ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org

Tractobots, my attempts at field robots

Loy,

I used Bidirectional UTM-WGS84 converter for python but I’m certainly interested in using more ROS code. Thank you. I will take a look. I appreciate the code example.

Ideally, what would I do with the Point? I’m thinking…

  1. Publish it (as is).
  2. Use it along with the secondary antenna projections to compute a Point for the secondary antenna.
  3. Publish a Point for the secondary antenna.

I suspect that I should use tf2 for the operations in #2. It’s going to take me awhile to get up to speed there. Is having the second point helpful, though? Or should I just use the data to publish Pose?

I would love to find a simple example that’s similar to what I’m doing. Pointers will be appreciated.

My use-case for this code is slightly different; I need to define a route for my robot based on lat/lon coordinates and put those in a Path, which is a list of PoseStampeds, which each consist of Points.

It’s good practice to stamp geometric data with a frame of reference (UTM in this case, as the robot_localization tells you when the /utm frame is wrt /map). I wanted to link to a tutorial on geometry_msgs but I was surprised I couldn’t find anything in limited time.

from geomerty_msgs.msg import PoseStamped, Quaternion
pose_stamped = PoseStamped()
pose_stamped.header.frame_id = "utm"
pose_stamped.pose.position = utm_coord.toPoint()
pose_stamped.pose.orientation = Quaternion(0, 0, 0, 1) # Default zero orientation. You might as well use a PointStamped really if you only have a position without orientation

Not sure why you would want to publish a Point for the second antenna?

I could not find this linked above, but http://docs.ros.org/kinetic/api/robot_localization/html/index.html might help as well.

a little different angle on this topic…

I recently got a couple Lepton thermal cameras. I’m hoping to use them for avoiding people. I considered LEDDAR but these are looking like a much better people-oriented sensor.

I’ll welcome suggestions for background subtraction, etc.

Kyler, can you share the pinout of the Delphi sensor?

I did not find a data sheet for the Delphi ride height sensor. Here is what I determined:
A: +5
B: signal
C: ground

I’ve been looking at other options like rudder sensors, but damn they’re expensive. It’s hard for me to justify buying a Garmin GRF10 when just the extension cable costs more than everything I’m using. Maybe someday I’ll play with one and see if it’s worth the cost (especially considering that I like to have several on hand as spares).

1 Like

Thank for posting the link to be able to keep following your progress…this is really interesting stuff, I’m amazed with what you’re doing.

On the latest versions, are you still using the IMU or are you back to dual GPS only?

I picked up a TinkerForge IMU2.0 and it’s looking really interesting on the desk. Haven’t had a chance to get it out in the field. I’m bouncing between using dual GPS antennas for tilt or using single + IMU. My end goal is to provide some level of a smart signal correction for the GPS and be able to feed that into AgOpenGPS. What’s driving the direction you are going with dual GPS?

My goal for this year is to use a Raspberry PI as the correction box to provide the calculations and transmission of most accurate gps location possible. Ultimately, I’d like to use the R-Pi + ROS to do more in teh future.

I’ve been using the BNO055 - the same chip as in the tinker, and i’ve found the pitch and roll to wander upward a few degrees - even as high as 8 to 10 degrees while traveling forward. A soon as you stop moving , the numbers return to zero. I have given up using it as I can’t seem to correct the problem.

Has anyone else noticed this?

87yj,

The IMU vs. dual-RTK situation is a hot-button for me right now. A firmware update went bad on one of my NVS-NV08C-RTK-A receivers. It’s now in “console mode” which NVS refuses to support. So it’s bricked.

The Tersus is delayed so I ordered a couple Piksi Multis and a couple more IMU Brick V2s. I’m hoping to migrate to them over the next week so that I can start planting.

I’m not sure what you mean by “driving the direction you are going” but I use the NV08C heading.

My issue now is that I wander around +/- 10cm from the line. There are a few things I still need to tweak but I’m hoping an IMU will help me stabilize.

I also bought some A/D converters for the Pi. I’m thinking about controlling steering (and everything else) directly from the Pi instead of going through rosserial to an Arduino. It should cut down on latency a bit and simplify the system.

Thanks Kyler…

“Driving the direction…” is a poor choice of words in the context. I was asking what factors are leading to the choices you are making related to # of GPS’s and/or IMU usage.

I have some rough python code that will take 2 GPS signals with know mounting height and distance from center-line and calculate centerpoint, heading and compensate for tilt. I was starting to move to 1 GPS and an IMU to simplify the calculation and coordination of 2 GPS signals. I’d like to build out the code to handle either configuration.

BTW, here’s the meat of my steering code. (Prepare to be amazed.)

                            pid_output = self.pid_captured(self.cross_track_distance)
                            desired_heading = (self.course + minmax(-90, 90, pid_output)) % 360
                            bearing = degdiff(desired_heading - heading)
                            steer_for_bearing(bearing)  # horribly uncalibrated

Yup, that’s right; I send the cross-track distance into a PID and then use the PID output to determine how big my intercept angle should be. This was basically my first shot at a steering algorithm and it worked so well that I just kept using it. But it’s not working well enough now and there are so many parts that need improvement.

I’m thinking that if I throw an IMU into the mix, I could still use my algorithm but make the steering changes in response to IMU updates (at a very high rate). So I’d use the old code to say “Steer 305 degrees” but use the IMU to help me direct the steering wheels there.

Hi,

Great project! Looks like you’re making good progress.

I just wanted to comment on your plan of doing steering directly from the Pi. I know some experienced engineers who built their hobby systems using Pi for logic and Arduino for motor control, on purpose. Their justification was that the Pi can get bogged down doing other OS-level tasks (accepting user input, etc.) that can interfere with a control loop and affect performance. To finally fix the issue, they ended up moving to a FPGA-based system, which is able to run the control law in real-time while also running program logic.

1 Like

Adam,

Yes, I was excited to do more control directly from the Pi but I had the concern of delays throwing a wrench in timing-critical operations. I appreciate that you reinforced that issue.

I ended up doing some testing directly from the Pi. That allowed me to tweak the steering settings. I had great success! I found that I need to use PWM at around 40 Hz to get good control of the proportional valve. It cracks at around 233/255 duty cycle. I am able to make the wheels steer at a nearly imperceptible rate. It’s so cool.

I have been concerned about proportional valve control since I started. It was never clear to me if I need current control or if I can simply use straight PWM. Until now I’ve been unable to get slow steer speeds (low flow) and I thought that it might be because I’m not controlling current adequately. But now I know I don’t need to do that! It seems that I get the full range of flow with straight PWM at a sufficiently low frequency. It’s easy to find the right frequency - if it’s too high then low flows aren’t possible, and if it’s too low it pulses. This is so important! And yet even when I suspected it was an issue, I didn’t find much mention of frequency adjustments.

Also, inspired by a video I found (https://www.youtube.com/watch?v=-SE6X6t1HNQ), I chopped I and D from my PID steering (angle) controller. With the ability to go at a low rate, proportional seems to be all that I need.

Once I did all of my testing on the Pi, I re-worked my Arduino code a bit and it seems to be good. I’ll stick with it for now.

I’ve been thinking about how well Tractobot02 (300HP tracked tractor) stuck to my line, even at high speeds (19 MPH). I want that level of control with Tractobot03. Tracked tractors are designed to go straight but I’m sure I can do something similar with an IMU. So tonight I laid a Tinkerforge IMU Brick V2 in Tractobot02’s enclosure. Here’s what I’m considering…

Instead of trying to calibrate the steer angle sensor to steering angles, I’ll simply control the steering valve with a PID fed by rate of turn data. That would allow me to say “Go straight” and it should find the steer angle to do so. That gets me to Tractobot02’s level of control.

Once I can control rate of turn, I can use the PID that’s fed by the cross-track distance to feed the rate of turn PID. This means that if I’m left of my line, the PID would command an appropriate rate of turn to the right to re-capture the line.

I thought about throwing in a PID to maintain a given heading but I don’t think that’s needed if I can do rate of turn. We’ll see…

–kyler

I installed the IMU. It’s cool! I enjoy watching the steering wheels adjust as they move across rough spots.

Now to figure out how to use it with the GPS. I know there’s lots of ROS fusion goodness but I’m not there yet. I’m still working with my own code.

I’m excited about the possibility of ditching the NVS dual-RTK receivers. I will have a lot more flexibility working with a single receiver plus IMU.

It looks pretty awesome. For the fusion you could use the robot_localization package.

Here you have the link for the API. The point of this algorithm is that it can fuses a NavSatFix message and a IMU message (also odometry message can be used).

1 Like

solosito,

Now that I’ve had a taste of working with the IMU, I desperately want fusion. I need remedial help.

One of the issues I’ve had is that I’ve wanted to incorporate the yaw and roll data from the dual GPS. I’m leaning hard toward switching to a single GPS(/GNSS) receiver. That should make my configuration a bit more standard.

Today I installed nmea_navsat_driver and the TinkerForge sensor driver. I now have /fix, /vel, and /tfsensors/imu1 (among others). Unfortunately, my GPS went sour last night and I can’t get RTK now. I’ll be debugging it for awhile.

I’m looking at the steps that xqms suggests. Converting the /fix data to UTM seems straightforward enough but the transforms are completely beyond me. I would love to have a simple concrete step to take toward this.

Also, I’m back to studying robot_localization. Again, I’m encountering the odometry requirement.

I decided to use RMC messages from the GPS so that I can get velocity data. There’s lots more information that I need, though. For example, I’ll want to check on the (RTCM) corrections, and I must stop if I ever lose an RTK solution. I also need to push raw sentences over Ethernet to my planter monitor.

To handle this, I configured one node to listen to the GPS and another to interpret the messages. Here’s my launch file:

<launch>
<node name="gps" pkg="nmea_navsat_driver" type="nmea_topic_serial_reader">
	<param name="port" value="/dev/gps_nmea" />
	<param name="baud" value="115200" />
</node>

<node name="nmea" pkg="nmea_navsat_driver" type="nmea_topic_driver">
	<param name="useRMC" value="True" />
</node>
</launch>

Now, in addition to /fix and /vel, I get raw output on /nmea_sentence.

I am just catching up on this entire project (which I must say is really interesting!), and see that you need to fuse IMU and GPS data. If you do want to use robot_localization, note that you don’t necessarily need wheel odometry, but rather a nav_msgs/Odometry message, which is generated by an instance of ekf_localization_node. You could do something like the following:

ekf_localization_node
Inputs: IMU, nav_msgs/Odometry message containing the GPS pose (as produced by navsat_transform_node), and possibly GPS velocity (more on that in a moment)
Outputs: nav_msgs/Odometry message with your tractor’s state

navsat_transform_node
Inputs: IMU, GPS, and nav_msgs/Odometry message from EKF
Outputs: nav_msgs/Odometry message containing the GPS pose, transformed into the EKF’s world frame, (optionally) a filtered GPS fix

You’ll note that there’s a circular dependency here, which is that the EKF has to produce an odometry message for navsat_transform_node, which is then spitting out its own odometry message that gets fed back to the EKF. The reasons for it are probably a bit much to go into here, but it will work: the EKF will get its first IMU measurement and immediately produce a state estimate, which will get fed to navsat_transform_node along with the IMU and GPS data, and navsat_transform_node will spit out a pose that is consistent with your EKF’s coordinate frame, and can be fused back into it. This lets you control your tractor based on its world frame pose (the EKF output).

There are two things that I’d want to know more about, however:

  1. If you want to use the GPS velocity data and fuse that into the EKF (which I would recommend), you’d first want to make sure that the data is in the body frame of the tractor. A cursory glance at the nmea_savsat_driver source seems to suggest that they are converting the velocity into a world frame (apologies if I am misreading that), but if you want to fuse it with the EKF, you need the body-frame velocity (i.e., on the tractor, your tractor’s linear velocity would always be +/- X).
  2. I’d need to know your IMU’s coordinate frame. The data sheet seems to suggest that it already assumes that the signs of the data would be robot_localization friendly (e.g., a counter-clockwise turn should correspond to an increase in yaw), but I’m assuming that your IMU reads 0 when facing magnetic north, correct? I only ask because we assume a heading of 0 for east, not north. There’s a parameter in navsat_transform_node for specifying that offset, however, along with the magnetic declination for your location.

Hi, Tom. I appreciate your explanations and suggestions. Getting the accuracy I need is becoming critical and I’m becoming anxious.

I have one of my IMUs in front of me now. Yes, it reads 0 when facing magnetic north.

I think I recall seeing a video that explained the circular dependency you mention. That makes sense to me. But this is all still very fuzzy. I need some basic help with frames.

Could you give me a simple first step or two? A launch file, for example, would be wonderful.

Thank you!

Hi Kyler, here I uploaded a starting point for using the robot_localization. I am adapting it to your system yet. The navsat transform node it is already done and “just” the ekf node must the set:

tractobots_robot_localization

More or less it is the same Tom proposed.

I submitted a PR against that branch with a few tweaks.