Tractobots, my attempts at field robots

I installed the IMU. It’s cool! I enjoy watching the steering wheels adjust as they move across rough spots.

Now to figure out how to use it with the GPS. I know there’s lots of ROS fusion goodness but I’m not there yet. I’m still working with my own code.

I’m excited about the possibility of ditching the NVS dual-RTK receivers. I will have a lot more flexibility working with a single receiver plus IMU.

It looks pretty awesome. For the fusion you could use the robot_localization package.

Here you have the link for the API. The point of this algorithm is that it can fuses a NavSatFix message and a IMU message (also odometry message can be used).

1 Like

solosito,

Now that I’ve had a taste of working with the IMU, I desperately want fusion. I need remedial help.

One of the issues I’ve had is that I’ve wanted to incorporate the yaw and roll data from the dual GPS. I’m leaning hard toward switching to a single GPS(/GNSS) receiver. That should make my configuration a bit more standard.

Today I installed nmea_navsat_driver and the TinkerForge sensor driver. I now have /fix, /vel, and /tfsensors/imu1 (among others). Unfortunately, my GPS went sour last night and I can’t get RTK now. I’ll be debugging it for awhile.

I’m looking at the steps that xqms suggests. Converting the /fix data to UTM seems straightforward enough but the transforms are completely beyond me. I would love to have a simple concrete step to take toward this.

Also, I’m back to studying robot_localization. Again, I’m encountering the odometry requirement.

I decided to use RMC messages from the GPS so that I can get velocity data. There’s lots more information that I need, though. For example, I’ll want to check on the (RTCM) corrections, and I must stop if I ever lose an RTK solution. I also need to push raw sentences over Ethernet to my planter monitor.

To handle this, I configured one node to listen to the GPS and another to interpret the messages. Here’s my launch file:

<launch>
<node name="gps" pkg="nmea_navsat_driver" type="nmea_topic_serial_reader">
	<param name="port" value="/dev/gps_nmea" />
	<param name="baud" value="115200" />
</node>

<node name="nmea" pkg="nmea_navsat_driver" type="nmea_topic_driver">
	<param name="useRMC" value="True" />
</node>
</launch>

Now, in addition to /fix and /vel, I get raw output on /nmea_sentence.

I am just catching up on this entire project (which I must say is really interesting!), and see that you need to fuse IMU and GPS data. If you do want to use robot_localization, note that you don’t necessarily need wheel odometry, but rather a nav_msgs/Odometry message, which is generated by an instance of ekf_localization_node. You could do something like the following:

ekf_localization_node
Inputs: IMU, nav_msgs/Odometry message containing the GPS pose (as produced by navsat_transform_node), and possibly GPS velocity (more on that in a moment)
Outputs: nav_msgs/Odometry message with your tractor’s state

navsat_transform_node
Inputs: IMU, GPS, and nav_msgs/Odometry message from EKF
Outputs: nav_msgs/Odometry message containing the GPS pose, transformed into the EKF’s world frame, (optionally) a filtered GPS fix

You’ll note that there’s a circular dependency here, which is that the EKF has to produce an odometry message for navsat_transform_node, which is then spitting out its own odometry message that gets fed back to the EKF. The reasons for it are probably a bit much to go into here, but it will work: the EKF will get its first IMU measurement and immediately produce a state estimate, which will get fed to navsat_transform_node along with the IMU and GPS data, and navsat_transform_node will spit out a pose that is consistent with your EKF’s coordinate frame, and can be fused back into it. This lets you control your tractor based on its world frame pose (the EKF output).

There are two things that I’d want to know more about, however:

  1. If you want to use the GPS velocity data and fuse that into the EKF (which I would recommend), you’d first want to make sure that the data is in the body frame of the tractor. A cursory glance at the nmea_savsat_driver source seems to suggest that they are converting the velocity into a world frame (apologies if I am misreading that), but if you want to fuse it with the EKF, you need the body-frame velocity (i.e., on the tractor, your tractor’s linear velocity would always be +/- X).
  2. I’d need to know your IMU’s coordinate frame. The data sheet seems to suggest that it already assumes that the signs of the data would be robot_localization friendly (e.g., a counter-clockwise turn should correspond to an increase in yaw), but I’m assuming that your IMU reads 0 when facing magnetic north, correct? I only ask because we assume a heading of 0 for east, not north. There’s a parameter in navsat_transform_node for specifying that offset, however, along with the magnetic declination for your location.

Hi, Tom. I appreciate your explanations and suggestions. Getting the accuracy I need is becoming critical and I’m becoming anxious.

I have one of my IMUs in front of me now. Yes, it reads 0 when facing magnetic north.

I think I recall seeing a video that explained the circular dependency you mention. That makes sense to me. But this is all still very fuzzy. I need some basic help with frames.

Could you give me a simple first step or two? A launch file, for example, would be wonderful.

Thank you!

Hi Kyler, here I uploaded a starting point for using the robot_localization. I am adapting it to your system yet. The navsat transform node it is already done and “just” the ekf node must the set:

tractobots_robot_localization

More or less it is the same Tom proposed.

I submitted a PR against that branch with a few tweaks.

I am feeling a lot of pressure these days. I need to get in the field. I only have one dual-RTK receiver and I really don’t want to buy another one from NVS. It would be wonderful if I could get fusion working enough to make a single antenna receiver work. I greatly appreciate the help I’ve received in working toward that. However…

Today I decided to return to what I’ve done already. I realized that my tracked tractor, Tractobot02, has an IMU which assists the steering. That’s part of why it’s so easy to control. I wanted to take a similar approach to Tractobot03 - use the GPS for heading but then use the IMU for steering corrections. It simply worked.

I am still excited about ROS fusion goodness, rviz, and other great tools, but right now I’m going to use what I know in order to get in the field.

Sounds good! Ping me if you ever want to try again, and good luck!

Hi, all! A couple updates while I sit in the truck watching Tractobot02…

solosito is helping me with fusion. I’m excited about it. I feel confident that I can do what I need with my dual-antenna receiver plus IMU but I’m hopeful that I can get down to a single antenna. I now understand more of what Tom_Moore has been saying. Thanks for everyone’s help!

I got Tractobot02 back from the shop and loaded the code I’d developed for VT a couple weeks. I had to make a few changes to it but I finally got it working this afternoon. It tilled about 100 acres before I quit at dark.

Tractobot02 is tracking surprisingly well. I suspect that pulling a heavy implement is helping to maintain a straight course.

2 Likes

Kyler:

Two thumbs up! this is so cool.

What are you using for GPS’s? I know you’ve played with NVS, Piski, etc.

I have a couple Emlid reach units. did some testing this weekend using 1 reach with NTRIP CORS VRS corrections provided by Ohio Dept of Transportation. The Reach unit would only hold rtk fix for 2-3 min, then drop into float and take 2-5 min to get back to fix.

This isn’t good enough for future autosteer as when it changes modes, the position would also shift 20-30 cm.

i was amazed on my simple setup that when the system was in RTK fix, I could easily stay within about 6" of target. This is the first time a low cost lightbar type of system (using AgOpen) has shown promise. My goal is to move to autosteer by next year. (of the fall with the combine if things come together).

Right now, I’m targeting a mechanical system as all equipment is too old or too small to have factory valves and I can’t financially justify adding electronic steering valves. targeting one unit I can move around.

87yj,

I’m using the NVS-NV08C-RTK-A. I want to move away from it, though. I have a Piksi, REACH RS, and a couple of Piksi Multis. I am thinking that the Tersus BX306 might be the way to go, though. I’m hoping that Piksi gets their RTCM code soon.

Throw an IMU into your system. I wish I had done that sooner.

What are you using from the IMU to help with guidance?

https://www.tinkerforge.com/en/blog/new-imu-brick-imu-20/

Hee Hee, not which IMU, what parameters are you using to help with guidance?

Sorry I didn’t explain that. The IMU Brick does its own fusion. I just use its heading output right now.

Only the using the heading from the IMU or complimentary filtering with GPS heading?

Actually, I misspoke. I forgot that I’m using the IMU for turn rate also.

But mostly I’m using the IMU heading so that I can “Turn 10 degrees left (of where you are now)” and it’ll do it. The IMU heading drifts but I use the GPS heading to determine how much I need to turn, so it’s not a big issue.

Since you have RTK and cm level accuracy of fix, why not just use the GPS to determine your heading? Promise, that’s the last question lol.