ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org

Tractobots, my attempts at field robots


#43

I am just catching up on this entire project (which I must say is really interesting!), and see that you need to fuse IMU and GPS data. If you do want to use robot_localization, note that you don’t necessarily need wheel odometry, but rather a nav_msgs/Odometry message, which is generated by an instance of ekf_localization_node. You could do something like the following:

ekf_localization_node
Inputs: IMU, nav_msgs/Odometry message containing the GPS pose (as produced by navsat_transform_node), and possibly GPS velocity (more on that in a moment)
Outputs: nav_msgs/Odometry message with your tractor’s state

navsat_transform_node
Inputs: IMU, GPS, and nav_msgs/Odometry message from EKF
Outputs: nav_msgs/Odometry message containing the GPS pose, transformed into the EKF’s world frame, (optionally) a filtered GPS fix

You’ll note that there’s a circular dependency here, which is that the EKF has to produce an odometry message for navsat_transform_node, which is then spitting out its own odometry message that gets fed back to the EKF. The reasons for it are probably a bit much to go into here, but it will work: the EKF will get its first IMU measurement and immediately produce a state estimate, which will get fed to navsat_transform_node along with the IMU and GPS data, and navsat_transform_node will spit out a pose that is consistent with your EKF’s coordinate frame, and can be fused back into it. This lets you control your tractor based on its world frame pose (the EKF output).

There are two things that I’d want to know more about, however:

  1. If you want to use the GPS velocity data and fuse that into the EKF (which I would recommend), you’d first want to make sure that the data is in the body frame of the tractor. A cursory glance at the nmea_savsat_driver source seems to suggest that they are converting the velocity into a world frame (apologies if I am misreading that), but if you want to fuse it with the EKF, you need the body-frame velocity (i.e., on the tractor, your tractor’s linear velocity would always be +/- X).
  2. I’d need to know your IMU’s coordinate frame. The data sheet seems to suggest that it already assumes that the signs of the data would be robot_localization friendly (e.g., a counter-clockwise turn should correspond to an increase in yaw), but I’m assuming that your IMU reads 0 when facing magnetic north, correct? I only ask because we assume a heading of 0 for east, not north. There’s a parameter in navsat_transform_node for specifying that offset, however, along with the magnetic declination for your location.

#44

Hi, Tom. I appreciate your explanations and suggestions. Getting the accuracy I need is becoming critical and I’m becoming anxious.

I have one of my IMUs in front of me now. Yes, it reads 0 when facing magnetic north.

I think I recall seeing a video that explained the circular dependency you mention. That makes sense to me. But this is all still very fuzzy. I need some basic help with frames.

Could you give me a simple first step or two? A launch file, for example, would be wonderful.

Thank you!


#45

Hi Kyler, here I uploaded a starting point for using the robot_localization. I am adapting it to your system yet. The navsat transform node it is already done and “just” the ekf node must the set:

tractobots_robot_localization

More or less it is the same Tom proposed.


#46

I submitted a PR against that branch with a few tweaks.


#47

I am feeling a lot of pressure these days. I need to get in the field. I only have one dual-RTK receiver and I really don’t want to buy another one from NVS. It would be wonderful if I could get fusion working enough to make a single antenna receiver work. I greatly appreciate the help I’ve received in working toward that. However…

Today I decided to return to what I’ve done already. I realized that my tracked tractor, Tractobot02, has an IMU which assists the steering. That’s part of why it’s so easy to control. I wanted to take a similar approach to Tractobot03 - use the GPS for heading but then use the IMU for steering corrections. It simply worked.

I am still excited about ROS fusion goodness, rviz, and other great tools, but right now I’m going to use what I know in order to get in the field.


#48

Sounds good! Ping me if you ever want to try again, and good luck!


#49

Hi, all! A couple updates while I sit in the truck watching Tractobot02…

solosito is helping me with fusion. I’m excited about it. I feel confident that I can do what I need with my dual-antenna receiver plus IMU but I’m hopeful that I can get down to a single antenna. I now understand more of what Tom_Moore has been saying. Thanks for everyone’s help!

I got Tractobot02 back from the shop and loaded the code I’d developed for VT a couple weeks. I had to make a few changes to it but I finally got it working this afternoon. It tilled about 100 acres before I quit at dark.

Tractobot02 is tracking surprisingly well. I suspect that pulling a heavy implement is helping to maintain a straight course.


#50

Kyler:

Two thumbs up! this is so cool.

What are you using for GPS’s? I know you’ve played with NVS, Piski, etc.

I have a couple Emlid reach units. did some testing this weekend using 1 reach with NTRIP CORS VRS corrections provided by Ohio Dept of Transportation. The Reach unit would only hold rtk fix for 2-3 min, then drop into float and take 2-5 min to get back to fix.

This isn’t good enough for future autosteer as when it changes modes, the position would also shift 20-30 cm.

i was amazed on my simple setup that when the system was in RTK fix, I could easily stay within about 6" of target. This is the first time a low cost lightbar type of system (using AgOpen) has shown promise. My goal is to move to autosteer by next year. (of the fall with the combine if things come together).

Right now, I’m targeting a mechanical system as all equipment is too old or too small to have factory valves and I can’t financially justify adding electronic steering valves. targeting one unit I can move around.


#51

87yj,

I’m using the NVS-NV08C-RTK-A. I want to move away from it, though. I have a Piksi, REACH RS, and a couple of Piksi Multis. I am thinking that the Tersus BX306 might be the way to go, though. I’m hoping that Piksi gets their RTCM code soon.

Throw an IMU into your system. I wish I had done that sooner.


#52

What are you using from the IMU to help with guidance?


#53

https://www.tinkerforge.com/en/blog/new-imu-brick-imu-20/


#54

Hee Hee, not which IMU, what parameters are you using to help with guidance?


#55

Sorry I didn’t explain that. The IMU Brick does its own fusion. I just use its heading output right now.


#56

Only the using the heading from the IMU or complimentary filtering with GPS heading?


#57

Actually, I misspoke. I forgot that I’m using the IMU for turn rate also.

But mostly I’m using the IMU heading so that I can “Turn 10 degrees left (of where you are now)” and it’ll do it. The IMU heading drifts but I use the GPS heading to determine how much I need to turn, so it’s not a big issue.


#58

Since you have RTK and cm level accuracy of fix, why not just use the GPS to determine your heading? Promise, that’s the last question lol.


#59

I need an accurate heading and a high refresh rate. RTK doesn’t give me either. (If I were smarter, I’d use kinematic models, but…)


#60

The Tinkerforge IMU 2 looks interesting. What ROS package are you using for it? So far I’ve only seen tinkerforge_laser_transform and another package from the same author.
But I’m not happy with the nodes covariance matrix (all 3x3 values 0.1).

I’m not an export on covariance, but having all three 3x3 matrices with the same value looks a bit arbitrary to me. Also I don’t see any dependency on one axis to the other.


#61

Humpelstilzchen,
The Tinkerforge ROS package is now much more complete:

I ended up just connecting directly to brickd, though.


#62

I like boring robots. Tractobot03 is approaching that stage. Here is a boring video of it planting:

Tractobot03 has planted 122A this week. I have not driven it locally for any planting. It was painfully slow and tedious for a couple days. I had to enter each line into my code, ensuring that it went the correct way, start following the line, drop the planter, and start the fertilizer. Then I had to raise the planter, stop the fertilizer, turn, then load a new line.

Today felt much better. It’s running pretty much on its own. I even ran home once while it traveled from the far end of the field. The main problem now is that I lose GPS periodically. At least I have it programmed to stop when that happens. Then I back up and resume.

Because of the pressure to plant, I fell back on my old navigation code. My lines are not at all straight. I’m calling them “heritage lines” - like my grandfather planted. I look forward to improving them.