Now that I’ve had a taste of working with the IMU, I desperately want fusion. I need remedial help.
One of the issues I’ve had is that I’ve wanted to incorporate the yaw and roll data from the dual GPS. I’m leaning hard toward switching to a single GPS(/GNSS) receiver. That should make my configuration a bit more standard.
Today I installed nmea_navsat_driver and the TinkerForge sensor driver. I now have /fix, /vel, and /tfsensors/imu1 (among others). Unfortunately, my GPS went sour last night and I can’t get RTK now. I’ll be debugging it for awhile.
I’m looking at the steps that xqms suggests. Converting the /fix data to UTM seems straightforward enough but the transforms are completely beyond me. I would love to have a simple concrete step to take toward this.
Also, I’m back to studying robot_localization. Again, I’m encountering the odometry requirement.
I decided to use RMC messages from the GPS so that I can get velocity data. There’s lots more information that I need, though. For example, I’ll want to check on the (RTCM) corrections, and I must stop if I ever lose an RTK solution. I also need to push raw sentences over Ethernet to my planter monitor.
To handle this, I configured one node to listen to the GPS and another to interpret the messages. Here’s my launch file:
I am just catching up on this entire project (which I must say is really interesting!), and see that you need to fuse IMU and GPS data. If you do want to use robot_localization, note that you don’t necessarily need wheel odometry, but rather a nav_msgs/Odometry message, which is generated by an instance of ekf_localization_node. You could do something like the following:
ekf_localization_node Inputs: IMU, nav_msgs/Odometry message containing the GPS pose (as produced by navsat_transform_node), and possibly GPS velocity (more on that in a moment) Outputs: nav_msgs/Odometry message with your tractor’s state
navsat_transform_node Inputs: IMU, GPS, and nav_msgs/Odometry message from EKF Outputs: nav_msgs/Odometry message containing the GPS pose, transformed into the EKF’s world frame, (optionally) a filtered GPS fix
You’ll note that there’s a circular dependency here, which is that the EKF has to produce an odometry message for navsat_transform_node, which is then spitting out its own odometry message that gets fed back to the EKF. The reasons for it are probably a bit much to go into here, but it will work: the EKF will get its first IMU measurement and immediately produce a state estimate, which will get fed to navsat_transform_node along with the IMU and GPS data, and navsat_transform_node will spit out a pose that is consistent with your EKF’s coordinate frame, and can be fused back into it. This lets you control your tractor based on its world frame pose (the EKF output).
There are two things that I’d want to know more about, however:
If you want to use the GPS velocity data and fuse that into the EKF (which I would recommend), you’d first want to make sure that the data is in the body frame of the tractor. A cursory glance at the nmea_savsat_driversource seems to suggest that they are converting the velocity into a world frame (apologies if I am misreading that), but if you want to fuse it with the EKF, you need the body-frame velocity (i.e., on the tractor, your tractor’s linear velocity would always be +/- X).
I’d need to know your IMU’s coordinate frame. The data sheet seems to suggest that it already assumes that the signs of the data would be robot_localization friendly (e.g., a counter-clockwise turn should correspond to an increase in yaw), but I’m assuming that your IMU reads 0 when facing magnetic north, correct? I only ask because we assume a heading of 0 for east, not north. There’s a parameter in navsat_transform_node for specifying that offset, however, along with the magnetic declination for your location.
Hi, Tom. I appreciate your explanations and suggestions. Getting the accuracy I need is becoming critical and I’m becoming anxious.
I have one of my IMUs in front of me now. Yes, it reads 0 when facing magnetic north.
I think I recall seeing a video that explained the circular dependency you mention. That makes sense to me. But this is all still very fuzzy. I need some basic help with frames.
Could you give me a simple first step or two? A launch file, for example, would be wonderful.
Hi Kyler, here I uploaded a starting point for using the robot_localization. I am adapting it to your system yet. The navsat transform node it is already done and “just” the ekf node must the set:
I am feeling a lot of pressure these days. I need to get in the field. I only have one dual-RTK receiver and I really don’t want to buy another one from NVS. It would be wonderful if I could get fusion working enough to make a single antenna receiver work. I greatly appreciate the help I’ve received in working toward that. However…
Today I decided to return to what I’ve done already. I realized that my tracked tractor, Tractobot02, has an IMU which assists the steering. That’s part of why it’s so easy to control. I wanted to take a similar approach to Tractobot03 - use the GPS for heading but then use the IMU for steering corrections. It simply worked.
I am still excited about ROS fusion goodness, rviz, and other great tools, but right now I’m going to use what I know in order to get in the field.
Hi, all! A couple updates while I sit in the truck watching Tractobot02…
solosito is helping me with fusion. I’m excited about it. I feel confident that I can do what I need with my dual-antenna receiver plus IMU but I’m hopeful that I can get down to a single antenna. I now understand more of what Tom_Moore has been saying. Thanks for everyone’s help!
I got Tractobot02 back from the shop and loaded the code I’d developed for VT a couple weeks. I had to make a few changes to it but I finally got it working this afternoon. It tilled about 100 acres before I quit at dark.
Tractobot02 is tracking surprisingly well. I suspect that pulling a heavy implement is helping to maintain a straight course.
What are you using for GPS’s? I know you’ve played with NVS, Piski, etc.
I have a couple Emlid reach units. did some testing this weekend using 1 reach with NTRIP CORS VRS corrections provided by Ohio Dept of Transportation. The Reach unit would only hold rtk fix for 2-3 min, then drop into float and take 2-5 min to get back to fix.
This isn’t good enough for future autosteer as when it changes modes, the position would also shift 20-30 cm.
i was amazed on my simple setup that when the system was in RTK fix, I could easily stay within about 6" of target. This is the first time a low cost lightbar type of system (using AgOpen) has shown promise. My goal is to move to autosteer by next year. (of the fall with the combine if things come together).
Right now, I’m targeting a mechanical system as all equipment is too old or too small to have factory valves and I can’t financially justify adding electronic steering valves. targeting one unit I can move around.
I’m using the NVS-NV08C-RTK-A. I want to move away from it, though. I have a Piksi, REACH RS, and a couple of Piksi Multis. I am thinking that the Tersus BX306 might be the way to go, though. I’m hoping that Piksi gets their RTCM code soon.
Throw an IMU into your system. I wish I had done that sooner.
Actually, I misspoke. I forgot that I’m using the IMU for turn rate also.
But mostly I’m using the IMU heading so that I can “Turn 10 degrees left (of where you are now)” and it’ll do it. The IMU heading drifts but I use the GPS heading to determine how much I need to turn, so it’s not a big issue.
The Tinkerforge IMU 2 looks interesting. What ROS package are you using for it? So far I’ve only seen tinkerforge_laser_transform and another package from the same author.
But I’m not happy with the nodes covariance matrix (all 3x3 values 0.1).
I’m not an export on covariance, but having all three 3x3 matrices with the same value looks a bit arbitrary to me. Also I don’t see any dependency on one axis to the other.