Tractobots, my attempts at field robots

I assume the desire for odometry is to allow use of existing ROS
packages for navigation or other purposes. Many require odometry as an
input. I was working on my slightly smaller robots last year and ran
into that requirement. Got distracted by a competition using simulation
so didn’t follow through with them although encoded motors were purchased.

A quick search found some GPS to odometry discussions so it’s likely to
be available.

-73 -
Rud Merriam K5RUD
/Mystic Lake Software/ http://mysticlakesoftware.com/

Thanks for the help, xqms! I’ll try to respond to your points in order…

One of the best parts about my new kill switch system is that I can program it. Yes, I originally made it so that there was a heartbeat and if the switch was hit, the heartbeat stopped and the tractor was killed. I switched away from that to only transmit a kill signal for a couple reasons. The main one is that I wanted to save the battery. I don’t know how long it’ll last if I’m transmitting all of the time but I sure don’t want to run out of battery at the end of a day and be tempted to bypass the kill switch. I also plan to have several kill switches operating several robots which could be miles apart. I want kill switches to be able to move in and out of range without killing tractors. I do consider having one heartbeat switch that stays in the field with the tractor (and is connected to a power source like the command center) and then have other switches that send signals only when activated. I like keeping it simple for now, though.

I do not use a localization filter. I work directly from the GNSS data, which is plenty accurate for me. I would love to fuse an IMU, though, so I can get higher refresh rates and lower latency. Yes, “direct tf transforms” sounds like my current need.

Yes, I get course and pitch from the receivers. Here’s the data I get about the base line to the second antenna:

  1. Latitude-projection of base-line, m
    
  2. Longitude-projection of base-line, m
    
  3. Height-projection of base-line, m
    
  4. Base-line length (Rover-to-Base distance), m
    
  5. Base-line course (angle between base-line vector and North direction), degrees
    
  6. Base-line pitch (angle between base-line vector and horizontal), degrees
    

Right now I’m using nvector for all of my calculations. I think about switching to UTM but I’m a little concerned about conversion errors, especially in fields which are over a mile in one dimension. I suppose that if I calculate my lines in nvector and then convert them to UTM, I should be OK.

Alright, modeling the robot in URDF seems to be the place to start. I’ve tried it a few times and not gotten far. I’m sure it’s something I can do. I will give it a shot.

The next steps overwhelm me right now. Good to know I’m on the right path, though.

Thank you!

Rud,

Yes, it seems like the tools I find always want odometry. I have read some discussions about using GPS only but they get complicated quickly. I suspect that the issue is that typical GPS data is very noisy and more input is needed to provide a usable position. I already have very usable position and pose data.

We have a need for a kill switch as well, and found that the only suitable wireless kill switches start close to 1000€. Thus we also decided to build our own kill switch (using heartbeat and an active kill signal for below reasons). We also ordered Featherboards and will use an 868MHz signal. We calculated (most pessimistic assessment, heart beat at 100Hz) that our 2200mAh battery will last at least 10 hours. Judging from your picture I would assume that you have a larger battery, you should have no problem sending a heart beat signal for the whole day at 10Hz.

Yes it is a different category, but the run-stop of our PR2 runs for about a year on 4 AA-cells. It’s a bit clumsy, but I would never trade more battery time for security, especially when running a tractor.

We plan to put code, parts list and 3D-printable model on thingiverse or an open git repo once our student is done building it.

Heartbeat: obviously required in case the battery runs out or the signal is out of range
Active Kill Signal: in case someone duplicates the heart beat signal, the active kill signal blocks the Heartbeat. Reset switch will be placed on the receiver. This way, multiple kill switches can be used for a single receiver, as well as multiple receivers for one kill switch (signal will have an ID to avoid interfering with multiple setups)

I didn’t know about the n-vector representation - cool!

It seems that averaging positions in n-vector (i.e. finding the center between the two antennas) is just the normalized mean of the two vectors: http://www.navlab.net/nvector/#example_7

I would use that to calculate the position of a virtual antenna in the middle of the two.

As @tfoote suggested it is extremely unusual to have GPS antennas on a movable part of any robot. You really wonna have an exact known offset between your antenna(s) and point in which you control your tractor (probably center of the rear axis, that is base_link). If you use very precise GPS (< 5cm) then you can very well just continue using it alone. That is how we did navigation here: https://www.youtube.com/watch?v=GhPAgYRJ5l8.

You can use this function to convert your Lat/Lon into utm: https://github.com/cra-ros-pkg/robot_localization/blob/kinetic-devel/include/robot_localization/navsat_conversions.h#L185.

Since you have 2 antennas you will be able to infer rotation of the tractor as you already figured it out above. Otherwise @xqms gave you all the right steps above.

If you need some help with the initial URDF and TF transforms I can find some time next weekend.

xqms,

Yes, n-vector has been a lifesaver for me. I can cobble together very useful code from the recipes given.

I was thinking I’d use the interpolation example (#6) so that I could easily handle an offset in case it’s not quite centered.

Kyler, here is the UTM conversion i use. Works to the millimeter - even to convert back. If you want the conversion back, i can post that too. Its written in C# but can easily be changed to whatever language.

   //private double pi = 3.141592653589793238462643383279502884197169399375;
    private double sm_a = 6378137.0;
    private double sm_b = 6356752.314;
    private double UTMScaleFactor = 0.9996;
    //private double UTMScaleFactor2 = 1.0004001600640256102440976390556;

    public void DecDeg2UTM()
    {
        zone = Math.Floor((longitude + 180.0) * 0.16666666666666666666666666666667) + 1;
        GeoUTMConverterXY(latitude * 0.01745329251994329576923690766743, 
                            longitude * 0.01745329251994329576923690766743);
    }

    //    return (degrees * 0.01745329251994329576923690766743);

    private double ArcLengthOfMeridian(double phi)
    {
        double alpha, beta, gamma, delta, epsilon, n;
        double result;
        n = (sm_a - sm_b) / (sm_a + sm_b);
        alpha = ((sm_a + sm_b) / 2.0) * (1.0 + (Math.Pow(n, 2.0) / 4.0) + (Math.Pow(n, 4.0) / 64.0));
        beta = (-3.0 * n / 2.0) + (9.0 * Math.Pow(n, 3.0) / 16.0) + (-3.0 * Math.Pow(n, 5.0) / 32.0);
        gamma = (15.0 * Math.Pow(n, 2.0) / 16.0)
            + (-15.0 * Math.Pow(n, 4.0) / 32.0);
        delta = (-35.0 * Math.Pow(n, 3.0) / 48.0)
            + (105.0 * Math.Pow(n, 5.0) / 256.0);
        epsilon = (315.0 * Math.Pow(n, 4.0) / 512.0);
        result = alpha
            * (phi + (beta * Math.Sin(2.0 * phi))
                + (gamma * Math.Sin(4.0 * phi))
                + (delta * Math.Sin(6.0 * phi))
                + (epsilon * Math.Sin(8.0 * phi)));

        return result;
    }

    private double UTMCentralMeridian(double zone)
    {
        return ((-183.0 + (zone * 6.0)) * 0.01745329251994329576923690766743);
    }

    private double[] MapLatLonToXY(double phi, double lambda, double lambda0)
    {
        double[] xy = new double[2];
        double N, nu2, ep2, t, t2, l;
        double l3coef, l4coef, l5coef, l6coef, l7coef, l8coef;
        double tmp;
        ep2 = (Math.Pow(sm_a, 2.0) - Math.Pow(sm_b, 2.0)) / Math.Pow(sm_b, 2.0);
        nu2 = ep2 * Math.Pow(Math.Cos(phi), 2.0);
        N = Math.Pow(sm_a, 2.0) / (sm_b * Math.Sqrt(1 + nu2));
        t = Math.Tan(phi);
        t2 = t * t;
        tmp = (t2 * t2 * t2) - Math.Pow(t, 6.0);
        l = lambda - lambda0;
        l3coef = 1.0 - t2 + nu2;
        l4coef = 5.0 - t2 + 9 * nu2 + 4.0 * (nu2 * nu2);
        l5coef = 5.0 - 18.0 * t2 + (t2 * t2) + 14.0 * nu2 - 58.0 * t2 * nu2;
        l6coef = 61.0 - 58.0 * t2 + (t2 * t2) + 270.0 * nu2 - 330.0 * t2 * nu2;
        l7coef = 61.0 - 479.0 * t2 + 179.0 * (t2 * t2) - (t2 * t2 * t2);
        l8coef = 1385.0 - 3111.0 * t2 + 543.0 * (t2 * t2) - (t2 * t2 * t2);

        /* Calculate easting (x) */
        xy[0] = N * Math.Cos(phi) * l
            + (N / 6.0 * Math.Pow(Math.Cos(phi), 3.0) * l3coef * Math.Pow(l, 3.0))
            + (N / 120.0 * Math.Pow(Math.Cos(phi), 5.0) * l5coef * Math.Pow(l, 5.0))
            + (N / 5040.0 * Math.Pow(Math.Cos(phi), 7.0) * l7coef * Math.Pow(l, 7.0));

        /* Calculate northing (y) */
        xy[1] = ArcLengthOfMeridian(phi)
            + (t / 2.0 * N * Math.Pow(Math.Cos(phi), 2.0) * Math.Pow(l, 2.0))
            + (t / 24.0 * N * Math.Pow(Math.Cos(phi), 4.0) * l4coef * Math.Pow(l, 4.0))
            + (t / 720.0 * N * Math.Pow(Math.Cos(phi), 6.0) * l6coef * Math.Pow(l, 6.0))
            + (t / 40320.0 * N * Math.Pow(Math.Cos(phi), 8.0) * l8coef * Math.Pow(l, 8.0));

        return xy;
    }


    private void GeoUTMConverterXY(double lat, double lon)
    {
        double[] xy = MapLatLonToXY(lat, lon, (-183.0 + (zone * 6.0)) * 0.01745329251994329576923690766743);

        xy[0] = xy[0] * UTMScaleFactor + 500000.0;
        xy[1] = xy[1] * UTMScaleFactor;
        if (xy[1] < 0.0)
            xy[1] = xy[1] + 10000000.0;

        //keep a copy of actual easting and northings
        actualEasting = xy[0];
        actualNorthing = xy[1];
    }

In terms of position of your hitch etc, most robots are no where as large as the project you have here so there are a couple considerations.

  1. You need a pivot distance, ie the distance from the back wheels (pivot point) to your antenna. your heading will be based off the movement of the antenna and since your antenna is not on the pivot point of your vehicle, the antenna will sweep in an arc and circle that is larger than your pivot point arc.

  2. The hitch point of your vehicle should be based on the distance from your pivot point to the hitch pin. In your tractor example note that the hitch moves in the opposite direction as your antenna.

Some example code to easily determine position based on your antenna heading in UTM…

           //translate world to the pivot axle
        pivotAxleEasting = pn.easting - (Math.Sin(fixHeading) * vehicle.antennaPivot);
        pivotAxleNorthing = pn.northing - (Math.Cos(fixHeading) * vehicle.antennaPivot);

        //determine where the rigid vehicle hitch ends
        hitchEasting = pn.easting + (Math.Sin(fixHeading) * (vehicle.hitchLength - vehicle.antennaPivot));
        hitchNorthing = pn.northing + (Math.Cos(fixHeading) * (vehicle.hitchLength - vehicle.antennaPivot));
  1. the resulting angle of trailing equipment and its position is based on a following algorithm, like a trailer behind a truck. Its a constantly decaying angle as the pulling vehicle moves forward. The trailing hitch length needs to be known of course, and once the heading of your implement is known and since you know the width of your implement you can now know where the extremes of your implement are by using the above sin/cos method.

Some example code

                double t = (vehicle.toolTrailingHitchLength) / distanceCurrentStepFix;
                toolEasting = hitchEasting + t * (hitchEasting - toolEasting);
                toolNorthing = hitchNorthing + t * (hitchNorthing - toolNorthing);
                fixHeadingImplement = Math.Atan2(hitchEasting - toolEasting, hitchNorthing - toolNorthing);    

Again, while its in C#, easily translated to your required code.

Hope this helps your quest.

Once you’ve converted both antenna lat/lon to UTM to get your heading taking the Ant1 and Ant2 position eastings and northings and subtract them, then take Atan2 to get your heading. Depending on which antenna you choose as the first one either add or subtract pi/2 to get your vehicle direction heading.Take right minus left antennna, you add. This assumes your positive northing is directly north and your angle increases clockwise. I’ve found not doing things pure Cartesian but rather navigational is much easier.

fixheading = Math.Atan2 (ant1.easting - ant2.easting, ant1.northing - ant2.northing) + PI/2

Your “virtual single antenna position” then is

eastingVirtual = (ant1.easting + ant2.easting)/2
northingVirtual = (ant1.northing + ant2.northing)/2

This now along with the heading can be used for distances to pivot point, hitch pin, implement, loader bucket, whatever. Just make sure to keep the signs consistent depending if ahead or behind virtual antenna.

*** Kyler, make sure the code you are using is accurate in the first place. I tried about 10 different versions of UTM convertors, none of which could get back to the original lat lon. The one i posted above was able to return back to within a millimeter (8 digits) of the original.

Converting to UTM and doing simple trig. calculations would be vastly easier for me. I don’t live near the equator or a pole but I’m still concerned about errors across a 1.5 mile field. I especially want to be accurate enough to work in the same field with machinery running standard (Deere and Precision) equipment.

Here are the disadvantages of UTM that caught my attention: (from Wikipedia)

  1. Inherent distortion (due to the map projection) gives only approximate answers for most calculations
  2. Calculations get complex when crossing the zones

I am fortunate to live smack in the middle of UTM Zone 16. Perhaps the distortion I will see is negligible. I would prefer to write code that anyone can use anywhere but I’m tempted to punt if this will satisfy my needs.

farmerbriantee, it is great to see your code. I appreciate that you shared it. There are Python libraries for the UTM conversions but I have been meaning to look for the trailer calculations after I get the basics handled. (It would be nice to know where my towed 31-row bean planter is.) I realize that you are not using ROS (and I barely do). I will probably mirror your approach for now but I’m hoping for some help with using ROS transformations to do this work. If UTM is sufficient, then I should be alright do use the transforms that way.

As the weather gets nicer here, I get more anxious about writing some usable code. It’s a huge relief to get such great help here. Thanks, all.

I punted and did the position computations in UTM. It was too easy to resist. It seems to work, although I haven’t set flags to ensure that it’s accurate. I did get to test it a little just as the weekend drew to a close:


It was nice and boring, except for some connectivity problems which caused it to stop occasionally.

I have a lot of tweaking to do on the steering. I’m using my old PID code for this and it showed - lots of drifting off course and then jerking back.

Hi,

Back to your request for comments:

  1. mounting the GPS units.
    I suggest you try mounting one unit on the nose (up high to avoid the FEL frame) of the tractor and one on the Rollover protection frame. This has the advantage of providing a longer axis, therefore any errors in GPS position reporting wont appear as a large tractor position error. Taking this a step further mount another set of GPS units on the outer ends of the planter or implement. Therefore having the best measure of tractor and implement.

  2. Battery for Kill switch & Kill Switch
    Why not use the on board tractor battery?

This is an awesome project!
For the UTM conversions, have you taken a look at http://wiki.ros.org/geodesy?

Usage is as simple as

from geodesy import utm
lat, lon = 51.47875, 5.43995
utm_coord = utm.fromLatLong(lat, lon, 0)  # Elevation 0 is fine for NL
utm_coord.toPoint()  # out comes a ROS geometry_msgs.Point in the UTM frame

Tray,

  1. I used longitudinally-mounted antennas for my previous two tractors. This time I wanted to have roll sensing and take advantage of the built-in mount points provided by the FEL. At some point I hope to study synchronization of cheap sat. receivers so that I can easily place them on the implements. Tractobot03 will have an integral (3-point mounted) planter but Tractobot02 will pull a much larger towed planter this year.

  2. The kill switch of Tractobot02 is powered by the tractor. The remote, however, is handheld and portable. I have not put a kill switch in Tractobot03 yet but it will be similar.

Thank you.

Loy,

I used Bidirectional UTM-WGS84 converter for python but I’m certainly interested in using more ROS code. Thank you. I will take a look. I appreciate the code example.

Ideally, what would I do with the Point? I’m thinking…

  1. Publish it (as is).
  2. Use it along with the secondary antenna projections to compute a Point for the secondary antenna.
  3. Publish a Point for the secondary antenna.

I suspect that I should use tf2 for the operations in #2. It’s going to take me awhile to get up to speed there. Is having the second point helpful, though? Or should I just use the data to publish Pose?

I would love to find a simple example that’s similar to what I’m doing. Pointers will be appreciated.

My use-case for this code is slightly different; I need to define a route for my robot based on lat/lon coordinates and put those in a Path, which is a list of PoseStampeds, which each consist of Points.

It’s good practice to stamp geometric data with a frame of reference (UTM in this case, as the robot_localization tells you when the /utm frame is wrt /map). I wanted to link to a tutorial on geometry_msgs but I was surprised I couldn’t find anything in limited time.

from geomerty_msgs.msg import PoseStamped, Quaternion
pose_stamped = PoseStamped()
pose_stamped.header.frame_id = "utm"
pose_stamped.pose.position = utm_coord.toPoint()
pose_stamped.pose.orientation = Quaternion(0, 0, 0, 1) # Default zero orientation. You might as well use a PointStamped really if you only have a position without orientation

Not sure why you would want to publish a Point for the second antenna?

I could not find this linked above, but http://docs.ros.org/kinetic/api/robot_localization/html/index.html might help as well.

a little different angle on this topic…

I recently got a couple Lepton thermal cameras. I’m hoping to use them for avoiding people. I considered LEDDAR but these are looking like a much better people-oriented sensor.

I’ll welcome suggestions for background subtraction, etc.

Kyler, can you share the pinout of the Delphi sensor?

I did not find a data sheet for the Delphi ride height sensor. Here is what I determined:
A: +5
B: signal
C: ground

I’ve been looking at other options like rudder sensors, but damn they’re expensive. It’s hard for me to justify buying a Garmin GRF10 when just the extension cable costs more than everything I’m using. Maybe someday I’ll play with one and see if it’s worth the cost (especially considering that I like to have several on hand as spares).

1 Like