Tractobots, my attempts at field robots

I’ve been discussing my field robot projects on Emlid’s forum (https://community.emlid.com/t/tractor-installation-advice-welcome/2126/77) because I initially considered basing my projects on a Navio+. I didn’t end up using Emlid’s products so it’s been a little strange to keep posting there. This looks like a more appropriate place for me.

I am firmly in ROS these days but I don’t use it well. I hope some discussion here might help me standardize and take better advantage of ROS components.

Lots of historic information is in the Emlid forum. I’ll be happy to elaborate on it here but for now, I’ll just start posting what I’m currently doing.

It’s high time for me to do some vertical tillage with Tractobot02 but I’ve been working on my new little planter tractor (Tractobot03). I worked a bit in Tractobot02 yesterday.

I originally set up Tractobot02 using a monolithic Python script. It worked great, even at high speeds (19 MPH). Then I partially converted it to ROS and sent it out to pull a grain cart for harvest and didn’t care much that it couldn’t drive a straight line well.

I tried to get it working last weekend and found it weaving all over. Yesterday I finally decided to rip out my PID code and use the ROS PID. I did it mostly because I wanted to reconfigure it (rqt) on the fly but I also suspected some windup issues so I thought I might as well switch to the ROS PID before investing more effort in it.

The ROS PID solved the problem! It was a wonderful change. I so appreciate being able to use code that has gone through such rigorous development. Thank you ROS community!

There’s a lot I want to do beyond the dumb functionality I have now. I am confident that working with ROS is the best way to ensure that I make progress.

15 Likes

I added code to make turns.

I did all of my work in the “driver” script today. I’d like to break out navigation into its own node but I needed to make quick progress today so I was concentrating on just making it work.

I think in terms of sequences: raise the tool, throttle down, turn, wait, stabilize, drop tool, throttle up. (The tractor easily does a better job than I would. And I’m still tweaking it.) I’m not programming it that way, though. Instead, I’m setting the controls based on the current state.

  1. If I am within 14 meters of the end line, increment the path offset and reverse the direction.
  2. If I have a large steering correction to make, raise the tool, set the throttle low.
  3. If I don’t have a large steering correction to make, increment a counter so I know when I’ve stabilized.
  4. If I’ve stabilized for a bit, drop the tool and set the throttle high.

If things go wrong, this all falls apart quickly, but for now it works. I will need to do a lot of re-coding before it goes out on its own.

3 Likes

That video above was intense during the closing approach to the ditch, just before I got to see your turning radius. I just caught up on your previous posts, (awesome work btw!) and will be eagerly following your progress here.

I agree, this is amazing work. You’ve accomplished more by yourself on a hobbyist’s budget than some agricultural machinery makers I happen to know with half a million Euros to spend. Looking forward to following your saga! :slight_smile:

BTW: I’m usually working with robots that are much, much smaller. And I’ve inadvertently run them into, over or under things many times. The thought of having a tractor run autonomously without any collision avoidance scares the shit out of me. Since you’re working on an open field, implementing this would be trivial if you just had the right sensor. I see you’re looking into Radar or lidar sensors; that would be ideal. If money wasn’t an issue, I’d probably go with the SICK LD-MRS. Despite being a Lidar, it can scan through dust, rain and snow. Or perhaps a Velodyne Puck, which would give you 360° vision; but that one’s also about $8k I think. So perhaps Radar is the better option.

That turn was scarier than it looked. I decided to increase the headland for the next field.

I was using a CaryMart long-range kill switch but this winter I misplaced the remotes and I’ve wanted to build my own kill switches anyway. I built a LoRa pair a couple weekends ago and I plan to package them next weekend. Then I’ll feel more comfortable operating Tractobot02 without being in the cab again.

I started with a small (23 HP) tractor because I was concerned about obstacles. It did its share of damage and ran into me a few times. I learned a lot from that, as I expected. I’m not so willing to let the bigger tractors run into anything. Fortunately, operating in an open field under constant supervision makes it fairly easy. I’m hopeful that the auto industry will make robust and reasonably inexpensive pedestrian detectors available soon so I’m waiting for that.

I’ve been spending a lot (to me) on tools and supplies but it’s amazing how little I have invested in Tractobot02’s equipment. It’s basically a GPS (~$1K), antennas, a Raspberry Pi 3, Verizon USB stick, and an Arduino Nano. $2K easily covers it.

I have learned the importance of a good kill switch that’s independent of the rest of the robot’s control system. I recently built my own using Adafruit LoRa Feathers. I put the remote in a magnetic-mount weatherproof GPS case.

1 Like

In the robot, I simply wired a Feather to an SSR for the ignition.

Those little wire antennas should work for well over a mile. I haven’t tested over half a mile.

OK…I’m finally butting up against transforms. I’m going to have to do something more complicated for Tractobot03 because of the location of the satellite antennas.

I tried hiring an expert on Upwork but I’m having trouble getting messages there now. So I might have to figure this out on my own. I’m overwhelmed.

I have this very fuzzy understanding that I need to:

  1. Establish a world frame and a base frame.
  2. Define the base, the loader arms on which the antennas are mounted, and the rear hitch.
  3. Translate the satellite navigation data into odometry and pose messages.
  4. Use transforms to determine the hitch location using the satellite data.

Unfortunately, every one of those steps feels insurmountable right now. And I don’t even know if I’m on a reasonable path.

My fallback plan is to do some simple nvector calculations to move the satellite-derived position half way between the primary antenna and the secondary antenna. (The position of the secondary antenna is given as meters of latitude and meters of longitude from the primary antenna.) Then I can just pretend that my antenna is on the centerline of the tractor (like my others have been).

Of course it’s likely that there will be a slight offset, so I’ll tweak the “half way” calculation as necessary. And I will have tilt data so I really should use it to calculate the ground-level position. I can imagine this eventually working well enough but it would be so nice to have transforms working so that I can do this cleanly.

I will gratefully welcome assistance/advice. And I’m quite willing to pay for help!

Although I’m sure it’s possible to do, I’d strongly recommend not mounting your GPS antennas on a movable link compared to your odometry and IMU. Most localization implementations assume fixed offsets for those elements and I doubt that you will want to be implementing your own localization estimator with extra parameters to estimate and deal with latency etc.

From your picture it looks like there’s a reasonable amount of space on the 80/20 at the top of your roll cage for the antennas. Or you could also use the root of the arms instead of the middle point. Or use the front brush guard.

1 Like

Thank you for the recommendations, tfoote.

I intend for the loader to be in a fixed position during operation (such that the antennas are pointing straight up). I did think about putting the antennas on the extrusion but I want the increased precision of having them farther forward, plus it gets them away from the cameras and antennas I’ll have on the extrusion. The grill guard wobbles a lot and does not offer the spread I need. (It will also be shadowed when the loader is lifted. I’ll have a fertilizer tank up front.)

I only use GPS - no odometry or IMU. That’s part of the problem I bump into whenever I try to use ROS tools instead of the code I’ve written.

Very nice project! One remark on the killswitch: I hope it is implemented in a fail-open way (no wireless connection -> stop) :wink:

I guess you are not using any localization filter at the moment. In that case, you don’t need odometry/pose messages, just direct tf transforms.

Do you get attitude measurements (e.g. compass bearing) from the GPS? I assume so in the following.

The remaining question is how you fuse the two GPS measurements. Naively, I would simply average them in the UTM frame, and then pretend that there is a single GPS antenna in the center. Then, you could even use your old code as you mentioned.

Otherwise, you already identified the steps you have to take to transform the data in ROS properly. Here is how I would fill these steps with more details:

  • Model your robot using URDF, including coordinate frames for base_link (somewhere in the middle of your tractor), gps_link (location of your GPS antenna) and rear hitch. For starters, you can use fixed joints for everything.
  • Start a robot_state_publisher, which takes care of publishing these transforms
  • Check that the transforms look okay using rviz (fixed frame: base_link, add a TF display).
  • Write a node which converts GPS lat/lon measurements into some Cartesian system (UTM?) and publishes a transform world -> base_link. I’m not aware of a ready-to-use implementation for your use case, but it should not be hard to implement.
    The tricky part is the correction for the mounting offset: Ask tf for the transform gps_link -> base_link and multiply that from the right side onto the world->gps_link transform (UTM) to obtain world->base_link.

I assume the desire for odometry is to allow use of existing ROS
packages for navigation or other purposes. Many require odometry as an
input. I was working on my slightly smaller robots last year and ran
into that requirement. Got distracted by a competition using simulation
so didn’t follow through with them although encoded motors were purchased.

A quick search found some GPS to odometry discussions so it’s likely to
be available.

-73 -
Rud Merriam K5RUD
/Mystic Lake Software/ http://mysticlakesoftware.com/

Thanks for the help, xqms! I’ll try to respond to your points in order…

One of the best parts about my new kill switch system is that I can program it. Yes, I originally made it so that there was a heartbeat and if the switch was hit, the heartbeat stopped and the tractor was killed. I switched away from that to only transmit a kill signal for a couple reasons. The main one is that I wanted to save the battery. I don’t know how long it’ll last if I’m transmitting all of the time but I sure don’t want to run out of battery at the end of a day and be tempted to bypass the kill switch. I also plan to have several kill switches operating several robots which could be miles apart. I want kill switches to be able to move in and out of range without killing tractors. I do consider having one heartbeat switch that stays in the field with the tractor (and is connected to a power source like the command center) and then have other switches that send signals only when activated. I like keeping it simple for now, though.

I do not use a localization filter. I work directly from the GNSS data, which is plenty accurate for me. I would love to fuse an IMU, though, so I can get higher refresh rates and lower latency. Yes, “direct tf transforms” sounds like my current need.

Yes, I get course and pitch from the receivers. Here’s the data I get about the base line to the second antenna:

  1. Latitude-projection of base-line, m
    
  2. Longitude-projection of base-line, m
    
  3. Height-projection of base-line, m
    
  4. Base-line length (Rover-to-Base distance), m
    
  5. Base-line course (angle between base-line vector and North direction), degrees
    
  6. Base-line pitch (angle between base-line vector and horizontal), degrees
    

Right now I’m using nvector for all of my calculations. I think about switching to UTM but I’m a little concerned about conversion errors, especially in fields which are over a mile in one dimension. I suppose that if I calculate my lines in nvector and then convert them to UTM, I should be OK.

Alright, modeling the robot in URDF seems to be the place to start. I’ve tried it a few times and not gotten far. I’m sure it’s something I can do. I will give it a shot.

The next steps overwhelm me right now. Good to know I’m on the right path, though.

Thank you!

Rud,

Yes, it seems like the tools I find always want odometry. I have read some discussions about using GPS only but they get complicated quickly. I suspect that the issue is that typical GPS data is very noisy and more input is needed to provide a usable position. I already have very usable position and pose data.

We have a need for a kill switch as well, and found that the only suitable wireless kill switches start close to 1000€. Thus we also decided to build our own kill switch (using heartbeat and an active kill signal for below reasons). We also ordered Featherboards and will use an 868MHz signal. We calculated (most pessimistic assessment, heart beat at 100Hz) that our 2200mAh battery will last at least 10 hours. Judging from your picture I would assume that you have a larger battery, you should have no problem sending a heart beat signal for the whole day at 10Hz.

Yes it is a different category, but the run-stop of our PR2 runs for about a year on 4 AA-cells. It’s a bit clumsy, but I would never trade more battery time for security, especially when running a tractor.

We plan to put code, parts list and 3D-printable model on thingiverse or an open git repo once our student is done building it.

Heartbeat: obviously required in case the battery runs out or the signal is out of range
Active Kill Signal: in case someone duplicates the heart beat signal, the active kill signal blocks the Heartbeat. Reset switch will be placed on the receiver. This way, multiple kill switches can be used for a single receiver, as well as multiple receivers for one kill switch (signal will have an ID to avoid interfering with multiple setups)

I didn’t know about the n-vector representation - cool!

It seems that averaging positions in n-vector (i.e. finding the center between the two antennas) is just the normalized mean of the two vectors: http://www.navlab.net/nvector/#example_7

I would use that to calculate the position of a virtual antenna in the middle of the two.

As @tfoote suggested it is extremely unusual to have GPS antennas on a movable part of any robot. You really wonna have an exact known offset between your antenna(s) and point in which you control your tractor (probably center of the rear axis, that is base_link). If you use very precise GPS (< 5cm) then you can very well just continue using it alone. That is how we did navigation here: https://www.youtube.com/watch?v=GhPAgYRJ5l8.

You can use this function to convert your Lat/Lon into utm: https://github.com/cra-ros-pkg/robot_localization/blob/kinetic-devel/include/robot_localization/navsat_conversions.h#L185.

Since you have 2 antennas you will be able to infer rotation of the tractor as you already figured it out above. Otherwise @xqms gave you all the right steps above.

If you need some help with the initial URDF and TF transforms I can find some time next weekend.

xqms,

Yes, n-vector has been a lifesaver for me. I can cobble together very useful code from the recipes given.

I was thinking I’d use the interpolation example (#6) so that I could easily handle an offset in case it’s not quite centered.

Kyler, here is the UTM conversion i use. Works to the millimeter - even to convert back. If you want the conversion back, i can post that too. Its written in C# but can easily be changed to whatever language.

   //private double pi = 3.141592653589793238462643383279502884197169399375;
    private double sm_a = 6378137.0;
    private double sm_b = 6356752.314;
    private double UTMScaleFactor = 0.9996;
    //private double UTMScaleFactor2 = 1.0004001600640256102440976390556;

    public void DecDeg2UTM()
    {
        zone = Math.Floor((longitude + 180.0) * 0.16666666666666666666666666666667) + 1;
        GeoUTMConverterXY(latitude * 0.01745329251994329576923690766743, 
                            longitude * 0.01745329251994329576923690766743);
    }

    //    return (degrees * 0.01745329251994329576923690766743);

    private double ArcLengthOfMeridian(double phi)
    {
        double alpha, beta, gamma, delta, epsilon, n;
        double result;
        n = (sm_a - sm_b) / (sm_a + sm_b);
        alpha = ((sm_a + sm_b) / 2.0) * (1.0 + (Math.Pow(n, 2.0) / 4.0) + (Math.Pow(n, 4.0) / 64.0));
        beta = (-3.0 * n / 2.0) + (9.0 * Math.Pow(n, 3.0) / 16.0) + (-3.0 * Math.Pow(n, 5.0) / 32.0);
        gamma = (15.0 * Math.Pow(n, 2.0) / 16.0)
            + (-15.0 * Math.Pow(n, 4.0) / 32.0);
        delta = (-35.0 * Math.Pow(n, 3.0) / 48.0)
            + (105.0 * Math.Pow(n, 5.0) / 256.0);
        epsilon = (315.0 * Math.Pow(n, 4.0) / 512.0);
        result = alpha
            * (phi + (beta * Math.Sin(2.0 * phi))
                + (gamma * Math.Sin(4.0 * phi))
                + (delta * Math.Sin(6.0 * phi))
                + (epsilon * Math.Sin(8.0 * phi)));

        return result;
    }

    private double UTMCentralMeridian(double zone)
    {
        return ((-183.0 + (zone * 6.0)) * 0.01745329251994329576923690766743);
    }

    private double[] MapLatLonToXY(double phi, double lambda, double lambda0)
    {
        double[] xy = new double[2];
        double N, nu2, ep2, t, t2, l;
        double l3coef, l4coef, l5coef, l6coef, l7coef, l8coef;
        double tmp;
        ep2 = (Math.Pow(sm_a, 2.0) - Math.Pow(sm_b, 2.0)) / Math.Pow(sm_b, 2.0);
        nu2 = ep2 * Math.Pow(Math.Cos(phi), 2.0);
        N = Math.Pow(sm_a, 2.0) / (sm_b * Math.Sqrt(1 + nu2));
        t = Math.Tan(phi);
        t2 = t * t;
        tmp = (t2 * t2 * t2) - Math.Pow(t, 6.0);
        l = lambda - lambda0;
        l3coef = 1.0 - t2 + nu2;
        l4coef = 5.0 - t2 + 9 * nu2 + 4.0 * (nu2 * nu2);
        l5coef = 5.0 - 18.0 * t2 + (t2 * t2) + 14.0 * nu2 - 58.0 * t2 * nu2;
        l6coef = 61.0 - 58.0 * t2 + (t2 * t2) + 270.0 * nu2 - 330.0 * t2 * nu2;
        l7coef = 61.0 - 479.0 * t2 + 179.0 * (t2 * t2) - (t2 * t2 * t2);
        l8coef = 1385.0 - 3111.0 * t2 + 543.0 * (t2 * t2) - (t2 * t2 * t2);

        /* Calculate easting (x) */
        xy[0] = N * Math.Cos(phi) * l
            + (N / 6.0 * Math.Pow(Math.Cos(phi), 3.0) * l3coef * Math.Pow(l, 3.0))
            + (N / 120.0 * Math.Pow(Math.Cos(phi), 5.0) * l5coef * Math.Pow(l, 5.0))
            + (N / 5040.0 * Math.Pow(Math.Cos(phi), 7.0) * l7coef * Math.Pow(l, 7.0));

        /* Calculate northing (y) */
        xy[1] = ArcLengthOfMeridian(phi)
            + (t / 2.0 * N * Math.Pow(Math.Cos(phi), 2.0) * Math.Pow(l, 2.0))
            + (t / 24.0 * N * Math.Pow(Math.Cos(phi), 4.0) * l4coef * Math.Pow(l, 4.0))
            + (t / 720.0 * N * Math.Pow(Math.Cos(phi), 6.0) * l6coef * Math.Pow(l, 6.0))
            + (t / 40320.0 * N * Math.Pow(Math.Cos(phi), 8.0) * l8coef * Math.Pow(l, 8.0));

        return xy;
    }


    private void GeoUTMConverterXY(double lat, double lon)
    {
        double[] xy = MapLatLonToXY(lat, lon, (-183.0 + (zone * 6.0)) * 0.01745329251994329576923690766743);

        xy[0] = xy[0] * UTMScaleFactor + 500000.0;
        xy[1] = xy[1] * UTMScaleFactor;
        if (xy[1] < 0.0)
            xy[1] = xy[1] + 10000000.0;

        //keep a copy of actual easting and northings
        actualEasting = xy[0];
        actualNorthing = xy[1];
    }

In terms of position of your hitch etc, most robots are no where as large as the project you have here so there are a couple considerations.

  1. You need a pivot distance, ie the distance from the back wheels (pivot point) to your antenna. your heading will be based off the movement of the antenna and since your antenna is not on the pivot point of your vehicle, the antenna will sweep in an arc and circle that is larger than your pivot point arc.

  2. The hitch point of your vehicle should be based on the distance from your pivot point to the hitch pin. In your tractor example note that the hitch moves in the opposite direction as your antenna.

Some example code to easily determine position based on your antenna heading in UTM…

           //translate world to the pivot axle
        pivotAxleEasting = pn.easting - (Math.Sin(fixHeading) * vehicle.antennaPivot);
        pivotAxleNorthing = pn.northing - (Math.Cos(fixHeading) * vehicle.antennaPivot);

        //determine where the rigid vehicle hitch ends
        hitchEasting = pn.easting + (Math.Sin(fixHeading) * (vehicle.hitchLength - vehicle.antennaPivot));
        hitchNorthing = pn.northing + (Math.Cos(fixHeading) * (vehicle.hitchLength - vehicle.antennaPivot));
  1. the resulting angle of trailing equipment and its position is based on a following algorithm, like a trailer behind a truck. Its a constantly decaying angle as the pulling vehicle moves forward. The trailing hitch length needs to be known of course, and once the heading of your implement is known and since you know the width of your implement you can now know where the extremes of your implement are by using the above sin/cos method.

Some example code

                double t = (vehicle.toolTrailingHitchLength) / distanceCurrentStepFix;
                toolEasting = hitchEasting + t * (hitchEasting - toolEasting);
                toolNorthing = hitchNorthing + t * (hitchNorthing - toolNorthing);
                fixHeadingImplement = Math.Atan2(hitchEasting - toolEasting, hitchNorthing - toolNorthing);    

Again, while its in C#, easily translated to your required code.

Hope this helps your quest.