Tractobots, my attempts at field robots

Once you’ve converted both antenna lat/lon to UTM to get your heading taking the Ant1 and Ant2 position eastings and northings and subtract them, then take Atan2 to get your heading. Depending on which antenna you choose as the first one either add or subtract pi/2 to get your vehicle direction heading.Take right minus left antennna, you add. This assumes your positive northing is directly north and your angle increases clockwise. I’ve found not doing things pure Cartesian but rather navigational is much easier.

fixheading = Math.Atan2 (ant1.easting - ant2.easting, ant1.northing - ant2.northing) + PI/2

Your “virtual single antenna position” then is

eastingVirtual = (ant1.easting + ant2.easting)/2
northingVirtual = (ant1.northing + ant2.northing)/2

This now along with the heading can be used for distances to pivot point, hitch pin, implement, loader bucket, whatever. Just make sure to keep the signs consistent depending if ahead or behind virtual antenna.

*** Kyler, make sure the code you are using is accurate in the first place. I tried about 10 different versions of UTM convertors, none of which could get back to the original lat lon. The one i posted above was able to return back to within a millimeter (8 digits) of the original.

Converting to UTM and doing simple trig. calculations would be vastly easier for me. I don’t live near the equator or a pole but I’m still concerned about errors across a 1.5 mile field. I especially want to be accurate enough to work in the same field with machinery running standard (Deere and Precision) equipment.

Here are the disadvantages of UTM that caught my attention: (from Wikipedia)

  1. Inherent distortion (due to the map projection) gives only approximate answers for most calculations
  2. Calculations get complex when crossing the zones

I am fortunate to live smack in the middle of UTM Zone 16. Perhaps the distortion I will see is negligible. I would prefer to write code that anyone can use anywhere but I’m tempted to punt if this will satisfy my needs.

farmerbriantee, it is great to see your code. I appreciate that you shared it. There are Python libraries for the UTM conversions but I have been meaning to look for the trailer calculations after I get the basics handled. (It would be nice to know where my towed 31-row bean planter is.) I realize that you are not using ROS (and I barely do). I will probably mirror your approach for now but I’m hoping for some help with using ROS transformations to do this work. If UTM is sufficient, then I should be alright do use the transforms that way.

As the weather gets nicer here, I get more anxious about writing some usable code. It’s a huge relief to get such great help here. Thanks, all.

I punted and did the position computations in UTM. It was too easy to resist. It seems to work, although I haven’t set flags to ensure that it’s accurate. I did get to test it a little just as the weekend drew to a close:


It was nice and boring, except for some connectivity problems which caused it to stop occasionally.

I have a lot of tweaking to do on the steering. I’m using my old PID code for this and it showed - lots of drifting off course and then jerking back.

Hi,

Back to your request for comments:

  1. mounting the GPS units.
    I suggest you try mounting one unit on the nose (up high to avoid the FEL frame) of the tractor and one on the Rollover protection frame. This has the advantage of providing a longer axis, therefore any errors in GPS position reporting wont appear as a large tractor position error. Taking this a step further mount another set of GPS units on the outer ends of the planter or implement. Therefore having the best measure of tractor and implement.

  2. Battery for Kill switch & Kill Switch
    Why not use the on board tractor battery?

This is an awesome project!
For the UTM conversions, have you taken a look at http://wiki.ros.org/geodesy?

Usage is as simple as

from geodesy import utm
lat, lon = 51.47875, 5.43995
utm_coord = utm.fromLatLong(lat, lon, 0)  # Elevation 0 is fine for NL
utm_coord.toPoint()  # out comes a ROS geometry_msgs.Point in the UTM frame

Tray,

  1. I used longitudinally-mounted antennas for my previous two tractors. This time I wanted to have roll sensing and take advantage of the built-in mount points provided by the FEL. At some point I hope to study synchronization of cheap sat. receivers so that I can easily place them on the implements. Tractobot03 will have an integral (3-point mounted) planter but Tractobot02 will pull a much larger towed planter this year.

  2. The kill switch of Tractobot02 is powered by the tractor. The remote, however, is handheld and portable. I have not put a kill switch in Tractobot03 yet but it will be similar.

Thank you.

Loy,

I used Bidirectional UTM-WGS84 converter for python but I’m certainly interested in using more ROS code. Thank you. I will take a look. I appreciate the code example.

Ideally, what would I do with the Point? I’m thinking…

  1. Publish it (as is).
  2. Use it along with the secondary antenna projections to compute a Point for the secondary antenna.
  3. Publish a Point for the secondary antenna.

I suspect that I should use tf2 for the operations in #2. It’s going to take me awhile to get up to speed there. Is having the second point helpful, though? Or should I just use the data to publish Pose?

I would love to find a simple example that’s similar to what I’m doing. Pointers will be appreciated.

My use-case for this code is slightly different; I need to define a route for my robot based on lat/lon coordinates and put those in a Path, which is a list of PoseStampeds, which each consist of Points.

It’s good practice to stamp geometric data with a frame of reference (UTM in this case, as the robot_localization tells you when the /utm frame is wrt /map). I wanted to link to a tutorial on geometry_msgs but I was surprised I couldn’t find anything in limited time.

from geomerty_msgs.msg import PoseStamped, Quaternion
pose_stamped = PoseStamped()
pose_stamped.header.frame_id = "utm"
pose_stamped.pose.position = utm_coord.toPoint()
pose_stamped.pose.orientation = Quaternion(0, 0, 0, 1) # Default zero orientation. You might as well use a PointStamped really if you only have a position without orientation

Not sure why you would want to publish a Point for the second antenna?

I could not find this linked above, but http://docs.ros.org/kinetic/api/robot_localization/html/index.html might help as well.

a little different angle on this topic…

I recently got a couple Lepton thermal cameras. I’m hoping to use them for avoiding people. I considered LEDDAR but these are looking like a much better people-oriented sensor.

I’ll welcome suggestions for background subtraction, etc.

Kyler, can you share the pinout of the Delphi sensor?

I did not find a data sheet for the Delphi ride height sensor. Here is what I determined:
A: +5
B: signal
C: ground

I’ve been looking at other options like rudder sensors, but damn they’re expensive. It’s hard for me to justify buying a Garmin GRF10 when just the extension cable costs more than everything I’m using. Maybe someday I’ll play with one and see if it’s worth the cost (especially considering that I like to have several on hand as spares).

1 Like

Thank for posting the link to be able to keep following your progress…this is really interesting stuff, I’m amazed with what you’re doing.

On the latest versions, are you still using the IMU or are you back to dual GPS only?

I picked up a TinkerForge IMU2.0 and it’s looking really interesting on the desk. Haven’t had a chance to get it out in the field. I’m bouncing between using dual GPS antennas for tilt or using single + IMU. My end goal is to provide some level of a smart signal correction for the GPS and be able to feed that into AgOpenGPS. What’s driving the direction you are going with dual GPS?

My goal for this year is to use a Raspberry PI as the correction box to provide the calculations and transmission of most accurate gps location possible. Ultimately, I’d like to use the R-Pi + ROS to do more in teh future.

I’ve been using the BNO055 - the same chip as in the tinker, and i’ve found the pitch and roll to wander upward a few degrees - even as high as 8 to 10 degrees while traveling forward. A soon as you stop moving , the numbers return to zero. I have given up using it as I can’t seem to correct the problem.

Has anyone else noticed this?

87yj,

The IMU vs. dual-RTK situation is a hot-button for me right now. A firmware update went bad on one of my NVS-NV08C-RTK-A receivers. It’s now in “console mode” which NVS refuses to support. So it’s bricked.

The Tersus is delayed so I ordered a couple Piksi Multis and a couple more IMU Brick V2s. I’m hoping to migrate to them over the next week so that I can start planting.

I’m not sure what you mean by “driving the direction you are going” but I use the NV08C heading.

My issue now is that I wander around +/- 10cm from the line. There are a few things I still need to tweak but I’m hoping an IMU will help me stabilize.

I also bought some A/D converters for the Pi. I’m thinking about controlling steering (and everything else) directly from the Pi instead of going through rosserial to an Arduino. It should cut down on latency a bit and simplify the system.

Thanks Kyler…

“Driving the direction…” is a poor choice of words in the context. I was asking what factors are leading to the choices you are making related to # of GPS’s and/or IMU usage.

I have some rough python code that will take 2 GPS signals with know mounting height and distance from center-line and calculate centerpoint, heading and compensate for tilt. I was starting to move to 1 GPS and an IMU to simplify the calculation and coordination of 2 GPS signals. I’d like to build out the code to handle either configuration.

BTW, here’s the meat of my steering code. (Prepare to be amazed.)

                            pid_output = self.pid_captured(self.cross_track_distance)
                            desired_heading = (self.course + minmax(-90, 90, pid_output)) % 360
                            bearing = degdiff(desired_heading - heading)
                            steer_for_bearing(bearing)  # horribly uncalibrated

Yup, that’s right; I send the cross-track distance into a PID and then use the PID output to determine how big my intercept angle should be. This was basically my first shot at a steering algorithm and it worked so well that I just kept using it. But it’s not working well enough now and there are so many parts that need improvement.

I’m thinking that if I throw an IMU into the mix, I could still use my algorithm but make the steering changes in response to IMU updates (at a very high rate). So I’d use the old code to say “Steer 305 degrees” but use the IMU to help me direct the steering wheels there.

Hi,

Great project! Looks like you’re making good progress.

I just wanted to comment on your plan of doing steering directly from the Pi. I know some experienced engineers who built their hobby systems using Pi for logic and Arduino for motor control, on purpose. Their justification was that the Pi can get bogged down doing other OS-level tasks (accepting user input, etc.) that can interfere with a control loop and affect performance. To finally fix the issue, they ended up moving to a FPGA-based system, which is able to run the control law in real-time while also running program logic.

1 Like

Adam,

Yes, I was excited to do more control directly from the Pi but I had the concern of delays throwing a wrench in timing-critical operations. I appreciate that you reinforced that issue.

I ended up doing some testing directly from the Pi. That allowed me to tweak the steering settings. I had great success! I found that I need to use PWM at around 40 Hz to get good control of the proportional valve. It cracks at around 233/255 duty cycle. I am able to make the wheels steer at a nearly imperceptible rate. It’s so cool.

I have been concerned about proportional valve control since I started. It was never clear to me if I need current control or if I can simply use straight PWM. Until now I’ve been unable to get slow steer speeds (low flow) and I thought that it might be because I’m not controlling current adequately. But now I know I don’t need to do that! It seems that I get the full range of flow with straight PWM at a sufficiently low frequency. It’s easy to find the right frequency - if it’s too high then low flows aren’t possible, and if it’s too low it pulses. This is so important! And yet even when I suspected it was an issue, I didn’t find much mention of frequency adjustments.

Also, inspired by a video I found (https://www.youtube.com/watch?v=-SE6X6t1HNQ), I chopped I and D from my PID steering (angle) controller. With the ability to go at a low rate, proportional seems to be all that I need.

Once I did all of my testing on the Pi, I re-worked my Arduino code a bit and it seems to be good. I’ll stick with it for now.

I’ve been thinking about how well Tractobot02 (300HP tracked tractor) stuck to my line, even at high speeds (19 MPH). I want that level of control with Tractobot03. Tracked tractors are designed to go straight but I’m sure I can do something similar with an IMU. So tonight I laid a Tinkerforge IMU Brick V2 in Tractobot02’s enclosure. Here’s what I’m considering…

Instead of trying to calibrate the steer angle sensor to steering angles, I’ll simply control the steering valve with a PID fed by rate of turn data. That would allow me to say “Go straight” and it should find the steer angle to do so. That gets me to Tractobot02’s level of control.

Once I can control rate of turn, I can use the PID that’s fed by the cross-track distance to feed the rate of turn PID. This means that if I’m left of my line, the PID would command an appropriate rate of turn to the right to re-capture the line.

I thought about throwing in a PID to maintain a given heading but I don’t think that’s needed if I can do rate of turn. We’ll see…

–kyler

I installed the IMU. It’s cool! I enjoy watching the steering wheels adjust as they move across rough spots.

Now to figure out how to use it with the GPS. I know there’s lots of ROS fusion goodness but I’m not there yet. I’m still working with my own code.

I’m excited about the possibility of ditching the NVS dual-RTK receivers. I will have a lot more flexibility working with a single receiver plus IMU.

It looks pretty awesome. For the fusion you could use the robot_localization package.

Here you have the link for the API. The point of this algorithm is that it can fuses a NavSatFix message and a IMU message (also odometry message can be used).

1 Like