Agricultural Field Survey Robot

Hi all,

I’ve started to build a robot!
An Agricultural Field Survey Robot (Surveying sensor is a downward facing camera)
It is has:

Four wheels and skid steers.
Raspberry3 running ROS Kinetic on Ubuntu Mate
Raspberry Pi v2 Camera running using rospicamnode
An Arduino collecting data from wheel encoders and passing to Pi3 using rosserial.

External Android-based RTK GPS unit giving a 2cm accurate fix (in optimal conditions) but only at a 1Hz interval.

Forward facing Garmin Lidar Lite3 lidar unit - intended to help avoid obstacles
IMU unit is fitted, but I have had trouble getting reliable Heading measurements from it.

Progress so far:

The robot drives following twist commands, with PID controllers using the encoder data to achieve desired rates of wheel turns. An odometry topic is produced, but skid steering and the slippy field surface means that this is quickly inaccurate, especially in turning.

The robot drives around nicely under manually issued commands, or by a simple rule set applied to lidar measurements (e.g. dist <2m stop, dist 2-5m turn right, dist >5m go straight forward)

I am now trying to move to use robot localization and then navigation to achieve a level of autonomy. This has proven to be a good deal trickier than I had expected!

The first target is to simply have the robot move from a known start location to a newly specified location, say, 100m away in the field.

I have been trying to feed the odometry, IMU and GPS data into robot localization to get the robot’s position and orientation pinned down. Then use ros_navigation to navigate across a (for now) empty static map to the target location. I am running the localization and navigation nodes on a laptop over wifi - just in case the Pi3 isn’t up to the task.

The fusion has been problematic, the heading of the robot is a key piece of information and the IMU seems to be unreliable. I have been around the loop several times getting the IMU orientation and handedness resolved, but have come to the conclusion that the heading data provided by my IMU is poor. I suspect it is struggling with interference from motors or other electronics on board, though seems to function correctly on the bench.

ROS setup for localisation/navigation
The odometry, imu and gps fix data is fed into robot localisation as per instructions here:
http://docs.ros.org/lunar/api/robot_localization/html/integrating_gps.html
I have two EKFs running and the navsat_transform_node as described. The resultant odometry topic is then fed into a move_base instance along with a map_server.

I am happy that the move_base end is working - when simulated it sends sensible looking twist commands to the wheels and shows appropriate progress on the map. But my localization isn’t working.

SO, has anyone got any comments on my setup or any pointers as to how I can get the localisation to behave? or perhaps some alternative approaches? Are there any really common ‘gotchas’ that I might have fallen foul of?

For now, I am going to try ignoring the IMU and encoder odometry and see if I can get by with just GPS - not any good long term - but might get me through for now.

You comments would be appreciated!

Thanks in advance, Joe.

Hello Joe,

I am interested in learning more about this. Do you have an email I can reach you at?

-Will