IMcoders: easy-to-install (and cheap) odometry sensors

Hey there,

We would like to introduce you our project: IMcoders.

A few months ago we started developing some sensors in our spare time to provide odometry data for robot with wheels. One of the main objectives of the project is to develop sensors extremely easy to integrate in already developed devices. The provided odometry data can be used among the output of other sensors to navigate autonomously.

We presented this project to the Hackaday Prize under the category Robotics Module Challenge so in case you want information about the development process, just take a look here: https://hackaday.io/project/158496-imcoders/

Here it is a short introduction before going to the problematic:

If you want to prototype with autonomous navigation on real vehicles the options are quite limited.

Imagine you want to make the forklift in a warehouse, a tractor at the field or a wheelchair at the school to navigate autonomously, to prototype and get some first hand input data you could easily integrate cameras and get some visual odometry or attach a GPS device for the use cases when you are outside. Perfect, you can navigate using that input but still the system is not reliable enough, maybe there are not features for computing a reliable visual odometry or you are inside a building and there is no GPS signal, wouldn’t be nice to have some encoder input? Let’s add them to our vehicle!.. hmmm not so easy, right? If you are good at mechanics, you could install some encoders in your vehicle, but on “off the shelf” vehicles, the hardware modifications needed to add this sensors are not a real option for all the people. At the moment, there is nothing mechanical and money-wise affordable for everybody. In order to meet this needs the IMcoders are borned.

For that we are using IMUs but not in a conventional way. The idea here is to attach an IMU to each robot wheel and measure its spatial orientation. Tracking the change of the orientation of the wheel we can infer how fast the wheel is spinning and, if needed, its direction. (Yes, as you probably already notice the Idea is to provide an output very similar to a traditional encoder, just from a different source, hence the name IMCoder = IMU+Encoder)

You could think this approach has a lot of error due to the nature of IMUs (and of course is not the perfect solution for every use case!) but adding some constraints based on the location of the IMUs on the robot most of the error can be mitigated so that the output provided is stable for most use cases.

After some simulations, we developed some IMU wireless boards which provide IMU data using the ROS interface:

sensor1

It means that combining some of them and using some theory about differential drive steering, we might be able to calculate a reliable odometry. So that’s what it is almost happening. To focus on the odometry calculations we created a simulation environment using gazebo and we attached one IMU (using the gazebo IMU plugin) to each wheel of our simulated differential drive robot. It is almost working as expected: the calculated odometry using our sensors is quite similar to the one provided by the diff_drive plugin for gazebo. We say almost because there is still a mismatching between the output odometry provided by the diff_drive odometry plugin and ours. We guess there is something we are not considering within our calculations for the odometry, so our output it is not as good as expected (it is our first time working with quaternions).

Summarizing what we are doing:

  • We get the IMU absolute orientation as a quaternion in one time instant and also in the next one.

  • We compute the quaternion that defines the rotation between the first measurement and second one

  • The rotation of the sensor is translated to linear velocity (we know the diameter of the wheels).

With this information (linear velocity of each wheel) and a little bit of theory about differential driving vehicles, we are able to compute the new position of the robot.

odom

In the image there are three arrows: the orange one corresponds to the diff_drive gazebo plugin output and the red/green one (they are the same) corresponds to our output.

It is easy to observe that after some left/right turning the odometry we are computing it is accumulating some error.

We assume it is because of our calculations. Do you have any idea how can we verify the correctness of our implementation?

Here you can find the link to our code. Concretely where the odometry is being computed.

In case you want to reproduce the problem just follow the readme in the repository (more precisely, the Differential wheeled robot with IMcoders part) and you will be able to play around with our simulation environment.

Once the problem is solved we will continue integrating the sensors in a comercial RC car (Parrot Jumping Sumo) and testing them with real data:

45985ced-0df9-4b38-aef4-9132f438ae87

6 Likes

Hi , thanks to share this very interesting and ingenious approach.
Why do not use directly the rotation rate provided by the IMU? instead of the absolute orientation, which requires extra computations at IMU level, usually involving a 3D compass, which may not work correctly close to wheels and motors.
So my proposal would be to directly use rotation rate provided by gyros, and with diff-drive forward kinematics compute platform velocities.
This would lead to an even smipler approach , where a single 1D gyro could solve the problem (ideally).
What do you think ?

Hello @andreucm,

Why do not use directly the rotation rate provided by the IMU?

The point is that the gyro provides just a rotational speed measurement, which drifts. The main advantadge of our approach is that, using the gravity, the computed movement can be corrected. For instance, imagine that the algorithm computed that the wheel moved pi/2 but it really moved 3pi/8. At the end of the movement (when the robot is standing still), the gravity will help to correct the movement because it is always point the same way. Further more, during the movement the measured acceleration will always belongs to the inferior semispace (its z component will always be negative), so we even might be able to make some corrections during the movement. But this approximation is still to be tested.

That’s the main advantage of having absolute rotation measurements instead of relative ones.

Thanks for the clarification
ok , I see you use “imu.orientation” to compute wheel angular position. This orientation is not a raw measurement of an IMU, its an estimate, often provided by sensor itself, using raw measurements (a_xyz and w_xyz). Using this angular position, as you said, there is no position drift due to velocity noise time integration.
But I wonder, if your accelerometers are not just exactly on the axis of rotation, additional linear accelerations will be measured by accelerometers, not related with the gravity, so the wheel orientation computations may be corrupted.
What do you think about ?

1 Like

You are right. We already considered that and we already saw that, for quick translations, the estimated orientation is erratic. Thus, depending on the use case, our sensors will provide a better or worse output depending on the quality of the IMU integrated in the board (sometimes it is just a matter of calibration). Here we are facing a trade-off between the cost of the sensor and the expected accuracy.

Anyway, for the applications as the ones exposed above, we guess that the accuracy for the sensor we chose (MPU9250) for the first prototype will be more than enough.

I hope I answered your question. Really thank you for the interest by the way :slight_smile:

1 Like

Hey, just a small update:

Yesterday we’ve been working on the mechanism for mounting the sensors on the wheels. Now it is really easy to mount/unmount them:

gif_opt

Note we are not using the compass for computing the orientation of the IMU. Thus, we don’t have to worry about the magnets.

84fab414-1777-4acd-a74a-5a7dfefa3cd5

We recorded some datasets and we will post an update as soon as we can.

1 Like

How are you currently calibrating for the extrinsics of the origin of the IMU sensor with respect to the center of rotation of the wheel axis? After fixating the IMCoder to the wheel in a new position, are you by chance rotating the wheel at a constant velocity, then inferring the rotational speed ω from the sinusoidal frequency from the two IMU accelerometers axes perpendicular to the wheels axle, then additionally using the amplitude to infer the radial distance r from the wheel axis, and the phase of the peak of the waveform to discern the angular position θ? I suppose the phase between accelerometer axes x, y (assuming z points along the wheel axle) could be used to resolve the rotation of the IMU about the endpoint of the vector (r, θ) if neither x or y happen necessarily lie along (r, θ).

I’m not sure how level the mounting of your fixture is to the wheel, or even if the toy’s wheel and axles would be true (straight); if it’s only roughly perpendicular, the offaxis comentents may then need to be accounted for as well, necessitating a full 6DOF calibration rather than just a 3DOF calibration. Additionally, if positioning is subject to disturbances during reinstallment, like in the case with the rotationally symmetric magnetic clips, then perhaps making the calibration online or as a tracking filter might be appropriate to simplify deployment to arbitrary platforms.

1 Like

Hey, sorry for the late answer. Due to the youthness of the project and our limitated time for developing, we are not doing right now any kind of calibration. We already considered what you exposed and the intention is to follow that path but just if necessary.

The main goal is to have a system providing an odometry good enough for being used in autonomous navigation among other inputs (e.g., visual odometry, GPS, UWB sensors…). Thus, our developing line is going iteration by iteration checking what’s really necessary.

So now going back to the topic, regarding the perpendicularity of the robot axles: that’s a good point, for a first approximation, we just assumed that they are perpendicular. Last day we recorded some datasets we are probably checking this weekend.

What we are trying right now is finding out a “ground truth” to compare the output of our algorithm against. For that, we thought about computing a visual odometry using the robot’s camera (but we don’t know how good it is) or using aruco markers for getting the position of the robot.

Do you think we are following the right path? What would be your proposal given our time restrictions?

I’m not sure as to the scale or distance you’d like to test against to compare your odometry with, (are we talking like looping around a table or a building?), but usually benchmarking against a established SLAM algorithm (as opposed to a odometry method) would still be useful, like cartographer. Try and use a SLAM approach where data association of landmarks would be less of an issue, and when odometry sensing is optional for runtime. If you don’t want to use a LIDAR, or can’t fit one on the platform, but say only a onboard camera, you could use something like this:

If the platform is sensor deprived, i.e. you can’t tack on camera on it, you could flip the problem and go the poor-man’s-motion-capture route using a fixed facing camera and a printed fiducial taped to the robot:

I’ve used this april tag library before for something related in previous SLAM development and liked the packages features:

The Golem Lab at Georgia Tech used something quite like, using a six camera overhead vision system when a proper mocap system was unavailable. Just be sure to disable any autofocusing features if your using a cheap web camera or something.

1 Like

Hey guys, just a small update. After analyzing the datasets and watching at the output we can confirm what we had at simulation.

We get a valid odometry! Well, at least the output makes sense. The data is a bit noisy and we still have some things to tune but it is possible to see that our approximation is really working.

Here is a gif for a linear trajectory:

linear_trajectory

It is intended that we upload the datasets in case somebody want to test itself.

Hi there,

just wanted to say that after lots of hours working, we could finally get everything ready and today we submitted our project to the contest.

Here is the video we used for it:

Hope you like it.

In case you want to know more, all the information is in the project web you can find at the beginning of this post.

2 Likes

This is an old thread, but are these sensors available?

I just bought something very much like it for my bicycle. It attached to the hub with a heavy-duty rubber band and it sends wheel rotation data via both blue tooth and ANT+
https://buy.garmin.com/en-US/US/p/641230

A number of companies make these for use on bikes. You just stick them in a wheel and they broadcast data. Prices range from $10 for no-name Chinese to $35 from top brands.