How does the odometer act in mobile robots like Limo?

For mobile robots, there are three basic questions: Where am I? Where am I going? How do I get there? The first question is describing a robot positioning topic. The positioning topic can be explained in more detail as follows: the mobile robot determines its position and posture in the world (global or local) in real time based on its own state and sensor information.

In this project, we will discuss details about odometer in the mobile base such as Limo.

Introduction of robot wheel odometer and calibration test

The main positioning solutions for driverless cars in Ackerman turned to include: wheel odometer, visual odometer, laser odometer, inertial navigation module (IMU+GPS), and multi-sensor fusion. Wheel odometer is the simplest and lowest-cost method. Like other positioning solutions, the wheel odometer also requires sensors to perceive external information, but the motor speed measurement module used by the wheel odometer is a very low-cost sensor. The speed module is shown in figure below.
image

The pose model of a mobile robot is the state of the robot in the world coordinate system. The random variable Xt = (xt, yt, θt) is often used to describe the state of the robot in the world coordinate system at time t, referred to as pose. Among them, (xt, yt) represents the position of the robot in the world coordinate system at time t, and θt represents the direction of the robot. The positive X-axis of the world coordinate system is assumed to be the positive direction, and the counterclockwise rotation is the positive direction of rotation.

At the initial moment, the robot coordinate system and the world coordinate system coincide. The pose description of the robot at a certain time t is shown in the figure.

The rotational angular velocity of the two wheels can be obtained through the wheel speed odometer. Therefore, the angular velocity of the wheel is needed to represent the x displacement, y displacement, and angle calculated by the odometer.

The quantities we need to calibrate are the wheel spacing and the wheel radius. The formula for establishing the mathematical model is to use the wheel spacing and wheel radius to represent the angular velocity and linear velocity of the vehicle body. The wheel spacing diagram is shown.


The angular velocity of the chassis center relative to the body’s rotation center is equal to the angular velocity of the two wheels relative to the body’s rotation center. That is:
image
Through the relationship between linear velocity and angular velocity, d is introduced:
image
So we can get r:
image
The motion solution solves w. Bringing r back, we can find w as:
image
Solve v in motion. By simplifying w*r, we can get v as:
image

The calculation of the odometer refers to the cumulative calculation of the robot’s position and posture in the world coordinate system at any time, starting from the moment the robot is powered on (the robot’s heading angle is the positive direction of the world coordinate system X).

The usual method for calculating the odometer is speed integral calculation: the speeds VL and VR of the left and right wheels of the robot are measured by the encoders of the left and right motors. In a short moment △t, the robot is considered to be moving at a uniform speed, and the increments of the X and Y axes of the robot in the world coordinate system at that moment are calculated based on the heading angle of the robot at the previous moment. The increments are then accumulated, and the yaw value of the IMU is used for the heading angle θ. Then the robot’s odometer can be obtained based on the above description.
The specific calculation is shown in the figure below:

Wheel odometer calibration

The three main sources of odometer system errors are “the deviation between the actual diameter of the left and right wheels and the nominal diameter”, “the deviation between the actual spacing between the left and right wheels and the nominal spacing” and “the actual average of the diameters of the two wheels is not equal to the nominal average”.

“The deviation between the actual diameter of the left and right wheels and the nominal diameter” will cause the distance error of linear motion. “The deviation between the actual spacing between the left and right wheels and the nominal spacing” will cause the direction error of rotational motion. “The actual average of the diameters of the two wheels is not equal to the nominal average” will affect both linear motion and rotational motion.

We usually assume that the actual position is linearly related to the wheel odometer. By recording the actual position by ourselves and the position x and y of the odometer of the car, we can use the least squares rule to obtain a linear equation: y=ax+b. The coefficients of the equation can be added when calculating the odometer to correct the odometer.
The code can be viewed in the driver package scout_base/src/scout_messenger.cpp of the robot.

First, data needs to be collected, that is, the actual distance moved by the car and the distance of the odometer of the car.
Running the code in Matlab, the results are as follows
p = [1.0482 -0.0778]
That is, a=1.0482, b=-0.0778, which are the calibration parameters in the x direction. Similarly, the calibration parameters in the y direction and the yaw angle can be calculated. This calibration is reflected in line 28 of the following code.

Detailed explanation of the wheel odometer code released by ROS

  1. Create package
catkin_create_pkg pub_odom roscpp tf nav_msgs
  1. Create the pub_odom_node.cpp file in the src folder under the pub_odom function package and add the following code:
#include <ros/ros.h>
#include <tf/transform_broadcaster.h>
#include <nav_msgs/Odometry.h>

int main(int argc, char** argv)
{
    ros::init(argc, argv, "odometry_publisher");  // Init ROS code
    ros::NodeHandle n;  // Create handle

    // Create publish object to publish odometer message
    ros::Publisher odom_pub = n.advertise<nav_msgs::Odometry>("odom", 50);

    // Create TransformBroadcaster object to publish transformation
    tf::TransformBroadcaster odom_broadcaster;

    // Initial status of robot
    double x = 0.0;
    double y = 0.0;
    double th = 0.0;
    double vx = 0.1;
    double vy = -0.1;
    double vth = 0.1;

    // Init time
    ros::Time current_time, last_time;
    current_time = ros::Time::now();
    last_time = ros::Time::now();

    // set roop as1Hz
    ros::Rate r(1.0);

    // Enter loop
    while(n.ok())
    {
        ros::spinOnce();  

        current_time = ros::Time::now();  // Get current time

        // 计算机器人的位移
        double dt = (current_time - last_time).toSec();  // calculate time difference
        double delta_x = (vx * cos(th) - vy * sin(th)) * dt;  //Calculate the x-direction displacement
        double delta_y = (vx * sin(th) + vy * cos(th)) * dt;  // Calculate the y-direction displacement
        double delta_th = vth * dt;  // Calculate the angle change

        // Update robot position and angle
        x += delta_x;
        y += delta_y;
        th += delta_th;

        // publish robot transformation
        geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);  // Convert angle to quaternion
        geometry_msgs::TransformStamped odom_trans;
        odom_trans.header.stamp = current_time;
        odom_trans.header.frame_id = "odom";
        odom_trans.child_frame_id = "base_link";
        odom_trans.transform.translation.x = x;
        odom_trans.transform.translation.y = y;
        odom_trans.transform.translation.z = 0.0;
        odom_trans.transform.rotation = odom_quat;
        odom_broadcaster.sendTransform(odom_trans);

        // pubilsh odometer
        nav_msgs::Odometry odom;
        odom.header.stamp = current_time;
        odom.header.frame_id = "odom";
        odom.pose.pose.position.x = x;
        odom.pose.pose.position.y = y;
        odom.pose.pose.position.z = 0.0;
        odom.pose.pose.orientation = odom_quat;
        odom.child_frame_id = "base_link";
        odom.twist.twist.linear.x = vx;
        odom.twist.twist.linear.y = vy;
        odom.twist.twist.angular.z = vth;
        odom_pub.publish(odom);

        last_time = current_time;  // Update timestamp
        r.sleep();  
    }
}

Code Review

ros::Publisher odom_pub = n.advertise<nav_msgs::Odometry>("odom", 50);
tf::TransformBroadcaster odom_broadcaster;

We need to create a ros::Publisher and a tf::TransformBroadcaster to send messages using ROS and tf respectively.

double x = 0.0;
double y = 0.0;
double th = 0.0;

We assume that the robot starts at the origin of the “odom” coordinate system.

double vx = 0.1;
double vy = ‐0.1;
double vth = 0.1;

Here we will set some velocities which will cause the “base_link” frame to move in the “odom” frame at 0.1m/s in the x direction, -0.1m/s in the y direction, and 0.1rad/s in the th direction. This will more or less cause our simulated robot to go in a circle.

ros::Rate r(1.0);

In this example, we will publish the mileage information at a rate of 1 Hz to make the display more concise, most systems will publish the mileage information at a higher rate.

//compute odometry in a typical way given the velocities of the robot
double dt = (current_time ‐ last_time).toSec();
double delta_x = (vx * cos(th) ‐ vy * sin(th)) * dt;
double delta_y = (vx * sin(th) + vy * cos(th)) * dt;
double delta_th = vth * dt;
x += delta_x;// x = a * x + b;
x = 1.0482x -0.0778;
y += delta_y;// y = m * m + n;
th += delta_th;// th = q * th + p;

Here we are updating our mileage information based on the constant speed we set. Of course, a real mileage system would incorporate speed into its calculations.

//since all odometry is 6DOF we'll need a quaternion created from yaw
geometry_msgs::Quaternion odom_quat = tf::createQuaternionMsgFromYaw(th);

We generally try to use 3D versions of all messages in our system to allow 2D and 3D components to work together where appropriate and to keep the number of messages to a minimum. Therefore, it is necessary to convert our yaw values ​​to quaternions. tf provides functions that allow quaternions to be easily created from yaw, and yaw values ​​to be easily obtained from quaternions.

//first, we'll publish the transform over tf
geometry_msgs::TransformStamped odom_trans;
odom_trans.header.stamp = current_time;
odom_trans.header.frame_id = "odom";
odom_trans.child_frame_id = "base_link";

Here, we’ll create a TransformStamped message to send over tf. We want to publish the transform from the “odom” coordinate system to the “base_link” coordinate system at current_time. So, we’ll set the message header and child_frame_id accordingly, making sure to use “odom” as the parent coordinate system and “base_link” as the child coordinate system.

odom_trans.transform.translation.x = x;
odom_trans.transform.translation.y = y;
odom_trans.transform.translation.z = 0.0;
odom_trans.transform.rotation = odom_quat;
//send the transform
odom_broadcaster.sendTransform(odom_trans);

Stuff our odometry data into the transform message and send the transform using the TransformBroadcaster.

//next, we'll publish the odometry message over ROS
nav_msgs::Odometry odom;
odom.header.stamp = current_time;
odom.header.frame_id = "odom";

We also need to publish a nav_msgs/Odometry message type so the navigation package can get velocity information from it. We set the header of the message to the current_time and the “odom” frame.

//set the position
odom.pose.pose.position.x = x;
odom.pose.pose.position.y = y;
odom.pose.pose.position.z = 0.0;
odom.pose.pose.orientation = odom_quat;
//set the velocity
odom.child_frame_id = "base_link";
odom.twist.twist.linear.x = vx;
odom.twist.twist.linear.y = vy;
odom.twist.twist.angular.z = vth;

This will populate the message with the mileage data and send it off. We set the child_frame_id of the message to the “base_link” frame, since that’s the frame we want to send velocity information to.

  1. Add the following two lines of code in the CMakeLists.txt file
add_executable(pub_odom_node src/pub_odom_node.cpp)
target_link_libraries(pub_odom_node
${catkin_LIBRARIES}
)
  1. Compile using catkin_make
  2. Run the code
    First open roscore
    Then run the code we wrote
rosrun pub_odom pub_odom_node
  1. After the code runs successfully, use rostopic echo to view the published odom information
rostopic echo /odom

Test result

After-class QUIZS
● In ROS, how to use the robot’s wheel odometer data to realize the robot’s pose estimation? Please write a ROS node, subscribe to the robot’s wheel odometer data, use the odometer data to realize the robot’s pose estimation, and publish the estimated pose information.

● How to calibrate the robot’s wheel odometer? Please write a ROS node, let the robot move on a specific trajectory, record the robot’s wheel odometer data and real pose information, and use the calibration algorithm to calibrate the wheel odometer, and finally save the calibration results in the ROS parameter server.

About Limo

If you are interested in Limo or have some technical questions about it, feel free to join AgileX Robotics or AgileX Robotics. Let’s talk about it!

4 Likes

Hi am I right I an saying your code works if the base link is in line with the centres of the wheels for a 2 wheels robot. And for a 4 wheeled robot if the base link is in the centre of the 4 wheels

I’m having an issue with my 2 wheeled robot where using a pair of encoders I got the “odometry” but later I realised this doesn’t take into account the transform between the centre of the 2 wheels and base link itself

I know I can change the base links location to being the centre of the wheels but it will take some time which is prefer to change in software

Thanks

Hi Hassan. Thanks for your attention!
Yes, you can do a coordinate transformation in codes. But we still recommend you set the base_link to the center of the mobile base. That can make it easier if you need the laser scan, odometer, and other coordinates transformation.