ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org

MoveIt 2 journey: sensorless collision detection with ROS 2

MoveIt 2 journey: sensorless collision detection with ROS 2

First published at https://acutronicrobotics.com/news/ros2-sensorless-collision-detection/. Full article available there (discourse here’s not very happy with mathjax syntax apparently)

The MoveIt motion planning framework is an open source software for motion planning, manipulation, 3D perception, kinematics, control and navigation. It is built on top of the Robot Operating System (ROS 2) and altogether, remains the third most popular package in the ROS world.

MoveIt provides an easy-to-use platform for developing advanced robotics applications, evaluating new robot designs, and building integrated robotics products for industrial, commercial, R&D, and other domains. MoveIt 2 is is the ROS 2 version of MoveIt.

MoveIt 2 simplified architecture diagram.

This is the fourth of a series of articles that describe our learning experience and contributions with the MoveIt 2 and ROS 2 communities. The complete series is listed below:

  1. why MoveIt 2?
  2. porting and understanding moveit_core
  3. first demonstrator in ROS 2, planning to a joint-space goal
  4. sensorless collision detection with ROS 2 ( this article )

In this fourth part we present the demonstration of a sensorless collision detection system for the MARA modular robot using some of the moveit_core submodules of MoveIt 2. The whole system is based in ROS 2 and has been tested using the Dashing Diademata pre-release while leveraging the real-time capabilities that our team is developing as part of the H-ROS communication bus for robots.

Collision detection demonstration

In this demo we show a simple MoveIt 2 program that moves the second joint of MARA between ±90º and stops-and-changes the direction when a collision is detected.

Technical background

Collision detection is a pre-requisite for safety certification. This is specially relevant for collaborative robots as detailed in ISO 10218-1. Literature describes different methods to detect a collision for safely operating robots. The selected method for collision detection often depends on different factors such as available sensors, data, costs (expenses), complexity and required sensitivity.

For this demonstration, we will use the Han’s Robot Modular Joints from Han’s Robot, one of our partners. These hardware modules, each, has its own position, velocity and torque sensor. All these values are available directly in ROS 2 thanks to the H-ROS SoM which is attached to each Modular Joint and ensures deterministic, secure and safe robot communications. Moreover, the SoM allows to detect collisions on each individual joint which further enhances the capabilities of the overall system and reduces the reponse time.

For the software implementation, we will be reusing existing submodules of MoveIt 2. In particular, MoveIt 2 includes a dynamic solver which provides a way to calculate the torque values of each joint based on the kinematic parameters and the model of the robot.

With all these tools and hardware available, we implement a method for collision. In the sections below we explain general concepts about sensorless collision detection (including the importance of friction models) and then, describe the selected method: detection by estimating the external torque at every moment in time (comparing real torque with the estimated one).

Sensorless methods for collision detection

Sensorless collision detection methods consist on estimating external torques applied to the robot. By monitoring the external torque it possible to detect if the estimated torque corresponds to an undesired collision.

Sensorless methods are typically based on measuring the current of each motor and inferring from it the torque of each joint. Depending on the transmission mechanism used by the motor, friction should be taken into account. The simplest methods use the measured torque of the motors and more complex methods take into account the robot dynamic to achieve more accurate detections.

In this case, a method based on directly estimating the external torque in each joint has been selected. To implement this method it is necessary to have the following points: current measurements of each motor, a friction model of the Harmonic drives, kinematic parameters and a dynamic model of the robot.

See https://acutronicrobotics.com/news/ros2-sensorless-collision-detection/ for a short explanation on the math behind our demonstration. Refer to section Sensorless methods for collision detection

Collision detection using ROS 2 and MoveIt 2:

MoveIt 2 architecture diagram highlighting moveit_core , wherein the dynamic_solver module used in this demonstrator lives.

To estimate the torque due to the internal status of the robot we make use of the dynamic_solver module inside moveit_core . The dynamic solver calculates the required torques given a joint configuration, velocities, accelerations and external wrenches acting on the links of a robot.

Torques calculated by MoveIt 2 are compared with the joint torques (τjτj). The difference between these torque values can be used to estimate the external torques applied in each joint. Resulting difference signal is used as a collision detection signal. Then, different criteria could be used for collision detection. For example using some filter or based on the change rate. In this example we use a simple threshold criteria; if the signal exceeds a certain value we interpret this as a collision.

Notice that, as there is noise in the signal due to the velocity and acceleration, and calculations errors, there is a compromise between having false positives and achieving high sensitivity when setting the threshold. In this case we set a threshold in 15 Nm. If the threshold is exceeded, we call the stop command on the MoveIt 2 move_group .

In the plots below we observe a comparison between the joint torque ( τjτj ) and the torque estimated by the MoveIt 2 dynamic model. On the left side we observe the recorded data for a no collision case when moving the joint 2 of the manipulator from -90 to 90º. On the right side we observe the same case but with a collision. When the External Torque (τeτe) exceeds the configured threshold (15Nm) we detect the collision and the robot is stopped.

Comparison between real and estimated torques with and without collision.

In conclusion, thanks to the fact that each module provides data such as torque, position, velocity and acceleration, it is pretty straightforward to compute the external torques by using the MoveIt 2 dynamic solver. Also, the real-time capabilities of ROS 2 play an important role here because the collision detection and stop command must be performed under timing constraints . This approach would simplify greatly the deployment of collaborative robots under ROS 2 ecosystem.

References

4 Likes