Streaming Motion Interfaces

Nice post @Jmeyer.

I believe the way forward here would be ros_control, as it seems to directly support the two technical bullets in your list.

Some history first though: the nodes in industrial_robot_client (and any OEM specific drivers) were actually supposed to be outfitted with an “on the fly real-time streaming” interface. Right now, only the /joint_path_command topic is subscribed to, but the Motion Control section of the driver spec also includes a /joint_command topic:

  • execute dynamic motion by streaming joint commands on-the-fly
  • used by client code to control robot position in “real-time”
    • the robot implementation may use a small buffer to allow smooth motion between successive points
    • the topic publisher is responsible for providing new commands at sufficient rate to keep up with robot motion
      • denser paths (or faster robot motion) will require higher update rates
    • […]

This is currently not there (ros-industrial/industrial_core#139) as at least back then the interfaces to motion controllers were so limited that the impression was that a (semi) downloading approach would always perform better, but the idea seems to at least conceptually correspond to what you have in bullets 1 and 2. Message type would be trajectory_msgs/JointTrajectoryPoint.

However, I don’t believe it makes sense for us to add that to industrial_robot_client or any of the other drivers we have that are IRC based: ros_control can already do this, and has one additional advantage: it supports on-line trajectory replacement with the ros_controllers/joint_trajectory_controller.

Trajectory replacement can be a nice way to get both behaviours in a single driver I believe:

  • send a single trajectory and wait for completion: ‘traditional’ trajectory execution
  • send a trajectory and whenever the need arises send an updated one: blend current and new trajectory in a meaningful way (ie: keep smoothness and continuity)

“in the limit” this can become a single-point streaming interface – we’ve almost used it as such here in the lab with a system where we sent the controller very short trajectories (a few hundred ms max each) with three points: [current_pose; destination_pose; controlled_stop_pose]. The idea was that if the real-time planner couldn’t generate new destination poses within its deadline, the trajectory controller would make the robot do a contolled stop at the last point in the trajectory ‘segment’. If everything worked correctly, the robot only moved to destination_pose with each update of the trajectory.

The joint_trajectory_controller is not perfect (iirc it’s not jerk-limited at the moment fi (which is probably the cause of ros-industrial/kuka_experimental#126 and others), but that could potentially be fixed by using AIS-Bonn/opt_control) so it could use some work, but I believe it could allow us to do what you want with the least amount of work.

The “only” thing we need to do is implement ros_control hardware_interfaces for robots and that’s it. And writing such hardware_interfaces becomes easier and easier now that more and more (industrial) robot controllers provide high-bandwidth external motion interfaces (ABB EGM, KUKA RSI, Mitsubishi MXT, Denso b-CAP, etc, etc).


Summarising: I would be very much in favour of having this sort of ROS API for all our supported robots and I believe that most of this is already all there (although some parts could use some work), except for the hardware_interfaces that we’d need to be able to use ros_control for this (we could create one for IRC if we’d like: ros-industrial/industrial_core#176).


Edit: as to the test setup: yes, please. I’ve had something like that on my list for so long, but never found the time for it. A colleague here in the lab has done some preliminary work based on floweisshardt/atf where robots (or really: their drivers) are made to execute specific motions with all sensor data recorded for later analysis. I’m sure there are other implementations available of this idea that go much further.

1 Like