Streaming Motion Interfaces

Greetings,

I’ve been stewing on an idea lately and I’d like to submit to the community at large to see what you think.

Some of our industrial robot drivers have “streaming” interfaces, meaning that joints are sent “on the fly” to applications while execution is happening. This is in contrast to “downloading” interfaces which try to make the entire plan available to the controller in advance.

A problem on the ROS-I side of things is that all of our interfaces look like download interfaces to user applications: you can only send whole trajectory chunks. Some drivers have alternate interfaces (ROS-Control in some of the UR variants?) that allow for real time streaming.

I’d like to:

  1. Create a ROS level API for connecting to, configuring, streaming, and manipulating joint-motion streams to the robot driver.
  2. Create a robot driver that calls on inspiration from audo drivers (e.g. ALSA or JACK). Data is sent in “chunks”, or time slices, to the controller where it is double buffered. The size of the time slice would be configureable so you can trade off between latency and reliability for your given application. This might enable more robustness on not RT platforms, especially when servicing multiple robots or when the computer is under heavy computational load.
  3. Some kind of test setup to quantify our performance on, say, a Motoman or a UR.

What in this proposal already exists? Has anyone experimented with point 2? Can ROS-Control fulfill our needs on point 1?

1 Like

I think because each JointTrajectoryPoint.msg has an associated “time_from_start” it is OK to batch up points and send them at once. They should get buffered if the underlaying transport has buffing.

SHouldn’t it work already. Perhaps all that is needed is a buffer node that subscribes to JointTrajectory and then sends them out again

Nice post @Jmeyer.

I believe the way forward here would be ros_control, as it seems to directly support the two technical bullets in your list.

Some history first though: the nodes in industrial_robot_client (and any OEM specific drivers) were actually supposed to be outfitted with an “on the fly real-time streaming” interface. Right now, only the /joint_path_command topic is subscribed to, but the Motion Control section of the driver spec also includes a /joint_command topic:

  • execute dynamic motion by streaming joint commands on-the-fly
  • used by client code to control robot position in “real-time”
    • the robot implementation may use a small buffer to allow smooth motion between successive points
    • the topic publisher is responsible for providing new commands at sufficient rate to keep up with robot motion
      • denser paths (or faster robot motion) will require higher update rates
    • […]

This is currently not there (ros-industrial/industrial_core#139) as at least back then the interfaces to motion controllers were so limited that the impression was that a (semi) downloading approach would always perform better, but the idea seems to at least conceptually correspond to what you have in bullets 1 and 2. Message type would be trajectory_msgs/JointTrajectoryPoint.

However, I don’t believe it makes sense for us to add that to industrial_robot_client or any of the other drivers we have that are IRC based: ros_control can already do this, and has one additional advantage: it supports on-line trajectory replacement with the ros_controllers/joint_trajectory_controller.

Trajectory replacement can be a nice way to get both behaviours in a single driver I believe:

  • send a single trajectory and wait for completion: ‘traditional’ trajectory execution
  • send a trajectory and whenever the need arises send an updated one: blend current and new trajectory in a meaningful way (ie: keep smoothness and continuity)

“in the limit” this can become a single-point streaming interface – we’ve almost used it as such here in the lab with a system where we sent the controller very short trajectories (a few hundred ms max each) with three points: [current_pose; destination_pose; controlled_stop_pose]. The idea was that if the real-time planner couldn’t generate new destination poses within its deadline, the trajectory controller would make the robot do a contolled stop at the last point in the trajectory ‘segment’. If everything worked correctly, the robot only moved to destination_pose with each update of the trajectory.

The joint_trajectory_controller is not perfect (iirc it’s not jerk-limited at the moment fi (which is probably the cause of ros-industrial/kuka_experimental#126 and others), but that could potentially be fixed by using AIS-Bonn/opt_control) so it could use some work, but I believe it could allow us to do what you want with the least amount of work.

The “only” thing we need to do is implement ros_control hardware_interfaces for robots and that’s it. And writing such hardware_interfaces becomes easier and easier now that more and more (industrial) robot controllers provide high-bandwidth external motion interfaces (ABB EGM, KUKA RSI, Mitsubishi MXT, Denso b-CAP, etc, etc).


Summarising: I would be very much in favour of having this sort of ROS API for all our supported robots and I believe that most of this is already all there (although some parts could use some work), except for the hardware_interfaces that we’d need to be able to use ros_control for this (we could create one for IRC if we’d like: ros-industrial/industrial_core#176).


Edit: as to the test setup: yes, please. I’ve had something like that on my list for so long, but never found the time for it. A colleague here in the lab has done some preliminary work based on floweisshardt/atf where robots (or really: their drivers) are made to execute specific motions with all sensor data recorded for later analysis. I’m sure there are other implementations available of this idea that go much further.

1 Like

I think real-time control is going to become more important, especially if ROS wants to compete with Boston Dynamics. Clearly we’re behind them at the moment.

It’s not so much that real-time control isn’t possible with ROS1 – but it hasn’t been explored much yet. We’ve actually had pretty good success, if we can keep network traffic low. (Example)

Velocity control of the joints, versus position control, is also important. When using velocity control, small message delays are less significant.

I think some of the robot manufacturers are catching on to this stuff. UR was way ahead of the game, and HEBI arms can also accept a stream of joint commands.

Slightly off topic, but: yes, velocity control is nice to have. UR was not necessarily ahead of the game though: KUKA LBR/IIWAs have had it for quite some time now. ABB EGM also supports both joint space and Cartesian velocity control. A third example would be Staubli.

An interesting case study where real-time trajectory modification would have been helpful (assuming it wasn’t tried already):

For months, Tesla engineers struggled to get a robot to guide a bolt through a hole accurately to secure part of the rear brake. They found a maddeningly simple solution: Instead of using a bolt with a flat tip on its threaded end, engineers switched to a bolt with a tapered point, known as a “lead-in,” that can be guided through the hole even if the robot is a millimeter off dead center.

https://www.theverge.com/2018/6/30/17520832/tesla-model-3-manufacturing-changes-tent-robots-welds

I don’t think real-time trajectory modification would have helped there: It’s not like the robot could see that it was going wrong, but couldn’t change the trajectory in time. Your “bolt through hole” example is a typical instance of a common class of problems encountered in industrial automation, and the solution is also a very typical solution by automation engineers: Painstakingly re-engineer the environment and the product to solve one specific problem in robotic assembly. Rinse, repeat. This is why automation is prohibitively expensive today.

Humans are just still much better than robots at “seeing with their hands”. What would have helped the robot to put the bolt through the hole without changing the assembled product is tactile feedback, maybe a gripper capable of in-hand manipulation, and above all the AI to compute the correct motions based on that feedback. We’re not there yet, but we’re getting there. :smiley:

What is also often done in these cases actually is to use some (passive) compliance. Either make the EEF compliant in such a way that it can deal with positional variation in the environment (ie: hole not in the exact spot) or make the robot control compliant. Various industrial robot OEMs have solutions for this. One example would be Fanuc’s soft float: it makes the robot compliant in one or more (Cartesian) axes allowing it again to ‘cope’ with a certain amount of positional variation when needed (of course having a lead-in bolt helps always).

If there is anything that you think could/should be added to ros_control, please feel free to approach and let’s discuss. I think we have been pretty open in the last few years to new controllers and modifications, the only thing that’s delaying progress these days is my reduced involvement as I can’t commit as much as I used to.

Anyone around for ROSCon? We should meet up and could even draft up a roadmap.

As for the specific issue about joint_trajectory_controller: Streaming and submitting full/partial trajectories at once are somewhat colliding. You could do it but you’d suffer on performance: blending takes a bit of time and changes the beginning of the trajectory slightly for no big gain. A simpler minimum jerk position controller should do a better job there.

ros_control has always kept focus on enabling realtime control for those who need to implement their RobotHW that way, in fact, controller input interfaces don’t have to be ROS, could even use any other messaging that provides a better performance/high level interface. An example of this is the Valkyrie from NASA where the ros_control layer can be used to use any controller but when they deploy it for complex tasks, IHMC’s controller takes care of the whole-body control with higher level interfaces.

Sorry if this post was a bit messy, probably tried to reply to everything at once :slight_smile:

2 Likes