This is a follow-up to the rather misplaced initial listing in https://github.com/ros-planning/moveit.ros.org/pull/371
Thank you so much @mamoll, @mikeferguson, @marip8 for the additional notes!
As I sadly did not attend the workshop, here is some feedback from my side, too.
MoveIt already preempts motions if a new collision is detected.
isRemainingPathValid does exactly that and is implicitly used for every single trajectory
we execute and for every change in the planning scene the system observes.
Lifecycle Management of MoveIt nodes
Supporting a simple
SHUTDOWN transition for
move_group is simple enough, but is this what the point is supposed to mean?
Usually industry argues in favor of not using standalone nodes for MoveIt because this intrudes on their internal design concepts.
If people are supposed to use classes/libraries in their own node, I don’t see what we can/should do to support lifecycle management of their nodes.
Replace pluginlib with components
In the last maintainer meeting where we discussed this point, it was very fuzzy what this could look like.
Also, there was no agreement at all among the maintainers whether this would be a good idea, because it would make tracing bugs much harder if no backtraces in clear call-hierarchies are possible anymore.
Were there more insights / some consensus on possible approaches discussed during the workshop?
Industry Priority 1
Support for dynamically updated robot model (for instances, supporting tool changers, especially when the end effectors are actuated)
This is not just a matter of MoveIt, but the general question on how this is should be implemented across ROS2. Still, MoveIt could pioneer a new “standard” approach for this (and extend other packages to support it).
Support for multiple robots in a scene (and actually being able to plan with multiple robots at the same time)
This is already possible right now by having a big urdf with all robots and many people did it in the past. What exactly is new/requested here?
If multiple separate URDF models in one scene are targeted, I would second this request, but this also leads to problems such as
- how/where do you define the geometric relation between the models?
- how/where are joint states for all models reported and merged?
- Should MoveIt support any number of
CurrentStateMonitors ? Should we generalize all interfaces to support any number of robots?
Industry Priority 2 - Broadly described as “more moveit task constructor features”
Great! I would appreciate more detailed feedback and further explanations to actually improve things in person.
More sensors supported in planning scene, instead of just pointclouds
What does this mean? Sensors are not at all supported in the PlanningScene, geometric measurements are. What alternative measurements are relevant to people?
Add scene graph support to represent the relationship between objects
This would make for a great “Milestone 2/3” entry!
It’s an often-discussed shortcoming of MoveIt as of today and it’s clearly achievable.
Add support for Convex Collision checking, to support things like TrajOpt
We noticed that this might mean two entirely different things and I would still like to know what was meant in the workshop discussions:
- Support for declaring/handling meshes as convex, so as to exploit faster collision checking algorithms in this case
- Support for heuristically checking for collisions of trajectories by creating the convex hull of start and end state.
Convex mesh support would make a wonderful MoveIt2 milestone entry.
The second interpretation could improve continuous collision checking, but might be more involved.
Visual Impact 1 - Add support for mobile manipulation
As I’m probably one of the very few people who used this in recent years, here’s my experience with the current state:
- The virtual joint can be used to define a single holonomic joint for driving or flying.
- The whole concept of “workspace” in MoveIt currently limits the positional range of these (otherwise unlimited) joints.
- With a holonomic robot base, e.g., the PR2, planning arm&base motions together works. The results are trajectories that, for example, move the robot around while grasping something.
- The trajectories can, however, not easily be executed as there are usually no base controllers that can receive a MultiDOF trajectory and actuate it.
moveit_simple_controller_manager does not support any ROS interface to forward such a trajectory either and needs to be extended/replaced.
- especially with such base motions, motion planning should typically be biased to prefer moving arms over the entire base. This is not trivially available in the pipeline at the moment, although OMPL probably has some support for it (e.g., optimizing for minimum effort)
Add planning support for diff drive robots
This boils down to implementing a new joint type
DifferentialJoint and support it throughout the pipeline.
I would expect this to be quite challenging when working on the details.