Hi all, friendly neighborhood navigator here.
I’m here to let you know about a new tutorial @fmrico has written to explain how to use navigation2 and our amazingly-reconfigurable-plugin-based-behavior-tree-navigation-system to do a very common request among users: dynamic object following, now batteries included into the Navigation2 system.
So you want a robot to follow you around in the airport? How about chase your dog? Maybe let someone guide a robot to an area to do a task? All of these and more are possible with the power of friendship – err I mean dynamic object following!
See the tutorial today! It covers:
- The requirements
- Modifying a behavior tree and explaining how it works
- Some cool videos
The object following capabilities are BYOD, bring your own detector. We wouldn’t want to assume what type of sensors you have on your robot or the type of detectors you have access to. Rather than lock you into a specific implementation, you can use any detector or segmentation algorithm under the sun as long as it can give you a detected position of an object. One such option could be used is the pipeline described by @adlarkin in their SubT tutorial post: https://github.com/osrf/subt_hello_world/blob/master/posts/03_perception.md. Any detector based on a 2D lidar, 3D lidar, RGB, RGBD, etc sensors should work great.
Some detectors will also be made available in the navigation system at a future date when we finish up some related dynamic obstacle work.