ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A

Navigation2 Marathon: No college students harmed in the making of this work


I wanted to share with you some prelimanary results of some on-going work happening in the Navigation working group, and specifically @ruffsl @fmrico and I.

We wanted to stress test the Navigation2 stack and work out all the kinks prior to the Eloquent release. Between Dashing and now, we’ve made a bunch of great progress optimizing performance and fixing the last few bugs that would make it challenging to use Navigation2 in production / extended use. We’re happy to report that we have resolved all those bugs and the performance is extremely stable.

To test this, we ran 2 industrial robots, a Robotnik RB1 and a Pal Robotics Tiago across a college campus for a little under 24 hours. We traveled over 40 miles in total with these robots with no disengagements, failures, or otherwise collisions / safety concerns. This experience, we called the Marathon2, is in the spirit of the Willow Garage experiment that resulted in the original navigation stack paper. We conducted a marathon-like experiment intentionally navigating through the highest-traffic areas of a campus building like central staircases, indoor bridge, and hallways with classrooms during passing periods. To be fair, there’s still work to do (clearly from the videos and the wiggling, the controllers need some tuning), but this shows that we can navigate safety a high-traffic, human-filled space with the current state of this project! No college students were harmed in the making of this paper (my lawyers said I have to mention that).

You can see our pre-print version here and the Navigation2 code here. There are more videos from this work on the Navigation2 documentation website, so please go and also check that out.

As always, if this is something that excites you, reach out, join the working group, file tickets and PRs, and generally just get involved! We have a pretty exciting roadmap of work towards a V1.0 release of navigation and supporting new classes of vehicles and applications like:

  • Ackermann steering (car-like) robots
  • Gradient based perception to allow running outside of planar environments
  • Integration of visual odometry / positioning
  • Demo applications and interfaces
  • Optimization and feature development
  • More! If you come and join in, you have a great deal of influence in direction :wink:

Really nice work !
On a simulation approach, did you try to run a lot of simulation in Ignition Gazebo before the Marathon2 in order to tune some parameters of the stack ?

So Tiago can be switch to ROS2 ? Nice to hear that, I will tell my colleagues right away.

hello, I would like to participate. What should I do?

@VimasterP send an invitation to the working group!forum/ros-navigation-working-group-invites

Then check the ros calendar

I would like to join too :slight_smile:

Hi @tomlogan501

We used the baseline parameters, mostly. We didn’t use gazebo to tune parameters.

Tiago doesn’t run ROS2, and our robot ran16.04/kinetic so we executed nav2 in an external computer. Instead of ros1_bridge, we developed dedicated bridges per topic. If you follow the links, you will find the details to reproduce everything :wink:


@tomlogan501 @VimasterP If you add the navigation working group event from the calendar, you can get the link for our meetings to join in!

In the meantime, there’s a bunch of tickets on the project repository and a project for the V1 release tickets. If any of those sound like things you’d like to tackle, comment on them and take it on! A great place to start is with our documentation website and get started with an install and test it to get an understanding for it.

Was the 24 hours of running contiguous or total?

Total, night traversals wouldn’t be much help if there’s no students around to potentially hit, I’m not going to take an easy win :wink:



I have one question about the paper, please see the quote below:

Using ROS2, the behavior tree nodes in the navigator can call long running asynchronous servers in other processor cores. Making use of multi-core processors substantially increases the amount of compute resources a navigation system can effectively utilize.

My understanding for the paragraph is that behavior tree nodes can have better performance in multi-core system. It doesn’t mean the navigator is able to specify the processor core to the particular asynchronous server, right?

Yes, that is correct.

You could though (not currently implemented, but if useful to you, I’d love the help in implementing) register the nodes as components and have specific component clustered together and then each of those in specific cores.

1 Like

You could call it an easy win, but on the other hand continuously running for so long is not easy in my opinion, and the ability to do that is one of the things that the original ROS marathon demonstrated. I also think that night traversals would be helpful, if only to demonstrate what happens when a major environmental factor changes.

So to rephrase the question based on new data… what was the longest continuous run?

Good question, I’m not sure. @fmrico?

True, and building highly robust 24-7 uptime navigation systems is a bit of a speciality for me. There are robots I deployed in past gigs with many months of continuous uptime. Even when they went down, it was usually because of a power outage at the location or event out of our control unrelated to the robot software. I would personally love to do that with navigation2, but I myself am lacking in robots and specifically robots with docks to do that with. If someone wants to gift me robots with docks, I’ll get that sucker rolling around for a month continuously, or fix the stack til it can.


Surely the OSRF can spare a turtlebot2 and dock. :innocent:

So to rephrase the question based on new data… what was the longest continuous run?

Good question, I’m not sure. @fmrico?

Robots ran all the time we were at the lab (9:00- 13:00 and 15:00 - 18:00) and only stopped for recharging computer battery, in the case of Tiago. Robot charged at night. We usually ran robot 4 hours non-stop.


Open call to robot vendors, if you give me a robot, not only will I use it, but I’ll make a bunch of pretty demos with it and advertise all over the place :wink:. I have some visual SLAM projects I’m spooling up in Navigation-land in need of a platform.

Next time, we’ll have to stack LiPo batteries to fill the payload capacity and make it barrel down the hallway at a full 3 m/s speed to impress @gbiggs. I may need to change this title to “very few college students were harmed in the marking of this work”. I’d need to write some safety controllers.


This sounds like a statement we have to put to the test! I’m going to see if we at Nobleo can make a copy of our smallest internal diff-drive robot. :robot:
Battery capacity is large and a lidar is available, that should get you going for some navigation duration testing :slight_smile:
It will however take some time to get permission and order all items, so if anyone has a spare that would still be the faster track.


And we have a go! :robot:
We won’t create a copy but we can lend you one of our bots! I’ll take the how-when-where up in a private chat :slight_smile:


Thanks for doing that, @Timple! It’s a great contribution to the community.

As a followup: The navigation2 introduction paper described above ( was accepted and will be published at IROS 2020!

If you use navigation2 or compare against it, please cite us in your works!

Happy navigating,