ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A

Safety Working Group meeting - 6th October

The Safety Working Group is currently operating on an as-needed basis, meeting when there are agenda items to discuss.

This month we have a proposal from Shawn Schaerer regarding some nodes for safe control of robots: a geofencing node and an obstacle detection node. Shawn’s company is working on a certified safety-critical robot application, so these nodes should be of interest to anyone working on a robot application with safety aspects (which is nearly all mobile robots!). Shawn is considering open sourcing these nodes, so if you want to see that happen come along to the meeting and show your interest!

The meeting will take place at 2021-10-06T14:00:00Z. The meeting will be held using Google Meet at this address.


Here are some brief notes from the meeting. Thank you to all of those who joined and contributed to an interesting discussion about safety certification of autonomous systems.

  • Presentation from Shawn Schaerer, CEO of Northstar Robotics
  • They focused on development of perception and autonomous off-highway vehicles (specifically a snow plough for airport aprons).
  • An example of the perception difficulties they deal with: Driving in snowy conditions where it’s not possible for any sensor other than RADAR to work (and even a human can barely see).
  • Their platform is based on QNX. It is capable of autonomous navigation in a geofenced area, and lead-follow navigation.
  • Northstar Robotics is a company you can go to for help when you need to certify a robotic application.
  • The have full traceability via a QMS and their process is compliant with ISO 26262.
  • Their safety system is based on a 3-zone algorithm for slowing down and stopping the vehicle. It is capable of replanning the path (assuming the application allows for that), and combines 3D vision, RADAR, LIDAR, thermal and GPS. It is single-fault tolerant and designed to meet ISO 26262 and IEC 61508.
  • They are looking at open sourcing two nodes: A geofence node and a 3D perception algorithm.
    • The geofence node is based on specifying geofences and keep-out zones as polygons.
    • The 3D perception algorithm uses a 3D camera, and provides which zone an obstacle is in. It is currently based on a ZED camera, but they are looking at also supporting the OAK-D.
  • They are considering open sourcing the entire (safety certified) autonomy stack.

That was a great session, thanks @gbiggs and @Shawn_Schaerer!