We are pleased to announce at Computex Taipei the release of Isaac ROS 3.0 with major updates including new packages, performance improvements and fixes.
The Isaac ROS update for ROS 2 Humble is available at github.com/NVIDIA-ISAAC-ROS, including packages for AI perception, image & LIDAR processing, navigation and adds:
New! Multi-camera nvBlox 3D reconstruction for costmaps used in motion planning with Nav2 and MoveIt 2.
New! Segformer and SAM (Segment Anything) DNN packages for transformer based image_segmentation.
New!data_recorder for multi-sensor data capture to ROSBag using MCAP and data_replayer for AI and perception development with time synchronized real sensor data at 500+ mega samples per second.
New! Software for out of the box ROS 2 development with Nova Orin Developer Kit to stop losing time on system software and drivers.
Updated ESS 4.0 stereo depth estimation DNN with improved accuracy, mobile robot and manipulation use cases with higher performance at >100 frames per second.
Performance improvements with event based scheduling for NITROS.
ESS 4.0, AI based depth estimation for manipulation and AMR applications.
Isaac ROS 3.0 is available now at github.com/NVIDIA-ISAAC-ROS and is part of our commitment to provide features and hardware acceleration for commercial deployment, development and research for autonomous robots.
Install pre-built Debian packages, leverage the Isaac ROS Dev container, or clone the repositories you need into your ROS workspace to build from source with colcon alongside your other ROS 2 packages. Please note that this release has been tested on the NVIDIA Jetson AGX Orin with JetPack 6.0. For more details see the release notes.
A minor hotfix will be released this June with the next major release in Oct at ROSCon 2024.
Also make sure to check out Nav2’s new Docking Server released in conjunction with Isaac 3.0 and supported by NVIDIA’s ongoing collaborations with Open Navigation!
Thankyou ggrigor and team for making this suite available, this is awesome! Are there any limitations I should immediately be aware of for Orin Nano on dev kit?
Review the Jetson Orin Technical Specifications (bottom of the page) to be informed on limitations for a Jetson Orin Nano vs an Jetson Orin AGX. These typically com from hardware not available in Orin Nano, lower compute, or reduced memory.
The performance results have a column for Orin Nano which provides an easy metrics based comparison between platforms. For some benchmarks Orin Nano is not supported due to a lack of HW, or limitations on memory to fit AI models.
For a concrete example, Orin Nano does not have H.264 encode, so the hardware accelerated Image compression to encode image data into H.264 is not present, however H.264 decode is.
Isaac ROS has been updated to 3.0.1 with the release of Isaac Sim 4.0 and adds sensor simulation with NITROS to improve performance.
Simulated sensor data for a robot under test typically passed using the DDS from the simulator to the ROS 2 application across processes using memory copies on the CPU. This is inefficient as simulated camera pixels are rendered on the GPU, and processed by the Isaac ROS 2 application on the GPU.
NITROS using type adaptation (rep-2007) keeps simulated camera pixels in GPU memory when passing image topics from the simulator to Isaac ROS application. For more details see the tutorial.