We’re happy to announce the ROS 2 release Dashing Diademata!
We’re especially excited to let you know that Dashing Diademata is the first long(er)-term support (LTS) release for ROS 2. After several years of development, and following a big boost in productivity over the past half year from new contributors, including the TSC membership, we’ve reached a level of maturity with ROS 2 such that we’re extending the support period for Dashing to be two years, through May 2021.
So whether you’re looking for a platform on which to build a new application, or planning to migrate an existing ROS 1 system, Dashing should be your starting point. Over the coming two years, we’ll be providing patches for Dashing. While we can’t guarantee API compatibility between ROS distributions, for the updates to Dashing we aim to maintain API and ABI stability. This matches what we’ve done in the past with ROS 1 LTS distributions.
To get an idea of what’s in this release and how to update existing code from ROS 2 Crystal, be sure to read the Dashing release page.
Here are a few features and improvements we would like to highlight in this release:
Components are now the recommended way to write your node. They can be used standalone as well as being composed within a process and both ways are fully support from
- The intra-process communication (C++ only) has been improved - both in terms of latency as well as minimizing copies.
- The Python client library has been updated to match most of the C++ equivalent and some important bug fixes and improvements have landed related to memory usage and performance.
- Parameters are now a complete alternative to
dynamic_reconfigure from ROS 1 including constraints like ranges or being read-only.
- By relying on (a subset of) IDL 4.2 for the message generation pipeline it is now possible to use
.idl files (beside
.action files). This change comes with support for optional UTF-8 encoding for ordinary strings as well as UTF-16 encoded multi-byte string.
- Command line tools related to
- Support for Deadline, Lifespan & Liveliness QoS
- MoveIt 2.0 alpha release
OpenEmbedded Thud (2.6)/webOS OSE as Tier 3 supported platform
We’re looking forward to getting your feedback and contributions, and to hearing about your new applications based on Dashing! If you have demonstrations of Dashing from your own work that you can share, feel free to post in this thread.
We also invite you to release your ROS 2 packages in Dashing! A huge thanks to all those who’ve already participated in our pre-release testing and packaging effort.
And finally the name of the next ROS 2 release scheduled for November 2019 will be:
Your friendly ROS 2 Team
P.S. Show your color and get a dashing T-Shirt / Hoodie.
Congratulations for this milestone! Acutronic Robotics would like to thank everyone in the community and specially the folks from Open Robotics who took part into this release and helped putting it together.
It’s been about 5 years contributing to ROS 2 for some of us here at Acutronic. This LTS release makes us specially proud of what’s been achieved. Our latest contributions are summarized in this article. We’d like to celebrate this launch by sharing with you some of the demos we’ve been putting together over the last period. All of them based in ROS 2 and powered by MARA, which uses ROS 2 native hardware. Enjoy!
Sensorless collision detection with ROS 2
Here we present the demonstration of a sensorless collision detection system for the MARA modular robot using some of the moveit_core submodules of MoveIt 2. The whole system is based in ROS 2 and has been tested using the Dashing Diademata pre-release while leveraging the real-time capabilities that our team is developing as part of the H-ROS communication bus for robots.
For more on this, refer to this article.
Planning to a joint-space goal (first demonstrator of MoveIt 2)
We present the first demonstrator of the capabilities of MoveIt 2 by showing how to plan to a joint-space goal and how to reproduce it with Dashing. Refer to https://github.com/AcutronicRobotics/moveit2/releases/tag/moveit_2_alpha for the alpha release of MoveIt 2.
Read more in this article.
Reinforcement Learning with Gazebo and ROS 2 in an industrial robot
Often researchers of AI benchmarking formal methods vs sub-symbolic ones find hurdles when it comes to the robot interfaces (real vs simulated). Here we demonstrate how through our contributions (ROS2Learn framework and gym-gazebo2 toolkit), the transfer of a learned policy from simulation to a robot has become easier. The video shows how we can replicate the behavior demonstrated in simulation accurately using a real robot and with the same ROS 2-powered interfaces.
Get to know more about RL in ROS 2 here.
Multi-robot coordination 1: Distributed millisecond-level control of 30 joints with ROS 2 (while having traffic challenging communications in the background)
Getting different robots to coordinate together precisely is one of the challenges of system integration. This becomes specially tedious when each robot uses proprietary interfaces. A complete integration hell. We demonstrate here how 5 robots running ROS 2 natively are controlled with a single computer achieving millisecond-level precision, while the network is being challenged with simulated depth sensors. Through ROS 2, the H-ROS robot bus provides real-time and synchronization allowing to control simultaneously 30 (5x robots, 3x 2DoF joints each) joints.
Read more in this article.
Multi-robot coordination 2: ROS 2 empowered distributed synchronization of industrial robots
Synchronization and repeatability are essential for industrial robots to be reliable. These MARA modular arms coordinate precisely thanks to the H-ROS communication bus which empowered by ROS 2 is able to provide sub-microsecond and distributed synchronization.
Read more in this article.
We look forward towards what ROS 2 is bringing to the overall robotics ecosystem and hope to continue contributing. Cheers for Dashing release and cheers for ROS 2!
Congratulations on the LTS release! Thank you very much to all contributors and to the team at Open Robotics!
Micro-ROS uses the ROS 2 stack to bridge the gap between powerful microprocessors and embedded microcontrollers (MCU). The two major goals:
- Integrate the different types of computing platforms seamlessly
- Ease the portability of ROS code to microcontrollers
Due to the well-designed abstractions in the ROS 2 stack, the rmw and rcl layers may be basically used unchanged on MCUs. On the middleware level, the upcoming DDS-XRCE standard allows the communication from and with MCUs at only a few tens of kilobytes of RAM.
Although micro-ROS started as a joint endeavor by five companies/institutions eProsima, Acutronic Robotics, ŁUKASIEWICZ - PIAP Institute, Bosch, and FIWARE Foundation in the context of a European project, we strive to incorporate the ROS community as early as possible. Join the ROS 2 Embedded SIG to learn more!
As a first community use-case, we brought micro-ROS to an STM32 F4 and created a tiny demo using the Kobuki Turtlebot 2 with it.
Check it out at https://github.com/micro-ROS/micro-ROS_kobuki_demo/.
Further use-cases with a modular manipulator, a lawnmower robot, integration of a drone autopilot, and robot operating in smart warehouse will be developed and demonstrated in the next 18 months.
We have also prepared another small demonstration. In this case, we show a micro-ROS temperature publisher.
Check the next video to see how it works:
Congratulations on the release! Thank you to Open Robotics for their continued work on coordinating all the ROS2 efforts.
LG Electronics has been building LGSVL Simulator, an autonomous vehicle simulator that is compatible with ROS2. Our simulator is Unity-based and features photorealistic environments, high-performance sensors, and a Python API to control non-ego vehicles, objects, and configurations. Through our use of ros2-web-bridge, anyone can connect their ROS2-based autonomous driving (AD) stack to our tool to test and speed up development. The simulator provides sensor input to the ROS2-based AD stack, and the AD stack provides control commands back to the simulator.
One way we have been using LGSVL Simulator is for deep learning training. Here we show a video of a lane following test run after training a deep learning model in the LGSVL Simulator. ROS2 is used to run a simplified autonomous driving stack that publishes steering and throttle commands to the vehicle inside the simulator, taking only images from the camera inside the virtual vehicle as input. The end-to-end lane following model was trained in various environmental conditions within the simulator, and is robust to weather, time-of-day, and visibility conditions.
If you’re interested in getting started with machine learning research using ROS2, the full documentation and guide to this project have been posted here, and you can see the code and trained model on Github.
We also have a tutorial on getting started with creating your own AD stack in ROS2 with LGSVL Simulator. Thanks!
Nice! Is it working faster than with ROS1 bridge?
Congratulations to the ROS community and to Open Robotics on reaching this new milestone!
Here at Apex.AI we are pretty excited about this release and wrote this Blog post to celebrate.
As noted in the Blog post, Apex.AI and Tier IV have now contributed a ROS 2 based 3D perception library for automotive applications to Autoware.Auto, which is shown in this short video.
For anyone who does not know what we do, in brief, Apex.AI is building Apex.OS, which is API-compatible to ROS 2, runs in hard real-time and is being certified to the highest level of the automotive safety norm ISO 26262 (ASIL-D).
In case anyone needs some advice with transitioning from ROS 1 to ROS 2, we previously also wrote this detailed Blog post describing how to transition software from ROS 1 to ROS 2.
Congratulations to Open Robotics on hitting this big milestone. Here at Rover Robotics (www.roverrobotics.com) we are excited to announce our new series of demos that we are creating in conjunction with Open Robotics and Amazon AWS to showcase ROS 2 running on industrial-grade, reliable hardware. Our robots were originally designed for SWAT teams so they are built to be really rugged! We have over 10 years of experience in fielding reliable robots and we are excited to show the world what reliable hardware can do when you have reliable software backing it.
Our demo series will include how to setup an use common packages in ROS 2 such as Google Cartographer, AMCL, RVIZ, and Gazebo. We will be contributing to the sensor drivers needed for these demos to ensure they are reliable for all to use, and lastly we will be hooking this all up to AWS RoboMaker for a professional workflow, professional-grade security and a reliable back-end. Our first full write-up will be released on June 30th.
Our robots come in 3 different drive trains
Visit our website to checkout more about their features and specifications.
I have a working example for publishing and subscribing directly to ROS2 (crystal) from Unity using a custom C# rcl client library and message generator. This approach should have significantly better performance than rosbridge.
We’re all excited about Dashing Diademata!
btw, the Dashing Diademata logo gets cropped by social media because the image is 718x1000 (portrait). Social media (twitter, linkedin, et al.) need something between square and landscape. So poor Dashing keeps getting his head lopped off
so I “fixed it” download here
. I’m an engineer not marketing so don’t claim perfection, merely that square with space top & bottom should solve it?
Arghh! … almost, needs to be more landscape
@joespeed I roughly measured ratio from your LinkedIn screenshot and resized width of canvas of your file according to it:
Here’s a file for sharing: dashing_2449x1280.png
(edit: Tested with LinkedIn and Twitter, updated the file)
ADLINK Technology is very excited about ROS 2 dashing! We’ve been working with ROS 2 since its inception. Dashing is supported in ADLINK ROScube ROS 2 controllers (HW+SW), Neuron SDK (HW optimized ROS 2 dashing), open & commercial OpenSplice DDS used in AVs/industrial/military, and the small & fast Eclipse Foundation’s Eclipse Cyclone DDS rmw_cyclonedds to which we contribute. Here’s ADLINK ROS 2 dashing demo from IEEE ROS Summit last month with AWS RoboMaker gazebo
Here is Bill Wang of ADLINK’s Robotics Group talking about ROS 2 and our products yesterday at Intel AI IoT DevFest IV
ADLINK makes robot controller modules appropriate for ROS 2 dashing ranging from credit card sized ARM & Atom to Intel core i3/i5/i7 w optional Movidius Myriad-X VPUs, NVIDIA rugged industrial GPUs, to NVIDIA Xavier.
Here’s more from last month’s IEEE ROS 2 Summit put together with the help of many. You’ll notice familiar faces including @gerkey, also ROS 2 dashing demos. 600 developers & roboticists made it … the biggest ROS 2 event ever?
We are happy to announce TurtleBot3 ROS 2 Dashing Release.
This updates includes
- OpenCR(Embedded Board) communicate with
turtlebot3_node by Dynamixel protocol
(For more detailed, please refer to attached picture below)
hls_lfcd_lds_driver package was ported to ROS 2 Dashing
- Added some services (/sound, /motor_power, /reset)
- Added some parameters
- Applied message filter to calculate
For more detailed, please refer to this link