AWS recently released a new blog about how to deploy and manage ROS applications with AWS IoT Greengrass 2.0 and Docker.
AWS IoT Greengrass 2.0 is an open-source edge runtime and cloud service that reduces complexities when deploying and managing ROS applications on robots. With AWS IoT Greengrass 2.0, developers can add local compute, messaging, and data management capabilities to their robotics fleets. This helps developers reliably deploy updates, remotely manage software, and securely connect robots to cloud services. You can use AWS IoT Greengrass 2.0 to:
- Remotely manage the application lifecycle (install, run, and shutdown) on robots
- Deploy software updates over-the-air to fleets of robots
- Securely connect robots to operator-facing dashboards and monitoring tools
- Deploy machine learning (ML) models and run ML inference at the edge for tasks like object detection and image classification
- Ingest large amounts of raw telemetry, sensor data, and logs into cloud-based stream processing services like Amazon Kinesis
The sample application runs three containers using a Docker Compose file. Two of the containers run the ROS2 demo_node_cpp ROS package with a talker and a listener. This demo uses local ROS messaging to send and receive a “ Hello World ” message over the ROS topic /chatter. A third container uses the AWS IoT Greengrass SDK to bridge messages published over the ROS topic /chatter with a local socket used by AWS IoT Greengrass for interprocess communication between components. AWS IoT Greengrass then relays the message over an MQTT topic named chatter in the cloud.
To check out the open source sample code:
And to read the entire blog:
If you are interested in the solution, feel free to reach out!