ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A

SD-TwizyModel in


We just suggested an implementation of the SD-TwizyModel in
Issue #2

Find the complete implementation at Merge Request !32

In this feature implementation of the SD-TwizyModel in Autoware.AI we have included:

  • Kinematic and dynamic robot description
  • Collision model
  • Visual represenation
  • Various worlds in Gazebo (optional)
  • Model control with joystick
  • A node to bridge Autoware.AI framework and the plugin.

urdf_graph.pdf (22.9 KB)

Simulated sensors

  • Velodyne VLP-16
  • 8 x Blackfly S 2.3 MP


1. Joystick

The robot supports the generic Linux joystick controllers. The sd_control package contains a node to turn joystick commands into control messages that drive the throttle and steering of the model.

  • Input: /joy
  • Output: /sd_control
2. Autoware.AI
  • Input: /ctrl_cmd
  • Output: /sd_control

We believe it’s important to have a realistic simulation implementation in any self-driving software stack. This vehicle model provides a well-documented and comprehensive interface with Autoware.AI. The node that bridges Autoware.AI with the SD-TwizyModel facilitates the implementation and testing of autoware.

Preferably, please report any bugs at the Issues section of the original StreetDrone GitHub repo.

More info on how to use and how to launch:
Blog post