Azure Kinect ROS sensor driver is now available

Hi everyone,

I’m here to announce that we’ve just released a ROS node to add compatibility for the new Azure Kinect Developer Kit (https://azure.microsoft.com/en-us/services/kinect-dk/).

This node provides compatibility with ROS Melodic Morenia on both Ubuntu 18.04 and Windows 10. It provides access to a number of sensor streams from the Azure Kinect, including:

  • Raw infrared, color, and depth Images
  • Registered depth and color Images
  • A PointCloud2, optionally colored using the color camera
  • An IMU stream
  • Factory-captured intrinsic and extrinsic calibration data for the color and depth cameras, as well as the IMU

The source code for the node is available here:

We’re looking forward to hearing your feedback!

4 Likes

That looks super cool! Are there any plans to share some bag files created with this sensor? Would love to see it in action!

I don’t have plans to release any bag files at the moment: releasing recorded camera data is a bit tricky. However, we have published some sample recordings in the native Azure Kinect recording format (mkv). It would be possible to add Azure Kinect mkv playback to the ROS node so that it can play Azure Kinect recordings into ROS.

You can find the example recordings here: https://www.microsoft.com/en-us/download/details.aspx?id=58385&WT.mc_id=

And information on the playback API can be found here: https://docs.microsoft.com/en-us/azure/Kinect-dk/record-playback-api

Is that a technical limitation, or a legal one?

@gavanderhoorn The tricky part is due to the group policy on videos. My team will be working with the Kinect in a different environment next week and will capture rosbags and video.

Ok, clear. Thanks Lou.

Looking forward to those bags.

Does it support ROS Kinetic?

Not officially, no. The underlying Azure Kinect SDK only officially supports Ubuntu 18.04. The ROS driver only supports the matching release for Ubuntu 18.04, which is Melodic.

Anecdotally, I’ve heard that some people have had success getting this working on Kinetic, but you’d need to first install the Sensor SDK and get that working before you could try to build the ROS Driver.

Where can I get more information about things like bandwidth requirements, outdoor performance, and if multiples of these things can be looking at the same area?

More information about the Azure Kinect itself can be found on its docs page.

To answer your specific questions:

  • Bandwidth: The Azure Kinect requires USB3.0 bandwidth. To my knowledge, this is a hard requirement: the camera won’t function at all on USB2
  • Multiple cameras: There’s support in the SDK for running multiple cameras in the same space by using a sync cable to avoid interference. More information about the various multi-camera setups can be found here and here. There isn’t support for multi-camera mode in the ROS driver, but it would be reasonably simple to add it.
  • Outdoor performance: I’ve never used the Azure Kinect outside, but it’s fundamentally similar to other IR-projector depth cameras, which typically have poor performance outside.
1 Like

I see. That makes a lot of sense. Thank you for the response and the links.

The d435 actually has really good performance outdoors with the projector turned off (as far as realsenses go anyway).

Yes, the newer RealSense cameras (like the ZR300 and the D435) work outdoors since they can fallback to a pure-stereo depth approach when their IR projectors get washed out by the sun.

Azure Kinect doesn’t have a stereo fallback mode, so once the IR projector gets washed out there’s no more depth data.

Something I’ve not yet tried is using depth_image_proc/disparity to extract stereo depth from the IR and RGB cameras. If the scene was well lit with both IR and visible light (outdoors in sunlight) you might be able to recover some depth data.

I’m sure what the Kinect lacks in ability because of being a structured light sensor it more than makes up for with accuracy (these point clouds look beautiful!)