Cameras with 360º FoV in ROS


I noticed that more and more consmer-grade cameras with 360 FOV are available (Samnsung Gear 360, Nikon KeyMission, kodak pixpro 360, etc.)

Wonder if anyone successfully managed to do live streaming from these cameras in Ubuntu and to use them in robotic application.





Disclaimer : self-promotion ^^.

A few years back I was playing with a 360º imaging using two uEye camera UI-3240CP – IDS-Imaging each mounted with 185º FoV fish-eye lens.
You can find short videos here.
Since then plenty of nice camera got to the market as you mentioned, unfortunately I didn’t get to play with any of them.



I’m not sure about those cameras, but Econ Systems makes a rig for the TX1 that supports 6 cameras to provide 360FOV.

I can only talk about the Kodak PIXPRO SP360 (and 4K). They work as normal USB cameras. But connecting them via USB doesn’t give them enough current to keep on forever (they drain more than they charge). Maybe fixable with some kind of powered USB cable?

My little work can be found here:

I have a short experience with SP360 (firmware version 1.0.5). I think it is important to highlight the differences of SP360 and SP360 4K since AFAIK only the SP360 4K version appears as V4L2 device.

The USB interface in SP360 serves only for storage access purposes. The only way I have found to stream the live image is setting the cam in WiFi mode and sniffing the MJPEG stream. Some instructions on that can be found here. Since this stream is actually meant for live preview on a mobile app, the quality and resolution is far from the cam is actually capable of. Although the cam has an HDMI output, it is only enabled in playback mode and does not provide live feed.

The SP360 4K has a webcam functionality and enables live stream to HDMI (now the external capture cards are also an option).

1 Like


does anyone know if the Kodak Orbit 4k is compatible with the ROS this node or at least with Linux?

I think @veiko was being too modest. Not only did he get the Kodak 360 camera streaming, he wrote RViz plugins to blend and display a spherical image (requires 2 cameras, ~1 hemisphere each) and use a VR headset.

Spherical image:



Hi, we are using RICOH Theta S.
We can use libuvc_camera package to obtain image via USB streaming.



Is the RICOH Theta S working well with the UVC driver on Linux?
We bought a RICOH Theta V only to realize that it is using the UVC1.5 driver which is not supported. I can’t seem to find any other confirmation besides your post about the compatibility of the “S” model, using UVC1.1, and it would be great if you can confirm that it is working fine before we buy another camera :slight_smile:


We are using RICOH Theta S with libuvc_camera, using UVC1.1 (Motion Jpeg mode). The frame is a combined 1280x720 image at 14fps. I think you can use other package like gscam or usb_cam.

I didn’t know about RICOH Theta V, but do you mean you cannot make Theta V work with current driver and packages? I expected V model is upper compatible to Theta S… but checking the spec sheet now, it seems support only H264 streaming… (it means we need UVC1.5 or higher, which is not supported by libuvc :slightly_frowning_face:)

Thank you for the confirmation! We will go ahead with the S model then.

I will check periodically what the status of support for uvc1.5 is and post it here when I can make the V model work. I assumed compatibility as well, but they dropped support for 1.1, which is my bad, I should have checked before.

I seriously doubt my boss will allow me to allocate time to develop the driver, but if it happens I will post the news here.

It’s wonderful if we can use UVC1.5 and H.264 encoding. I’m waiting for good news!

what do you mean with combined 1280x720 image, the equirectangular 360º panorama?

Also, how does the integration work? Can I plug it to a computer and have the computer handle everything ( turning it On/Off, modes, data handling, etc)

1 Like

Has anyone had any success running the Theta S or V with UVC1.5?
I think live streaming with Ricoh Theta V is possible now. I will update soon with my results


Do you have an update per chance?
And if it works, what is the maximum resolution for video streaming using UVC?

We use Pointgrey (now FLIR) Ladybug 3. It connects via firewire and we use camera1394 driver. Reading images works well, you get the 6 synchronized images as one image where all cameras are “sticked” together into one very high image, so afterwards we do our own postprocessing to “saw” this image into six normal images, and further we blend a panorama or we can emulate a virtual camera looking somewhere inside the captured “sphere”.

For processing of the images, we use and . This software is not really released, but if you’re interested, I can share you ZIPs of the package sources for Melodic.

I am now trying to receive theta v camera data.
Did you succeed in receiving data in Ubuntu environment?

We Delivers.AI recently generated a report on this so I am sharing it in here.


Since this report also considers software and explicitly talks about fisheye lenses and the need for projection, I wanted to link

which is a package that can create virtual cameras from any number of real cameras.
Also, most cameras don’t need an explicit ROS driver but work with a generic uvc or gige ros driver (I’m saying that because in the report it seemed like for some cameras it was noted that they don’t have a specialized ROS driver, chances are that they don’t even need one).

1 Like