OpenCV AI Kit (OAK)

As far as I know the die package sensors are designed for mobile devices which have strict low profile requirement. The light enters into the lens with big bending in order to cover all the sensor surface, the sensor CRA (actually they are micro-lenses on top of the sensor) should match with the CRA of the lens on the module. That’s why the OV9282 is designed with bigger CRA (Chief Ray Angle), although the electrical performance is the same as OV9281.
Actually the fisheye module is using OV9282, sorry for the confusion.

2 Likes

Thanks for both! We’ll be trying these out in Altium today.

Fits almost perfectly. Just happenstance that the existing mounting hole fits with the lens housing mounting hole, but nice! We could make room for this on the design pretty easily, I think. Would need to move a few passive components and punch a hole (doesn’t need to be plated or have annular ring).



This is just the physical M12 housing - the connector/MIPI pinout would need to be figured out. Likely shorter cables and then either changing the connector on OAK-D, or if possible changing the connector on the ArduCam module.

Thoughts?

1 Like

@ArduCAM @Luxonis-Brandon We’re calling y’all the postmen, because y’all deliver! Expedited shipping.

deliver

Now to figure out an open source calibration routine. I think the camera calibration packages could probably use some ROS 2 love.

4 Likes

Heh. Thanks and agreed WRT ArduCam. Very looking forward to working with ArduCam on this. The NDVI applications alone will be super cool, afforded by lenses with filters

1 Like

Hi guys, I’ve checked in a working first draft of ROS2 support for the OpenCV AI Kit. It can be found here along with some setup instructions:

It’s essentially a ROS2 wrapper for the python interface defined here: https://github.com/luxonis/depthai. It’ll broadcast a topic for each stream specified with the cliArgs parameter. It will also take any input parameter that the depthai-demo.py will take. The included demoListen component can be used as an example on how to receive those topics in your own ROS2 nodes.

For a more detailed list of arguments that can be passed see the depthai-demo.py help or add ‘help’ as a cliArg to the depthai_wrapper talker component.

I’ll be offline for about a week but am interested in any comments and opinions you guys might have. I’m pretty new to both ROS2 and Python so I’d very much welcome suggestions and criticism.

Thanks for reading!

2 Likes

Hi @FPSychotic,

Thanks for reaching out and for the kind words.

To summarize the requests:

  1. Dynamic calibration. So where intrinsics are calibrated at the factory, but extrinsic can be recalibrated w/out a calibration target. We will work to support this. TBD timing on it though.
  2. Color camera that matches grayscale camera resolution exactly. We are actually already working on this with ArduCam. So the OV9782 is color, global shutter, and the exact same resolution/etc. of the OV9282 grayscale. So we plan to make this an option. So all 3 cameras would be the exact same resolution, view angle, etc. and all global shutter in this permutation. We will likely have this in both integrated camera module variant and also M12 mount variants (so you can do your own view-angle/fisheye/etc.)
  3. Synchronization between multiple OAK-D. So we actually have the I2C, UART, and SPI brought out on our System on Module (which OAK-D is built around) for this (and other) purposes. https://shop.luxonis.com/collections/all/products/bw1099 That said, we haven’t investigated this. The Gen2 Pipeline Builder we are making is quite relevant though, as it will provide user-code inside the Myriad X access to the SPI/UART/I2C interfaces: https://github.com/luxonis/depthai/issues/136 Note that this Gen2 pipeline builder functionality is planned for December. And perhaps it could be used for this timing/sync. We’d need to build/test the Gen2 pipeline builder first, and then see if it is precise enough for this.

Thoughts?

Thanks,
Brandon

1 Like

Thanks @FPSychotic,

Yes, really like the idea of IR versions of the cameras. In fact we have tested with IR-capable and IR-only modules. So here are some examples. These modules had 15,000 MOQ so the ArduCam solution will be much better. And we are working with ams to make an active-illumination version, starting with the BELICE-SD (see here).

In terms of TF and UDRF, yes, great idea. So we have STEP and Blender files available here. I’m guessing it should be straightforward to make UDRF and TF from these. We will look into it, but are currently slammed with some additional support (e.g. see the laundry list here) so it might be a bit until we get to it.

And thanks for the kind words. :slight_smile:

WRT VSLAM, yes, with the wide-angle optics afforded by ArduCam, I think this will work well. We do already have object tracking. See here.

In terms of performing more functions than the Myriad X can handle, OAK-D (and OAK-1, etc.) all run cleanly on the Jetson/Xavier series (and presumably, on the Edge TPU, although we haven’t yet tested that).

And sounds great on Rtabmap, Kimera, and Open3D. Would love intros (even on this thread if they’re on here) to strike up making this happen. Once our feature-tracking support (here) is out and ArduCam wide-angle OV9282 and OV9782 cameras are available, I’m thinking this could be extremely useful for the use-cases you describe.

And you could pair it with Jetson/etc. if additional accelerated AI/CV is needed on the host.

You know, ask is for free XD? That is what I would ask to make robots or if I could develop a camera.

Invaluable feedback - thank you!

Sorry by so long post with so basic english.

French and German are the only to I can take a stab at - and I think your English is better than both of mine. :slight_smile:

Thanks again,
Brandon

Thanks by your detailed answer, just amazing to know all that will be included the Arducam version and either the software part.

I’m very happy knowing it will work with Jetsons and probably with Coral TPU, my first prototype which is already done has incorporated a TPU USB, and my next prototype will work with Jetson NX instead.

I will start to work on a OAK version of them, and start to follow the development closer, starting by the links that you kindly gave me.
I think, maybe if I have a mobile robot to perform SLAM and to develop that software area, and is ready or short time after the new SLAM version be released, maybe could help me to get some customer in the company’s starting.
Maybe something very small like this

Or a little bigger like this


1 Like

Thanks @FPSychotic. I love seeing robotics like this. We are actually working with University of Maui on their autonomous racing (evgrandprix and indie autonomous challenge) with OAK-D (multiple of them).


Looking forward to this being on your platform!

Thanks again,
Brandon

1 Like

Ohh, that is a cool and very nice platform. Very high level indeed. I don’t know many about this kind of competitions, but I know there is a lot of talent into them and as I’m seeing good funded too.XD. Really nice, I love it. Thanks by the chat and pictures. Very interesting.

Thanks!

So these will very likely be using the Xavier NX as the base system for OAK-D. And actually likely many Xavier NX.

And the whole project will be released open source so folks will be able to rebuild the whole thing if interested. Or repurpose parts for 1/10-scale application. :slight_smile:

1 Like

Thanks! And those look great!

1 Like

Hi, we are also working on similar project. Do you think we can connect?

Hi @heroku_use ,

Yes, definitely. And sorry about the delay, I missed this message. Yes, definitely. Please feel free to email me at brandon {at} luxonis [dot] com.

Thanks,
Brandon

So to update everyone on the thread, the wide field-of-view cameras discussed here, both M12 and also fully-integrated are now available to purchase from ArduCam for depthai/OAK.



More details here:

Reach out to admin@arducam.com for now to purchase. And we are working to get these just all on our webstores/etc. over the coming month or so. We’re doing some refinements on permutations to make purchasing and use clearer and easier before we do so. So for now we’re doing purchase via email so we can manually check that the permutation requested works correctly.

There are even more options here as well.

Thoughts?

Thanks,
Brandon

3 Likes

@Luxonis-Brandon
It’s great to see that you are seriously taking users feedback into account!
Thank you

I would be very interested to use OAK-D for SLAM application, similar to T265. However, I would like to make sure that it’s equipped with the right sensors, wide FOV cameras and IMU, that are hardware synchronized. Is the OAK-D there yet, or still in progress?

Cheers,
Mo AK

1 Like

Hi @mzahana ,

Sorry somehow I missed this! Yes, we do now have everything that you mention. And we’re soft-launching it now. Wide FOV and IMU included, hardware synced, yes. (We soft launch by offering product for sale in our “Beta Store”.)

We have 4 (main) options actually (and a bunch of other permutations as well, but I’ll focus on the main 4):

  1. OAK-D-W. Like OAK-D, but 150° DFOV instead of ~80°DFOV global shutter cameras.
  2. OAK-D-Pro-W. Same as OAK-D-W, but with laser dot projector and blanket illuminator for no-light depth and night computer vision.
  3. OAK-D-W-PoE. OAK-D-W but with PoE interface instead of USB.
  4. OAK-D-Pro-W-PoE. OAK-D-Pro-W but with PoE interface instead of USB.

OAK-D-W and OAK-D-Pro-W photos (they use the same enclosure):


OAK-D-W-PoE and OAK-D-Pro-W-PoE (they use the same enclosure):

And the following figure is useful, as all 4 of these version support production-time variability on which sensors are used. For example you can choose to have 3x OV9782 global shutter color if you want. Or 3x OV9282 global shutter grayscale. Or 3x IMX378 wide FOV. Or any permutation between those extremes.

And note that the Pro and non-Pro use the exact same enclosure. So it’s only the USB or PoE that impacts the size. And here’s the size comparison:

These things are going like hotcakes. So we’re currently selling out of our next production run before it even finishes (and we’re not blocked by the supply chain - just typical production time).

So if you’d like to get a hoon on any of these, please do order from our Beta store below:

Thanks - and sorry about the delay.

2 Likes

Oh and it’s worth saying specifically: When OAK launched it wasn’t really ideal for SLAM. Now OAK-D quite good for SLAM. And also OAK-D is well suited for VIO as well. Below is an example from Spectacular AI:

And in fact Spectacular AI has a bunch of cool demos. Some on LinkedIN (which are the coolest) but then a couple on Youtube as well:

Thanks,
Brandon

2 Likes

Cross-posting New Platform discussion link here for visibility :slight_smile:

2 Likes