Hi,
I’m trying to develop a rviz2 plugin for a special display hardware, the Looking Glass. It is a TRUE naked-eye 3D display. You may have seen it in Linus’ review.
There are some tools that are officially supported, such as ParaView, Unity, Unreal, VTK. I’ve modified the Unity Robotics Demos, got it work on Looking Glass.
However our commonly used robot visualization tool is rviz. So I want to make a rviz2 plugin for it. It is not difficult to understand how it works. The official documentation explains it well. The challenge comes from implementation. After working on it for a few days, I found it to be more difficult than I expected and beyond my knowledge base. I’m a robotics engineer after all, and I haven’t systematically studied computer graphics.
It seems that the only developer tool that can be used with QT is the HoloPlay Core SDK. It needs to be applied for access and is available on github. It includes functions to obtain calibration information from hardware, as well as dedicated shaders, which is what I don’t quite understand. I think I need to configure multi-view cameras (45 typically), and combine their rendered outputs into a Quilt (should be a texture). Then I can use the official shaders to draw lenticular output. I really don’t know how to use GLSL shaders in OGRE. It seems to be related to material. I referenced the oculus_rviz_plugins, which seems to have been discontinued.
Now I have created a display plugin, added an associated render panel to it, made it undocked and showfullscreen on the second display. But I don’t know what to do next, or what to learn.
This is a purely personal project, so I can take my time. But I also don’t want to waste time in the wrong direction.
Thanks