Thanks! For the multi-drone software architecture I just used Unreal Engine’s built-in multiplayer. There’s a steep learning curve but it was certainly easier than writing my own system. I haven’t used Unity but I’m sure it is similar. My ROS/game connection is just like yours (when you said “with some annoying coordinate frame transforms along the way” I really felt that :wink: ). Each drone has its own ROS namespace and the game server creates a new virtual drone whenever it sees a namespace appear.

The biggest challenge was pose accuracy. A coherent MR experience really requires full RTK-fixed “centimeter-level” precision, which was not easy to get consistently. Even with good position, the system relied on a magnetometer for orientation. Any small angular offset could lead to large apparent position errors when lining up the virtual and physical drones. One other caveat is that errors compound with multiple drones. I expect pose errors will be less of an issue for you, using lighthouses indoors over small distances.

I’m curious if you plan on expanding this beyond the lighthouse system. I’ve looked into using WiFi or Bluetooth radios for both radio link and localization at building-scale but the accuracy is not quite there yet. Another approach would be to do localization on the server with visual SLAM from the camera feed.

I’ve seen a few projects connect game engines and ROS but unfortunately not much else in the niche of “multiplayer mixed-reality drones with ROS”. It may not be necessary for you, but I found it valuable to use a simulator. It can be easier than working with a physical drone for quick testing.