Mixed Reality Drone Racing Game with Unity and ROS [Work in Progress]

Update Video

Hello!

I wanted to share a project I’m working on making an FPV drone racing game using Unity and ROS. I’m still pretty early in the process, but the goal I’m working towards is a PC game (maybe VR) that let’s you build custom virtual racetracks in an indoor environment and then race actual physical drones (and eventually autonomous) on them.

Why I’m building this

I like working on robotics projects in my spare time, and one project I’ve wanted to do for a while has been building my own autonomous drone. I’ve worked on some systems like that in the past and they’ve been really cool to see in person. Along the way, I also started getting into flying FPV drones as well and realized that flying them manually is just as fun as seeing them fly themselves, so I wanted to see if I could combine the two in some way by possibly making a game out of it!

How does it work

I put together a quick demo video (linked at the top of the post) just to document the current state of my prototype.

I’m very early in the process, and honestly, I’ve kind of cheated a bunch just to get something up and running and feel out the concept. Most of what I’ve done has just been connecting pieces together using off-the-shelf hardware/software. Right now, the prototype basically just proves out the concept of rendering the realtime position of a drone inside of a Unity game and getting all the “piping” set up to get data into the right place. Currently, the information flow is all one-directional from the drone to the PC.

On the hardware-side, I’m using Bitcraze’s crazyflie drone with it’s lighthouse positioning deck and steamVR’s base stations for estimating the drone’s 3D position. State estimation is pretty hard, but thanks to all the hardwork done by the crazyflie open source community, this is just kind of works out of the box and in realtime (i.e. one of the big reasons why it kind of feels like cheating lol). Communication between the crazyflie and the PC is done using the crazyflie radio dongle.

On the software-side, I’m using ROS to handle all the intermediate messaging and obviously Unity for the user interface, game logic and visualization.

Challenges I’ve run into so far

Getting the state estimate data from the crazyflie into Unity was somewhat interesting to figure out. Basically, the crazyflie computes its 6DoF pose (position and orientation) onboard, then transmits this telemetry over radio to the PC. On the PC, I wrote a simple ROS publisher node that listens for these messages and then publishes them onto a ROS network. To get the data into Unity, I’m using Unity’s ROS-TCP-Connector package (and ROS-TCP-Endpoint) which essentially just forwards the messages from the ROS network into Unity. Inside Unity, I wrote a simple script tied to a gameobject representing the drone that takes the data, transforms it into Unity’s coordinate frame and uses it to set the gameobject’s position. Overall, it’s just a lot of forwarding of information (with some annoying coordinate frame transforms along the way).

Another important piece of the puzzle (as far as rendering the drone inside a 3D virtual replica of my room) was building the room model and calibrating it to my actual room. I can go into it more detail for sure, but at a high-level I basically just picked a point in my room to be the origin in both the physical and virtual room, put the crazyflie there (aligned with the axes I picked for the origin) used the crazyflie cfclient tool to center the base station position estimates there. My process was pretty rough as a first pass, and it will very likely have to improve, especially as I move in the mixed reality direction and start rendering virtual objects on a live camera feed.

What’s next?

I think the next few major steps would be to add the FPV view into the game (streaming video data from the drone and rendering it into Unity), which involves more data piping (and calibration). In addition, I need to add input controls so you can actually fly the drone. The bigger goals in store would be around building out proper gameplay, integrating in autonomy (and figuring out where it makes sense), and maybe exploring what VR functionality might look like as opposed to just using a flat display on a PC monitor.

I would really love to hear any feedback or questions on this or anything else. Most likely, it would help me figure out what some additional next steps would be, and I’d be super interested learn if there are other cool directions I could take this project!

8 Likes

Sounds like an amazing project! I think the trickiest part will be getting latency low enough for the video to be usable by the user, even a small amount of lag can cause problems so I’d suggest concentrating on that.

Thanks for the feedback! Yeah I’m expecting adding in video to be more complicated than I expect. I know in the FPV drone world, analog video is commonly used exactly for the low latency reasons you mention, so that’s kind of where I was gonna start…but we’ll see how well that works out. I’m also sure calibration overall will be a challenge too, especially when getting rendered objects to overlay accurately on a live video stream. I’m definitely open to any suggestions on approaches to deal with these things, or even other challenges to watch out for :slight_smile:

Very awesome work! I’ve been experimenting at one point with the crazyflie and the Quest 2 with Unity (with the passthrough interface and controlling it with hand detection), but syncing the positioning with the lighthouse system was a bit challenging and the passthrough interface was still a bit limited and then I dropped the project to pursue other projects, but happy that more are interested in this as well!

I guess if you are able to watch the crazyflie in VR, overlapped over the real one, that would be pretty amazing.

Btw, you should send Bitcraze an email if you have advanced enough for a guest blogpost :wink:

1 Like

Awesome! Your project sounds really interesting too, and I’m getting the feeling that I’ll end up running into similar issues with synchronization/alignment of multiple sensors and systems. I’m not quite there yet so I’ll see where I end up exactly. Nice to know that I wouldn’t be the first down that path though :slight_smile:

I also do like the idea of watching the crazyflie in VR…I already have a few ideas I want to test out with that now.

And doing a guest post would be super cool! I feel like I might be too early, but I’ll definitely reach out and see what they think.

Thanks for all the great feedback! I have a bunch of new things to think about now :slight_smile:

Hello! It’s been a little while since my last update on this project but I made some meaningful progress that I wanted to share:

If anyone is interested in more of the details, I’ve also been keeping a project log on hackaday as well: Mixed Reality Drone Racing | Hackaday.io

And I’m always open to any feedback or thoughts or suggestions if there’s any :slight_smile:

7 Likes

Whoa. This is cool as heck!

1 Like

Great to see the improvements you made! This look awesome

About your last comments: you should get better positioning now with 2 - 4 basestations but I guess placement is perhaps challenging? You could use those flexible camera mounts to place them on top of the doors if you dont want something permanent fixed to the wall.

Also, there should be some ready made 3d propguards models available online. I dont have a url right now but I’ll take a look!

1 Like

Thanks! Yeah placement for the second base station was part of the problem, but I think I found a potential solution involving one of those floor-to-ceiling contractor poles (I’m in an apt right now and don’t want to do any permanent fixtures). But I’m very interested to see how much better the positioning is with additional base stations. Ideally, I would like to be able to do the positioning without any external hardware…but the lighthouse deck is just such an easier solution to do for now haha :slight_smile:

Yeah, I did see a few designs for 3d printable guards but I would also really like some protection for the camera/antenna as well, so I think I’ve gotta brush up on my CAD a bit and do a little customization (also need to find access to a 3D printer too lol).

Thanks for all the input and the interest though! I’m happy to see other people enjoying what I’m working on

I built something similar for a school project using Unreal, with the idea of a multiplayer Quidditch-themed game. It is in no way polished but does show the basic proof-of-concept. We used a DJI F450 frame with Pixhawk 4 flight controller and RTK GPS (Here3) for the precise positioning. The hardware integration team couldn’t get multiple drones running before I graduated, but the software supports interactions between multiple drones in the MR environment! We also have an administrator interface for viewing drone status and commanding automatic takeoff/landing of all drones. Check it out:
short demo video
GitHub
conference paper

If anyone is interested in using this I can help you get started with development.

Hey! That’s awesome, super cool project! I skimmed through the stuff you linked, there’s some interesting ideas in there. When I have some more time, I specifically wanna look more at the multi-drone architecture you implemented because that’s definitely a direction I’d like to go in (though I’ve still got a little ways to go there). I know you didn’t really get that fully implemented with physical drones, but I’d be curious to know what sorts of issues you ended up having to work through. I’m also interested to know if you’re aware of any similar or related projects/resources out there.

Thanks for sharing all this awesome work!

Thanks! For the multi-drone software architecture I just used Unreal Engine’s built-in multiplayer. There’s a steep learning curve but it was certainly easier than writing my own system. I haven’t used Unity but I’m sure it is similar. My ROS/game connection is just like yours (when you said “with some annoying coordinate frame transforms along the way” I really felt that :wink: ). Each drone has its own ROS namespace and the game server creates a new virtual drone whenever it sees a namespace appear.

The biggest challenge was pose accuracy. A coherent MR experience really requires full RTK-fixed “centimeter-level” precision, which was not easy to get consistently. Even with good position, the system relied on a magnetometer for orientation. Any small angular offset could lead to large apparent position errors when lining up the virtual and physical drones. One other caveat is that errors compound with multiple drones. I expect pose errors will be less of an issue for you, using lighthouses indoors over small distances.

I’m curious if you plan on expanding this beyond the lighthouse system. I’ve looked into using WiFi or Bluetooth radios for both radio link and localization at building-scale but the accuracy is not quite there yet. Another approach would be to do localization on the server with visual SLAM from the camera feed.

I’ve seen a few projects connect game engines and ROS but unfortunately not much else in the niche of “multiplayer mixed-reality drones with ROS”. It may not be necessary for you, but I found it valuable to use a simulator. It can be easier than working with a physical drone for quick testing.

Oh nice, yeah I’ve heard Unreal has pretty good multi-player support out of the box (or more so than Unity), but I guess I’ll cross that bridge when I get there. Ahh yes you know the pain then lol - my favorite part was forgetting that Unity has a left-handed coordinate system (I think Unreal does too)…but hey at least it’s never boring working out coordinate transforms :stuck_out_tongue:

Yes 100% to pose accuracy, it’s the foundation of the entire system and it really has to be pretty good. I agree that in your case dealing with much larger distances really tightens your error tolerances, whereas in my case the scale is a lot smaller. I hadn’t considered the fact that error would compound with multiple drones but that makes sense when I think about it…so I guess I’m glad I asked! I’ll be keeping that in mind for when I do get to the point of working on multiplayer.

And yeah, I appreciate the question - I definitely want to to explore ways to expand beyond the lighthouse system. My initial design was actually an offboard vslam approach like you mentioned (and that was one of the reasons I picked ROS in the first place for this project)…but the lighthouse system was just such an easier path forward and sets a very good baseline of performance (again I need to shout out the crazyflie ecosystem for this). But my ideal state for this project would be to eliminate or severely limit the reliance on external infrastructure as much as possible. I think it would be incredible to be able to just bring like a laptop, your drone and a controller and just start flying around (maybe mapping the space out beforehand) but without having to depend on hardware being setup first. I knows there’s certainly limitations there and I’m not even sure if a fully infrastructure-free system would be practical or enjoyable to use given the data quality/transmission artifacts + latency/etc, but it’s definitely something I’m looking into. It actually falls into one of the very next steps I’m looking at (improving position tracking): I’m expecting adding another base station for now will improve things, but depending on how much/little it helps, I was going to start looking at applying vision-based corrections as a potential precursor to a slam system down the line.

Yeah I didn’t see a lot of similar projects either when I checked last year (but then again I also missed your project too…so who knows). I appreciate the link, I’m not using a sim right now, but I think it will be as the project gets more complex.

Also thanks for all the great info and input! :slight_smile:

2 Likes