Mixed Reality Drone Racing Game with Unity and ROS [Work in Progress]

Update Video


I wanted to share a project I’m working on making an FPV drone racing game using Unity and ROS. I’m still pretty early in the process, but the goal I’m working towards is a PC game (maybe VR) that let’s you build custom virtual racetracks in an indoor environment and then race actual physical drones (and eventually autonomous) on them.

Why I’m building this

I like working on robotics projects in my spare time, and one project I’ve wanted to do for a while has been building my own autonomous drone. I’ve worked on some systems like that in the past and they’ve been really cool to see in person. Along the way, I also started getting into flying FPV drones as well and realized that flying them manually is just as fun as seeing them fly themselves, so I wanted to see if I could combine the two in some way by possibly making a game out of it!

How does it work

I put together a quick demo video (linked at the top of the post) just to document the current state of my prototype.

I’m very early in the process, and honestly, I’ve kind of cheated a bunch just to get something up and running and feel out the concept. Most of what I’ve done has just been connecting pieces together using off-the-shelf hardware/software. Right now, the prototype basically just proves out the concept of rendering the realtime position of a drone inside of a Unity game and getting all the “piping” set up to get data into the right place. Currently, the information flow is all one-directional from the drone to the PC.

On the hardware-side, I’m using Bitcraze’s crazyflie drone with it’s lighthouse positioning deck and steamVR’s base stations for estimating the drone’s 3D position. State estimation is pretty hard, but thanks to all the hardwork done by the crazyflie open source community, this is just kind of works out of the box and in realtime (i.e. one of the big reasons why it kind of feels like cheating lol). Communication between the crazyflie and the PC is done using the crazyflie radio dongle.

On the software-side, I’m using ROS to handle all the intermediate messaging and obviously Unity for the user interface, game logic and visualization.

Challenges I’ve run into so far

Getting the state estimate data from the crazyflie into Unity was somewhat interesting to figure out. Basically, the crazyflie computes its 6DoF pose (position and orientation) onboard, then transmits this telemetry over radio to the PC. On the PC, I wrote a simple ROS publisher node that listens for these messages and then publishes them onto a ROS network. To get the data into Unity, I’m using Unity’s ROS-TCP-Connector package (and ROS-TCP-Endpoint) which essentially just forwards the messages from the ROS network into Unity. Inside Unity, I wrote a simple script tied to a gameobject representing the drone that takes the data, transforms it into Unity’s coordinate frame and uses it to set the gameobject’s position. Overall, it’s just a lot of forwarding of information (with some annoying coordinate frame transforms along the way).

Another important piece of the puzzle (as far as rendering the drone inside a 3D virtual replica of my room) was building the room model and calibrating it to my actual room. I can go into it more detail for sure, but at a high-level I basically just picked a point in my room to be the origin in both the physical and virtual room, put the crazyflie there (aligned with the axes I picked for the origin) used the crazyflie cfclient tool to center the base station position estimates there. My process was pretty rough as a first pass, and it will very likely have to improve, especially as I move in the mixed reality direction and start rendering virtual objects on a live camera feed.

What’s next?

I think the next few major steps would be to add the FPV view into the game (streaming video data from the drone and rendering it into Unity), which involves more data piping (and calibration). In addition, I need to add input controls so you can actually fly the drone. The bigger goals in store would be around building out proper gameplay, integrating in autonomy (and figuring out where it makes sense), and maybe exploring what VR functionality might look like as opposed to just using a flat display on a PC monitor.

I would really love to hear any feedback or questions on this or anything else. Most likely, it would help me figure out what some additional next steps would be, and I’d be super interested learn if there are other cool directions I could take this project!


Sounds like an amazing project! I think the trickiest part will be getting latency low enough for the video to be usable by the user, even a small amount of lag can cause problems so I’d suggest concentrating on that.

Thanks for the feedback! Yeah I’m expecting adding in video to be more complicated than I expect. I know in the FPV drone world, analog video is commonly used exactly for the low latency reasons you mention, so that’s kind of where I was gonna start…but we’ll see how well that works out. I’m also sure calibration overall will be a challenge too, especially when getting rendered objects to overlay accurately on a live video stream. I’m definitely open to any suggestions on approaches to deal with these things, or even other challenges to watch out for :slight_smile:

Very awesome work! I’ve been experimenting at one point with the crazyflie and the Quest 2 with Unity (with the passthrough interface and controlling it with hand detection), but syncing the positioning with the lighthouse system was a bit challenging and the passthrough interface was still a bit limited and then I dropped the project to pursue other projects, but happy that more are interested in this as well!

I guess if you are able to watch the crazyflie in VR, overlapped over the real one, that would be pretty amazing.

Btw, you should send Bitcraze an email if you have advanced enough for a guest blogpost :wink:

1 Like

Awesome! Your project sounds really interesting too, and I’m getting the feeling that I’ll end up running into similar issues with synchronization/alignment of multiple sensors and systems. I’m not quite there yet so I’ll see where I end up exactly. Nice to know that I wouldn’t be the first down that path though :slight_smile:

I also do like the idea of watching the crazyflie in VR…I already have a few ideas I want to test out with that now.

And doing a guest post would be super cool! I feel like I might be too early, but I’ll definitely reach out and see what they think.

Thanks for all the great feedback! I have a bunch of new things to think about now :slight_smile: