Hello!
I wanted to share a project I’m working on making an FPV drone racing game using Unity and ROS. I’m still pretty early in the process, but the goal I’m working towards is a PC game (maybe VR) that let’s you build custom virtual racetracks in an indoor environment and then race actual physical drones (and eventually autonomous) on them.
Why I’m building this
I like working on robotics projects in my spare time, and one project I’ve wanted to do for a while has been building my own autonomous drone. I’ve worked on some systems like that in the past and they’ve been really cool to see in person. Along the way, I also started getting into flying FPV drones as well and realized that flying them manually is just as fun as seeing them fly themselves, so I wanted to see if I could combine the two in some way by possibly making a game out of it!
How does it work
I put together a quick demo video (linked at the top of the post) just to document the current state of my prototype.
I’m very early in the process, and honestly, I’ve kind of cheated a bunch just to get something up and running and feel out the concept. Most of what I’ve done has just been connecting pieces together using off-the-shelf hardware/software. Right now, the prototype basically just proves out the concept of rendering the realtime position of a drone inside of a Unity game and getting all the “piping” set up to get data into the right place. Currently, the information flow is all one-directional from the drone to the PC.
On the hardware-side, I’m using Bitcraze’s crazyflie drone with it’s lighthouse positioning deck and steamVR’s base stations for estimating the drone’s 3D position. State estimation is pretty hard, but thanks to all the hardwork done by the crazyflie open source community, this is just kind of works out of the box and in realtime (i.e. one of the big reasons why it kind of feels like cheating lol). Communication between the crazyflie and the PC is done using the crazyflie radio dongle.
On the software-side, I’m using ROS to handle all the intermediate messaging and obviously Unity for the user interface, game logic and visualization.
Challenges I’ve run into so far
Getting the state estimate data from the crazyflie into Unity was somewhat interesting to figure out. Basically, the crazyflie computes its 6DoF pose (position and orientation) onboard, then transmits this telemetry over radio to the PC. On the PC, I wrote a simple ROS publisher node that listens for these messages and then publishes them onto a ROS network. To get the data into Unity, I’m using Unity’s ROS-TCP-Connector package (and ROS-TCP-Endpoint) which essentially just forwards the messages from the ROS network into Unity. Inside Unity, I wrote a simple script tied to a gameobject representing the drone that takes the data, transforms it into Unity’s coordinate frame and uses it to set the gameobject’s position. Overall, it’s just a lot of forwarding of information (with some annoying coordinate frame transforms along the way).
Another important piece of the puzzle (as far as rendering the drone inside a 3D virtual replica of my room) was building the room model and calibrating it to my actual room. I can go into it more detail for sure, but at a high-level I basically just picked a point in my room to be the origin in both the physical and virtual room, put the crazyflie there (aligned with the axes I picked for the origin) used the crazyflie cfclient tool to center the base station position estimates there. My process was pretty rough as a first pass, and it will very likely have to improve, especially as I move in the mixed reality direction and start rendering virtual objects on a live camera feed.
What’s next?
I think the next few major steps would be to add the FPV view into the game (streaming video data from the drone and rendering it into Unity), which involves more data piping (and calibration). In addition, I need to add input controls so you can actually fly the drone. The bigger goals in store would be around building out proper gameplay, integrating in autonomy (and figuring out where it makes sense), and maybe exploring what VR functionality might look like as opposed to just using a flat display on a PC monitor.
I would really love to hear any feedback or questions on this or anything else. Most likely, it would help me figure out what some additional next steps would be, and I’d be super interested learn if there are other cool directions I could take this project!