ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A

MSCHF mounted a remote-control paintball gun to Spot

Hey ROS folk,

I just wanted to know your thoughts on the provocative art project of MSCHF. In my opinion, their project shows very clearly that powerful tools such as the Spot robot can be turned into a weapon:

Do you think this danger is exaggerated? What do you think about the reaction of Boston Dynamics?

My personal opinion is that you have to be aware that an autonomous system like the Spot can also be armed and thus has the potential to be used as a weapon. Every autonomous machine for outdoor operation with high performance and mass can easily be misused as a weapon. That’s why I think it’s necessary to equip autonomous vehicles with additional systems that prevent this misuse. Unfortunately, the “anti-scenarios” described here are far too rarely considered in engineering. In my view, however, the experience of recent years in IT security has shown that this consideration is indispensable.

That’s why I think the networking of autonomous systems over the internet is a very dangerous development. As soon as a system is fully remote controllable and it is not able to recognize whether it is being used as a weapon, misuse cannot be ruled out and is very likely in the short and long term. What are your thoughts and your opinion?

Kind regards


Reminds me of the ROS Answers question where the OP was asking for help with mounting his AK47 on a helicopter model in a Gazebo simulation …

That question was quickly shot down

Edit: found it again:


I think if an art collective wants to do something art-collective-y, let them. It’s not a novel idea to put a gun on a robot. Google “paintball robot”, you’ll find a dozen or more examples of this already. Boston Dynamics responding to it puts fuel on the fire of something that would have been otherwise soon forgotten.

If Boston Dynamics was actually angry about this, where was the outraged response with the viral parity videos of Atlas pulling a gun on a human or armed with machine guns, shotguns, and pistols? These are far more aggressive violations of “violence, harm, or intimidation”.

From the article (via BD representative):

Had this concept been brought to us while we were in the initial sales discussions, we probably would have said, ‘there’s Arduino quadruped that you could easily put this activation together. Go do that. This isn’t representative of how we view our technology being used.’

A real bold, cowardly, and arrogant thing to say considering they work with police forces as an intimidation practice with Spot.

This argument is weak at best from Boston Dynamics. They don’t meet those criteria themselves in any sense. I’m not going to say any company is perfect, but this is draw droppingly contradictory to their own current practices and history.

Edit: And the bad PR just keeps on coming now from US elected officials!


Note that I am speaking on behalf of myself here:

There has always been a fraught relationship between robotics and national defense. My preference would be that within each of our home countries we push for legislation to outlaw / criminalize the integration of weapons with autonomous or semi-autonomous systems. I know there is this group working on just that. Hopefully that would boil up to an international treaty to do the same. I’ve seen some organizations / companies claim they have internal codes of ethics, and while I respect the sentiment, I think legislation is much more powerful way of codifying those desires. This general sentiment is lightly broached in this article that came out in IEEE Spectrum earlier in the week.

More pragmatically, I am less concerned about Spot and more concerned with our current use of military drones that has been going on for some time. This genie has been out of the bottle for well over a decade.

Without actual laws in place I think it is on the individual to make informed decisions about where and how they work. I have personally quit jobs because I believed their efforts were ethically beyond the pale. I would encourage others to do the same if they see injustice happening.

@smac Here is an example of Spot being used by NYPD and more recently in full on branding (note that these aren’t the most reliable publications). I am still a little on the fence about robots being used for police work as there is some precedence for bomb removal. However, for day-to-day policing a ~$75k robot seems mighty expensive, intimidating, and not particularly useful.

Anyway, I don’t want to be a total Spot hater, I think there are applications out there for legged robots. I just think we need legislation to make it illegal to arm an autonomous / semi-autonomous robot.


Consider ROS Military already exists… :face_with_monocle:

A legal barrier is certainly desirable, but I think technical barriers are also desirable. When I read that a highly automated Tesler continues to drive after a crash, I ask myself why the vehicles do not recognize that a collision has just occurred and bring themselves to an emergency stop. One should not underestimate the extent to which a high-performance autonomous vehicle can be misused as a weapon in itself.