MSCHF mounted a remote-control paintball gun to Spot

Hey ROS folk,

I just wanted to know your thoughts on the provocative art project of MSCHF. In my opinion, their project shows very clearly that powerful tools such as the Spot robot can be turned into a weapon:

Do you think this danger is exaggerated? What do you think about the reaction of Boston Dynamics?

My personal opinion is that you have to be aware that an autonomous system like the Spot can also be armed and thus has the potential to be used as a weapon. Every autonomous machine for outdoor operation with high performance and mass can easily be misused as a weapon. That’s why I think it’s necessary to equip autonomous vehicles with additional systems that prevent this misuse. Unfortunately, the “anti-scenarios” described here are far too rarely considered in engineering. In my view, however, the experience of recent years in IT security has shown that this consideration is indispensable.

That’s why I think the networking of autonomous systems over the internet is a very dangerous development. As soon as a system is fully remote controllable and it is not able to recognize whether it is being used as a weapon, misuse cannot be ruled out and is very likely in the short and long term. What are your thoughts and your opinion?

Kind regards
Tobias

2 Likes

Reminds me of the ROS Answers question where the OP was asking for help with mounting his AK47 on a helicopter model in a Gazebo simulation …

That question was quickly shot down


Edit: found it again:

2 Likes

I think if an art collective wants to do something art-collective-y, let them. It’s not a novel idea to put a gun on a robot. Google “paintball robot”, you’ll find a dozen or more examples of this already. Boston Dynamics responding to it puts fuel on the fire of something that would have been otherwise soon forgotten.

If Boston Dynamics was actually angry about this, where was the outraged response with the viral parity videos of Atlas pulling a gun on a human or armed with machine guns, shotguns, and pistols? These are far more aggressive violations of “violence, harm, or intimidation”.

From the article (via BD representative):

Had this concept been brought to us while we were in the initial sales discussions, we probably would have said, ‘there’s Arduino quadruped that you could easily put this activation together. Go do that. This isn’t representative of how we view our technology being used.’

A real bold, cowardly, and arrogant thing to say considering they work with police forces as an intimidation practice with Spot.

This argument is weak at best from Boston Dynamics. They don’t meet those criteria themselves in any sense. I’m not going to say any company is perfect, but this is draw droppingly contradictory to their own current practices and history.

Edit: And the bad PR just keeps on coming now from US elected officials!

2 Likes

Note that I am speaking on behalf of myself here:

There has always been a fraught relationship between robotics and national defense. My preference would be that within each of our home countries we push for legislation to outlaw / criminalize the integration of weapons with autonomous or semi-autonomous systems. I know there is this group working on just that. Hopefully that would boil up to an international treaty to do the same. I’ve seen some organizations / companies claim they have internal codes of ethics, and while I respect the sentiment, I think legislation is much more powerful way of codifying those desires. This general sentiment is lightly broached in this article that came out in IEEE Spectrum earlier in the week.

More pragmatically, I am less concerned about Spot and more concerned with our current use of military drones that has been going on for some time. This genie has been out of the bottle for well over a decade.

Without actual laws in place I think it is on the individual to make informed decisions about where and how they work. I have personally quit jobs because I believed their efforts were ethically beyond the pale. I would encourage others to do the same if they see injustice happening.

@smac Here is an example of Spot being used by NYPD and more recently in full on branding (note that these aren’t the most reliable publications). I am still a little on the fence about robots being used for police work as there is some precedence for bomb removal. However, for day-to-day policing a ~$75k robot seems mighty expensive, intimidating, and not particularly useful.

Anyway, I don’t want to be a total Spot hater, I think there are applications out there for legged robots. I just think we need legislation to make it illegal to arm an autonomous / semi-autonomous robot.

5 Likes

Consider ROS Military already exists… :face_with_monocle:

1 Like

A legal barrier is certainly desirable, but I think technical barriers are also desirable. When I read that a highly automated Tesler continues to drive after a crash, I ask myself why the vehicles do not recognize that a collision has just occurred and bring themselves to an emergency stop. One should not underestimate the extent to which a high-performance autonomous vehicle can be misused as a weapon in itself.

ROS-M is even funnier when you consider that ClearPath robotics is basically participating in the Robotic Combat Vehicle (RCV) program as test mules. Or that OpenRobotics signed the Boston Dynamics letter alongside them, yet they are considered a “partner” in ROSM development by NAMC. "Robotic Technology Kernel’ is basically ROSM for use with Warfighter Management Interface. Surprise, SPOT has RTK support. The elephant is in the room for sure, many folks acting like they don’t see it, or directly care, water and feed it.




RTK-Lite support on Spot, used by West Point should not at all go unnoticed.

Especially when West Point flat out puts their “face shooter” ROS examples right out there face_shooter/gun_fire.ino at master · westpoint-robotics/face_shooter · GitHub

Some of the intent is hard to hide, and negligent of many of us to ignore, while we chastise more playful things like paintball guns.

1 Like

Some of the public stances these companies, and people that work for them are taking quickly present moments of internal conflict:
https://www.ge.com/news/reports/everywhere-you-go-ge-researchers-are-building-ai-brain-army’s-unmanned-vehicles



Every where you look it clear ClearPath products are being used for RCV logic…


The product line is being used with a “Direct Transition to Robotic Combat Vehicle”




Yet, I’m supposed to take their signature on the Boston Dynamics letter seriously.

I can’t with a straight face figure out how Clearpath has written so many letters, and pretty well directly participates in RCV program, and lets folks develop logic for WMI+RTK(ROSM) on their platform.

This is like the 4th letter since 2014?

Something fishy is going on… I suspect following the money will tell the story.

I know this might sound weird, but you really don’t sign any contract like “we won’t ever use the platform for military” when you’re buying a Husky. Maybe it is so that Clearpath asks for the initial use for the robot and they do not want it to be military? I don’t actually know how their commitment under the open letter is meant to be practically performed…

2 Likes