I just wanted to know your thoughts on the provocative art project of MSCHF. In my opinion, their project shows very clearly that powerful tools such as the Spot robot can be turned into a weapon:
Do you think this danger is exaggerated? What do you think about the reaction of Boston Dynamics?
My personal opinion is that you have to be aware that an autonomous system like the Spot can also be armed and thus has the potential to be used as a weapon. Every autonomous machine for outdoor operation with high performance and mass can easily be misused as a weapon. Thatâs why I think itâs necessary to equip autonomous vehicles with additional systems that prevent this misuse. Unfortunately, the âanti-scenariosâ described here are far too rarely considered in engineering. In my view, however, the experience of recent years in IT security has shown that this consideration is indispensable.
Thatâs why I think the networking of autonomous systems over the internet is a very dangerous development. As soon as a system is fully remote controllable and it is not able to recognize whether it is being used as a weapon, misuse cannot be ruled out and is very likely in the short and long term. What are your thoughts and your opinion?
I think if an art collective wants to do something art-collective-y, let them. Itâs not a novel idea to put a gun on a robot. Google âpaintball robotâ, youâll find a dozen or more examples of this already. Boston Dynamics responding to it puts fuel on the fire of something that would have been otherwise soon forgotten.
Had this concept been brought to us while we were in the initial sales discussions, we probably would have said, âthereâs Arduino quadruped that you could easily put this activation together. Go do that. This isnât representative of how we view our technology being used.â
This argument is weak at best from Boston Dynamics. They donât meet those criteria themselves in any sense. Iâm not going to say any company is perfect, but this is draw droppingly contradictory to their own current practices and history.
There has always been a fraught relationship between robotics and national defense. My preference would be that within each of our home countries we push for legislation to outlaw / criminalize the integration of weapons with autonomous or semi-autonomous systems. I know there is this group working on just that. Hopefully that would boil up to an international treaty to do the same. Iâve seen some organizations / companies claim they have internal codes of ethics, and while I respect the sentiment, I think legislation is much more powerful way of codifying those desires. This general sentiment is lightly broached in this article that came out in IEEE Spectrum earlier in the week.
More pragmatically, I am less concerned about Spot and more concerned with our current use of military drones that has been going on for some time. This genie has been out of the bottle for well over a decade.
Without actual laws in place I think it is on the individual to make informed decisions about where and how they work. I have personally quit jobs because I believed their efforts were ethically beyond the pale. I would encourage others to do the same if they see injustice happening.
@smac Here is an example of Spot being used by NYPD and more recently in full on branding (note that these arenât the most reliable publications). I am still a little on the fence about robots being used for police work as there is some precedence for bomb removal. However, for day-to-day policing a ~$75k robot seems mighty expensive, intimidating, and not particularly useful.
Anyway, I donât want to be a total Spot hater, I think there are applications out there for legged robots. I just think we need legislation to make it illegal to arm an autonomous / semi-autonomous robot.
A legal barrier is certainly desirable, but I think technical barriers are also desirable. When I read that a highly automated Tesler continues to drive after a crash, I ask myself why the vehicles do not recognize that a collision has just occurred and bring themselves to an emergency stop. One should not underestimate the extent to which a high-performance autonomous vehicle can be misused as a weapon in itself.
ROS-M is even funnier when you consider that ClearPath robotics is basically participating in the Robotic Combat Vehicle (RCV) program as test mules. Or that OpenRobotics signed the Boston Dynamics letter alongside them, yet they are considered a âpartnerâ in ROSM development by NAMC. "Robotic Technology Kernelâ is basically ROSM for use with Warfighter Management Interface. Surprise, SPOT has RTK support. The elephant is in the room for sure, many folks acting like they donât see it, or directly care, water and feed it.
Yet, Iâm supposed to take their signature on the Boston Dynamics letter seriously.
I canât with a straight face figure out how Clearpath has written so many letters, and pretty well directly participates in RCV program, and lets folks develop logic for WMI+RTK(ROSM) on their platform.
This is like the 4th letter since 2014?
Something fishy is going on⌠I suspect following the money will tell the story.
I know this might sound weird, but you really donât sign any contract like âwe wonât ever use the platform for militaryâ when youâre buying a Husky. Maybe it is so that Clearpath asks for the initial use for the robot and they do not want it to be military? I donât actually know how their commitment under the open letter is meant to be practically performedâŚ