A Twitter post featuring a video of a robot dog firing a gun that’s racked up nearly 120,000 likes since July. Videos of Ukrainian soldiers apparently modifying off-the-shelf drones to airdrop weapons. An art project featuring Spot, the Boston Dynamics robot most known for viral dancing videos, outfitted with a paintball gun.
These kinds of videos are all over the internet. They demonstrate the kind of scary scenarios that six top robot manufacturers, including Boston Dynamics, probably had in mind when they published a letter last week promising not to weaponize their products. While robots are becoming increasingly accessible to consumers, these companies warned, people might try to turn them into weapons meant to harm people. To prevent this from happening, the companies promised to review what customers want to do with their commercial robots before selling them (“when possible”) and to look into developing technologies that might reduce the risk of this happening in the first place.
Sign up for the
newsletter
Recode
“[W]e do not support the weaponization of our advanced-mobility general-purpose robots,” the companies wrote. “[W]e now feel renewed urgency in light of the increasing public concern in recent months caused by a small number of people who have visibly publicized their makeshift efforts to weaponize commercially available robots.”
The robots available to the general public are still somewhat expensive, and they’re not as common as the other kinds of commercial technologies that people can buy (namely, drones). Still, this letter serves as a reminder that the risk of weaponization isn’t exactly zero, and that it’s an issue that robot makers are already worried about. At the same time, though, these companies left plenty of caveats in their declaration, and they’ve kept the door open to continue selling robots to law enforcement and the military. They’re also far from the only manufacturers making these kinds of technologies, which are slowly entering the mainstream.
“You don’t necessarily want to be seen by the public as producing a good and then intentionally having it be used for military purposes,” explains Erik Lin-Greenberg, a professor who studies how emerging military technologies impact conflict at MIT. “Whether or not that kind of statement is actually going to have any impact on how these systems are being used I think is another question. These are essentially just off-the-shelf technologies.”
An international campaign called Stop Killer Robots has urged people to push back against the development of autonomous weapons, and has highlighted how racism, sexism, and dehumanization can be built into these technologies. One former New York City Council member, Ben Kallos, proposed a law banning police from acquiring any kind of armed robot last year after the New York Police Department started trialing a Boston Dynamics robot. (That pilot was called off after backlash.) The Electronic Frontier Foundation, a digital rights organization, has called for banning law enforcement from using autonomous or remote-controlled robots.
Even some unconventional efforts have caught the attention of robot manufacturers. In 2021, an art collective called MSCHF purchased a Spot, Boston Dynamics’ nearly $75,000 robot dog, for a demonstration the group called “Spot’s Rampage.” The project involved attaching a paint gun to the robot and then inviting people from around the world to remotely operate the “weapon.” Though the paintball gun-armed robot eventually broke down, Boston Dynamics — which has offered the same robot model to police departments and militaries in the past — was not happy, and said the project misrepresented how its robot “is being used to benefit our daily lives.”
Though off-the-shelf robots are still somewhat rare, drones have become more commonplace, and demonstrate how consumer technologies can be weaponized. Amid the war in Ukraine, some soldiers have turned to off-the-shelf drones and used them to drop ammunition, including grenades and weapons meant to target tanks. Cartels in Mexico have similarly used drones to carry and detonate explosives. Terrorist groups and other non-state actors can also retrofit these relatively simple technologies to their advantage, explains Kerry Chávez, an instructor at Texas Tech University and a project administrator at the university’s peace, war, and social conflicts laboratory.
“A lot of them are just the hobbyist and commercial models, even some homemade ones,” Chávez told Recode. “Even if you cut off a supply chain from one vector, they can just activate another one.”
The US Bureau of Alcohol, Tobacco, Firearms, and Explosives did not respond to a request for comment as to how common it knows weaponizing drones to be in the US, but we do know that it’s happened. In 2015, an 18-year-old in Connecticut stirred global outrage, and an investigation, after he fired a handgun attached to a homemade drone. In 2020, a Pennsylvania man was sentenced to prison after, among other crimes, using a drone to drop explosives to “terrorize” a woman that he used to date. The Federal Aviation Administration has pursued legal action in at least one case concerning a weaponized drone. Operating a drone with a dangerous weapon attached is illegal and comes with up to a $25,000 fine, according to agency spokesperson Rick Breitenfeldt.
What we’ve already seen with drones might make companies getting ahead of the weaponization of more advanced, consumer robots seem like a good idea. But there are critical caveats. For one thing, the companies seem to acknowledge that they alone won’t be able to stop the misuse of their tech, and they’re already asking the government for help. At the same time, these technologies can still be used for other types of harm, like surveilling people or funneling weapons across borders. Earlier this year, Canadian police caught a drone carrying almost a dozen handguns from the US after it crashed into a tree in the southern part of Ontario.
And there’s the biggest caveat of all: These companies limited their pledge to “general-purpose” robots, but noted that “we are not taking issue with existing technologies that nations and their government agencies use to defend themselves and uphold their laws.”
This story was first published in the Recode newsletter. Sign up here so you don’t miss the next one!
Our goal this month
Now is not the time for paywalls. Now is the time to point out what’s hidden in plain sight (for instance, the hundreds of election deniers on ballots across the country), clearly explain the answers to voters’ questions, and give people the tools they need to be active participants in America’s democracy. Reader gifts help keep our well-sourced, research-driven explanatory journalism free for everyone. By the end of September, we’re aiming to add 5,000 new financial contributors to our community of Vox supporters. Will you help us reach our goal by making a gift today?
Source by www.vox.com