Yesterday, a group of 116 specialists from across 26 countries released a letter demanding a ban on autonomous weapons. It was presented at the opening of the International Joint Conference on Artificial Intelligence (IJCAI 2017) in Melbourne, the world’s pre-eminent gathering of top experts in AI and robotics. It stated:
"Once developed, lethal autonomous weapons will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close."
The open letter is the first time that AI and robotics companies have taken a joint stance on the issue. The signatories included many notable technologists that employ thousands of researchers in robotics and AI including:
- Elon Musk, founder of Tesla, SpaceX, and OpenAI (USA).
- Mustafa Suleyman, Founder and Head of Applied AI at Google’s DeepMind (UK).
- Esben Østergaard, Founder and CTO of Universal Robotics (Denmark).
- Jerome Monceaux, Founder of Aldebaran Robotics, makers of Nao and Pepper robots (France).
- Jürgen Schmidhuber, Leading Deep Learning Expert and Founder of Nnaisense (Switzerland).
- Yoshua Bengio, Leading Deep Learning Expert and Founder of Element AI (Canada).
It's not the first time that the issue of killer robots have been raised on a global platform. The UN recently voted to begin discussions on these weapons including drones, tanks, and automated machine guns.
How Close Are We to Autonomous Killer Robots? Depends on Where You Live
In regards to technological applications, there have been several recent developments into the use of robotics in public policing and weaponry.
In July last year, a robot was used to detonate a bomb in response to a police killing, ultimately leading to the death of Micah Johnson, who killed five police officers and wound seven others in Dallas. The police attached a pound of the explosive C4 to the robot, creating a makeshift weapon out of a design that was not intended to inflict harm on people. Notably, the robot was remote-controlled, not autonomous.
There are also examples of companies that have created robotics to be used in scenarios of public order and crowd control. An example is the work of Desert Wolf, makers of the Skunk Riot Control Copter, a drone designed to "control unruly crowds without endangering the lives of the protestors or the security staff." It's equipped with four high-capacity paint ball barrels and can release up to 80 pepper balls per second stopping any crowd in its tracks. It's also equipped with blinding strobe lights and lasers and with onboard speakers enables communication and warnings to the crowd. The makers claim operator and his team are under full video and audio surveillance but its potential abuse remains significant.
But this is not to suggest that people aren't working on autonomous weaponry or robotics. In April this year, Russia’s Deputy Prime Minister Dmitry Rogozin announced that they were working on the weaponry skills of the military's humanoid robot FEDOR (Final Experimental Demonstration Object Research). He shared a video on Twitter demonstrating that the robot has been taught to dual-wield pistols and is able to hit the mark with a gun in each hand.
In 2015, the BBC reported on the development of autonomous weaponry in South Korea. Weaponry such as a machine gun turrets had the technical capability to fire without human intervention through an auto firing system. However, due to the request of customers, a human operator was required first enter a password into the computer system to unlock the turret’s firing ability.
What about the U.S. military? It's certainly not unreasonable to speculate that DARPA is in step (if not ahead) of their opponents.
What About the Law?
Notably, armed drones (effectively flying robots) have been legalized in North Dakota as long as they are "less than lethal" — meaning that the SkunkRiot Control Copter would theoretically be legal. Legislation was also introduced to the House of Representatives in Connecticut in April this year to allow law enforcement agencies to use drones equipped with deadly weapons. While neither of these examples suggests that autonomous robots or MVA's are to be marshaling the streets anytime soon ready to shoot to kill, it shows that robots in defense roles, traditionally limited to the battlefield, are moving into the suburbs.
The Campaign to Stop Killer Robots, a coalition of international NGO's was established in 2012 to provide a coordinated civil society response to the multiple challenges that fully autonomous weapons pose to humanity. They believe that giving machines the power to decide who lives and dies on the battlefield is an unacceptable application of technology and urge all countries to consider and publicly elaborate their policy on fully autonomous weapons, particularly with respect to the ethical, legal, policy, technical, and other concerns that have been raised.
The fact that some of the makers of such technology are uniting to lend their opposition to "killer autonomous robots" tells us that they believe such machines would be possible in the future and that we're really only beginning to understand what role future AI discoveries might play in robotics. Hopefully, this serves as a deterrent to future roboticists.