Primary tabs

A group of leading robotics and Artificial Intelligence experts have become so concerned about the potential dangers relating to the use of lethal autonomous weapons in the future that they have written an open letter to the United Nations (UN), asking them to ban the development and subsequent use of this technology.

The UN are soon to begin discussing the possible dangers and merits of lethal autonomous weapons, dubbed ‘killer robots’. This can include drones, tanks and automated machine guns. The prospect of these sorts of weapons being used in battle is certainly a scary one and something that requires serious consideration going forward.



Experts are worried

The release of the open letter coincides with the International Joint Conference on Artificial Intelligence (IJCAI) in Melbourne on August 21.

Elon Musk, CEO of Tesla, has already warned society not to be too complacent when it comes to the development of Artificial Intelligence, saying “AI is a rare case where we need to be proactive in regulation instead of reactive because if we’re reactive in AI regulation it’s too late.”

So, instead of leaving things until it is too late and we face the use of lethal autonomous weapons by armed forces from countries to have decided not to ban their use themselves, signatories of this letter are calling on the UN to add them to the list of weapons which are banned under the UN convention on certain conventional weapons (CCW) which came into force in 1983. Other weapons on the list include chemical weapons and those which include high-powered lasers, designed to intentionally blind people. 19 countries within the UN have already agreed to amend the ban to include lethal autonomous weapons in principle, with formal discussions due to start this week but which have now been delayed.

A stark but necessary warning

Despite, or perhaps because of the fact that the signatories of this letter are involved in the development of AI, the warning is stark. As experts in the development of this technology, perhaps they know better than most of us that any advantages are also balanced by potential dangers. In the letter, they state:

“Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s Box is opened, it will be hard to close.”

The “third revolution in warfare” being referred to follows the development of first gunpowder, and then nuclear arms as key advances in warfare.

Certain countries have already taken steps to limit the use of lethal autonomous weapons within their own armed forces, with the UK government banning their development in 2015, saying that all weapons deployed by UK armed forces would be “under human oversight and control.”

The full list of signatories is available here, but it includes Tesla’s Elon Musk and Google’s Mustafa Suleyman along with another 114 founders and Chief Operating Officers of AI and robotics companies from 26 countries around the world. With developments in AI carrying on apace, we certainly hope that the UN listens carefully to this warning from those who know and understand this technology best.

Top image: Terminator Exhibition: T-800 (CC BY 2.0)

Emma's picture

Emma Stenhouse, MSc

Emma qualified with a BSc (Hons) in Equine Science in 2003 and has had a passion for horses since a young age. She continued her academic career with an MSc in Applied Marine Science, gained in 2004. Emma’s main scientific focus was the navigational techniques of sea turtles and whether they use the acoustics of the surf-zone as a cue for nesting. She then worked for a sea turtle conservation project on the Pacific coast of Costa Rica before travelling to New Zealand where she worked as a Mari...Read More

No comment

Leave a Response