BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Hawking, Musk, Wozniak Warn About Artificial Intelligence's Trigger Finger

Following
This article is more than 8 years old.

Big names have been touting big fears about the implications of a future in which artificial intelligence plays a bigger role, and now those same folks -- including Stephen Hawking and Elon Musk -- have been joined by others, including Apple co-founder Steve Wozniak in signing an open letter calling for a ban on offensive autonomous weapons.

The letter is hosted by the Future of Life Institute, which received a $10 million donation from Musk in January, at the same time that it released another open letter outlining research priorities for artificial intelligence going forward meant to keep A.I. beneficial to humanity while avoiding the potential pitfalls like, you know, creating Skynet. While the earlier letter is all about potential and giant leaps forward, the new follow-up, which is set to be officially announced Tuesday at the start of the International Joint Conference on Artificial Intelligence in Buenos Aires, is mostly just a dire warning.

"If any major military power pushes ahead with AI weapon development, a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow. Unlike nuclear weapons, they require no costly or hard-to-obtain raw materials, so they will become ubiquitous and cheap for all significant military powers to mass-produce. It will only be a matter of time until they appear on the black market and in the hands of terrorists, dictators wishing to better control their populace, warlords wishing to perpetrate ethnic cleansing, etc. Autonomous weapons are ideal for tasks such as assassinations, destabilizing nations, subduing populations and selectively killing a particular ethnic group."

The letter is signed by Hawking, Musk, Wozniak, Skype co-founder Jaan Tallinn, Noam Chomsky, and a host of AI and robotics researchers including Google's Geoff Hinton, Microsoft's Eric Horvitz and over a thousand others. It begins by noting that its primary concern is with autonomous weapons that can make their own targeting decisions without approval from a human controller;  notably this definition specifically excludes cruise missiles and unmanned drones under remote human control.

This is just the latest in over a year's worth of high-profile warnings about the potential of A.I. coming from those who stand to potentially profit from less-threatening forms of smart computers like Musk, Hinton and Bill Gates.

Follow me on Twitter or LinkedInCheck out my website