AI and robotics are playing an increasingly prominent role in the field of weapon design. An open letter signed by Elon Musk, Stephen Hawking and Steve Wozniak, among others, made the case that weaponized robots could lead to “a global AI arms race” that turns self-directed drones into “the Kalashnikovs of tomorrow.”
“We believe that AI has great potential to benefit humanity in many ways and that the goal of the field should be to do so,” the open letter reads. “Starting a military AI arms race is a bad idea and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”
This isn’t the first time these technologists have warned of the dangers of artificial intelligence. Musk has warned before that there “needs to be a lot more work on AI safety,” and a previous open letter from Musk, Hawking, Wozniak, and others spoke of the “pitfalls” that lay in wait, if the research wasn’t done carefully. The solution, according to Musk and others, is a ban on autonomous weapons, similar to the kind that governs chemical weapons.
History suggests that such a ban could be hard to approve, let alone enforce: Despite many major powers signing the 1925 Geneva Protocol banning the use of chemical and biological weapons, countries such as Japan and the United States did not become signatories until as late as the 1970s, according to the Arms Control Association. And even then, claims are still made about the use of such weapons in violation of the ban.
Today we mostly rely on the fear of mutual destruction and voluntary commitments by countries such as China and India on a “no-first-use” policy that only permits the firing of nuclear weapons in response to a nuclear attack.
In addition, once humans perfect artificial intelligence, the AI thus created might be tempted to improve upon its perceived flaws. Can humans with limited biological evolution compete with robotics? The question we all face is, Do we use new research to save mankind or to destroy it? The decision rests in the hands of researchers and governments. It is up to us to make sure that technology saves us, and not destroy everything we hold dear.