The role of artificial intelligence and machine learning in the military.
Almost every aspect of human life is influenced by science and technology. From the smartphone, there are always two sides of technology. As the field of science and technology has advanced, it has fundamentally changed our perspectives of life. Among those advancements, robotics is the most significant development that is trying to get closer to human life. Although it has managed to do make our daily lives easier, they can still create problems.
As we undoubtedly aware, machine learning algorithms build a mathematical model based on sample data, known as “training data”, in order to make predictions or decisions without being explicitly programmed to perform the task.
Let’s assume, for now, there are some errors in the training data, what will happen? When machine learning systems fail, it’s rarely because of problems with the machine learning algorithm. It’s more likely, you’ve introduced human error into the training data, creating bias or some other systematic error. Therefore it is better to always be skeptical, and approach machine learning with the discipline you apply to software engineering.
"It is better to always be skeptical, and approach machine learning with the discipline you apply to software engineering"
Use in Defense
If you want to know about any country’s innovative technology, then you should look at the weaponry that is being used by the defense forces of that country. Although many of these technologies are sensitive and secret, it would be appropriate to understand some of the issues related to Artificial Intelligence in the defense forces. According to Elon Musk, the founder of Tesla and SpaceX, if the third world war happens then the robotics will be in the main role. Like the ban on human creation from DNA-cloning, more and more individuals are asking for curbs on the use of AI in military.
Elon Musk, Stephen Hawking and Steve Wozniak on AI
AI and robotics are playing an increasingly prominent role in the field of weapon design. An open letter signed by Elon Musk, Stephen Hawking and Steve Wozniak, among others, made the case that weaponized robots could lead to “a global AI arms race” that turns self-directed drones into “the Kalashnikovs of tomorrow.”
“We believe that AI has great potential to benefit humanity in many ways and that the goal of the field should be to do so,” the open letter reads. “Starting a military AI arms race is a bad idea and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”
This isn’t the first time these technologists have warned of the dangers of artificial intelligence. Musk has warned before that there “needs to be a lot more work on AI safety,” and a previous open letter from Musk, Hawking, Wozniak, and others spoke of the “pitfalls” that lay in wait, if the research wasn’t done carefully. The solution, according to Musk and others, is a ban on autonomous weapons, similar to the kind that governs chemical weapons.
History suggests that such a ban could be hard to approve, let alone enforce: Despite many major powers signing the 1925 Geneva Protocol banning the use of chemical and biological weapons, countries such as Japan and the United States did not become signatories until as late as the 1970s, according to the Arms Control Association. And even then, claims are still made about the use of such weapons in violation of the ban.
Today we mostly rely on the fear of mutual destruction and voluntary commitments by countries such as China and India on a “no-first-use” policy that only permits the firing of nuclear weapons in response to a nuclear attack.
In addition, once humans perfect artificial intelligence, the AI thus created might be tempted to improve upon its perceived flaws. Can humans with limited biological evolution compete with robotics? The question we all face is, Do we use new research to save mankind or to destroy it? The decision rests in the hands of researchers and governments. It is up to us to make sure that technology saves us, and not destroy everything we hold dear.