Scientists in attendance at the World Economic Forum in Davos, Switzerland have issued a warning that would've sounded funny even ten years ago: beware the coming killer robot menace. They're deadly serious.
The Agence France-Presse reported that former German UN High Representative for Disarmament Affairs Angela Kane argued during a debate that it could be "too late" to avoid battlefields full of artificially intelligent and fully independent battlebots. We're not talking the remotely-operated machines like the U.S. military's drones, but armored and armed machines making their own decisions, choosing targets and executing deadly missions.
That's the big fear regarding our very real future" Do we want machines that will be able to decide who lives and who dies in battle? That sounds like some serious Skynet business. Can machines even follow the rules of war?
UC Berkeley computer science professor Stuart Russell doesn't think so, reported the AFP:
"We are talking about autonomous weapons, which means that there is no one behind it. AI: artificial intelligence weapons," he told a forum in Davos. "Very precisely, weapons that can locate and attack targets without human intervention."
"I am against robots for ethical reasons but I do not believe ethical arguments will win the day. I believe strategic arguments will win the day," Russell said.
The United States had renounced biological weapons because of the risk that one day they could deployed by "almost anybody", he said. "I hope this will happen with robots."
While there aren't battle-ready autonomous robots are primed to start plowing through ISIS encampments today, the brains at Davos are in accord with the many top tech and science luminaries who signed a letter in the summer of 2015 warning such machines are just a few years away—they do not want them to happen.
Alan Winfield, an electronics engineering professor from the United Kingdom, put a fine point on why, noting that even though they sound like awesome science fiction come true, battle robots deprive humans of "moral responsibility" for battlefield action.
Worse, robots probably couldn't handle the unpredictability of warfare. "When you put a robot in a chaotic environment," said Winfield, "it behaves chaotically."