(Natural News) An international conference in Geneva, Switzerland, this week will consider whether to ban so-called killer robots powered by artificial intelligence.
Representatives from an advocacy group known as the Campaign to Stop Killer Robots will be on hand at the United Nations Convention on Conventional Weapons, the forum which will discuss the dangers posed by lethal, miniaturized autonomous weapons.
At the conference, the group comprised of a coalition of non-governmental organizations is showing a short film called “Slaughterbots” which dramatizes the risks to citizens posed by mini-drones operating with minimal human supervision. The Campaign seeks a ban on the development, production, and use of autonomous weapons.
The Guardian summarizes what the compelling, futuristic video depicts. It starts off with a product presentation at a technology conference but then the scene changes to one of terror. A swarm of palm-sized drones, equipped with explosives and facial recognition capabilities hunt down and kill college students with shots to the head as they attempt to flee a classroom.
The short, disturbing film is the latest attempt by campaigners and concerned scientists to highlight the dangers of developing autonomous weapons that can find, track and fire on targets without human supervision. They warn that a preemptive ban on the technology is urgently needed to prevent terrible new weapons of mass destruction.
Although AI technology has gained currency in military applications, the scenario could be even more dire should autonomous drones (i.e., those operating independent of human oversight) fall into the wrong hands and be deployed in terrorist attacks or aggression by rogue regimes.
Computer science professor Stuart Russell of the University of California, Berkeley, suggests that the technology is closer to science fact than science fiction. “It’s not the Terminator that experts in AI and robotics like myself are worried about but much simpler technologies currently under development,” warns Russell. The Guardian goes on to explain that “The manufacture and use of autonomous weapons, such as drones, tanks and automated machine guns, would be devastating for human security and freedom, and the window to halt their development is closing fast.”
The Daily Mail echoed this assessment saying “Technology allowing a pre-programmed robot to shoot to kill, or a tank to fire at a target with no human involvement, are only years away experts say.”
Tesla and SpaceX CEO Elon Musk has repeatedly warned that rapidly advancing artificial intelligence could give rise to self-replicating machines that might threaten humanity. Health Ranger Mike Adams, the founding editor of Natural News, has similarly cautioned that once AI technology develops into highly evolved, self-aware, systems, the human race has a big Terminator problem on its hands. (Related: Read more about artificial intelligence at Robotics.news.)
Musk was one of about 100 robotics experts who signed an open letter in August recommending that the U.N. prohibit the use of AI weaponry.
This week’s conference is only an initial step in complex negotiations that may lead to a ban on autonomous weapons by international treaty. Thus, it remains to be seen what the future might hold for mini-drones or other similar weapons which may or may not fit the definition of conventional. A second U.N. conference is scheduled for later this month as diplomats and bureaucrats tackle this issue. Nineteen countries currently support the call to ban AI weapons.
Separately, an Australian scientist joined with colleagues in Canada to urge their respective governments to regulate AI devices to the same extent as chemical, biological, and nuclear weapons, the Independent reported.
Without a ban, there will be an arms race to develop increasingly capable autonomous weapons. These will be weapons of mass destruction. One programmer will be able to control a whole army.
Watch the “Slaughterbots” video, which has been viewed about 90,000 times on YouTube as of this writing, and draw your own conclusions.