Experts warn that UN failure to ban slaughterbots could spell the end of humanity
01/01/2022 // Cassie B. // Views

Experts in military strategy and artificial intelligence are raising the alarm after a UN conference did not reach an agreement on banning the use of so-called slaughterbots at a recent meeting in Geneva.

Slaughterbots is the name that has been given to weapons that can select and apply force to targets without using any human intervention. These weapons make their decisions using a series of algorithms in artificial intelligence software. Capable of hunting and striking targets without any input from controllers, their technology is growing so fast that many fear societies and governments have not taken the time to fully consider the dangers.

This year, for the first time, most of the 125 nations in the UN Convention on Certain Conventional Weapons called for new laws governing the killer robots. However, some countries opposed the measure, such as the U.S. and Russia, both of whom are known to be working on developing such weapons. Other nations that objected included India, Australia and the UK, with some arguing that continuing the development of these killer robots is vital to avoid having a strategic disadvantage.

The leader of the Future of Life Institute’s advocacy program on autonomous weapons, Emilia Javorsky, called the group's failure to reach an agreement an “epic failure.”

She added: “It is now blatantly clear this forum — whose unanimity requirement makes it easily derailed by any state with a vested interest — is utterly incapable of taking seriously, let alone meaningfully addressing, the urgent threats posed by emerging technologies such as artificial intelligence.”


Unfortunately, time appears to be running out as slaughterbots are already being used in some places on the battlefield. For example, a UN report published this spring showed that STM Kargu drones have been used in the Libyan civil war. These small, portable rotary wing attack drones have precision strike capabilities and were used to hunt down soldiers who were retreating.

The companies that are developing the drones are working on AI systems that will be able to find a human target's thermal signature or even identify people’s faces using a camera. However, they seem to lack some of the accuracy needed to make the distinction between a combatant and a non-combatant.

These weapons could be easy for anyone to obtain

The STM drones are among the most worrying for many officials, not least because of their resemblance to a normal consumer drone. They are fairly inexpensive, easy to mass produce, and can be equipped with guns. Some experts have warned that this accessibility means that gangs and other criminals could try to use them.

Massachusetts Institute of Technology Professor Max Tegmark believes we’re headed for the “worst possible outcome.” He said: “That’s going to be the weapon of choice for basically anyone who wants to kill anyone. A slaughterbot would basically be able to anonymously assassinate anybody who’s pissed off anybody.”

Tegmark told The Sun some of the ways this technology could be used. For example, he pointed out that if slaughterbots cost the same as AK-47s, drug cartels would use the bots to evade getting caught when they kill people. He also said that a judge with lots of bodyguards could still be killed by one of these if it was flown into their bedroom window while they were sleeping.

Macalester College Professor James Dawes said: “It is a world where the sort of unavoidable algorithmic errors that plague even tech giants like Amazon and Google can now lead to the elimination of whole cities.”

“The world should not repeat the catastrophic mistakes of the nuclear arms race. It should not sleepwalk into dystopia,” he added.

There’s no way it can end well when you let machines that are prone to unpredictable errors make their own decisions about who to kill. If these artificial intelligence weapons were to be equipped with chemical, biological or nuclear warheads, they could even wipe out humanity.

Sources for this article include:

Take Action:
Support Natural News by linking to this article from your website.
Permalink to this article:
Embed article link:
Reprinting this article:
Non-commercial use is permitted with credit to (including a clickable link).
Please contact us for more information.
Free Email Alerts
Get independent news alerts on natural cures, food lab tests, cannabis medicine, science, robotics, drones, privacy and more.
App Store
Android App
eTrust Pro Certified

This site is part of the Natural News Network © 2022 All Rights Reserved. Privacy | Terms All content posted on this site is commentary or opinion and is protected under Free Speech. Truth Publishing International, LTD. is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Truth Publishing assumes no responsibility for the use or misuse of this material. Your use of this website indicates your agreement to these terms and those published here. All trademarks, registered trademarks and servicemarks mentioned on this site are the property of their respective owners.

This site uses cookies
Natural News uses cookies to improve your experience on our site. By using this site, you agree to our privacy policy.
Learn More
Get 100% real, uncensored news delivered straight to your inbox
You can unsubscribe at any time. Your email privacy is completely protected.