Losing HUMANITY The threat of Autonomous Weapons Systems (aka "killer robots")

“Lethal autonomous weapons threaten to become the third revolution in warfare [after gunpowder and nuclear weapons]. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at time scales faster than humans can comprehend."
"These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways.”

This dire warning

comes from a 2017 open letter signed by 115 tech experts, including SpaceEx CEO Elon Musk and Alphabet’s artificial intelligence expert, Mustafa Suleyman.

Pledge against killer robots

Since the publication of that letter, more than 2,400 tech professionals have formally pledged “neither [to] participate in nor support the development, manufacture, trade, or use of lethal autonomous weapons.”

Their statement, issued in July 2018, also calls on private firms and governments to follow suit.

High-tech militaries

Countries with high-tech militaries, particularly the United States, China, Israel, South Korea, Russia, and the United Kingdom are continuing to develop autonomous systems for military applications.

12 Countries

At present, close to 381 partly autonomous systems have been deployed or are being developed in 12 countries.

Technological advances in weaponry are far outpacing international law.

Over the past decade, the expanded use of unmanned armed vehicles has dramatically changed warfare.

Rapid advancements in artificial intelligence and robotics are bringing militaries ever closer to developing fully autonomous weapons that would be able to choose and fire on targets on their own, without any human intervention.

This capability would pose a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law.

Campaign to Stop Killer Robots

Project Ploughshares has joined the Campaign to Stop Killer Robots, a growing coalition of 76 nongovernmental organizations in 32 countries working to pre-emptively ban weapons systems that, once activated, would select and attack targets without human intervention.

At the most recent meeting of states parties to the United Nations Convention on Certain Conventional Weapons (CCW) in Geneva this August, the Campaign to Stop Killer Robots issued a statement urging states parties to heed calls “to begin negotiations on a new treaty to retain human control over weapons systems or prohibit lethal autonomous weapons.”

Thus far, 26 countries, including Austria, Brazil, and China, have called for a ban.

What is a fully autonomous weapons system?

In a fully autonomous weapons system (AWS), there is no significant human input over critical decisions, such as the decision to target and kill people. After being programmed, the weapons system takes action on its own.

Even when the system is not yet fully autonomous, the push toward autonomy can render human control essentially meaningless.

What are the legal and ethical concerns?

Because humanitarian law was created to be applied to human beings, it is not at all clear who would be held legally responsible in the event of an attack by an autonomous weapons system—the manufacturer, programmer, commander, robot itself.

Additionally, moral and ethical questions arise with the use of an autonomous weapons system:

  • What are appropriate levels of human involvement?
  • How much human control is being exercised over critical decisions such as targeting and killing?

For more information visit www.ploughshares.ca

Report Abuse

If you feel that this video content violates the Adobe Terms of Use, you may report this content by filling out this quick form.

To report a copyright violation, please follow the DMCA section in the Terms of Use.