Report Cites Dangers of Autonomous Weapons

Photo credit: Kim Hong-Ji/Reuters


By John Markoff


A new report written by a former Pentagon official who helped establish United States policy on autonomous weapons argues that such weapons could be uncontrollable in real-world environments where they are subject to design failure as well as hacking, spoofing and manipulation by adversaries.


In recent years, low-cost sensors and new artificial intelligence technologies have made it increasingly practical to design weapons systems that make killing decisions without human intervention. The specter of so-called killer robots has touched off an international protest movement and a debate within the United Nations about limiting the development and deployment of such systems.


The new report was written by Paul Scharre, who directs a program on the future of warfare at the Center for a New American Security, a policy research group in Washington, D.C. From 2008 to 2013, Mr. Scharre worked in the office of the Secretary of Defense, where he helped establish United States policy on unmanned and autonomous weapons. He was one of the authors of a 2012 Defense Department directive that set military policy on the use of such systems.


In the report, titled “Autonomous Weapons and Operational Risk,” set to be published on Monday, Mr. Scharre warns about a range of real-world risks associated with weapons systems that are completely autonomous.


The report contrasts these completely automated systems, which have the ability to target and kill without human intervention, to weapons that keep humans “in the loop” in the process of selecting and engaging targets.


Mr. Scharre, who served as an Army Ranger in Iraq and Afghanistan, focuses on the potential types of failures that might occur in completely automated systems, as opposed to the way such weapons are intended to work. To underscore the military consequences of technological failures, the report enumerates a history of the types of failures that have occurred in military and commercial systems that are highly automated.


“Anyone who has ever been frustrated with an automated telephone call support helpline, an alarm clock mistakenly set to ‘p.m.’ instead of ‘a.m.,’ or any of the countless frustrations that come with interacting with computers, has experienced the problem of ‘brittleness’ that plagues automated systems,” Mr. Scharre writes.



Continue reading by clicking the name of the source below.

 •  0 comments  •  flag
Share on Twitter
Published on February 28, 2016 22:29
No comments have been added yet.


ريتشارد دوكنز's Blog

ريتشارد دوكنز
ريتشارد دوكنز isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow ريتشارد دوكنز's blog with rss.