Google Play icon

Beyond drone warfare: Prof warns of ‘automated killing machines’

Share
Posted May 29, 2015

The acronym may sound comforting, but LAWS stands for lethal autonomous weapons systems — systems that, unlike remotely piloted drones, select and engage targets without human intervention. Such artificial-intelligence weaponry is “feasible within years, not decades,” warns Stuart Russell, a UC Berkeley professor of computer science, and “the stakes are high: LAWS have been described as the third revolution in warfare, after gunpowder and nuclear arms.”

In an op-ed piece for the science journal Nature, Russell, an expert in artificial intelligence, outlines the debate over the use of AI weapons systems, and notes widespread agreement on the need for “meaningful human control” over targeting and engagement decisions. “Unfortunately,” he adds, “the meaning of ‘meaningful’ is still to be determined.”

“Some argue that the superior effectiveness and selectivity of autonomous weapons can minimize civilian casualties by targeting only combatants,” writes Russell, one of four leading researchers invited by Nature to weigh in on the question. “Others insist that LAWS will lower the threshold for going to war by making it possible to attack an enemy while incurring no immediate risk, or that they will enable terrorists and non-state-aligned combatants to inflict catastrophic damage on civilian populations.”

MQ-9 Reaper in flight. Image credit: U.S. Air Force

MQ-9 Reaper in flight. Image credit: U.S. Air Force

“The capabilities of autonomous weapons will be limited more by the laws of physics — for example, by constraints on range, speed and payload — than by any deficiencies in the AI systems that control them,” Russell writes. Moreover, “LAWS could violate fundamental principles of human dignity by allowing machines to choose whom to kill — for example, they might be tasked to eliminate anyone exhibiting ‘threatening behavior.’ The potential for LAWS technologies to bleed over into peacetime policing functions is evident to human-rights organizations and drone manufacturers.”

Such decisions, he argues, are far too important to be left to machines. “The AI and robotics science communities, represented by their professional societies, are obliged to take a position,” he writes, “just as physicists have done on the use of nuclear weapons, chemists on the use of chemical agents and biologists on the use of disease agents in warfare.”

Accompanying the op-ed is an audio interview in which Russell stresses the urgency of human oversight. “It’s a very easy step to go from remotely piloted drones to fully autonomous weapons,” he says. “The AI community may be too late. We might decide that we don’t like our technology being used to kill people, but we may not have a say in it.”

Source: UC Berkeley

Featured news from related categories:

Technology Org App
Google Play icon
84,707 science & technology articles

Most Popular Articles

  1. Real Artificial Gravity for SpaceX Starship (September 17, 2019)
  2. Top NASA Manager Says the 2024 Moon Landing by Astronauts might not Happen (September 19, 2019)
  3. How social media altered the good parenting ideal (September 4, 2019)
  4. What's the difference between offensive and defensive hand grenades? (September 26, 2019)
  5. Just How Feasible is a Warp Drive? (September 25, 2019)

Follow us

Facebook   Twitter   Pinterest   Tumblr   RSS   Newsletter via Email