The Human Rights Watch has issued a plea for governments to preemptively ban autonomous drones capable of killing.
In a report written in conjunction with the Harvard Law School, “Losing Humanity: The Case Against Killer Robots,” the Human Rights Watch has called for preemptive action to stop the advancement and deployment of autonomous drones that are capable of killing without intervention by human operators.
Within the 50-page study, the rights group calls autonomous drones “killer robots” and says that tackling the issue now is the only way to stop the future abuse of military force — and the unnecessary deaths of civilians. The report urges governments to consider “an international treaty that would absolutely prohibit the development, production, and use of fully autonomous weapons.”
There are currently no completely independent drones that are equipped with killing capabilities. Instead, a human operator is required to take responsibility for the final push of a button. However, advances in technology — mainly stemming from the U.S. — have resulted in autonomous surveillance drones and trackers. If military technology continues to go down the same road, it is reasonable to expect these examples to be precursors to robots that have more autonomy on the battlefield.
According to the rights group, some experts predict that this change could take place within 20 to 30 years.
“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, Arms Division director at Human Rights Watch. “Human control of robotic warfare is essential to minimizing civilian deaths.”
A major concern is the lack of accountability that fully autonomous weapons would create. Not only could it violate international law, but the Human Rights Watch says that trying to hold military personnel responsible for a robot’s actions would “present significant challenges”.
It may also be the case that robotic soldiers would not be able to distinguish between civilians and soldiers, and would not possess the capabilities to assess a situation properly — resulting in a lack of balance between risk, civilian safety and military advantages of an attack.
“Action is needed now, before killer robots cross the line from science fiction to feasibility,” Goose said.
Image credit: HRW/Chief Fire Controlman Brian Kirkwood, U.S. Navy