Featured Articles

IDC says PC market is rebounding

IDC says PC market is rebounding

Research firm IDC has published its latest report into the state of the PC market and while there are some signs…

More...
TSMC steps up development of 10nm process

TSMC steps up development of 10nm process

TSMC, the world’s biggest chip foundry for hire, has reportedly stepped up development of its 10nm manufacturing process.

More...
Broadwell 14nm desktop comes late in Q2 2015

Broadwell 14nm desktop comes late in Q2 2015

A while ago we mentioned that Broadwell won’t show up in the desktop space this year and we got it right.…

More...
AMD A8-7600 Kaveri APU reviewed

AMD A8-7600 Kaveri APU reviewed

Today we'll take a closer look at AMD's A8-7600 APU Kaveri APU, more specifically we'll examine the GPU performance you can…

More...
EVGA GTX 780 Classified reviewed

EVGA GTX 780 Classified reviewed

The EVGA GTX 780 Classified has been dethroned as the company’s fastest non-Titan card following the introduction of the GTX 780…

More...
Frontpage Slideshow | Copyright © 2006-2010 orks, a business unit of Nuevvo Webware Ltd.
Tuesday, 20 November 2012 11:22

Human Rights Watch calls for autonomous drone ban

Written by Peter Scott



Only humans should kill humans, not Skynet


Human Rights Watch has called for an international ban on autonomous robots capable of shooting people without intervention from human operators.

In a report co-produced with the Harvard Law School, the rights group warns about the dangers of employing autonomous robotic weapons on the battlefields of tomorrow. The report describes the contraptions as “killer robots” and calls for an international treaty that would ban their development, production and deployment.

The defense industry has been in love with tech for decades and various levels of automation have been employed in countless weapons systems, dating back to World War II. However, the distinction between them and “killer robots” is the level of autonomy.

Current generation systems are not entirely autonomous and in most cases they rely on human operators to squeeze the trigger, or push the button. The report states that fully autonomous robots that decide for themselves when to fire could be developed within 20 to 30 years, or even sooner.

Purely defensive autonomous weapons systems, like anti-missile CIWS systems, have been around for years, but the report focuses on robots that would shoot actual people rather than sea-skimming missiles.

Human Rights Watch arms division director Steve Goose argued that it would be best to preempt the development of “killer robots” before they get off drawing boards, just in case.

Robotics professor Noel Sharkey raises another problem – accountability.

“If a robot goes wrong, who’s accountable? It certainly won’t be the robot,” he said. “The robot could take a bullet in its computer and go berserk, so there’s no way of really determining who’s accountable and that’s very important for the laws of war.”

Well, to be honest, we're not doing a good job at prosecuting real flesh and blood war criminals, so why should robots be any different?

More here.


Peter Scott

E-mail: This e-mail address is being protected from spambots. You need JavaScript enabled to view it
blog comments powered by Disqus

 

Facebook activity

Latest Commented Articles

Recent Comments