Rise Of The Machines : Human Judgment Required

Nov 24, 2012 | comments


As custom government malware becomes an increasingly common international weapon with real-world effects—breaking a centrifuge, shutting down a power grid, scrambling control systems—do we need legal limits on the automated decision-making of worms and rootkits? Do we, that is, need to keep a human in charge of their spread, or of when they attack? According to the US government, no we do not.

A recently issued Department of Defense directive signed by Deputy Secretary of Defense Ashton Carter sets military policy for the design and use of autonomous weapons systems in combat. The directive is intended to minimize "unintended engagements"—weapons systems attacking targets other than enemy forces, or weapon systems causing collateral damage. But the directive specifically exempts autonomous cyber weapons.

Most weapon systems, the policy states, "shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force," regardless of whether the system is using lethal "kinetic" weapons or some form of non-lethal force. If bullets, rockets, or missiles are to be fired, tear gas is to be launched, or systems are to be jammed, a human needs to make the final decision on when they are used and at whom they are aimed.
Share this article :

Post a Comment

I'm certainly not an expert, but I'll try my hardest to explain what I do know and research what I don't know. Be sure to check back again , after moderation i do make every effort to reply to your comments .

 
Support : INDIATRIKS
Copyright © 2011. INDIATRIKS - All Rights Reserved
Template Edited By Indiatriks
Proudly Powered By Blogger