This is not good. It sounds like a SCFI movie in real life. Please let me know what you think. Andrew
Scientists, engineers and policymakers are all figuring out ways drones can be used better and more smartly, more precise and less damaging to civilians, with longer range and better staying power. One method under development is by increasing autonomy on the drone itself.
Eventually, drones may have the technical ability to make even lethal decisions autonomously: to respond to a programmed set of inputs, select a target and fire their weapons without a human reviewing or checking the result. Yet the idea of the U.S. military deploying a lethal autonomous robot, or LAR, is sparking controversy. Though autonomy might address some of the current downsides of how drones are used, they introduce new downsides policymakers are only just learning to grapple with.
The basic conceit behind a LAR is that it can outperform and outthink a human operator. “If a drone’s system is sophisticated enough, it could be less emotional, more selective and able to provide force in a way that achieves a tactical objective with the least harm,” said Purdue University Professor Samuel Liles. “A lethal autonomous robot can aim better, target better, select better, and in general be a better asset with the linked ISR [intelligence, surveillance, and reconnaissance] packages it can run.”
Read more at http://www.nationaljournal.com/national-security/soon-drones-may-be-able-to-make-lethal-decisions-on-their-own-20131008
- Ready for This? Lethal Autonomous Robot Drones. (defenseone.com)
- U.N. Wants to Stop All Production, Testing of Killer Robots (theblaze.com)
- The U.S. Military Wants Autonomous “Killer Robots” (warnewsupdates.blogspot.com)
- NPS challenges students to consider the ethics of unmanned systems (dvidshub.net)