By Heather M. Roff
Over the last few years, global leaders have started debating how to handle the prospect of autonomous weapons—aka killer robots—capable of selecting and engaging targets without human intervention. The implications here are, of course, enormous: Such a system would be able to identify a potential target and decide to fire upon it without a human telling it exactly what to do or perhaps even knowing what it’s going to do. While no military has announced that it possesses autonomous weapons, some countries’ armed forces do possess systems capable of loitering in an area “hunting for a target” and then firing upon it, such as the Israeli Harpy and Harop. Others have systems that can navigate by themselves, communicate with other weapons, and “decide” which target to fire upon from a preselected area or class of targets, like the U.S. Long-Range Anti-Ship Missile. These arms are halfway to a true autonomous weapon.
Read on Slate.com
By Brad Allenby
The world is in a confused and dangerous state. Russia, a nuclear power, invades Ukraine and threatens the Baltic states, all the while spouting casual nuclear threats. ISIS recruits by posting videos of its brutal murders. Portions of both the Middle East and sub-Saharan Africa degrade into failed and weak states. They exhibit what some have called neomedievalism, which is characterized by violence, polycentric governance, and warring ideologies. Camps within the American and European right and left reject science as an authoritative source of truth, accepting only that which accords with their belief systems. It seems chaotic—what American military author and historian Sean McFate calls “durable disorder”—but it has at least one unifying underlying theme: the rejection of the modern, technologically sophisticated, complex, multicultural, and multipolar world.
Read on Slate.com