Currently there are no fully autonomous weapons. Soon however that statement will cease to be true. Autonomous weapons are weapons that can choose and fire on targets without human intervention. Some have affectionately called these weapons “killer robots”. The United States, U.K., South Korea, Germany and other countries are all involved in the research for autonomous weapons. Many experts believe that the use of fully autonomous weapons can happen in less than 20 years. Recently, Human Rights Watch and the Harvard Law School International Human Rights Clinic published a 50 page report entitled “Losing Humanity: The Case Against Killer Robots”. This report is extensively researched and outlines the legal and non-legal concerns with “killer robots”. They range from the lack of human accountability with the law when using deadly force if autonomous weapons are used, to the undermining of non-legal checks on the killing of civilians such as compassion.
There are arguments made that the use of autonomous weapons will save the lives of our military service men and women. In a country that has been at war for over ten years with thousands of lives lost, a technology that could help reduce this number for the next war should not be easily over looked.
Human Rights Watch and the International Human Rights Clinic have called for a treaty that would ban the development, production and use of these weapons. They have made it very clear, and I am in agreement with this belief, that if this weapon system is to be seriously banned from use, then it must be prohibited now, before the technology exists or its use will become too likely.
My question posed to you is:
1) If you agree with the use of technology, who and through what currently enacted legal treatises or law are you going to hold accountable anyone for crimes against humanities if one of these machines kills a civilian? The computer programmer? If the answer is none, what solution would you offer?
Human Rights Watch