By Sandra Song. (Originally posted March 24, 2016 on The NATO Association of Canada. Re-printed with permission)
The advancement of military technology is inching towards a robotic revolution that was once only imaginable in science fiction.
In recent years, the development of lethal autonomous weapons systems (LAWS), often called ‘killer robots’ has sparked a number of debates. The main concern focuses on the ethical considerations, followed by its legality and efficacy in warfare. Although these fully autonomous weapons have not been deployed thus far, that we know of, the possibility is not too far on the horizon.
LAWS are weapons that would act on the basis of artificial intelligence (AI), which would be capable of selecting and firing upon a target without any human intervention. An example of one these weapons would be armed quadcopters that could chase after enemy combatants in a city and eliminate them based on the program that has been installed.
Human Rights Watch has noted that precursors to these weapons are in the process of being developed and deployed by China, Israel, South Korea, Russia, the United Kingdom and the United States.
This would be the first time the possibility of removing a human operator from the battlefield would be taking place, which has raised many concerns for the structural transformation of warfare.
The consensus shared among the international community, prominent academics and scientists is that ‘killer robots’ need to be banned because they are dangerous due to its indiscriminate effects. In the wrong hands they could result in overuse, leading to a LAWS arms race. The other consequence is further perpetuating warfare by removing the element of respect between human combatants and lowering the threshold of going to war.
According to the Women’s International League for Peace and Freedom (WILPF), fully autonomous weapons systems “lack every feature of human intelligence and human judgment that make humans subject and accountable to rules and norms. The use of artificial intelligence in armed conflict poses a fundamental challenge to the protection of civilians and to compliance with international human rights and humanitarian law.”
To date, the United Nations (UN) has held three major meetings under the Convention on Certain Conventional Weapons (CCW) to move towards drafting a treaty to ban LAWS. For the time being, there is a general agreement to keep human beings in the loop when it comes to controlling deadly application of force. In addition, under the practice of customary international humanitarian law, the use of weapons that are by nature indiscriminate is prohibited. However, the question remains as to whether LAWS qualify for the distinction of indiscriminate.
US Army Strategist Major Daniel Sukman explains that unmanned systems in the US, including unmanned autonomous weapons systems have a high functional accuracy, however “they do not possess the ability to determine the second and third order effects of killing another human being.”
As there is always the possibility of human error, there can be flaws within an autonomous weapons systems’ AI program. Consequently, it would not be able to apply the same common sense that a human soldier would.
Although LAWS do present many frightening threats, they can be useful if applied with caution, since they are bound by their programming. These ‘killer robots’ are completely different from human combatants, therefore they would need to serve a different function.
Autonomous weapons can add a physical distance between soldiers and highly contentious theatres of operation. This could reduce one of the largest costs of deploying soldiers and equipment. The use of LAWS could also reduce the number of war fatalities, as well as the psychological effects such as post-traumatic stress disorder (PTSD).
In addition, what is often dismissed is the capacity for a ‘shoot-second’ approach to warfare. Human combatants exist in a state of ‘kill or be killed’, while, an AI-controlled weapons system does not. The benefit of this is that the program can use information gathered by not taking the first shot directed at a target to evaluate and examine the information it collects. This could potentially lead to two features: dramatically lower collateral damage, and potentially reduce conflict as enemy combatants will hesitate to fight a solider they cannot exterminate.
The pros may not outweigh the cons, but there also needs to be the consideration that there is no such thing as a fault-free war. Keeping a human being in the loop to manage all weapons systems would not be an absolute fail-safe to the LAWS debate, nor would it be the universal remedy to all of the ethical quandaries.
To quote Major Sukman, “in complex war, the objective is not just to win but to do so while minimizing our losses and minimizing collateral damage, in addition to other goals… Autonomous systems may prove to be more adept at distinguishing between combatant and non-combatants on the battlefield, thereby enabling better and more precise targeting. Human decision-making may be in error due to fear, anger, or fatigue, which narrows the differences with autonomous systems.”
Advanced military technologies are changing the traditional structure of how wars are fought. LAWS may have the potential to reduce the number of losses during a time of conflict, but it is uncertain at what cost. With or without human operators in the loop as a fail-safe, the future of autonomous weapons systems could still have deadly results.
Canada’s Defence Perspectives 2020-2050: Recapitalization and the Canadian Forces is an event hosted in partnership with the Mackenzie Institute, and with support from the Department of National Defence. The conference will look to engage security and defence academics, military representatives and professional in various panels discussing Canadian defence perspectives. These articles have been composed to highlight some of the important talking points that will be uncovered throughout the conference, and give further insight into the important work being done to continue discussion relating to Canadian security and defence.
This article relates to the panel that will take place on day 2 of the conference and features:
- (Ret’d) D. Mike Day, CMM, MSC, CD
- Andrew Johnston, Program Leader, Security Materials Technology, National Research Council
- Alex Wilner, Carleton University (Moderator)
If this topic interests you, please attend our conference from March 29-30th, 2016 at the Fairmont Château Laurier in Ottawa. For further information and registration please click here
Sandra Song is a Research Analyst at the NATO Association of Canada. She was the former Editor for the Canadian Armed Forces program, and she was previously a Junior Research Fellow for the Strategic Reserve Program in 2013. Sandra has a BA Bilingual Hons. in International Studies from Glendon College, York University. She recently completed her MA in International Conflict & Security at the University of Kent, Brussels School of International Studies. Her dissertation examined the political and legal perspectives of balancing security and liberty in the case of civilian aircraft hijackings that would be used as a weapon for terrorism.