12 August 2021
- RSIS
- Publication
- RSIS Publications
- Lethal Autonomous Weapons: No Law for LAWS?
SYNOPSIS
Representatives from 50 countries have gathered in Geneva to resume discussions on the challenges posed by lethal autonomous weapons (LAWS). Earlier discussions have yielded little in terms of a commitment to regulate LAWS. An outright ban desired by international rights groups and some countries is unlikely to materialise.
Source: Pixabay
COMMENTARY
ON 2 AUGUST 2021, the international Human Rights Watch (HRW) launched a renewed effort to urge governments to negotiate a treaty that sets binding regulations – or even an outright ban – on the use of lethal autonomous weapons (LAWS).
The NGO released a 17-page report, entitled “Areas of Alignment: Common Visions for a Killer Robots Treaty”. It describes how dozens of countries have stated their objections to “delegating life-and-death decisions to machines” during the Convention on Conventional Weapons (CCW) last held in September 2020. The report also came a day before representatives from 50 countries were expected to consider the matter in Geneva following the months-long hiatus.
Military Utility of LAWS
Many military systems and weapons in use today are already semi-autonomous in nature because they rely on autonomy for certain parts of their system; but they retain a physical or communications link to a human operator who will make the ultimate decision on any action.
One example would be integrated air defence systems that can automatically identify, track, and calculate optimal responses to multiple incoming threats. Another example: unmanned vehicles that are able to self-navigate and execute a range of actions using pre-determined directives set by an operator.
In both instances, the command to employ lethal force – such as launching a surface-to-air missile or an unmanned aircraft’s onboard weapons – firmly remain under the sole province of human control.
In contrast, autonomous weapon systems can identify, select and engage a target with lethal force without an operator’s intervention, would independently respond to a dynamic environment, and determine the optimal course of action to achieve its pre-programmed goals.
Advantages of LAWS
As autonomous weapons do not require constant interaction with one or more human operators these systems would offer military forces several advantages.
These include reduced cognitive load on personnel and much shorter sensor-to-shooter cycles, and even immunity to traditional countermeasures such as jamming. It could also mean that large numbers of these weapons can be deployed simultaneously; this is because human operators with their limited cognitive capacities are no longer needed to direct them.
Indeed, notwithstanding the ongoing debate and growing alarm over LAWS, military forces across the world have demonstrated a growing interest in procuring unmanned aerial, ground, and maritime vehicles that are often highly automated and capable of carrying weapons.
While these unmanned vehicles remain largely in human control, there are serious concerns that the line between present-day platforms and LAWS will become blurred over time as military forces continue to seek and field increasingly advanced systems.
Moreover, the emerging breed of unmanned combat systems being developed to fight alongside or replace crewed platforms across the air, land, and maritime domains in future battlefields will necessarily feature high levels of autonomy not only to be able to operate in spectrum-contested environments but also to independently assess and react to fluid situations during battle.
Paradigm Shift Into the Unknown?
The proliferation of such weapon systems could result in a paradigm shift in the conduct of military operations.
Some observers believe that LAWS present an opportunity to make combat more humane by reducing the number of civilian and military casualties on the battlefield. Others are concerned about unintended escalation and potential global instability, as well as the risks of having such potentially deadly weapons fall into the hands of non-state actors.
Among the main concerns of international watchdogs and countries demanding tight control or a total ban on LAWS is that the use of such systems is incompatible with international humanitarian law. There are to be observed fundamental rules of distinction and proportionality; human qualities that serve as checks and balances; and compel most actors to apply force only in the most limited range of situations.
Another pressing concern about the use of autonomous weapons is how little data or understanding that we currently possess about such systems. For example, the AI that underpin autonomous weapons must be trained, tested, and validated to ensure that they function as expected.
But such efforts are presently carried out within controlled environments such as laboratories and are hardly representative of rapidly changing and often highly ambiguous real-world environments. Without such rigorous testing, the risk of unintended and disastrous results are very real possibilities.
Moreover, while individual operators and military commanders can be held responsible for breaching international treaties such as the Geneva Conventions, it would be legally challenging and morally questionable to bring charges against such individuals for the unintended or unforeseeable actions of an autonomous weapon system.
Most of these groups believe that an unambiguous and total ban on LAWS is necessary as opposed to current approaches that essentially centre on attempts to implement a set of guiding principles and best practices, or worse, non-binding political declarations.
Banning LAWS: An Unlikely Prospect?
According to HRW, there are presently 31 countries calling for a ban on LAWS: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China, Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, El Salvador, Ghana, Guatemala, the Holy See, Iraq, Jordan, Kazakhstan, Mexico, Morocco, Namibia, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.
The outlier is China, which surprisingly called upon other governments in April 2018 “to negotiate and conclude a succinct protocol to ban the use of fully autonomous weapons systems”. China was the first permanent member of the UN Security Council and the 26th country to call for a ban on LAWS. However, it appears that Beijing’s stated intent has little substance given that its military apparatus continues to engage in the development of such systems today.
Despite a growing pool of international voices supporting this motion, it is clear that prospects for a legally binding instrument on LAWS remain dim for the foreseeable future for several reasons.
First, CCW discussions on the matter have remained largely impotent thus far because of excessive consensus building that slows the process, often resulting in lowest-common-denominator decisions.
Second, it is also clear that most countries supporting a ban on LAWS are typically unable to pursue indigenous development of these systems while countries that possess the financial and technical wherewithal to do otherwise remain opposed to such limitations.
For instance, leading military powers such as Russia and the United States have consistently rejected proposals for a treaty, calling such moves “premature”.
About the Author
Kelvin Wong is a lead technology analyst and editor with defence intelligence provider Janes, specialising in unmanned systems and emerging military technologies. He was formerly with the Military Studies Programme at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. The views expressed are his own.
SYNOPSIS
Representatives from 50 countries have gathered in Geneva to resume discussions on the challenges posed by lethal autonomous weapons (LAWS). Earlier discussions have yielded little in terms of a commitment to regulate LAWS. An outright ban desired by international rights groups and some countries is unlikely to materialise.
Source: Pixabay
COMMENTARY
ON 2 AUGUST 2021, the international Human Rights Watch (HRW) launched a renewed effort to urge governments to negotiate a treaty that sets binding regulations – or even an outright ban – on the use of lethal autonomous weapons (LAWS).
The NGO released a 17-page report, entitled “Areas of Alignment: Common Visions for a Killer Robots Treaty”. It describes how dozens of countries have stated their objections to “delegating life-and-death decisions to machines” during the Convention on Conventional Weapons (CCW) last held in September 2020. The report also came a day before representatives from 50 countries were expected to consider the matter in Geneva following the months-long hiatus.
Military Utility of LAWS
Many military systems and weapons in use today are already semi-autonomous in nature because they rely on autonomy for certain parts of their system; but they retain a physical or communications link to a human operator who will make the ultimate decision on any action.
One example would be integrated air defence systems that can automatically identify, track, and calculate optimal responses to multiple incoming threats. Another example: unmanned vehicles that are able to self-navigate and execute a range of actions using pre-determined directives set by an operator.
In both instances, the command to employ lethal force – such as launching a surface-to-air missile or an unmanned aircraft’s onboard weapons – firmly remain under the sole province of human control.
In contrast, autonomous weapon systems can identify, select and engage a target with lethal force without an operator’s intervention, would independently respond to a dynamic environment, and determine the optimal course of action to achieve its pre-programmed goals.
Advantages of LAWS
As autonomous weapons do not require constant interaction with one or more human operators these systems would offer military forces several advantages.
These include reduced cognitive load on personnel and much shorter sensor-to-shooter cycles, and even immunity to traditional countermeasures such as jamming. It could also mean that large numbers of these weapons can be deployed simultaneously; this is because human operators with their limited cognitive capacities are no longer needed to direct them.
Indeed, notwithstanding the ongoing debate and growing alarm over LAWS, military forces across the world have demonstrated a growing interest in procuring unmanned aerial, ground, and maritime vehicles that are often highly automated and capable of carrying weapons.
While these unmanned vehicles remain largely in human control, there are serious concerns that the line between present-day platforms and LAWS will become blurred over time as military forces continue to seek and field increasingly advanced systems.
Moreover, the emerging breed of unmanned combat systems being developed to fight alongside or replace crewed platforms across the air, land, and maritime domains in future battlefields will necessarily feature high levels of autonomy not only to be able to operate in spectrum-contested environments but also to independently assess and react to fluid situations during battle.
Paradigm Shift Into the Unknown?
The proliferation of such weapon systems could result in a paradigm shift in the conduct of military operations.
Some observers believe that LAWS present an opportunity to make combat more humane by reducing the number of civilian and military casualties on the battlefield. Others are concerned about unintended escalation and potential global instability, as well as the risks of having such potentially deadly weapons fall into the hands of non-state actors.
Among the main concerns of international watchdogs and countries demanding tight control or a total ban on LAWS is that the use of such systems is incompatible with international humanitarian law. There are to be observed fundamental rules of distinction and proportionality; human qualities that serve as checks and balances; and compel most actors to apply force only in the most limited range of situations.
Another pressing concern about the use of autonomous weapons is how little data or understanding that we currently possess about such systems. For example, the AI that underpin autonomous weapons must be trained, tested, and validated to ensure that they function as expected.
But such efforts are presently carried out within controlled environments such as laboratories and are hardly representative of rapidly changing and often highly ambiguous real-world environments. Without such rigorous testing, the risk of unintended and disastrous results are very real possibilities.
Moreover, while individual operators and military commanders can be held responsible for breaching international treaties such as the Geneva Conventions, it would be legally challenging and morally questionable to bring charges against such individuals for the unintended or unforeseeable actions of an autonomous weapon system.
Most of these groups believe that an unambiguous and total ban on LAWS is necessary as opposed to current approaches that essentially centre on attempts to implement a set of guiding principles and best practices, or worse, non-binding political declarations.
Banning LAWS: An Unlikely Prospect?
According to HRW, there are presently 31 countries calling for a ban on LAWS: Algeria, Argentina, Austria, Bolivia, Brazil, Chile, China, Colombia, Costa Rica, Cuba, Djibouti, Ecuador, Egypt, El Salvador, Ghana, Guatemala, the Holy See, Iraq, Jordan, Kazakhstan, Mexico, Morocco, Namibia, Nicaragua, Pakistan, Panama, Peru, the State of Palestine, Uganda, Venezuela, and Zimbabwe.
The outlier is China, which surprisingly called upon other governments in April 2018 “to negotiate and conclude a succinct protocol to ban the use of fully autonomous weapons systems”. China was the first permanent member of the UN Security Council and the 26th country to call for a ban on LAWS. However, it appears that Beijing’s stated intent has little substance given that its military apparatus continues to engage in the development of such systems today.
Despite a growing pool of international voices supporting this motion, it is clear that prospects for a legally binding instrument on LAWS remain dim for the foreseeable future for several reasons.
First, CCW discussions on the matter have remained largely impotent thus far because of excessive consensus building that slows the process, often resulting in lowest-common-denominator decisions.
Second, it is also clear that most countries supporting a ban on LAWS are typically unable to pursue indigenous development of these systems while countries that possess the financial and technical wherewithal to do otherwise remain opposed to such limitations.
For instance, leading military powers such as Russia and the United States have consistently rejected proposals for a treaty, calling such moves “premature”.
About the Author
Kelvin Wong is a lead technology analyst and editor with defence intelligence provider Janes, specialising in unmanned systems and emerging military technologies. He was formerly with the Military Studies Programme at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore. The views expressed are his own.