Editorials

Principles needed for autonomous lethal weapons

While much of the progress in artificial intelligence has unfolded in plain view — machines that have mastered games like chess and go, and driverless cars are two examples — important developments are taking place far from such scrutiny. The most important of these is the creation of lethal autonomous weapons systems (LAWS), which militaries are developing with speed and intensity. These weapons create new concerns and controversies. Yet despite growing attention to this troubling phenomenon, there is precious little consensus on the appropriate response. A legally binding framework for these weapons must be developed and soon, to ensure that both regulations and norms can guide efforts in this field.

There is no single agreed definition of LAWS. In general, the term refers to weapons that can select, detect and engage targets with little or no human intervention. It encompasses a wide range of weapons systems that include in some cases no human action and in others no human authorization. In the most fervid examples, critics condemn “killer robots” capable of unleashing extraordinary lethality and destruction without human input. Supporters of those weapon systems counter that their processing capabilities reduce the chance of human error and increase accuracy. While fully autonomous weapons are a distant prospect, it is estimated that global spending on robotics and drones reached $95.9 billion last year and will top $200 billion by 2022.

The United Nations has, under the Convention on Certain Conventional Weapons (which has 125 state parties), for three years convened a Group of Governmental Experts (GGE) to examine and discuss developments in this field. The name is misleading. The group includes more than governmental experts: International organizations, nongovernmental organizations and academic groups also participate. Last week, the group met for the third time, and efforts to propose a framework for limiting, if not banning, the development and use of LAWS were again stymied.

Advocates of a ban argue that LAWS violate the Martens Clause in the preamble to the 1899 Hague Convention (II) on the Laws and Customs of War on Land. That clause states that even without an explicit ban, all states must ensure that its weapons comply with “principles of humanity” and “the dictates of public conscience.” Killer robots would not. Apart from the argument that autonomous machines would not act according to principles of humanity — they would be programmed — those critics point to public opinion surveys that show majorities in many countries oppose LAWS — and those numbers have been increasing in recent years.

At a ceremony marking the 100th anniversary of the end of World War I, U.N. Secretary-General Antonio Guterres backed a ban on LAWS. He encouraged his audience to “imagine the consequences of an autonomous system that could, by itself, target and attack human beings.” For him the conclusion was clear: “I call upon states to ban these weapons, which are politically unacceptable and morally repugnant.” He repeated himself in a message to the GGE, calling “machines with the power and discretion to take lives without human involvement … politically unacceptable, morally repugnant and should be prohibited by international law.”

Japan walks a fine line in this debate. While it has said it does “not intend to develop any lethal weapon that is completely autonomous and functions without human control,” officials and experts acknowledge that autonomous weapons make sense when the number of individuals who can serve in the Self-Defense Forces is shrinking and because they can reduce collateral damage. Japan’s working paper on the issue submitted to the GGE noted that the country “intends to develop and operate such systems” that “are designed to ensure meaningful human control.” The paper argues that while there is no agreement about key terms such as “autonomous,” “lethal” and “the forms of human control,” there is an effective consensus that “meaningful human control is essential.”

A large majority of states are prepared to endorse a legally binding framework to ban the develop of LAWS, even if particular terms must be better defined. There are fears that a small group of countries is happy to let the debate continue; in the meantime, they will continue to develop LAWS. While that debate continues, individuals in the private sector are voicing their opposition and refusing to work on weapons systems. That may prove to be the most powerful obstacle to the development of LAWS. This is encouraging, but not all scientists share these scruples and reservations. A legal framework for states is needed. It’s time to start declaring where consensus exists and begin to articulate the principles that can govern state behavior even if a treaty remains beyond reach. Japan can and should help lead in this effort.