By akademiotoelektronik, 21/03/2023

Killing robots: The United Nations are struggling to pose a legal framework for the employment of autonomous lethal weapons

As early as 2013, autonomous lethal weapons systems (SALA) were the subject of discussions during the United Nations Convention on Classic Weapons (CCAC).At the end of 2021, the subject was again addressed by the 125 states gathered in Geneva but, for lack of understanding, the file, referred to government experts, will be treated again this year.

Sala

At the forefront of high technologies, using artificial intelligences, Sala, are not the subject of a common definition, or international supervision to date.They arouse strong concerns, in particular of an ethical and legal order, question in relation to international humanitarian law (DIH), while discussions were initiated in 2013 in Geneva within the framework of the Convention on certain Classic Weapons (CCAC) of1980, amended in 2001.

For many of us, killers are part of science fiction and global public opinion does not seem concerned by the problems they are already raising.However, organizations like "Stop Robot Killers" or "Human Rights Watch", in 2013, launched a campaign aimed at the prohibition of entirely autonomous weapons to which many countries have joined and claiming a law in this sense.Mary Wareham, campaign of the campaign, said:

Sala racing

If some countries declare themselves against the development of Sala and adhere to the campaign of “Stop Robot Killers”, others are more mixed, even for ... the Committee of Ethics of the French Army, meanwhile, condemns “theRobots, boosted by artificial intelligence, which themselves use integrated software to find and hit targets ”but is not really against robotic armas piloted by man.

Robots tueurs : les Nations Unies peinent à poser un cadre juridique à l’emploi des armes létales autonomes

All the major powers invest in artificial intelligence for their armies.Vladimir Putin said about AI in 2017:

Semi-autonomous weapons systems were used in the last "modern" wars: drones, missiles, tanks, submarines ... but they are supervised by man.According to experts, which a killer robot can decide alone to attack thanks to cameras, sensors, visual recognition software, algorithms, only depends on the decision of the authorities, technology is practically operational.The United States, China, Russia, South Korea, Australia, India, Turkey and Israel invest in the development of murderous autonomous weapons systems that can identify, target and kill a personeven without any international law adjusting their use.

A manualization of man

Giving a decision of life and death to a technology from data obviously poses an ethical problem.A machine will never understand the context of an action or its consequences.Without human intervention, proportionality of an attack, the distinction between civilians and soldiers will not be taken into account by the AI system.The rights of the war will be flouted: a group of men wanting to go, for example, will be recognized as attackers and slaughtered ... Noel Sharkey, president of the International Committee for the control of robotic weapons, expert in AI and robotics at the'British University of Sheffield, said:

A necessary legal framework

A Sala's arms race is already engaged, the protection of civilians does not actually come into account and killers' robots could be victims of cyberattacks that can lead to world conflicts.The danger is real and a law to protect itself essential.Even if more and more governments, institutions, experts in AI, scientists ... call for regulation of SALA, countries oppose it: Russia, the United States, India and Israel inhead and make the implementation of an impossible law to date.If the CCAC could not lead to tangible recommendations, a new international treaty could however be adopted through an independent process.

Tags: