By akademiotoelektronik, 09/06/2022

"Robots": between legends and realities, what should we be afraid of?

Yannick Smaldore August 12, 2021 at 3:072 p.m. Kargu-2 drone of this type would have been used to track down and eliminate targets without direct human control.Credits: STM

For the past few weeks, the press around the world has been agitated around the issue of "killer robots".In question, a UN report on the Libyan conflict dating from March, which would describe for the first time a drone attack without human intervention.Enough to revive all the freezing freezers around "robots", and the opportunity to recall the need for regulation internationally.

However, if the militarization of drones remains a subject of legitimate concern, the "robots" should remain confined to science fiction for a long time.

Voir aussi : Les robots-tueurs inquiètent Musk et le monde de la high-tech

How drones have become independent

The use of military drones dates back to the Vietnam War, even if the generalization of this type of aircraft is done from the 1980s. Initially designed to obtain intelligence without risking the life of a pilot, some models haveGradually seen with more offensive features.In other words, even if it means observing the enemy and returning the video flow in real time to the observer on the ground, why not use the drone to directly attack the opponent?

Two families of systems were then developed.The first, which is found on American drones Predator and Reaper, allows to embark weapons under the wings of the aircraft, which becomes a "attack drone".With the other system, and this is the privileged approach by Israel with Harop or Harpy, the observation drone itself can turn into a weapon by rushing on a target.We then speak of "suicide drone" or "prowler ammunition".

But in all these cases, these flying death devices remain controlled at a distance.If modern drones generally manage to fly, or even take off and land automatically, on -board optronic sensors and armaments remain controlled in real time by human operators.For smaller drones, the range of radio waves limits the field of action a few kilometers, but the largest models have satellite communication antennas.Each time, the essential for users is to keep "human in the loop".After all, the primary mission of the drone remains to serve as a deported sensor for operators.

Les drones Reaper ne sont pas des "robots-tueurs", puisqu'ils restent pilotés à distance. Cependant, leur utilisation intensive dans pour assassinats ciblés avait déjà soulevé de vives critiques partout dans le monde. Crédits: General Atomics

However, the drone's attack mentioned in the UN report published last March, would have been made, for the first time, without human intervention.In other words, a Kargu-2 quadricopter of Turkish origin, after having lost the radio link with its human operator, would still have continued its mission, selected and attacked a human target, without anyone knowing if this attack hasmade victims.

Therefore, for many observers, a limit would have been crossed.Drones would no longer be strictly subject to (already questionable) control of human operators.Their lethal capacities could now be controlled by AI alone, raising gigantic ethical and legal problems.But, to look there, is it really a novelty?Can drones really become "robots"?

Voir aussi : L'inquiétude monte à l'international vis-à-vis d'hypothétiques "robots tueurs"

Autonomy and decision: what technological limits?

In recent years, the generalization of "drones-kamikaze", or "vagabond ammunition", has tended to make the border between missiles and drones.This new category of ammunition has developed at the same time as new consumer technologies, exploiting both the miniaturization of cameras and digital data bonds increasingly secure.From then on, a simple commercial drone equipped with a small military charge can become a cheap-up-to-date mini-missile.

Therefore, it was only a matter of time before certain "drones-kamikazes" have a mode of action present for decades on certain missiles: "the hanging after shooting".This mode allows a missile to be drawn towards a predetermined area without the human operator knowing precisely where the target is.Using its radar or optronic researcher head, target recognition algorithms and a pre-recorded image library before shooting, the missile is responsible for finding and destroying its target by itself.We can then speak of autonomous guidance, even if the designation of the target remains indirectly the fact of the operators, to whom the responsibility is to correctly define the missile parameters.

Plus que son mode de guidage, c'est l'utilisation d'un drone comme arme antipersonnelle (assassinat ciblé) qui soulève un énorme problème éthique. Crédits: STM

In the case of "drones-kamikazes", the situation is still different.By definition, the interest of the drone lies in its data link, which makes it a deported sensor for the human operator.But with the arrival of facial recognition algorithms, available on Github, and compatible with certain commercial mini-drones, military drones like Kargu-2 were quickly transformed into "anti-personnel missiles".

Technically, this type of machine does not have the computing power or the algorithms necessary to choose its target alone, and even less to decide to attack it.Any attack, with or without permanent connection, with or without facial recognition, remains the result of a direct human order.There is therefore no "reflection" or "independent decision -making" on the part of the drone.

What could have happened in Libya?

In the Libyan framework, according to some rumors, a Kargu-2 has targeted a entrenched fighter, potentially via facial recognition.In any case, it is very unlikely that the drone was fully deployed in autonomous mode, the risk of fratricidal shooting or collateral damage being too important.

On the other hand, it is possible that during the flight, the operator has attributed a target to the drone, either by "simple" visual identification, or via facial recognition, two modes of attack which do not require human in the loop.In the first case, it would not be a world first.

And if the second case may seem much more worrying in the eyes of experts, it is mainly because current automatic facial recognition technologies are much less efficient than automatic monitoring of a target previously identified by a human, via thedata link.

It is also possible that the case described by the UN report is not an autonomous attack.Currently, reported facts are lacking in precision, and could completely correspond to a classic drone attack, with an unrealized, or particularly tenacious operator, at the other end of the data link.

The proliferation of armed drones: a real international problem

From then on, should we sweep the threat underlined by the UN report?Certainly not !However, it would be quite counterproductive to focus only on the banishment of "robots", while the problem is much wider, and the very definition of "robots" is still far from frozen.

Indépendamment des technologies embarquées à bord des drones, leur bas coût permet aujourd'hui une véritable prolifération qui dépasse de très loin les seules forces armées. Crédits: Shutterstock

Indeed, the real problem does not intrinsically concern the level of empowerment of drones.Much more destructive missiles, and having much more advanced AI, have been capable of recognition of automatic target for decades, without observing of a panic movement linked to "missiles".While some drones are equipped with facial recognition algorithms, it is rather an imprecise "poor" tool "rather than high-tech equipment, which on the contrary seek to be as precise and reliable aspossible.

For specialists, real concern does not relate so much to the technological sophistication of drones as on the lack of decline and maturity of on -board technologies, and especially on the proliferation of these affordable new devices, both in the armed forces and interrorist groups.The latest conflicts have already shown that "drones-kamikazes" Low-Cost could destroy over-aerial batteries of several million dollars, and that DID DIY quadcopters could destroy tank.The idea that light drones with rudimentary facial recognition can fall into the hands of any insurgents then worries much more than the development of high-tech robots worthy of the most frightening works of science fiction.

Robots will not become killers

In such a context, the threat is therefore not really the "robot-tutor", capable of understanding the tactical situation and choosing its target alone, but rather the proliferation of drones in short.If science fiction is not lacking in examples of revolt of crazy machines or robots turning against their creators, the chances of seeing such phenomena generalized in the short or medium term are close to zero.

Even if a drone were to escape the control of his operator, he would have no reason to disobey his programming.And even if it had to happen, or if it was hacked, it would be an isolated event, which the operators would not allow to reproduce.

Image extraite du film Terminator. Les "robots-tueurs" volants, qui traquent et attaquent leurs cibles en toute autonomie, sont monnaie courante dans la science-fiction. Malgré les avancées technologiques, ils devraient rester cantonner aux oeuvres de fiction pour encore quelques années

After all, more than anyone, the military need to have absolute control over their equipment, and to avoid any ambiguity on the battlefield.Armaments manufacturers are also, overall, opposed to the appearance of truly autonomous weapons.In addition, drones and other terrestrial robots, such as "smart missiles", have an extremely limited energy and logistical autonomy.Even in the event of mass hacking, the "machine revolt" that would follow would run out of itself in just a few hours.

In fact, in the near future, humanity as a whole does not have to worry about "robots" strictly speaking.On the other hand, we might need, collectively, to impose a certain control over this new type of ammunition that are "drones-kamikazes".Whether in autonomous or not guidance, their proliferation in the armed forces multiplies the risk of seeing them fall into bad hands, but also that of bad manipulation.Piracy also remains a subject of concern, but which is not specific to drones, at the time of the rampant scanning of weapons systems.

Voir aussi : Découvrez le visage du robot qui veut faire interdire les robots tueurs via les Nations unies

Can we legislate?

The good news is that it is possible, internationally, to limit the proliferation of certain weapons.In the past, the Convention on anti-personnel mines was ratified in 1997 by 131 countries, while the submunition weapons convention entered into force in 2010, with 108 signatory countries.

La question sémantique sera au coeur d'éventuelles Conventions de régulation. Définir ce qu'est un "robot-tueur" ou en quoi consiste une "prise de décision autonome" est bien plus complexe que d'imposer une limitation de certaines fonctionnalités techniques. Crédits: USAF

In the future, one can imagine that a moratorium will be offered in order to limit the level of empowerment of drones and robots.We could also limit their anti -personnel use, which remains based on facial recognition algorithms whose drifts and limitations are known.In general, the idea of such a convention would be to avoid spreading a new type of weapon capable of hitting without discrimination.

Unfortunately, even if such legislation was being set up, it is a safe bet that the main producers decide not to be signatory.For the time being, the USA, China and Russia have never ratified conventions on mines and submunitions.And everything suggests that they will also refuse to limit their use of "drones-kamikazes".

Source : New York Times, New Scientist, Areion, UNDoc, Le Monde

Tags: