The piece below is an article written by Diane Francis which originally appeared on her website on June 24, 2021. Check out the original here
Quietly, without fanfare, a grim milestone has been reached in what has been coined the “Third Revolution in Warfare”. The first revolution was the invention of gun powder and the second, nuclear weapons. The third is the specter of lethal autonomous weapons, or, in other words, killer robots, capable of identifying physical targets or people then destroying them without human involvement or supervision.
The Terminator: cyborg from the future
The world’s first “killer robot” has just been cited and is airborne — a tiny drone made in Turkey with a difference. Of course, drones, or unmanned aerial vehicles, have been used by the military for years, but are controlled by pilots in remote ground control systems called cockpits. Pilots identify, approve, then verify, targets then deploy their weapons. And they are also required to heed the rules of combat which require the avoidance of civilian casualties, whenever possible.
But on May 8, 2021, the UN Security Council issued a report claiming that a swarm of small Turkish-made drones (called Kargu-2) may have “hunted down” and killed humans in Libya in 2020 without any human piloting or controls whatsoever. The mainstream media has ignored this but it marks a turning point in military history as well as geopolitics. Large-scale versions with nuclear or biological warheads, with minds of their own, could wipe out cities.
Already these small-scale drones are upending warfare. For instance, they helped turn the tide in the civil war in Libya and were used to great effect by Turkey in its war against Russian-backed Syrians as well as by Azerbaijan in pushing back the Russian-backed incursion by Armenia. The Turks are selling fleets of them and gearing up with partners like Ukraine for more production. (They also sell bigger traditional drones too). So far, customers are smallish nations, notably those beset or bullied by Russia or Iran’s large-scale militaries, such as Ukraine to Poland, Qatar, Azerbaijan, and Libya.
The Kargu-2 looks harmless, like a toy or something that Amazon or Walmart will eventually roll out to deliver packages. But it’s loaded with deadly software.
A showroom displaying Turkey’s Kargu-2 drones. Anadolu Agency
In Turkish, Kargu means “mountain observation tower” because these drones were initially designed as an airborne sentry or surveillance tool. Built by Turkey’s STM, they hover in warzones in order to spot targets and defense-shield gaps to facilitate artillery and jet fighter attacks. They are, like other drones, linked to satellite or command headquarters but, unlike the others, can “think” for themselves, thanks to facial recognition, computer vision, and navigational software. Kargu-2s can swarm to attack or aim themselves at their targets then blow up. This is why their nickname is the “Kargu Kamikaze” military drone.
Versions are secretly under development by militaries around the world but the United Nations Security Council report claims that the Kargu-2 is the world’s first operating lethal autonomous weapon to hunt humans. “Logistics convoys and retreating [Haftar-affiliated forces in Libya] were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or lethal autonomous weapons systems … and other loitering munitions. They were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true ‘fire, forget and find’ capability.”
“Fire, forget and find” is military jargon for a weapon that once fired can guide itself to its target, form a swarm, and cannot be stopped once deployed.
Brochure showing Kargu-2 swarm published by the maker, STM
This month, engineering industry bible Popular Mechanics declared “this is the first recorded case of using a self-hunting drone against people. Drone experts say this extremely dangerous development could be dangerous to people far beyond the traditional battlefield.”
For several years, the UN’s Secretary-General, 30 nations, and 4,500 robotic scientists and technologists – including the late Stephen Hawking and Elon Musk — have been warning such weapons were likely and attempted for years at the UN to hammer out a ban and non-proliferation treaty pre-emptively. Behind the campaign has been Oxford University’s Future of Life Institute which specializes in warning the world about man-made looming technological disasters. The admonition on its website minces no words:
“Unlike any weapon seen before, they could also allow for the selective targeting of a particular group based on parameters like age, gender, ethnicity, or political leaning (if such information was available). Because lethal autonomous weapons would greatly decrease personnel costs and could be easy to obtain at a low cost (in the case of small drones), small groups of people could potentially inflict disproportionate harm, making autonomous weapons a new class of weapon of mass destruction,” forewarned Institute several years ago.
Nations in red are firmly against a ban on autonomous lethal weapons: US, UK, France, Israel, Turkey, Russia, China, and Australia Future of Life
The existence of lethal autonomous weapons that are affordable undermines the world’s two superpowers and the world’s nuclear “club”. Worse, Turkey has demonstrated that there is a cost-effective way to repurpose the world’s known exponential technologies to assassinate or destroy anything on land, on water, underwater, or in space, which could be built in a garage by a mad genius. “The implications are game-changing,” said U.K. Defense Secretary Ben Wallace in a speech last year, citing Syria’s heavy losses due to actions by these drones.
The Kargu-2 can stay aloft for up to 24 hours – to find gaps in air-defense systems to assist warplanes and artillery. But if conditions permit, they can dive into a target and explode, or fire their own missiles from above. Fortunately, Zachary Kallenborn, US Army national security consultant wrote the drones “are not like the movie ‘Terminator’. They have nowhere near that level of sophistication, which might be decades away.”
Back in 2018, a senior Chinese official forecasted that “in future battlegrounds, there will be no people fighting” and the use of autonomous weapons is “inevitable”. Now we learn that they are likely already here which means the world is less safe than before — barring an enforceable global ban or, alternatively, barring an evolutionary change in human nature for the better.
German protest against killer robots. Reuters
There’s little doubt that Turkey’s innovation was discussed behind closed doors by Joe Biden and Vladimir Putin if, for no other reason than their hegemony of terror is threatened. Now what’s required is a global movement to ban such weapons or risk the possibility that life will imitate “art” which means – to quote The Terminator – it will be “hasta la vista, baby”.