AUTONOMOUS robots have transformed modern warfare as nations race to develop technologies that fire on targets without human input, leaving experts alarmed.

AI-controlled weapons could be more destructive than nukes and mistake civilians for combatants, experts warn.

Ukrainian soldiers taking part in a military exercise
AFP – Getty
Russian President Vladimir Putin has publicly expressed interest in growing Russia’s AI sector
Getty Images – Getty

An AI system powering a drone could scan a battlefield and select targets for destruction, like a gun that aims and fires itself.

Autonomous weapons could reduce the amount of risk human soldiers are exposed to – providing an obvious strategic benefit for a country at war.

But James Dawes, an expert on the weaponization of AI, wrote a harsh review of autonomous weapons and their potential for The Conversation:

“When selecting a target, will autonomous weapons be able to distinguish between hostile soldiers and 12-year-olds playing with toy guns? Between civilians fleeing a conflict site and insurgents making a tactical retreat?”

Read More on Robots

ROBOT LOVE

Humans and robots are getting closer than ever through romance and relationships

Worse yet, a robot cannot be held accountable for mistakes in battle – the charge of responsibility has nowhere to go, Dawes argues.

Fortunately, killer robots do not appear to have been widely utilized yet, and there are no confirmed deaths caused by an autonomous weapon.

But a UN report last year said an autonomous killer drone was deployed on a battlefield in March 2020 during the Second Libyan Civil War, NPR reports.

The battle was between the UN-recognized Government of National Accord and the forces of army general Khalifa Haftar,

The UN report stated: “The lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition.”

It is unknown if the drones killed anyone.

An Obama-era policy position on autonomous weapons has such strict parameters that no killer robots have been submitted for review, the New Scientist reports.

But other nations have been less restrictive.

A US intelligence report found that Russia has more than 150 AI-powered military systems in different stages of development.

“The Russian military seeks to be a leader in weaponizing AI technology,” an intelligence officer told National Defense.

Russia boycotted a February 2022 conference on autonomous weapons regulation and will abstain from discussions continuing this month.

To experts, the most likely cause of an autonomous weapons arms race is the use of killer robots against the Ukrainians.

“I can guarantee, if Russia deploys these weapons, some people in the US government will ask ‘do we now, or will we later, need comparable capabilities for effective deterrence?’,” diplomacy expert Gregory Allen told New Scientist.

Attempts to regulate autonomous weapons have met walls put up by threat actors while human rights organizations plead for their ban.

Meanwhile, the Campaign to Stop Killer Robots conducted a survey and found that 61% of respondents from 26 countries oppose the use of lethal autonomous weapons.

A United States policy position on autonomous weapons set during the Obama administration is due for a planned 10-year review
Getty Images – Getty
Leave a Reply

Your email address will not be published. Required fields are marked *

You May Also Like

I found the iPhone’s HIDDEN camera app – you’ve probably never seen it

YOUR iPhone has a hidden camera feature that’s almost certainly passed you…

Your WhatsApp has FIVE hidden games you can play with pals – how to find them

WHATSAPP has plenty of games – they’re not built into the app,…

I’ve been playing Diablo Immortal for a week – it’s a must-have iPhone game

I’VE spent just over a week playing Diablo Immortal and I’ve struggled…

Brain-busting optical illusion reveals how your mind lies to you – do you see the colours?

A BAFFLING optical illusion reveals how your brain lies to you. Using…