Jump to content
  • Sign Up
×
×
  • Create New...

Recommended Posts

  • Diamond Member



Autonomous weapons reduce moral agency and devalue human life

Using autonomous weapons systems (AWS) to target humans will erode moral agency and lead to a general devaluing of life, according to military technology experts.

Speaking at the Vienna Conference on Autonomous Weapons Systems on 30 April 2024 – a forum set up by the Austrian government to discuss the ongoing moral, ethical, legal and humanitarian challenges presented by artificial intelligence (AI)-powered weapons – experts talked about the impact of AWS on human dignity, and how algorithmically enabled ********* will ultimately dehumanise both its targets and operators.

Specific concerns raised by experts throughout the conference included the potential for dehumanisation when people on the receiving end of lethal force are reduced to data points and numbers on a screen; the risk of discrimination during target selection due to biases in the programming or criteria used; as well as the emotional and psychological detachment of operators from the human consequences of their actions.

Speakers also touched on whether there can ever be meaningful human control over AWS, due to the combination of automation bias and how such weapons increase the velocity of warfare beyond human cognition.

The ethics of algorithmic ********

Highlighting his work on the ethics of autonomous weaponry with academic Elke Schwarz, Neil Renic, a researcher at the Centre for Military Studies in Copenhagen, said a major concern with AWS is how they could further intensify the broader systems of ********* they are already embedded within.

“Autonomous weapons and the systematic ******** they’ll enable and accelerate are likely to pressure human dignity in two different ways, firstly by incentivising a moral devaluation of the targeted,” he said, adding that the “extreme systematisation” of human beings under AWS will directly impose, or at least incentivise, the adoption of pre-fixed and overly broad targeting categories.

“This crude and total objectification of humans leads very easily to a loss of essential restraints, so the stripping away of basic rights and dignity from the targeted. And we can observe these effects by examining the history of systematic ********.”

For Fan Yang, an assistant professor of international law in the Law School of Xiamen University, the problem of bias in AWS manifests in terms of both the data used to train the systems and how humans interact with them.

Noting that bias in AWS will likely manifest in direct human casualties, Yang said this is orders of magnitude worse than, for example, being the victim of price discrimination in a retail algorithm.

“Technically, it’s impossible to eradicate bias from the design and development of AWS,” he said. “The bias would likely endure even if there is an element of human control in the final code because psychologically the human commanders and operators tend to over-trust whatever option or decision is recommended by an AWS.”

Yang added that any discriminatory targeting – whether a result of biases baked into the data or the biases of human operators to trust in the outputs of machines – will likely exacerbate conflict by further marginalising certain groups or communities, which could ultimately escalate ********* and undermine peaceful solutions.

An erosion of human agency

Renic added the systemic and algorithmic nature of the ********* inflicted by AWS also has the potential to erode the “moral agency” of the operators using the weapons.

“Within intensified systems of *********, humans are often disempowered, or disempower themselves, as moral agents,” he said. “They lose or surrender their capacity to self-reflect, to exercise meaningful moral judgement on the battlefield, and within systems of algorithmic *********, we are likely to see those involved cede more and more of their judgement to the authority of algorithms.”

Autonomous weapons aren’t going to bring an end to human involvement [but] they will rearrange and distort the human relationship with *********, potentially for the worse
Neil Renic, Centre for Military Studies

Renic further added that, through the “processes of routinisation” encouraged by computerised ********, AWS operators lose both the capacity and inclination to morally question such systems, leading to a different kind of dehumanisation.

Commenting on the detrimental effects of AWS on its operators, Amal El Fallah Seghrouchni, executive president of the International Centre of Artificial Intelligence of Morocco, said there is a dual problem of “virtuality” and “velocity”.

Highlighting the physical distance between a military AWS user and the operational theatre where the tech is deployed, she noted that consequences of automated lethal decisions are not visible in the same way, and that the sheer speed with which decisions are made by these systems can leave operators with a lack of awareness.

On the question of whether targets should be autonomously designated by an AWS based on their characteristics, no speaker came out in favour.

Anja Kaspersen, director for global markets development and frontier technologies at the Institute of Electrical and Electronics Engineers (IEEE), for example, said that with AI and machine learning in general, systems will often have an acceptable error rate.

“You have 90% accuracy, that’s okay. But in an operational [combat] theatre, losing 10% means losing many, many, many lives,” she said. “Accepting targeting means that you accept this loss in human life – that is unacceptable.”

Renic added while there may be some less problematic scenarios where an AWS can more freely select its targets – such as a maritime setting away from civilians where the characterises being identified are a uniformed ****** on the deck of a ship – there are innumerable scenarios where ill-defined or contestable characteristics can be computed to form the category of “targeted ******” with horrendous results.

“Here, I think about just how much misery and unjust harm has been produced by the characteristic of ‘military-age male’,” he said. “I worry about that characteristic of military-age male, for example, being hard coded into an autonomous *******. I think that’s the kind of moral challenge that should really discomfort us in these discussions.”

The consensus among Renic and other speakers was that the systematised approach to ******** engendered by AWS, and the ease with which various actors will be able to deploy such systems, will ultimately lower the threshold of resorting to *********.

“Our issue here is not an erasure of humanity – autonomous weapons aren’t going to bring an end to human involvement,” said Renic. “What they will do, however, along with military AI more broadly, is rearrange and distort the human relationship with *********, potentially for the worse.”

In terms of regulating the technology, the consensus was that fully autonomous weapons should be completely prohibited, while every other type and aspect of AWS should be heavily regulated, including the target selection process, the scale of force deployed in a given instance, and the ability of humans to meaningfully intervene.





This is the hidden content, please

#Autonomous #weapons #reduce #moral #agency #devalue #human #life

This is the hidden content, please

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.