Jump to content
  • Sign Up
×
×
  • Create New...

Recommended Posts

  • Diamond Member

This is the hidden content, please

This is the hidden content, please
drops pledge not to develop AI weapons

This is the hidden content, please
parent company Alphabet has dropped its pledge to not use artificial intelligence (AI) in weapons systems or surveillance tools, citing a need to support the national security of “democracies”.

This is the hidden content, please
CEO Sundar Pichai,
This is the hidden content, please
, previously outlined how the company would “not pursue” AI applications that “cause or are likely to cause overall harm”, and specifically committed to not developing AI for use in “weapons or other technologies whose principal purpose or implementation is to cause or directly facilitate injury to people”.

He added that

This is the hidden content, please
would also not pursue “technologies that gather or use information for surveillance violating internationally accepted norms”.

This is the hidden content, please
– whose company motto ‘Don’t be Evil’ was replaced in 2015 with ‘Do the right thing’ – defended the decision to remove these goals from its
This is the hidden content, please
in a
This is the hidden content, please
co-authored by Demis Hassabis, CEO of
This is the hidden content, please
DeepMind; and James Manyika, the company’s senior vice-president for technology and society.

“There’s a global competition taking place for AI leadership within an increasingly complex geopolitical landscape. We believe democracies should lead in AI development, guided by core values like freedom, equality and respect for human rights,” they wrote on 4 February.

“And we believe that companies, governments and organisations sharing these values should work together to create AI that protects people, promotes global growth and supports national security.”

They added that

This is the hidden content, please
’s AI principles will now focus on three core tenants, including “bold innovation”, which aims to “assist, empower, and inspire people in almost every field of human endeavour” and address humanity’s biggest challenges; “responsible development and deployment”, which means pursuing AI responsibly throughout a systems entire lifecycle; and “collaborative progress, together”, which is focused on empowering “others to harness AI positively”.

Commenting on

This is the hidden content, please
’s policy change, Elke Schwarz – a professor of political theory at Queen Mary University London and author of
This is the hidden content, please
– said that while it is “not at all surprising” given the company has already been
This is the hidden content, please
(and
This is the hidden content, please
) with cloud services, she is still concerned about the shifting mood among big tech firms towards military AI; many of which are now arguing it is “unethical not to get stuck in” developing AI applications for this context.

This is the hidden content, please
now feels comfortable enough to make such a substantial public change without having to face a significant backlash or repercussions gives you a sense of where we are with ethical concerns about profiteering from violence (to put it somewhat crudely). It indicates a worrying acceptance of building out a war economy,” she told Computer Weekly, adding that
This is the hidden content, please
’s policy change highlights a clear shift that the global tech industry is now also a global military industry.

“It suggests an encroaching militarisation of everything. It also signals that there is a significant market position in making AI for military purposes and that there is a significant share of financial gains up for grabs for which the current top companies compete. How useful this drive is toward AI for military purposes is still very much speculative.”

Experts on military AI have previously raised concerns about the ethical implications of algorithmically enabled killing, including the potential for dehumanisation when people on the receiving end of lethal force are reduced to data points and numbers on a screen; the risk of discrimination during target selection due to biases in the programming or criteria used; as well as the emotional and psychological detachment of operators from the human consequences of their actions.

There are also concerns over whether there can ever be meaningful human control over autonomous weapons systems (AWS), due to the combination of automation bias and how such weapons increase the velocity of warfare beyond human cognition.

Throughout 2024, a range of other AI developers – including

This is the hidden content, please
,
This is the hidden content, please
and
This is the hidden content, please
– walked back their own AI usage policies to allow US intelligence and defence agencies use their AI systems. They still claim they do not allow their AI to harm humans.

Computer Weekly contacted

This is the hidden content, please
about the change – including how it intends to approach AI development responsibly in the context of national security, and if it intends to place any limits on the kinds of applications its AI systems can be used in – but received no response.

‘Don’t be Evil’

The move by

This is the hidden content, please
has attracted strong criticism, including from human rights organisations concerned about the use of AI for autonomous weapons or mass surveillance. Amnesty International, for example, has called the decision “shameful” and said it would set a “dangerous” precedent.

“AI-powered technologies could fuel surveillance and lethal killing systems at a vast scale, potentially leading to mass violations and infringing on the fundamental right to privacy,” said Matt Mahmoudi, a researcher and adviser on AI and human rights at Amnesty.

This is the hidden content, please
’s decision to reverse its ban on AI weapons enables the company to sell products that power technologies including mass surveillance, drones developed for semi-automated signature strikes, and target generation software that is designed to speed up the decision to kill.

This is the hidden content, please
must urgently reverse recent changes in AI principles and recommit to refraining from developing or selling systems that could enable serious human rights violations. It is also essential that state actors establish binding regulations governing the deployment of these technologies grounded in human rights principles. The facade of self-regulation perpetuated by tech companies must not distract us from urgent need to create robust legislation that protects human rights.”

Human Rights Watch similarly highlighted the problematic nature of self-regulation through voluntary principles.

“That a global industry leader like

This is the hidden content, please
can suddenly abandon self-proclaimed forbidden practices underscores why voluntary guidelines are not a substitute for regulation and enforceable law. Existing international human rights law and standards do apply in the use of AI, and regulation can be crucial in translating norms into practice,” it said, noting while it is unclear to what extent
This is the hidden content, please
was adhering to its previous principles,
This is the hidden content, please
workers have at least been able to cite them when pushing back on irresponsible AI practices.

For example, in September 2022,

This is the hidden content, please
workers and ************ activists called on the tech giant to end its involvement in the secretive Project Nimbus cloud computing contract, which involves the provision of AI and machine learning (ML) tools to the Israeli government.

They specifically accused the tech giant of “complicity in Israeli apartheid”, and said they feared how the technology would be used against Palestinians, citing

This is the hidden content, please
’s own AI principles. A
This is the hidden content, please
spokesperson told Computer Weekly at the time: “The project includes making
This is the hidden content, please
Cloud Platform available to government agencies for everyday workloads such as finance, healthcare, transportation and education, but it is not directed to highly sensitive or classified workloads.”

Human Rights Watch added: “

This is the hidden content, please
’s pivot from refusing to build AI for weapons to stating an intent to create AI that supports national security ventures is stark. Militaries are increasingly using AI in war, where their reliance on incomplete or faulty data and flawed calculations increases the risk of civilian harm. Such digital tools
This is the hidden content, please
for battlefield decisions that may have life-or-death consequences.”

While the vast majority of countries are in favour of multilateral controls on AI-powered weapons systems, European foreign ministers and civil society representatives noted during an April 2024 conference in Vienna that a small number of powerful players – including the ***, US and Israel – are holding back progress by being part of the select few countries to oppose binding measures.

Timothy Musa Kabba, the minister of foreign affairs and international cooperation in Sierra Leone, said at the time that for multilateralism to work in the modern world, there is a pressing need to reform the UN Security Council, which is dominated by the interests of its five permanent members (China, France, Russia, the ***, and the US).

“I think with the emergence of new realities, from climate change to autonomous weapons systems, we need to look at multilateralism once again,” he said, noting any new or reformed institutions will need to be inclusive, democratic and adaptable.



This is the hidden content, please

#

This is the hidden content, please
#drops #pledge #develop #weapons

This is the hidden content, please

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.