Jump to content
  • Sign Up
×
×
  • Create New...

Denmark’s AI-powered welfare system fuels mass surveillance


Recommended Posts

  • Diamond Member

This is the hidden content, please

Denmark’s AI-powered ******** system fuels mass surveillance

Artificial intelligence (AI) tools used by the Danish ******** authority violate individual privacy, risk discrimination and breach the ********* Union’s (EU) AI Act’s regulations on social scoring systems, according to analysis from Amnesty International.

Udbetaling Danmark (UDK, or Payout Denmark) – established in 2012 to centralise the payment of various ******** benefits across five municipalities – uses AI-powered algorithms to flag individuals who are considered at the highest risk of committing social benefits ****** for further investigation. These were developed in partnership with ATP, Denmark’s largest pensions processing company, and various private multinational corporations.

The

This is the hidden content, please
details how UDK’s ****** control algorithms breach the human rights of social security benefits recipients, including their rights to privacy, equality and social security. It also concludes that the system creates a barrier to accessing social benefits for certain marginalised groups, including people with disabilities, low-income individuals and migrants.

“This mass surveillance has created a social benefits system that risks targeting, rather than supporting, the very people it was meant to protect,” said Hellen Mukiri-Smith, Amnesty International’s researcher on artificial intelligence and human rights.

“The way the Danish automated ******** system operates is eroding individual privacy and undermining human dignity. By deploying ****** control algorithms and traditional surveillance methods to identify social benefits ******, the authorities are enabling and expanding digitised mass surveillance.”

Amnesty argues that UDK’s ****** detection system likely falls under the “social scoring” ban under the EU’s AI Act, which came into force on 1 August 2024.

The act defines AI social scoring systems as those that “evaluate or classify” individuals or groups based on social behaviour or personal traits, causing “detrimental or unfavourable treatment” of those people.

Mukiri-Smith said: “The information that Amnesty International has collected and analysed suggests that the system used by the UDK and ATP functions as a social scoring system under the new EU Artificial Intelligence law – and should therefore be banned.”

UDK and ATP provided Amnesty with redacted documentation on the design of certain algorithmic systems, and allegedly rejected Amnesty’s requests for a collaborative audit, refusing to provide full access to the code and data used in their algorithms.

The Danish authority also rejected Amnesty’s assessment that its ****** detection system likely falls under the AI Act’s social scoring ban, but did not offer an explanation for this reasoning.

In response to this, Amnesty has called on the ********* Commission to issue clear guidelines on which AI practices constitute a social scoring system in its AI Act guidance. The organisation has also requested that the Danish authorities stop using the system until it can be confirmed that it does not fall under this ban.

Mukiri-Smith added: “The Danish authorities must urgently implement a clear and legally binding ban on the use of data related to ‘foreign affiliation’ or proxy data in risk scoring for ****** control purposes. They must also ensure robust transparency and adequate oversight in the development and deployment of ****** control algorithms.”

Computer Weekly contacted UDK about the claims made by Amnesty International but received no response by the time of publication.

Violation of privacy

Alongside ATP, UDK uses a system of up to 60 algorithms to identify fraudulent social benefit applications and flag individuals for further investigation by Danish authorities.

To power these models, Danish authorities have enacted laws enabling the extensive collection and merging of personal data from public databases of millions of Danish residents. This includes information on residency status, citizenship, and other data that can also serve as proxies for a person’s race, ethnicity or ******* orientation.

Mukiri-Smith added: “This expansive surveillance machine is used to document and build a panoramic view of a person’s life that is often disconnected from reality. It tracks and monitors where a social benefit claimant lives, works, their travel history, health records, and even their ties to foreign countries.”

Individuals interviewed by Amnesty described the psychological impact of being subjected to surveillance by ****** investigators and case workers. Describing the feeling of being investigated for benefits ******, Stig Langvad of Dansk Handicap Foundation told Amnesty that it is like “sitting at the end of a ****”.

UDK stated that its collection and merging of personal data to detect social benefits ****** is “legally grounded”.

Exacerbation of structural marginalisation

The report also reveals that the benefits ****** control system developed by UDK and ATP is built on inherently discriminatory structures in Denmark’s legal and social systems, which categorises people and communities based on difference.

According to the report, Danish law already creates a “hostile environment for migrants and people who have been granted ******** status”, with residency requirements for those seeking to claim benefits that disproportionately affect people from non-Western countries, with many refugees in Denmark, including Syria, Afghanistan and Lebanon.

The Really Single ****** control algorithm predicts a person’s family or relationship status to assess risk of benefit ****** in pensions and childcare schemes. One of the parameters employed by the algorithm includes “unusual” or “atypical” living patterns or family arrangements, but contains no clarity on what constitutes such situations, leaving room for dangerously arbitrary decision-making.

Mukiri-Smith added: “People in non-traditional living arrangements – such as those with disabilities who are married but live apart due to their disabilities; older people in relationships who live apart; or those living in a multi-generational household, a common arrangement in migrant communities – are all at risk of being targeted by the Really Single algorithm for further investigation into social benefits ******.”

Gitte Nielsen, the chairperson of the social and labour market policy committee at Dansk Handicap Foundation, described the feeling of being constantly scrutinised and reassessed: “It is eating you up. A lot of our members … have depression because of this interrogation.”

UDK and ATP additionally use inputs related to “foreign affiliation” in their algorithmic models. For example, the Model Abroad algorithm identifies groups of beneficiaries deemed to have “medium and high-strength ties” to non-EEA countries and prioritises these groups for further investigation.

Amnesty’s research found that algorithms such as these discriminate against people based on factors such as national origin and migration status.

In a response to Amnesty, UDK stated that the use of “citizenship” as a parameter in their algorithms does not constitute processing of sensitive personal information.



This is the hidden content, please

#Denmarks #AIpowered #******** #system #fuels #mass #surveillance

This is the hidden content, please

This is the hidden content, please


Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.