Jump to content
  • Sign Up
×
×
  • Create New...

Algorithms that predict crime are watching, and judging us by the cards we’ve been dealt


Recommended Posts

  • Diamond Member

Algorithms that predict ****** are watching, and judging us by the cards we’ve been dealt

Credit: Pavel Danilyuk from Pexels

Your money, postcode, friends and family can make all the difference to how the ********* system treats you.

The New South Wales police recently scrapped a widely condemned program known as the

This is the hidden content, please
. It used algorithmic risk scores to single out “targets,” some as young as 10 years old, for police surveillance.

But similar programs remain in place. For instance, Corrective Services NSW uses a

This is the hidden content, please
to predict whether prisoners will reoffend.

“High risk” prisoners receive “high intensity interventions,” and may be denied parole. The

This is the hidden content, please
from facts such as “********* friends,” family involvement in ****** or drugs, financial problems, living in a “high ****** neighborhood” and frequent changes of address.

A predictive algorithm is a set of rules for computers (and sometimes people) to follow, based on patterns in data. Lots has been written about how algorithms

This is the hidden content, please
, from biased search engines to health databases.

In my newly published book,

This is the hidden content, please
, I argue the use of tools that predict our behavior based on factors like ******** or family background should worry us, too. If we are punished at all, it should be only for what we have done wrong, not for the cards we have been dealt.

Algorithms are watching us

Algorithms generate risk scores used in ********* justice systems all over the world. In the ******* Kingdom, the OASys (Offender Assessment System) is used as part of the pre-sentence information given to judges—it shapes bail, parole and sentencing decisions. In the ******* States, a

This is the hidden content, please
does something similar.

Risk scores are used beyond ********* justice, too, and they don’t always need computers to generate them. A short survey

This is the hidden content, please
helps doctors in Australia and across the world decide whether to prescribe pain relief for acute and chronic illness, by predicting whether patients will misuse their medications.

Predictive algorithms literally save lives: they are used to allocate donor organs, triage patients and make

This is the hidden content, please
. But they can also create and sustain unjustified inequalities.

Imagine that we develop an algorithm—”CrimeBuster”—to help police patrol ****** “hot spots.” We use data that links ****** to areas populated by lower income families. Since we cannot measure “******” directly, we instead look at rates of arrest.

Yet the fact that arrest rates are high in these areas may just tell us that police spend more time patrolling them. If there is no justification for this practice of intensive policing, rolling out CrimeBuster would give these prejudices the status of policy.

Algorithms are judging us

The trouble deepens when we use statistics to make predictions about intentional action—the things that we choose to do.

This might be a prediction about whether someone will be a “

This is the hidden content, please
” employee, commit ******* or ****** drugs.

The factors that influence these predictions are rarely publicized. For the British sentencing algorithm OASys, they include whether someone has been the victim of

This is the hidden content, please
.

The ********* COMPAS system captures parental divorce and

This is the hidden content, please
. The Opioid Risk Tool asks whether the patient’s family has a history of substance ******, and whether the patient (if female) has a history of “
This is the hidden content, please
.”

In each case, these facts make it more likely that someone will go to prison, miss out on medical treatment, and so on.

We all want to have the chance to make choices true to who we are, and meet our needs and goals. And we want to be afforded the same choices as other people, rather than be singled out as incapable of choosing well.

When we punish someone because of facts they can’t easily influence, we do just this: we treat that person as if they simply cannot help but make bad choices.

We can’t lock people up just in case

The problem isn’t the use of algorithms per se. In the 19th century, Italian physician

This is the hidden content, please
argued we could identify “the born *********” from physical characteristics—a misshapen skull, wide jaw, long limbs or big ears.

Not long after, British criminologist

This is the hidden content, please
ran with this idea and argued that certain “defective” mental characteristics made “the fate of imprisonment” inevitable.

Algorithms simply make it much ******* to see what’s going on in the world of ****** risk assessment.

But when we look, it turns out what’s going on is something pretty similar to the Lombroso-Goring vision: we treat people as if they are fated to do wrong, and lock them up (or keep them locked up) just in case.

Public bodies should be required to publish the facts that inform the predictions behind such decisions. Machine learning should only be used if and to the extent that these publication requirements can be met. This makes it easier to have meaningful conversations about where to draw the line.

In the context of ********* justice, that line is clear. We should only deal out harsher penalties for bad behavior, not other physical, mental or social characteristics. There are

This is the hidden content, please
that take this approach, and this is the line that *********** institutions should toe.

Once penalties for their ****** have been applied, prisoners should not be treated differently or locked up for longer because of their friends and family, their financial status or the way in which they’ve been treated at the hands of others.

Provided by
The Conversation


This article is republished from

This is the hidden content, please
under a Creative Commons license. Read the
This is the hidden content, please
.

Citation:
Algorithms that predict ****** are watching, and judging us by the cards we’ve been dealt (2024, March 26)
retrieved 26 March 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.





This is the hidden content, please

Science, Physics News, Science news, Technology News, Physics, Materials, Nanotech, Technology, Science
#Algorithms #predict #****** #watching #judging #cards #weve #dealt

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.