Jump to content
  • Sign Up
×
×
  • Create New...

New iMessage feature allows children to report nudity to Apple | Apple


Recommended Posts

  • Diamond Member

This is the hidden content, please

New iMessage feature allows children to report nudity to Apple | Apple

Apple is introducing a new feature to iMessage in Australia that will allow children to report ***** images and video being sent to them directly to the company, which could then report the messages to police.

The change comes as part of Thursday’s beta releases of the new versions of Apple’s operating systems for *********** users. It is an extension of communications safety measures that have been turned on by default since iOS 17 for Apple users under 13 but are available to all users. Under the existing safety features, an iPhone automatically detects images and videos that contain nudity children might receive or attempt to send in iMessage, AirDrop, FaceTime and Photos. The detection happens on devices to protect privacy.

If a sensitive image is detected, the young user is shown two intervention screens before they can proceed, and given the offer of resources or a way to contact a parent or guardian.

The screen from which users can report content to Apple. Photograph: Apple/Apple Corps Ltd

With the new feature, when the warning comes up, users will also have the option to report the images and videos to Apple.

The device will prepare a report containing the images or videos, as well as messages sent immediately before and after the image or video. It will include the contact information from both accounts, and users can fill out a form describing what happened.

The report will be reviewed by Apple, which can take action on an account – such as disabling that user’s ability to send messages over iMessage – and also report the issue to law enforcement.

Apple said the plan would be initially to roll out the feature in Australia in the latest beta update, but that it would be released globally in the future.

The timing of the announcement as well as picking Australia as the first region to get the new feature coincides with new codes coming into force. By the end of 2024, tech companies will be required to police child ****** and ******* content on cloud and messaging services that operate in Australia.

Apple had warned that the draft of the code would not protect end-to-end encryption and would leave the communications of everyone who uses the services vulnerable to mass surveillance. The *********** eSafety commissioner ultimately watered down the law, allowing companies that believe it would break end-to-end encryption to demonstrate alternative actions to tackle child ****** and ******* content they would take instead.

Apple has faced strong criticism from regulators and law enforcement across the globe over its reluctance to compromise end-to-end encryption in iMessage for law enforcement purposes. Apple abandoned plans to scan photos and videos stored in its iCloud product for child ******* ****** material (CSAM) in late 2022, eliciting further rebuke. Apple, WhatsApp and other advocates for encryption say that any backdoors to encryption endanger users’ privacy globally.

skip past newsletter promotion

The ***’s National Society for the Prevention of Cruelty to Children (NSPCC) had accused Apple of vastly undercounting how often CSAM appears in its products, the Guardian revealed in July.

In 2023, Apple made just 267 reports of suspected CSAM on its platforms worldwide to the National Center for Missing & Exploited Children (NCMEC), which is vastly lower compared to other tech giants in the industry, with

This is the hidden content, please
reporting more than 1.47m and Meta reporting more than 30.6m, the NCMEC’s annual report states.

In the US, call or text the

This is the hidden content, please
****** hotline on 800-422-4453 or visit
This is the hidden content, please
for more resources and to report child ****** or DM for help. For ****** survivors of child ******, help is available at
This is the hidden content, please
. In the ***, the
This is the hidden content, please
offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (
This is the hidden content, please
) offers support for ****** survivors on 0808 801 0331. In Australia, children, young adults, parents and teachers can contact the Kids Helpline on 1800 55 1800, or
This is the hidden content, please
on 1800 272 831, and ****** survivors can contact
This is the hidden content, please
on 1300 657 380. Other sources of help can be found at
This is the hidden content, please



This is the hidden content, please

#iMessage #feature #children #report #nudity #Apple #Apple

This is the hidden content, please

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.