Apple's plan to scan iPhone photos for child abuse material is over

Apple’s plan to scan iPhone photos for child abuse material is over

Apple’s proposed CSAM detection feature



AppleInsider may earn an affiliate commission on purchases made through links on our site.

While Apple’s controversial plan to track down child sexual abuse material using iPhone scans has been abandoned, the company has other plans in mind to stop it at the source.

Apple announced Two initiatives in late 2021 with the aim of protecting children from abuse. One, which is already in force today, will warn minors before sending or receiving nude images. It works using algorithmic detection of nudity and only warns children – parents are not notified.

The second, and more controversial, feature is the analysis of user photos uploaded to iCloud on the user’s iPhone for known CSAM content. The analysis was performed locally, on the device, using the segmentation system.

“After extensive consultation with experts to gather feedback on the child protection initiatives we proposed last year, we are deepening our investment in the communications safety feature we first introduced in December 2021.”

“We have also decided not to move forward with the previously proposed CSAM discovery tool for iCloud Photos. Children can be protected without companies combing personal data, and we will continue to work with governments, child advocates and other companies to help protect young people, preserve their right to privacy, and make the Internet a safer place.” For children and for all of us.”

The statement comes moments later Apple announced New features that will encrypt more iCloud data end-to-end, including iMessage Content and images. These enhanced protections would have made server-side flag system impossible, which was an essential part of Apple’s CSAM detection feature.

Take a different approach

Amazon, Google, Microsoft and others make server-side scanning a requirement by law, but end-to-end encryption would prevent Apple from doing so.

Instead, Apple hopes to tackle the problem at its source — creation and distribution. Instead of targeting those who store content in cloud servers, Apple hopes to educate users and prevent content from being created and sent in the first place.

Apple provided Additional details About this initiative to Wired. While there is no timeline for the features, it will start with the expansion of algorithmic nudity detection into a video for communications security feature. Apple then plans to extend this protection to its other communication tools, and then provide developers with access as well.

“Potential child exploitation can be stopped before it happens by providing parents with opt-in tools to help protect their children from unsafe connections,” Apple also said in a statement. “Apple is dedicated to developing innovative privacy-preserving solutions to combat child sexual abuse material and protect children, while addressing the unique privacy needs of personal communications and data storage.”

Other on-device protections are in place for Siri, Safari, and Spotlight to detect when users are searching for CSAM. This redirects the search to resources that offer assistance to the individual.

Features that educate users while preserving privacy have been Apple’s goal for decades. All existing child safety apps seek to flag it, and Apple never knows when a safety feature is turned on.

#Apples #plan #scan #iPhone #photos #child #abuse #material

Leave a Comment

Your email address will not be published. Required fields are marked *