in ,

Apple holds off on plans to scan for child porn in iOS

apple-holds-off-on-plans-to-scan-for-child-porn-in-ios

On Monday 13 December 2021 Apple released the final version of iOS 15.2. The company closed a few security vulnerabilities with the update and brought some new features, including new protections for children that include a warning about explicit content in iMessage and protection for searches via Siri, Spotlight and Safari. These changes were announced by the company earlier in 2021.

The protections mean that in iOS 15.2 parents can activate warnings should their children receive or send explicit content.

The changes are described in detail on Apple’s child safety webpage, but a Reddit post notes that a section has disappeared from this page: the announcement of a CSAM review of iCloud photos.

Apple’s page on the upcoming child safety features initially went live on 5 August. The latest version dates from 13 December and has removed any reference to the information relating to the CSAM review. The corresponding paragraph on the page was previously the most comprehensive of the three security features. In addition, Apple had referred to technical documentation such as white papers and review of the innovations by researchers. These PDFs themselves are still online, but now any links from the announcement page are missing.

Apple had received a backlash for its announcement that it would check iCloud photos for pictures that represent Child Sexual Abuse Material. People were clearly more concerned that it would be an invasion of their own privacy than a way to protect children. But not every Apple user is a parent. Apple’s Craig Federighi did explain that Apple’s photo scanning had been “widely misunderstood” in an interview with the Wall Street Journal in August.

The company then updated its child safety web page in September 2021 to say:

“Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”

And now, Apple’s plans in this respect are now unclear. Are the plans still postponed or are they no longer planned at all.

Hopefully Apple can come up with a way to identify this kind of criminal material that Apple users do not feel is an invasion of their own privacy.

This article originally appeared on Macwelt. Translation by Karen Haslam.

What do you think?

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

0
got-a-gift-you-didn’t-want?-it’s-easy-to-return-it-to-amazon!

Got a gift you didn’t want? It’s easy to return it to Amazon!

rumor:-apple-could-launch-iphone-14-pro-without-sim-card-slot-this-year

Rumor: Apple Could Launch iPhone 14 Pro Without SIM Card Slot This Year