CSAM: Apple does not want to implement child pornography detection

0
114

Apple has finally stopped development of the CSAM detection presented in summer 2021. The feature, which was supposed to recognize child pornography when photos were uploaded to iCloud using special hashes, was misunderstood in many places when it was announced, was then discussed controversially and the rollout was later paused.

CSAM (child sexual abuse material) detection should have hashes of photos uploaded to iCloud against known child sexual abuse material hashes from a National Center for Missing & Exploited Children (NCMEC) are compared. According to original plans, if a threshold of around 30 hits was exceeded, Apple would have called the law enforcement authorities.

Although Apple only wanted to work with hashes and did not want to search through all of the user's recordings on the smartphone or in the cloud, the feature was perceived in exactly the same way in many places after it was presented in August 2021. There was criticism that Apple wanted to integrate a backdoor into the devices and automatically search the entire photo library of the user. About a month later, Apple responded to the criticism by postponing the feature that was originally supposed to come for iOS, iPad and macOS. Recognition is not implemented

As part of the announcement of expanded end-to-end encryption, which will also include iCloud Backup, iCloud Drive, Photos, Notes, Reminders, Safari Bookmarks, Siri Shortcuts, Voice Memos and Maps in Apple Wallet from 2023, Apple has now finally moved away from of CSAM detection passed. Software boss Craig Federighi has confirmed this decision to the Wall Street Journal, and Wired has also made an official statement. Children could therefore be protected without combing through the data, for example by working with the relevant authorities. However, Apple is sticking to the Communication Safety feature.

After extensive consultation with experts to gather feedback on child protection initiatives we proposed last year, we are deepening our investment in the Communication Safety feature that we first made available in December 2021. We have further decided not to move forward with our previously proposed CSAM detection tool for iCloud Photos. Children can be protected without companies combing through personal data, and we will continue working with governments, child advocates, and other companies to help protect young people, preserve their right to privacy, and make the internet a safer place for children and for us all .

Apple

iMessage can warn about nudity

Since December Last year, family accounts in the Messages app gave parents the option to warn children and provide helpful resources when they are trying to receive or send nudity. The Messages app uses on-device machine learning to analyze image attachments and determine if a photo is likely to contain nudity. According to the manufacturer, the function is designed in such a way that Apple does not have access to the respective photos. Parents can turn on Enhanced Communication Security in their child's Screen Time settings.