Rumor: 'Apple develops tool to recognize child abuse in users' photos'

In the Photos app, Apple uses the machine learning capabilities of the iPhone and iPad to recognize subjects in photos. You can therefore easily search for certain photos in your library, for example photos of animals or landscape photos. Security expert Matthew Green says multiple independent sources have heard that Apple's next step is to introduce CSAM scanning. CSAM stands for Child Sexual Abuse Material and scans photos for possible child abuse.

‘Apple is going to scan your photos for child abuse’

Such a tool thus gives Apple the ability to scan your photos and check for specific points that may indicate child abuse photos. Although the exact details are not yet clear, if such photos are recognized at several of those specific points, they can be sent to Apple's servers for manual checks. So Apple would use this for photos that are stored in the cloud, so in the case of Apple in iCloud Photo Library.

iCloud Photo Library is not end-to-end encrypted like many other similar cloud photo storage services. There is encryption, but the key to unlock the security is also in the hands of Apple. Apple does this at the request of governments, who can request data in this way if there is reason to do so.

Implementing CSAM scanning can raise privacy concerns, also because such a system can make mistakes and mistakenly mistake innocent photos for child abuse photos. The system can therefore be misused to analyze other photos as well. Certainly if governments start using such resources and demand that other parties also introduce them.

Apple resists backdoors
Apple has resisted backdoors in security for years. Chat services such as iMessage and WhatsApp are end-to-end encrypted, so that governments, for example, cannot just read along. Governments therefore prefer to see companies such as Apple add tools to the security so that lawful persons can still watch, if there is reason to do so. But the disadvantage is that such systems can also be abused. If this falls into the wrong hands, it could have serious privacy implications.

It is not clear how and when Apple will announce the use of CSAM scanning. According to the security expert, Apple would release this tomorrow. The question is whether Apple will also announce this publicly.


Posted

in

by

Tags: