Apple delays CSAM abuse detection for a few months

0
194

Apple postpones CSAM detection after outcry

In a statement on its website, Apple says that CSAM detection will not be part of the major release of iOS 15. This announcement is not entirely unexpected: security experts and critics around the world expressed concern about the position, including whistleblower Edward Snowden.

Apple says it is basing its decision on feedback from customers, advocates, researchers and others. The company does not say when the release of CSAM will be, but indicates that they will use the coming months to improve the function. This is Apple's statement:

“Update on September 3, 2021: We announced plans last month to protect children from abusers who use communication tools to lure and exploit children and to limit the distribution of sexually explicit images of children. Following feedback from customers, advocates, researchers and others, we have decided to take more time in the coming months to collect feedback and make improvements before rolling out this very important child safety feature. blockquote>

You can view the statement in English on Apple's website.

Already fuss about CSAM

There was already a lot of fuss about the CSAM function when it was announced . It was not Apple's intention to go the wrong way. The ability to detect child abuse in photos could be just the beginning, experts say. Apple already applied the technology in iCloud Mail and now wanted to extend it to iCloud Photos.

In an interview with Craig Federighi it became clear that Apple saw that it was not the best decision made in the field of communication. Federighi once again explained how CSAM should work and why Apple considers it a safe system.

It is still completely unclear in which iOS version we can expect CSAM. Although we consider the chance very small, there is also the possibility that Apple will remove the feature until further notice. Apple is hard to predict if the company only reports it will be in “months” goes.