EU wants tech companies to scan for child abuse

0
138

EU wants to scan citizens for CSAM content

Last year Apple was heavily criticized for its plan to implement CSAM abuse detection. With this, the company wanted to intercept material showing child abuse. According to the EFF, Apple should completely abandon the plan. It seems that this has also been done, because since December 2021 all references to CSAM have been removed from the website. Apple indicated that the feature “a few months” was supposed to be postponed, but it now seems to have been called off altogether. A new EU proposal brings the subject back on the agenda: the EU wants tech companies to install a backdoor to scan for child pornography.

The EU's proposal is already “unworkable” and, according to privacy experts, infringes too much on the lives of ordinary citizens. This concerns apps such as WhatsApp, Facebook Messenger and iMessage, where it must be possible to selectively scan users' private messages for CSAM (child sexual abuse material) content. According to experts, the plan is similar to Apple's and even goes a few steps further.

Cryptography professor Matthew Green stated:

This document is the most terrifying thing I've ever seen. It describes the most advanced mass surveillance machine ever deployed outside of China and the USSR. I'm not exaggerating.

Jan Penfrat of European Digital Rights (EDRi) is also concerned and calls it scandalous. It does not fit into any free democracy. Online service providers such as app stores, hosting companies and communication service providers are required to scan selected users' messages for CSAM content if requested by the EU. Attention should also be paid to ‘grooming’ and nursery activities. Using image recognition and AI, the content of images and texts must be scanned. In the EU, this also concerns unknown CSAM content, while Apple only wanted to recognize known pictures, reducing the chance of false accusations.

The EU Member States can commission an investigation. According to the European Commission, this concerns activities aimed at specific individuals, in order to prevent privacy from being endangered too much. However, the law does not say on what basis these commands can be given or whether they are aimed at individuals or groups. It opens the door to general oversight of what citizens are doing. Another problem is that the proposal undermines the use of end-to-end encryption, as used by WhatsApp and Signal. It is virtually impossible to detect the CSAM content if end-to-end encryption is allowed at the same time. Privacy experts also fear that EU legislation is flagship enough to be adopted by totalitarian regimes elsewhere in the world. But apart from that, according to some experts, the entire proposal is not technically feasible at all and was drafted by bureaucrats, not IT experts.

A comparison of the leaked proposal and the final proposal can be seen here. This shows that no relevant changes have been made in content.