Open letter: civil rights activists criticize Apple's surveillance plans

0
139

The protest against Apple's surveillance plans continues. More than 90 civil rights organizations demand in an open letter that the group should not implement the system to search for child pornographic material on users' devices. The concern: The monitoring could easily be expanded.

The open letter (PDF) states: “Although this system is designed to protect children and curb the distribution of child sexual abuse material, we are concerned that this technology is censoring free speech, privacy and human safety around the world and have catastrophic consequences for many children. ”Signatory groups include the Center for Democracy & amp; Technology (CDT), the ACLU, PEN America, the Electronic Frontier Foundation (EFF), Privacy International and other groups from Europe, Asia, Australia and America.

Apple plans to use two systems

Specifically, Apple plans to use two methods. Apple wants to track down child sexual abuse material (CSAM) by analyzing the hash values ​​of images that users want to upload to iCloud. These values ​​are compared with databases that contain hash values ​​of known child pornographic content. Child protection organizations operate such databases. If there is a hit, the picture is marked and then encrypted so that Apple can check and open the pictures even after they have been uploaded.

The system only sounds the alarm if it detects around 30 images per user. In this case, Apple employees will examine the case and decide how to proceed. For the time being, the technology is to be introduced in the USA.

In the second method, Apple wants to offer the option for family accounts that children are warned if they send sexually explicit content. Apple's software is supposed to determine whether this is the case using machine learning methods. Parents under the age of 13 should also receive a notification (but not Apple), and only children from 13 to 17 year olds.

Worry about starting further monitoring

Civil rights activists and privacy advocates have been criticizing the systems since Apple unveiled them in early August. The open letter now also states that algorithms that are supposed to recognize sexually explicit content are “notoriously unreliable”. “These tend to falsely mark art, health information, educational content, supportive messages and other images.” The system also assumes that there is a healthy relationship between parents and children. However, this is not the case if an LGBTQ + adolescent grows up in an environment that does not accept their sexual orientation.

Scanning hash values ​​is just as problematic. It would put Apple under pressure from governments around the world to search for specific content that is viewed as unpopular. This threatens that censorship and surveillance mechanisms will expand because even end-to-end encryption can be bypassed in this way.

Apple defends the plans as usual. According to the FAQ, the group has ensured that governments cannot smuggle any images onto the hash list that do not display child pornographic content. In addition, not all images on the devices are analyzed, only those that users upload to iCloud. The encryption procedures would also remain intact because the analysis would be carried out on the users' devices.