In August 2021, Apple has announced plans to scan users’ photos stored in iCloud for child abuse activity (CSAM). The system was designed to maintain confidentiality and allow the company to display potentially sensitive and aggressive content without revealing anything else. But this was controversial and soon came under fire from privacy and security researchers and digital rights groups due to the possibility that surveillance technology could be misused to compromise the privacy and security of iCloud users around the world. In early September 2021, Apple said it would pause the release of the feature to “gather input and make changes before releasing more important child safety features.” In other words, the establishment was still coming. says that in response to the feedback and guidance they received, the CSAM detection tool for iCloud photos is dead.
Instead, Apple told WIRED this week, it is focusing its efforts against CSAM and its investments on the “Communication Safety” security measures that the company announced in August 2021 and implemented last December. Parents and guardians can access security through a family iCloud account. They work with Siri, Apple’s Spotlight Search, and Safari Search to alert when someone is looking for or searching for child-friendly content and provide immediate support for reporting the content and asking for help. In addition, the basis of safety is Communication Safety for Messages, which caregivers can set up to give warnings and help to children if they receive or try to send images that contain genitals. The aim is to prevent child abuse before it occurs or settle and to reduce the implementation of new CSAM.
“After extensive consultation with experts to get feedback on the child protection solutions we provided last year, we are increasing our investment in the Communication Safety segment that we first released in December 2021,” the company told WIRED in a statement. “We have also decided not to move forward with our CSAM detection tool for iCloud Photos. Children can be protected without companies interfering with their data, and we will continue to work with governments, child advocates, and other companies to protect young people, preserve their privacy, and make the Internet a safe place safe for the children and all of us..”
Apple’s CSAM update comes alongside its announcement today that the company is significantly expanding its end-to-end iCloud offerings, including increasing the security of backups and photos stored in the cloud. Child protection experts and technology experts working to combat CSAM are often opposed to end-to-end encryption because it makes user data inaccessible to technology companies, making it difficult for them to analyze and flag CSAM. Law enforcement agencies around the world have also reported on the seriousness of child abuse in opposition to the use and spread of the latter’s encryption, although many of these organizations have been opposed to the latter’s blocking because it would make further investigations difficult. Research has consistently shown, however, that end-to-end encryption is an important tool for protecting human rights and that the risks involved do not outweigh its benefits.
Message Security is the process of logging and checking images that users send and receive on the user’s device to determine whether an image is sexually explicit. This feature is designed so that Apple can’t access the messages, closed-end messages are not broken, and Apple doesn’t even learn that the device has detected a mole.
The company told WIRED that while it doesn’t plan to announce an exact timeline for expanding its Communication Security feature, the company is working on adding the ability to detect nudity in videos sent via Messages when security is enabled. The company also plans to expand its offerings beyond Messaging to other communications products. And ultimately, the goal is to make it possible for third-party developers to integrate Communication Safety tools into their applications. The more things become available, Apple says, the more children can get the information and support they need before eating.
“Potential child abuse can be interrupted before it happens by providing parents with access tools to help protect their children from unsafe contact,” the company said in a statement. “Apple is committed to developing new privacy protections to combat Child Abuse and to protect children, and to meet privacy requirements for communications and data retention.”
Similar to other companies that have publicly fought to combat CSAM, including Meta, Apple told WIRED that it also plans to continue working with child protection experts to make it easier for users to report scams and incidents to enforcement agencies. and law enforcement.
The fight against CSAM is a difficult and difficult task with many children around the world, and it is not yet clear how Apple’s bet on quick action will be achieved. The technology giants are walking a fine line, however, as they try to find a balance between CSAM’s awareness and user privacy.