It is in the course of a declaration at Wired that Apple is discreetly burying a measure announced in the heart of summer 2021 for fight once morest pedophilia, and which did not go unnoticed. The manufacturer abandons its idea of detect child pornography images contained in the photo libraries on its devices.
This CSAM detection (for Child Sexual Abuse Material) was to be carried out both locally and then on Apple’s servers, but the manufacturer managed the communication around this project so badly that quickly, he put it on hold. Before letting go completely, therefore.
One of the sins of Apple in this story is to have launched the flower with a gun by failing to consult with the specialists and scientists who work on the technologies for detecting these images. This has raised serious concerns regarding the security and privacy of user data.
« We decided not to go ahead with our CSAM detection tool that we had proposed for iCloud Photos. Children can be protected without companies combing through personal data “, explains the manufacturer. ” We will continue to work with governments, child protection organizations and other businesses to help protect children, uphold their right to privacy and make the internet a safer place for all. ».
One measure of the “package” announced in August 2021 has been implemented without causing controversy: the blurring of nude images in Messages for young users. This function, performed entirely locally, is available in France since iOS 16.
The abandonment of this function certainly goes hand in hand with the announcement of end-to-end encrypted backup of photos in iCloud.
iOS 16.2 will be able to end-to-end encrypt photos and backups in iCloud