Apple Inc. has confirmed that will delay the deployment of the controversial policy about CSAM (Child Sexual Abuse Material) scanning. The company indicated will “take additional time over the coming months to collect input and make improvements”.
After largely negative feedback and significant backlash, Apple Inc. has cooled its heels regarding scanning photo libraries of iPhones and iPads to hunt for images of child exploitation.
Apple Inc. was to scan individual devices and iCloud Photos for CSAM:
Last month, Apple Inc. announced a rather controversial policy. Although well-intentioned, the policy involved scanning personal iPhone and iPad devices.
Apple Inc. claimed it would start scanning iPhone and iPad devices. It wanted to hunt for material that corroborates with the global database on child sex, abuse, and exploitation.
Breaking: Apple has delayed rolling out its CSAM detection technology in iOS 15, citing feedback from customers, advocacy groups, researchers and others — which has been largely negative since the technology was announced in August.https://t.co/q1woBhvHuM
— Zack Whittaker (@zackwhittaker) September 3, 2021
Besides locally scanning Apple-branded smartphones and tablets, Apple Inc. would also go through remote cloud-hosted service, iCloud Photos. The feature would have gone live with iOS 15, iPadOS 15, and macOS Monterey later this year.
Apple Inc. has repeatedly clarified that it would not be looking at actual photos. Instead, the company would hunt for digital “fingerprints” that it matches against the CSAM database. Technically, the company would look for and confirm if the images had CSAM image hashes that NCMEC (National Center for Missing & Exploited Children) provides.
THIS IS THE SAFER WAY TO DIGITIZE PHOTOS:
— ScanMyPhotos.com 💞🌈 😷 #GetVacciNATION! (@ScanmyphotosC) September 3, 2021
Simply put, Apple would go through the content’s metadata and other non-visual aspects to confirm if the imagery matched the global database about CSAM. Many technologies, security agencies, and companies regularly deploy such technology and rely on the well-proven CSAM database.
In case Apple Inc. discovered evidence which strongly suggested it was related to CSAM, it would involve a human monitor. This person would visually confirm the imagery and pass the information to law enforcement if necessary.
Although clearly well-intentioned, several privacy advocacy groups rose in unison opposing the controversial policy. And Apple Inc. has now notified that it will accept more input and conduct more research before deploying the policy.
Apple Inc. delays but does not cancel its plan to scan individual devices:
Needless to mention, there was a huge backlash to the controversial policy. The Electronic Frontier Foundation reportedly amassed more than 25,000 signatures from consumers. More than 100 groups also called on Apple Inc. to abandon the technology.
Apple delays CSAM tools, statement from company: pic.twitter.com/CD5CaotV53
— Mark Gurman (@markgurman) September 3, 2021
Presumably taking the criticism into consideration, Apple Inc. has issued a statement, which mentioned:
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Apple decided not to roll out client-side scanning, for now. Activism works, guys. Great work everyone who helped to get our concerns heard.https://t.co/Gw7rrHV1CH
— banteg (@bantg) September 3, 2021
Incidentally, alongside CSAM scanning, Apple Inc. will also reportedly delay communications safety features in Messages and updated knowledge information for Siri and Search.