This Could Be the End of Apple’s CSAM Photo Detection, a Hotly Debated Feature

by -
Photo: Nicholas Kamm (Getty Images) – On the company’s Child Safety page, all mention of the function has been removed. Apple has quietly removed all traces of its contentious new tool to scan iPhones for evidence of child sexual abuse (CSAM) from the Child Safety section of its website, months after it was announced in error.

As recently as August, Apple announced plans for a CSAM-detection set of functions, which MacRumors first identified. Security experts, policy organizations, and normal Apple users all objected to the plan’s potential to undermine customer privacy with the on-device CSAM detection function, which stood out among the others proposed features.

An algorithm called NeuralHash was included into the CSAM detection tool, and it looked for unique hashes in user photographs that were compared against a database of CSAM images created by the National Center for Missing and Exploited Children (NCMEC). Humans, who are more likely to engage law enforcement, would be notified if a user’s iPhone was detected for holding such photographs.

Criticism of the move by Apple to give it the capacity to mine users’ private data has been fierce, with some arguing that the move might open the door to hazardous monitoring precedents and misidentifications of CSAM (would a picture of your kid in the bathtub place you on an FBI watchlist?).

As a result of Apple’s early efforts to calm suspicions about the planned feature, top executives were summoned to conduct interviews with the Wall Street Journal about how the idea was truly “an advancement of the state of the art in privacy.” While Apple attempted to soothe the uproar over the feature, it said in September that it was reversing course to fine-tune it before it was made public.

At the time, Apple told Gizmodo that it had opted to take more time to gather feedback and make changes before releasing these vitally essential kid safety features because of user feedback, advocacy organizations, researchers and others.

iOS 15.2 does have some unique Child Safety features, including improvements to Siri, Spotlight and Safari that offer new safety warnings for children to help them avoid danger when browsing the web, but the CSAM picture recognition tool is missing. We can safely conclude that we won’t be seeing this function on our smartphones for some time, if not ever, based on Apple’s silent withdrawal from any mention of the feature on its website.