https://www.wired.com/story/apple-csam-scanning-heat-initiative-letter/ Skip to main content Open Navigation Menu To revist this article, visit My Profile, then View saved stories. Close Alert WIRED Apple's Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy * Backchannel * Business * Culture * Gear * Ideas * Science * Security More To revist this article, visit My Profile, then View saved stories. Close Alert Sign In Search * Backchannel * Business * Culture * Gear * Ideas * Science * Security * Podcasts * Video * Artificial Intelligence * Climate * Games * Newsletters * Magazine * Events * Wired Insider * Jobs * Coupons Lily Hay Newman Security Aug 31, 2023 3:32 PM Apple's Decision to Kill Its CSAM Photo-Scanning Tool Sparks Fresh Controversy Child safety group Heat Initiative plans to launch a campaign pressing Apple on child sexual abuse material scanning and user reporting. The company issued a rare, detailed response on Thursday. Illuminated Apple logo is seen in the distance on a building through foliage Photograph: Leonardo Munoz/Getty Images Save Save In December, Apple said that it was killing an effort to design a privacy-preserving iCloud photo-scanning tool for detecting child sexual abuse material (CSAM) on the platform. Originally announced in August 2021, the project had been controversial since its inception. Apple had first paused it that September in response to concerns from digital rights groups and researchers that such a tool would inevitably be abused and exploited to compromise the privacy and security of all iCloud users. This week, a new child safety group known as Heat Initiative told Apple that it is organizing a campaign to demand that the company "detect, report, and remove" child sexual abuse material from iCloud and offer more tools for users to report CSAM to the company. Today, in a rare move, Apple responded to Heat Initiative, outlining its reasons for abandoning the development of its iCloud CSAM scanning feature and instead focusing on a set of on-device tools and resources for users known collectively as Communication Safety features. The company's response to Heat Initiative, which Apple shared with WIRED this morning, offers a rare look not just at its rationale for pivoting to Communication Safety, but at its broader views on creating mechanisms to circumvent user privacy protections, such as encryption, to monitor data. This stance is relevant to the encryption debate more broadly, especially as countries like the United Kingdom weigh passing laws that would require tech companies to be able to access user data to comply with law enforcement requests. "Child sexual abuse material is abhorrent and we are committed to breaking the chain of coercion and influence that makes children susceptible to it," Erik Neuenschwander, Apple's director of user privacy and child safety, wrote in the company's response to Heat Initiative. He added, though, that after collaborating with an array of privacy and security researchers, digital rights groups, and child safety advocates, the company concluded that it could not proceed with development of a CSAM-scanning mechanism, even one built specifically to preserve privacy. "Scanning every user's privately stored iCloud data would create new threat vectors for data thieves to find and exploit," Neuenschwander wrote. "It would also inject the potential for a slippery slope of unintended consequences. Scanning for one type of content, for instance, opens the door for bulk surveillance and could create a desire to search other encrypted messaging systems across content types." Heat Initiative is led by Sarah Gardner, former vice president of external affairs for the nonprofit Thorn, which works to use new technologies to combat child exploitation online and sex trafficking. In 2021, Thorn lauded Apple's plan to develop an iCloud CSAM scanning feature. Gardner said in an email to CEO Tim Cook on Wednesday, August 30, which Apple also shared with WIRED, that Heat Initiative found Apple's decision to kill the feature "disappointing." "Apple is one of the most successful companies in the world with an army of world-class engineers," Gardner wrote in a statement to WIRED. "It is their responsibility to design a safe, privacy-forward environment that allows for the detection of known child sexual abuse images and videos. For as long as people can still share and store a known image of a child being raped in iCloud we will demand that they do better." Most Popular * She Sacrificed Her Youth to Get the Tech Bros to Grow Up Backchannel She Sacrificed Her Youth to Get the Tech Bros to Grow Up Lexi Pandell * Super Mario Bros. Wonder Is What Happens When Devs Have Time to Play Culture Super Mario Bros. Wonder Is What Happens When Devs Have Time to Play Megan Farokhmanesh * Starfield Makes It Easy to Get Lost in Space Culture Starfield Makes It Easy to Get Lost in Space Reid McCarter * The Anticlimactic Death of the Streaming Wars Culture The Anticlimactic Death of the Streaming Wars Angela Watercutter * In the email to Cook, Gardner wrote that Apple's photo-scanning tool "not only positioned Apple as a global leader in user privacy but also promised to eradicate millions of child sexual abuse images and videos from iCloud. ... Child sexual abuse is a difficult issue that no one wants to talk about, which is why it gets silenced and left behind. We are here to make sure that doesn't happen." Apple maintains that, ultimately, even its own well-intentioned design could not be adequately safeguarded in practice, and that on-device nudity detections for features like Messages, FaceTime, AirDrop, and the Photo picker are safer alternatives. Apple has also begun offering an application programming interface (API) for its Communication Safety features so third-party developers can incorporate them into their apps. Apple says that the communication platform Discord is integrating the features and that appmakers broadly have been enthusiastic about adopting them. "We decided to not proceed with the proposal for a hybrid client-server approach to CSAM detection for iCloud Photos from a few years ago," Neuenschwander wrote to Heat Initiative. "We concluded it was not practically possible to implement without ultimately imperiling the security and privacy of our users." On Heat Initiative's request that Apple create a CSAM reporting mechanism for users, the company told WIRED that its focus is on connecting its vulnerable or victimized users directly with local resources and law enforcement in their region that can assist them rather than Apple positioning itself as an intermediary for processing reports. The company says that offering this intermediary service may make sense for interactive platforms like social networks. The need to protect children from online sexual abuse is urgent, though, and as these concerns intersect with the broader encryption debate, Apple's resolve on refusing to implement data scanning will continue to be tested. Read the full exchange between Heat Initiative and Apple below. WIRED has redacted sensitive personal information for the privacy of senders and recipients: Updated 9:40 am ET, September 1, 2023: Added statement from Heat Initiative's Sarah Gardner. Get More From WIRED * Get the best stories from WIRED's iconic archive in your inbox * Our new podcast wants you to Have a Nice Future * The world is going blind. Taiwan offers a warning, and a cure * Adderall shortages are dragging on--can video games help? * The impossible fight to stop Canada's wildfires * How to talk to kids about social media and mental health * Using AI to resurrect the dead will create a burden for the living * See if you take a shine to our picks for the best sunglasses and sun protection [undefined] Lily Hay Newman is a senior writer at WIRED focused on information security, digital privacy, and hacking. She previously worked as a technology reporter at Slate magazine and was the staff writer for Future Tense, a publication and project of Slate, the New America Foundation, and Arizona State University. Additionally... Read more Senior Writer * Topicsappleencryptionprivacysurveillance More from WIRED China's Breach of Microsoft Cloud Email May Expose Deeper Problems China's Breach of Microsoft Cloud Email May Expose Deeper Problems Plus: Microsoft expands access to premium security features, AI child sexual abuse material is on the rise, and Netflix's password crackdown has its intended effect. Matt Burgess A New Attack Impacts Major AI Chatbots&-and No One Knows How to Stop It A New Attack Impacts Major AI Chatbots--and No One Knows How to Stop It Researchers found a simple way to make ChatGPT, Bard, and other chatbots misbehave, proving that AI is hard to tame. Will Knight Code Kept Secret for Years Reveals Its Flaw&-a Backdoor Code Kept Secret for Years Reveals Its Flaw--a Backdoor A secret encryption cipher baked into radio systems used by critical infrastructure workers, police, and others around the world is finally seeing sunlight. Researchers say it isn't pretty. Kim Zetter Twitter Scammers Stole $1,000 From My Friend&-So I Hunted Them Down Twitter Scammers Stole $1,000 From My Friend--So I Hunted Them Down After scammers duped a friend with a hacked Twitter account and a "deal" on a MacBook, I enlisted the help of a fellow threat researcher to trace the criminals' offline identities. Selena Larson How to Use Apple's New All-In-One Password Manager How to Use Apple's New All-In-One Password Manager Your iPhone, iPad, and Mac now have a built-in password feature, complete with two-factor authentication. Justin Pot Unmasking Trickbot, One of the World's Top Cybercrime Gangs Unmasking Trickbot, One of the World's Top Cybercrime Gangs A WIRED investigation into a cache of documents posted by an unknown figure lays bare the Trickbot ransomware gang's secrets, including the identity of a central member. Matt Burgess Google Fixes Serious Security Flaws in Chrome and Android Google Fixes Serious Security Flaws in Chrome and Android Plus: Mozilla patches more than a dozen vulnerabilities in Firefox, and enterprise companies Ivanti, Cisco, and SAP roll out a slew of updates to get rid of some high-severity bugs. Kate O'Flaherty The Cheap Radio Hack That Disrupted Poland's Railway System The Cheap Radio Hack That Disrupted Poland's Railway System The sabotage of more than 20 trains in Poland by apparent supporters of Russia was carried out with a simple "radio-stop" command anyone could broadcast with $30 in equipment. Andy Greenberg WIRED WIRED is where tomorrow is realized. It is the essential source of information and ideas that make sense of a world in constant transformation. The WIRED conversation illuminates how technology is changing every aspect of our lives--from culture to business, science to design. The breakthroughs and innovations that we uncover lead to new ways of thinking, new connections, and new industries. * * * * * * More From WIRED * Subscribe * Newsletters * FAQ * Wired Staff * Press Center * Coupons * Editorial Standards * Archive Contact * Advertise * Contact Us * Customer Care * Jobs * RSS * Accessibility Help * Conde Nast Store * Do Not Sell My Personal Info (c) 2023 Conde Nast. All rights reserved. Use of this site constitutes acceptance of our User Agreement and Privacy Policy and Cookie Statement and Your California Privacy Rights. WIRED may earn a portion of sales from products that are purchased through our site as part of our Affiliate Partnerships with retailers. The material on this site may not be reproduced, distributed, transmitted, cached or otherwise used, except with the prior written permission of Conde Nast. Ad Choices Select international siteUnited States * UK * Italia * Japon