The EU Wants Big Tech to Scan Your Private Chats for Child Abuse


All your WhatsApp photos, iMessage texts, and Snapchat videos could be scanned to check for child sexual abuse images and videos under newly proposed European rules. The plans, experts warn, may undermine the end-to-end encryption that protects billions of messages sent every day and hamper people’s online privacy.

The European Commission today revealed long-awaited proposals aimed at tackling the huge volumes of child sexual abuse material, also known as CSAM, uploaded to the web each year. The proposed law creates a new EU Centre to deal with child abuse content and introduces obligations for tech companies to “detect, report, block and remove” CSAM from their platforms. The law, announced by Europe’s commissioner for home affairs, Ylva Johansson, says tech companies have failed to voluntarily remove abuse content and has been welcomed by child protection and safety groups.

Under the plans, tech companies—ranging from web hosting services to messaging platforms—can be ordered to “detect” both new and previously discovered CSAM, as well as potential instances of “grooming.” The detection could take place in chat messages, files uploaded to online services, or on websites that host abusive material. The plans echo an effort by Apple last year to scan photos on people’s iPhones for abusive content before it was uploaded to iCloud. Apple paused its efforts after a widespread backlash


If passed, the European legislation would require tech companies to conduct risk assessments for their services to assess the levels of CSAM on their platforms and their existing prevention measures. If necessary, regulators or courts may then issue “detection orders” that say tech companies must start “installing and operating technologies” to detect CSAM. These detection orders would be issued for specific periods of time. The draft legislation doesn’t specify what technologies must be installed or how they will operate—these will be vetted by the new EU abuse Centre—but says they should be used even when end-to-end encryption is in place.

The European proposal to scan people’s messages has been met with frustration from civil rights groups and security experts, who say it’s likely to undermine the end-to-end encryption that’s become the default on messaging apps such as iMessage, WhatsApp, and Signal. “Incredibly disappointing to see a proposed EU regulation on the internet fail to protect end-to-end encryption,” WhatsApp head Will Cathcart tweeted

. “This proposal would force companies to scan every person’s messages and put EU citizens’ privacy and security at serious risk.” Any system that weakens end-to-end encryption could be abused or expanded to look for other types of content, researchers say.

“You either have E2EE or you don’t,” says Alan Woodward, a cybersecurity professor from the University of Surrey. End-to-end encryption protections people’s privacy and security by ensuring only the sender and receiver of messages can see their content. For example, Meta, the owner of WhatsApp, doesn’t have any way to read your messages or mine their contents for data. The EU’s draft regulation says solutions shouldn’t weaken encryption and says it includes safeguards to ensure this doesn’t happen; however, it doesn’t include specifics of how this would work.

“That being so there is only one logical solution: client-side scanning where the content is examined when it is decrypted on the user’s device for them to view/read,” Woodward says. Last year, Apple announced it would introduce client-side scanning—scanning done on people’s iPhones rather than Apple’s servers—to check photos for known CSAM being uploaded to iCloud. The move prompted anger at potential surveillance from civil rights groups to Edward Snowden, and led to Apple pausing its plans a month after initially announcing them. (Apple declined to comment for this story.)

For tech companies, detecting CSAM on their platforms and scanning some communications is not new. Companies operating in the United States are required to report any CSAM they find or is reported to them by users to the National Center for Missing and Exploited Children (NCMEC), a US-based nonprofit. More than 29 million reports, containing 39 million images and 44 million videos, were made to NCMEC last year alone. Under the new EU rules, the EU Centre will receive CSAM reports from tech companies.