“The information from Facebook was absolutely crucial to this case,” Cook said. Overall, the company made 90 referrals about Wilson to the US National Center for Missing and Exploited Children (NCMEC), a US non-profit organisation that helps find missing children and collects reports of online exploitation and abuse material. Under US law, technology companies have to report child sexual abuse material they find on their platforms to the NCMEC, although they are not obliged to proactively track down illegal content. The NCMEC then passes tips to law enforcement bodies around the world who investigate and build cases. In the case of Wilson, Facebook provided data about his behaviour, including the IP address of the phone he was using, and the content of his messages.
But the system that allowed Facebook to spot Wilson, and helped police build a case against him, is about to be torn down. Since early 2019 Facebook has been working to add end-to-end encryption to Instagram and Messenger. The move, which is likely to happen in 2022, has reignited the debate around how to balance the importance of individual privacy with protecting the most vulnerable people in society. When the rest of Facebook joins WhatsApp, which turned on end-to-end encryption by default in 2016, the daily communications of billions of people will, law enforcement officials argue, vanish. Investigators in the Wilson case say it’s unlikely he would have been caught if Facebook had already been using end-to-end encryption.
The scale of online child sexual abuse is huge. Year after year, child protection agencies report increases in the amount of abuse found online and say things have got worse as more children have been at home during the pandemic. Last year, the NCMEC received 21.4 million reports of online child sexual abuse material. Across all of the companies that reported content, Facebook accounted for 20.3 million, or almost 95 per cent, of that total. Child sexual abuse material reports have swelled due to better technology being used to find it in recent years. And Facebook has been more aggressive at detecting and finding child sexual abuse material than many other tech firms, experts say. But the impact of turning on end-to-end encryption across Instagram and Messenger is still likely to be significant.
The NCMEC estimates that “more than half” of the tips it receives will vanish when Facebook rolls out end-to-end encryption more widely. Rob Jones, the NCA’s threat leadership director, said the move would “take away” the “crown jewels from the online child protection response”. And earlier this year Facebook executives admitted that the introduction of end-to-end encryption will make it harder for the company to find abusive and harmful content being shared on its platforms.
In response, politicians in Europe, the UK, India and US have restarted the same arguments that defined the cryptowars of the 1990s. A few years ago they raised the spectre of terrorism to attack encryption, now the detection of child sexual abuse is being used to make their case. Demands have been made for technical “solutions” to encryption and Facebook has been encouraged to abandon its planned rollout. Laws are being drafted that could rein in the use of encryption. Meanwhile, civil liberties groups and technologists say that any technical compromises made to end-to-end encryption will weaken the security it provides to billions of people. They fear damaging encryption will allow carte blanche surveillance of entire nations and undermine the universal right to privacy.
At the heart of the debate is an alarming claim: that turning on end-to-end encryption on all messaging platforms and social networks by default would stop law enforcement from being able to catch people like Wilson. But would it? And what, if anything, can be done about it without breaking encryption? The issue is set to define the future of online communication but, after decades of debate, there remains no easy answer, no magic bullet. So what happens next?
END-TO-END ENCRYPTION HAS come a long way since Phil Zimmermann created Pretty Good Privacy, a previously dominant form of end-to-end encryption, in the early 1990s. Now the technology, which scrambles messages so only the sender and intended recipient can see them, is everywhere. It’s become mainstream, with every big technology company using it in some way. Signal’s open-source encryption protocol has become the default and made it possible for companies to use end-to-end encryption at scale. There are no central databases of people’s messages that can be hacked. And, best of all, when it’s turned on you don’t need to do anything. You’re protected even if you’ve never heard of cryptography before.