Apple fixes one problem, creates another
Take Apple, for example. The brouhaha surrounding its decision to invent a technology to scan user images for CSAM material has apparently “surprised” the company.
To my cynical eyes, the fact Apple announced the move in a note quietly published to its website at the end of the weekly news cycle speaks volumes. As I see it, every PR person on the planet knows making announcements at the end of the week is a way to bury bad news.
This makes me think it wasn’t actually surprised. It just failed to manage the reaction – and is now in damage control as it continues to add additional explanations to the original announcement. The company’s senior vice president for software, Craig Federighi, has even been wheeled out to try to explain things better.
I am glad criticism of the move is now taking place inside the company. I think Apple’s motivation was to create a solution that enabled it to scan image libraries while defending user privacy, but I also see that it wound up building a technology framework that can easily be twisted to undermine privacy.
It wanted to protect privacy, but instead invented a system that could undermine it. That Apple now just wants us to trust it not to extend the system into other domains stretches credulity. Now that the system has been invented and the company has confirmed its existence, there’s no way back.
By accident or design, Apple has opened Pandora’s box. Trust is a currency, but at this level it must be backed up by regulation.
The ethics of a hacker
It’s the same for the NSO Group, which offers to invade almost anybody’s privacy for a very high price. While the company promises that if you have nothing to hide, you have nothing to fear, and says it only works with governments, you just have to take a look how its hacks have recently been used to see the problem.
The lack of respect for human rights evidenced in how NSO’s tech has already been used highlights the challenge Apple now faces if it really wants to keep its promise not to extend its CSAM scanning system into other domains.
We need regulation
The problem is that now we know the system exists, there is no way to roll it back — and governments that want such systems in your devices know it’s possible. So the pressure is on.
That’s why a United Nations call for a moratorium on the sale of surveillance tech such as the NSO Group’s Pegasus seems well timed. “It is highly dangerous and irresponsible to allow the surveillance technology and trade sector to operate as a human rights-free zone,” the UN warns.
“International human rights law requires all States to adopt robust domestic legal safeguards to protect individuals from unlawful surveillance, invasion of their privacy or threats to their freedom of expression, assembly and association,” the agency said.
What’s required is an internationally agreed legal framework that regulates use of tech-based surveillance across the board, from the kind of surveillance-based advertising Apple has pushed so hard against to the egregious use of tech, such as Cambridge Analytica, the NSO Group, and the on-device snooping Apple just revealed.
Anyone using any device should have a reasonable expectation of how their use of that device is protected. And this should be an internationally agreed-upon set of standards, likely built around principles of freedom of speech and association.
Where’s Tim Cook?
It is upsetting, given his leadership on privacy, that Apple CEO Tim Cook has remained silent on this matter. It was only in 2019 he wrote, “It’s time to stand up for the right to privacy – yours, mine, all of ours,” in Time magazine.
In 2018 he had said: “Rogue actors and even governments have taken advantage of user trust to deepen divisions, incite violence, and even undermine our shared sense of what is true and what is false.”
That last point is one to which Cook often returns. In Canada earlier this year, he warned of the need to protect freedom of expression, and recently discussed the need to give “users peace of mind by strengthening that control and the freedom to use their technology without worrying about who is looking over their shoulder.”
Just over a week ago, the slow but steady process towards agreeing such rules was acceptable. Things have changed.
Apple is not a small entity. Macs, iPhones, and iPads have over a billion users. The decision to enable on-device surveillance across its platforms means it has now made it critical to put in place an international bill of digital rights.
In order to keep its promise to keep our privacy safe, Apple should — morally, I think — now put the full extent of its corporate might behind the development of such a set of rights. Nothing less will do.