The surveillance-as-a-service industry needs to be brought to heel

Advertisement

Here we go again: another example of government surveillance involving smartphones from Apple and Google has emerged, and it shows how sophisticated government-backed attacks can become and why there’s justification for keeping mobile platforms utterly locked down.

What has happened?

I don’t intend to focus too much on the news, but in brief it is as follows:

  • Google’s Threat Analysis Group has published information revealing the hack.
  • Italian surveillance firm RCS Labs created the attack.
  • The attack has been used in Italy and Kazakhstan, and possibly elsewhere.
  • Some generations of the attack are wielded with help from ISPs.
  • On iOS, attackers abused Apple’s enterprise certification tools that enable in-house app deployment.
  • Around nine different attacks were used.

The attack works like this: The target is sent a unique link that aims to trick them into downloading and installing a malicious app. In some cases, the spooks worked with an ISP to disable data connectivity to trick targets into downloading the app to recover that connection.

The zero-day exploits used in these attacks have been fixed by Apple. It had previously warned that bad actors have been abusing its systems that let businesses distribute apps in-house

Advertisement
. The revelations tie in with recent news from Lookout Labs of enterprise-grade Android spyware called Hermit.

What’s at risk?

The problem here is that surveillance technologies such as these have been commercialized. It means capabilities that historically have only been available to governments are also being used by private contractors. And that represents a risk, as highly confidential tools may be revealed, exploited, reverse-engineered and abused.

As Google said: “Our findings underscore the extent to which commercial surveillance vendors have proliferated capabilities historically only used by governments with the technical expertise to develop and operationalize exploits. This makes the Internet less safe and threatens the trust on which users depend.”

Not only this, but these private surveillance companies are enabling dangerous hacking tools to proliferate, while giving these high-tech snooping facilities available to governments — some of which seem to enjoy spying on dissidents, journalists, political opponents, and human rights workers. 

An even bigger danger is that Google is already tracking at least 30 spyware makers, which suggests the commercial surveillance-as-a-service industry is strong. It also means that it’s now theoretically possible for even the least credible government to access tools for such purposes — and given so many of the identified threats make use of exploits identified by cybercriminals, it seems logical to think this is another income stream that encourages malicious research.

What are the risks?

The problem: these close-seeming links between purveyors of privatized surveillance and cybercrime won’t always work in one direction. Those exploits — at least some of which appear to be sufficiently difficult to discover that only governments would have the resources to be able to do so — will eventually leak.

Advertisement

And while Apple, Google, and everyone else remain committed to a cat-and-mouse game to prevent such criminality, closing exploits where they can, the risk is that any government-mandated back door or device security flaw will eventually slip into the commercial markets, from which it will reach the criminal ones.

Europe’s Data Protection regulator warned: “Revelations made about the Pegasus spyware raised very serious questions about the possible impact of modern spyware tools on fundamental rights, and particularly on the rights to privacy and data protection.”

That’s not to say there aren’t legitimate reasons for security research. Flaws exist in any system, and we need people to be motivated to identify them; security updates wouldn’t exist at all without the efforts of security researchers of various kinds. Apple pays up to six-figures to researchers who identify vulnerabilities in its systems.

What happens next?

The EU’s data protection supervisor called for a ban on the use of NSO Group’s infamous Pegasus software earlier this year. In fact, the call went further, outright seeking a “ban on the development and deployment of spyware with the capability of Pegasus.”

NSO Group is now apparently up for sale.

The EU also said that in the event such exploits were used in exceptional situations, such use should require companies such as NSO are made subject themselves to regulatory oversight. As part of that, they must respect EU law, judicial review, criminal procedural rights and agree to no import of illegal intelligence, no political abuse of national security and to support civil society.

In other words, these companies need bringing into line.

What you can do

Following revelations about NSO Group last year, Apple published the following best practice recommendations to help mitigate against such risks.

  • Update devices to the latest software, which includes the latest security fixes.
  • Protect devices with a passcode.
  • Use two-factor authentication and a strong password for Apple ID.
  • Install apps from the App Store.
  • Use strong and unique passwords online.
  • Don’t click on links or attachments from unknown senders.

Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

Advertisement