London is buying heaps of facial recognition tech

The UK’s biggest police force is set to significantly expand its facial recognition capabilities before the end of this year. New technology will enable London’s Metropolitan Police to process historic images from CCTV feeds, social media and other sources in a bid to track down suspects. But critics warn the technology has “eye-watering possibilities for abuse” and may entrench discriminatory policing.

In a little-publicised decision made at the end of August, the Mayor of London’s office approved a proposal allowing the Met to boost its surveillance technology. The proposal says that in the coming months the Met will start using Retrospective Facial Recognition (RFR), as part of a £3 million, four-year deal with Japanese tech firm NEC Corporation. The system examines images of people obtained by the police before comparing them against the force’s internal image database to try and find a match.

“Those deploying it can in effect turn back the clock to see who you are, where you’ve been, what you have done and with whom, over many months or even years,” says Ella Jakubowska, policy advisor at  European Digital Rights, an advocacy group. Jakubowska says the technology can “suppress people’s free expression, assembly and ability to live without fear”.

The purchase of the system is one of the first times the Met’s use of RFR has been publicly acknowledged. Previous versions of its facial recognition web page on the Wayback Machine shows references to RFR were added at some stage between November 27, 2020, and February 22, 2021. The technology is currently used by six police forces in England and Wales, according to a report published in March. “The purchase of a modern, high-performing facial recognition search capability reflects an upgrade to capabilities long used by the Met as well as a number of other police forces,” a spokesperson for the Met says.

Critics argue that the use of RFR encroaches on people’s privacy, is unreliable and could exacerbate racial discrimination. “In the US, we have seen people being wrongly jailed thanks to RFR,” says Silkie Carlo, director of civil liberties group Big Brother Watch. “A wider public conversation and strict safeguards are vital before even contemplating an extreme technology like this, but the Mayor of London has continued to support expensive, pointless and rights-abusive police technologies.”

A spokesperson for the Mayor of London defended the use of the technology, saying it will shorten the time it takes to identify suspects and help reduce crime in the capital. “Whilst this is clearly an important policing tool, it’s equally important that the Met Police are proportionate and transparent in the way it is used to retain the trust of all Londoners,” the spokesperson says. The London Policing Ethics Panel, an independent scrutiny group set up by the Mayor’s office, has been tasked with reviewing and advising the Met on its use of the RFR, although this process has not happened before the purchase of the technology was approved. The Ethics Panel did not respond to a request for comment.

Political support for the use of facial recognition remains contested in the UK, with MPs from Labour, the Liberal Democrats and the Green Party all calling for regulations on the use of the technology. “I’m disappointed to see this latest development in the Met’s use of Retrospective Facial Recognition software,” says Sarah Olney, Liberal Democratic MP for Richmond Park. “It comes despite the widespread concerns as to its accuracy, along with its clear implications on human rights. Better policing ought to start from a foundation of community trust. It’s difficult to see how RFR achieves this.”

The expansion of the Met’s facial recognition technology, which also includes Live Facial Recognition (LFR) systems that are used in public places, comes at a time when the legality of such systems remain in question with serious concerns being raised about its deployment. Lawmakers around the world are considering how to regulate facial recognition systems and multiple cities have banned the use of the technology. 

The UK’s data regulator, the Information Commissioner’s Office, has not published official guidance on the use of RFR. “Police forces wishing to use RFR technology must comply with data protection law before, during and after its use,” an ICO spokesperson says, adding that organisations must put in place robust policies and complete a Data Protection Impact Assessment (DPIA) prior to processing people’s data. “These are crucial steps to take so public trust is not lost,” the spokesperson says.