Two other UK police forces have tried LFR in recent months, raising concerns about how the technology will be used going forward and who is put on the watchlists of people the systems are looking for. There are more than 30 police forces that have not used LFR, although the government has said it is “very supportive” of more forces using it.
Around 380,000 people had their faces scanned by Northamptonshire police over three days at the British Grand Prix in July, according to public records requests. No arrests were made. A Freedom of Information Act request from Big Brother Watch revealed the force had placed 790 people on the watchlist for the event while only 234 were wanted for arrest. “It was largely individuals who were not wanted for any criminal reasons, which leads us to believe that these were protesters who were being put on watchlists,” Stone says. The previous year’s British Grand Prix was disrupted by protesters.
A statement from Northamptonshire Police says the watchlist “included a range of offences from organized crime to aggravated trespass, which had the potential of leading to the death or serious injury of the public, racing drivers and staff, especially if there was a repeat of the 2022 track incursion. We were not targeting those taking part in lawful, peaceful protests.”
Police forces are also being encouraged to ramp up the use of after-the-fact, retrospective face recognition. All police forces across the UK have the ability to run searches for faces against the Police National Database, which has more than 16 million photos and includes millions of images that should have been deleted years ago. In 2022, according to data from freedom of information requests, there were 85,158 face searches—up 330 percent on the previous year. Policing minister Chris Philp said in October that he wants the number of searches to double by May 2024.
Fussey, the University of Essex professor, says retrospective face recognition is often thought to be more “benign” than live face recognition, but he doesn’t believe it is the case. “The issues and harms and human rights implications are the same, whether it’s live or retrospective,” he says, adding there is a lot of ambiguity around how the technology is being used.
In August 2020, the UK’s Court of Appeal ruled that South Wales Police’s use of LFR was unlawful. Since then, police forces using the technology say they have changed their procedures in response to the court decision, and the Home Office spokesperson says there is a “comprehensive legal framework in the UK” that requires police to use the technology only when it is “necessary, proportionate, and fair.”
Many disagree. A wide-ranging review from the Ada Lovelace Institute, a nonprofit, says there is “legal uncertainty” about the use of LFR. Another report by University of Cambridge academics, from the Minderoo Centre for Technology and Democracy, concluded that three examined police deployments of face recognition “failed to meet the minimum ethical and legal standards.”
“It’s wholly legitimate for police to use technology to keep the public safe,” says Fussey, who recently completed a report on proposals to change the oversight of biometrics in the UK. “The question is about how lawful and necessary it is,” he says. “At the moment, we’re in a situation where the legal basis isn’t clear. There’s no external oversight of how it’s used, how it’s authorized, who’s on the watchlist.”