Indian law enforcement is starting to place huge importance on facial recognition technology. Delhi police, looking into identifying people involved in civil unrest in northern India in the past few years, said that they would consider 80 percent accuracy and above as a “positive” match, according to documents obtained by the Internet Freedom Foundation through a public records request.
Facial recognition’s arrival in India’s capital region marks the expansion of Indian law enforcement officials using facial recognition data as evidence for potential prosecution, ringing alarm bells among privacy and civil liberties experts. There are also concerns about the 80 percent accuracy threshold, which critics say is arbitrary and far too low, given the potential consequences for those marked as a match. India’s lack of a comprehensive data protection law
The documents further state that even if a match is under 80 percent, it would be considered a “false positive” rather than a negative, which would make that individual “subject to due verification with other corroborative evidence.”
“This means that even though facial recognition is not giving them the result that they themselves have decided is the threshold, they will continue to investigate,” says Anushka Jain, associate policy counsel for surveillance and technology with the IFF, who filed for this information. “This could lead to har assment of the individual just because the technology is saying that they look similar to the person the police are looking for.” She added that this move by the Delhi Police could also result in harassment of people from communities that have been historically targeted by law enforcement officials.
In response to the IFF’s records request, police said they are using convict photographs and dossier photographs to run facial recognition. They added that these could be used as evidence but declined to share more details. They clarified, however, that in a case of a positive match, police officials would conduct further “empirical investigation” before taking any kind of legal action. Delhi Police did not respond to WIRED’s emailed requests for comment.
Divij Joshi, who has spent time researching the legality of facial recognition systems, says the threshold of an 80 percent match is virtually meaningless. Joshi explains that accuracy numbers are highly contingent upon the conditions for testing facial recognition technology models against particular benchmark data sets.
“Normal accuracy with facial recognition or machine learning systems is determined by comparing a model developed on training data and validation data with a benchmarking data set,” says Joshi, a doctoral candidate at University College London. “Once the training data is tweaked, it has to be benchmarked against a third-party data set or a slightly different data set.” This benchmarking, he says, is what is typically used to calculate the predictive accuracy percentage.
Evidence of racial bias in facial recognition models has long made the technology’s use problematic. And while many variables affect the accuracy of facial recognition systems, widespread police use of a system with an overall 80 percent accuracy threshold appears to be highly unusual. A 2021 US National Institute of Standards and Technology study found that systems used to match a single scan of travelers’ faces to a database that contains their photos had an accuracy rate of 99.5 percent or better. Other studies, however, have found error rates as high as 34.7 percent when used to identify women with darker complexions.