Can Face Recognition Technology Accurately Identify Delhi Rioters?

In 2018, Delhi Police told the Delhi High Court that its face recognition system had an accuracy rate of 2 percent.

Sushovan Sircar
India
Published:
 In a first of its kind admission, Home Minister Amit Shah, speaking about the communally-charged violence in Delhi that left at least 52 dead and over 500 injured said, “We have used using facial recognition software to initiate the process of identifying faces.”
i
In a first of its kind admission, Home Minister Amit Shah, speaking about the communally-charged violence in Delhi that left at least 52 dead and over 500 injured said, “We have used using facial recognition software to initiate the process of identifying faces.”
(Image: Aroop Mishra/ The Quint)

advertisement

Editor: Ashutosh Bhardwaj | Camera: Abhishek Ranjan

Union Home Minister Amit Shah revealed in Lok Sabha on Wednesday, 11 March, evening that law enforcement agencies deployed facial recognition software to identify over 1,100 individuals who allegedly partook in the communal violence in northeast Delhi on 24-25 February.

Shah added that by using the software, law enforcement found out that over 300 individuals came from neighbouring Uttar Pradesh and that this revelation “makes it clear that this was deep conspiracy.”

In a first of its kind admission, the home minister, speaking about the communally-charged violence that left at least 52 dead and over 500 injured said, “We have used using facial recognition software to initiate the process of identifying faces.”

While Shah did not specify the kind of facial recognition software used – if biometrics were used or which law enforcement agency deployed it – he stated that the software was fed with images from voter ID cards as well as driving licence databases, among others.

“We have fed voter ID data into it, we have fed driving licence and all other government data into it. More than 1,100 people have already been identified through this software,” Shah said.

Decoding Shah’s Statement

1. LEGALITY

“We have used facial recognition software to initiate the process of identifying faces.”

A key concern in the use of evolving technology like facial recognition is the absence of an underlying legal framework to guide its usage while ensuring adequate protection of fundamental right to privacy.

Apar Gupta, executive director, Internet Freedom Foundation, which has raised concerns over the dangers of facial recognition systems, said, “All of this is being done without any clear underlying legal authority and is in clear violation of the Right to Privacy judgment.”

2. TRANSPARENCY

“It only sees the face and through the face the person is caught. We have fed voter ID data into it, we have fed driving license and all other government data into it.”

The question to ask is – What face recognition software are these databases being fed into? As of now, the software, its developer, its algorithm training, datasets are not known.

Shah did not specify which agency deployed the technology.

Moreover, an uptake in the use of facial recognition among law enforcement agencies in India comes at a time when countries in the West – US & EU – have taken drastic steps to ban or limit the use of the technology.

In May 2019, the city of San Francisco, the capital of tech and Silicon Valley in the United States, banned the use of facial recognition by law enforcement agencies owing to the potential for abuse and amid fears of the technology pushing the country into an overtly surveillance state.

ADVERTISEMENT
ADVERTISEMENT

3. BIAS

“Owaisi <i>sahab</i> also expressed concern that innocents should not be caught with this. Owaisi <i>sahab</i>,this is a software. It does not see faith, as a software it does not see clothes.”
Amit Shah, Union Home Minister

What Shah is implying through his response to Owaisi’s concerns is that algorithms are agnostic or blind to biases inherent in humans. However, this assumption has been debunked thoroughly through extensive research.

On the contrary, FRT algorithms in a number of commercial software have shown racial, ethnic and gender biases.

At least one study carried out at Massachusetts Institute of Technology has revealed that FRT from giants like IBM and Microsoft is less accurate when identifying females. In the US, many reports have discussed how such softwares are particularly poor at accurately recognising African-American women.

Even Amazon cannot get it right. Yes, Amazon! In a major embarrassment to the company, a test of its software called “Rekognition”, incorrectly identified 28 members of US Congress as other people arrested for crimes.

4. ACCURACY

“More than 1,100 people have already been identified through this software. And I want to say that over 300 people from Uttar Pradesh came in to cause riots here. The facial data that we had ordered from UP makes it clear that this was deep conspiracy.”
Amit Shah, Union Home Minister

There is no information on the specific software deployed and hence no knowledge of the accuracy levels of algorithm being used. The accuracy of FRTs has been one of the primary concerns in deploying the technology for security and law enforcement purposes.

Among the risks posed by low-accuracy software are false negatives and false positives in identifying individuals. While the former involves not identifying people involved in the violence, the latter would lead to incorrectly identifying innocent people.

This risk is exacerbated in situations which demand ‘1: Many identification’. This means identifying individuals from a pool of unknown persons.

Smriti Parsheera, fellow at the National Institute of Public Finance and Policy (NIPFP), authored a report on FRTs adoption and regulation of facial recognition technologies in India, in which she describes the risks with ‘1: Many identification’.

“1:many identification, on the other hand, is more complicated and also more prone to errors, given the larger set of unknowns in the system,” the report states.

“For instance, trying to identify a person based on CCTV footage is complicated because of the conditions in which such images are captured as well as the uncertainty regarding whether that person is actually present in the database that is being used for matching,” it adds.

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT