ADVERTISEMENTREMOVE AD

Face Recognition Tender Extended Five Times; NCRB Won’t Say Why

Prospective bidders for NCRB’s nationwide facial recognition project found the Rs 40 lakh deposit criteria too harsh

Updated
Aa
Aa
Small
Aa
Medium
Aa
Large

Editor: Purnendu Pritam | Camera: Mukul Bhandari

In a series of curious developments, the Ministry of Home Affairs’ tender for a controversial nationwide facial recognition system has been postponed a fourth consecutive time.

The National Crime Records Bureau, which has “conceptualised the Automated Facial Recognition System (AFRS)” as an effort towards “modernising the police force, information gathering, criminal identification,” had issued the Request for Proposal on 4 July and was supposed to open the technical bids on 19 August.

However, there have now been a total of five “last dates” for submission of bids including the original 19 August deadline.

The bids have successively been postponed by a month to September, October and subsequently to 8 November. Most recently, the last date was pushed once again to 3 January 2020 and the date for opening the bids scheduled on 6 January.

Officially, the NCRB has stated that the extensions have been caused by “administrative reasons” but has remained silent on a number of questions about the bidding process, the concerns of prospective bidders and the legal framework within which the exercise is being carried out.

ADVERTISEMENTREMOVE AD

According to the 172-page long Request for Proposal document, stakeholders of this project are the NCRB, the Ministry of Home Affairs and all state police forces.

Despite repeated attempts by The Quint, the NCRB director and deputy director in-charge of the face recognition project have refused to offer comments on the reasons for the delay. A senior official had stated that all queries regarding the AFRS and the tender process would have to be directed to the Ministry of Home Affairs.

The Quint will update the story once it receives a response from the MHA.

Following are the list of all the scheduled last dates for submission of bids and for opening of the bids:

Last Date: 16 August
Bid Open: 19 August

Last Date: 13 September
Bid Open: 16 September

Last Date: 11 October
Bid Open: 14 October

Last Date: 8 November
Bid Open: 11 November

Last Date: 3 January 2020
Bid Open: 6 January 2020

Issues Raised by Bidders

During a pre-bid conference organised at the NCRB headquarters on 25 July, many prospective bidders had requested the nodal agency to extend the last date for submissions.

The Quint was present at the two-hour long session chaired by Deputy Director A Mohan Krishna, where over 80 representatives of vendors were present.

Following are some of the key issues highlighted by prospective bidding companies.

Allow Consortiums: NCRB should allow consortiums comprising three companies to bid for the tender.

Deposit Amount Too High: A number of participants had requested for an exemption of the Earnest Money Deposit (EMD) of Rs 40 lakh to qualify for the bidding process.

“Essentially, only the usual international big players will remain eligible and competent startups who fit the requirements of the tender will lose out,” an executive of a prospective bidder told The Quint.

Confusion Over Past Experience Criteria: While the qualification criteria specifies that bidders must submit proof “at least three AFRS installations with at least 10 lakh database across the world,” some felt this to be too restrictive. Two questions were raised in this regard.

What is Facial Recognition System?

Automatic facial recognition (AFR) is an advanced way of recognising people by using computers to scan their faces. According to a research paper, “It aims to identify people in images or videos using sophisticated pattern recognition techniques.”

The NRCB, is seeking to implement a system where a police officer can take a photograph of an individual, say at a protest or a crime scene, and match it with the photographs in their database to identify the person and get basic information about the person.

Some intended objectives of the AFRS:

  • The AFRS will be a centralised web application hosted at the NCRB Data Centre in Delhi
  • The repository shall act as a foundation for a national level searchable platform of facial images
  • Capture face images from CCTV feed and generate alerts if a blacklist match is found
  • The system should have option to upload bulk images of an individual
  • The solution should be compatible with other biometric solutions such as Iris and fingerprint for generation of comprehensive biometric authentication reports
  • The database should be able to store 1.5 crore images and accommodate 2,500 simultaneous users
ADVERTISEMENTREMOVE AD

Is The Project Backed By Law?

The Internet Freedom Foundation, mentions in a post on its site that it had sent a legal notice to the National Crime Records Bureau (NCRB) on 18 August 2019 “to halt and recall the invitation for bids for the implementation of a centralised Automated Facial Recognition System (AFRS).”

The NCRB responded to the legal notice on 5 November 2019. However, according to the IFF, it “received emails from NCRB officials stating that they wished to recall the email response sent to us”

1. No Legal Basis

Among the biggest problems in launching such a massive project is the absence of a law to supervise and inform how a government agency can go about using and processing our images.

“There exists a complete lack of legality in undertaking a massive government program which infringes the right to privacy especially with conditions set out by the Apex Court of India,” IFF had stated in its legal notice.

NCRB RESPONSE: NCRB appears to be locating the legality of the project in a CCTNS cabinet note from 2009 which had envisaged the creation of an automated facial recognition system.

“A cabinet note is not a statutory enactment but a record of proceedings, and hence, AFRS continues to lack legality,” IFF’s rejoinder on NCRB’s response has pointed out.

2. Compatibility With Biometric Solutions

Under section 2.2 – Functional Requirements of the AFRS System – point 21 states that the solution “should be compatible with other biometric solutions such as iris and fingerprints for generating comprehensive biometric authentication reports.”

NCRB RESPONSE: “There will not be any integration of AFRS with Aadhaar database whatsoever,” NCRB’s response to a legal notice sent by the Internet Freedom Foundation had stated.

ADVERTISEMENTREMOVE AD

3. Dangers of Misindentification

Even Amazon cannot get it right. Yes, Amazon! In a major embarrassment to the company, a test of its software called “Rekognition”, incorrectly identified 28 members of US Congress as other people arrested for crimes.

In May, San Francisco, at the heart of Silicon Valley’s technology revolution, became the first American city to ban the use of facial technology by the police.

The block came amidst growing fears of abuse by the government and pushing the city towards overt surveillance.

NCRB RESPONSE: The response appears to be suggesting that it would be not be affected by such issues because AFRS will comply NIST standards (American National Standards Institute and that “Procurement of AFRS will be done only after satisfactory test run and proposed solution meeting all stringent evaluation criteria.”

4. Dangers of Discriminatory Profiling & Bias Against Women

At least one study carried out at Massachusetts Institute of Technology has revealed that FRS from giants like IBM and Microsoft is less accurate when identifying females. In the US, many reports have discussed how such softwares are particularly poor at accurately recognising African-American women.

NCRB RESPONSE: The response simply says “AFRS solution will not discriminate on the bases of religion, geography, class, and caste etc.”

Why? Because “Adequate precautions/ safeguards will also form part of SOP.”

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: 
Speaking truth to power requires allies like you.
Become a Member
×
×