Main menu

Pages

Description | Use of Facial Recognition Technology by Delhi Police

featured image

When was FRT first introduced in Delhi? What are the concerns in using this technology at scale?

When was FRT first introduced in Delhi? What are the concerns in using this technology at scale?

Story so far: According to a Right to Information (RTI) response received by the Internet Freedom Foundation, a New Delhi-based digital rights organization, the Delhi Police said more than 80% of the similarities generated by facial recognition technology (FRT) systems It became clear that we were treating degree agreement as a positive outcome.

Why is Delhi Police using facial recognition technology?

Delhi Police initially obtained FRT for the purpose of tracking and identifying missing children. According to his RTI response received from the Delhi Police, the procurement was approved following his 2018 Delhi High Court Order. Sadan Haldar vs NCT in DelhiHowever, in 2018 itself, the Delhi Police filed with the Delhi High Court that the accuracy of the technology they procured was only 2%, “not good”.

Things changed after multiple reports emerged that the Delhi police were using FRT to monitor anti-CAA protests in 2019. Sadan Haldar They were using FRT in police investigations, especially in directions related to finding missing children. The expansion of FRT’s intended use is a clear example of ‘functional creep’, where a technology or system gradually expands in scope from its original purpose to encompass and achieve a wider range of functions. As a result, according to available information, the Delhi Police used his FRT for investigative purposes, and among other things, he was involved in the 2020 Northeast Delhi riots, the 2021 Red Fort riots, and the 2022 Jahangirpuri riots. It was also used during riots.

What is face recognition?

Facial recognition is an algorithm-based technology that creates a digital map of a person’s face by identifying and mapping facial features. Then match that facial feature against a database you have access to. This can be used for two purposes. One is one-to-one identification, where a face map is obtained for the purpose of verifying identity by matching it with a person’s photo in a database. For example, one-to-one authentication is used to unlock the phone. However, it is increasingly being used to provide access to benefits and government schemes. The second is 1:n identity identification, which takes a face map from a photo or video and matches it against the entire database to identify the person in the photo or video. Law enforcement agencies such as the Delhi Police typically procure FRTs for 1:n identification.

For 1:n identification, FRT produces a probability or match score between the identified suspect and the available database of identified offenders. A list of possible matches is generated based on the probability of being a correct match with the corresponding match score. Ultimately, however, it is the human analyst who selects the final potential matches from the list of matches generated by FRT. According to the Internet Freedom Foundation’s Project Panoptic, which tracks his FRT spread in India, there are at least 124 government-sanctioned FRT projects in India.

Why is using FRT harmful?

India has seen rapid deployment of FRT in recent years by both federal and state governments, but no legislation has been enacted to regulate its use. There are two problems with using FRT. Problems related to false positives due to technical inaccuracies and problems related to mass surveillance due to technical misuse. Extensive research on this technology has revealed that race and gender degrade accuracy significantly. This can lead to false positives, where a person is mistaken for someone else, or false negatives, where a person is not identified as themselves. Cases of false-positive results can lead to prejudice against misidentified individuals. In 2018, the American Civil Liberties Union revealed that Rekognition, Amazon’s facial recognition technology, incorrectly identified 28 of his congressmen who were arrested for crimes. Of the 28, a disproportionate number were people of color. Also in her 2018, researchers Joy Buolamwini and Timnit Gebru found that facial recognition systems had a higher error rate in identifying women and people of color, and a higher error rate in identifying women of color. was found to be the highest. The use of this technology by law enforcement has already resulted in three wrongful arrests of her in the United States. On the other hand, a false-negative result may exclude an individual from accessing important schemes in which he may use FRT as a means of providing access. One example of such exclusion is the failure of biometric-based authentication under Aadhaar, which prevents many people from obtaining essential government services and leads to starvation deaths.

However, even if accurate, the technology could cause irreparable harm as it can be used as a tool to facilitate mass state-sponsored surveillance. There are no FRT-specific regulations to prevent legal or misuse. In such a legal vacuum, as is the case with the Delhi Police, there are no safeguards to ensure that authorities use FRT only for authorized purposes. FRT enables continuous surveillance of individuals and can violate fundamental rights to privacy.

What did Delhi Police’s 2022 RTI response reveal?

The RTI response, dated 25 July 2022, was shared by the Delhi Police after the Internet Freedom Foundation appealed to the Central Intelligence Board for information after being repeatedly denied by the Delhi Police. In its response, the Delhi Police stated that matches with more than 80% similarity were treated as positive results, matches with less than 80% similarity were treated as false positive results, and additional “supportive evidence” was required. made clear that it was necessary. It is unclear why 80% was chosen as the threshold between positives and false positives. There is no good reason to support the Delhi Police’s claim that a match of 80% or more is sufficient to assume the results are correct. Secondly, the fact that less than 80% of results were classified as false positives instead of negatives indicates that the Delhi police may investigate his less than 80% results further. Therefore, people who share family facial features, such as extended families and communities, may end up being targeted. Communities that have been

The response also mentions that the Delhi Police are matching the photographs/videos with photographs collected under Sections 3 and 4 of the Prisoners Identification Act, 1920. To collect broader categories of data from a wider range of people, i.e. “convicted persons and other persons for the purpose of identifying and investigating criminal cases”. There are concerns that this law will result in excessive collection of personal data in violation of internationally recognized best practices for data collection and processing. The revelation raises multiple concerns as the use of facial recognition could lead to unauthorized arrests and mass surveillance, which could lead to invasion of privacy. not only. Several cities, including Kolkata, Bangalore, Hyderabad, Ahmedabad and Lucknow, have implemented ‘Safe Cities’ programs to implement surveillance infrastructure to reduce gender-based violence in the absence of a regulatory legal framework to act as a safeguard. is expanding.

Anushka Jain is an Associate Policy Counsel and Gyan Prakash Tripathi is a Policy Trainee at the Internet Freedom Foundation, New Delhi.

gist

RTI’s response received by the Internet Freedom Foundation reveals that Delhi Police treats similarity matches above 80% generated by its facial recognition technology system as positive results. Facial recognition is an algorithm-based technology that creates a digital map of the face by identifying and mapping an individual’s facial features. Then match that facial feature against a database you have access to.

Delhi Police initially obtained FRT for the purpose of tracing and identifying missing children as directed by the Delhi High Court. Sadan Haldar vs NCT in Delhi.

Extensive research on FRT has revealed that race and gender degrade accuracy significantly. This can lead to false positives, where a person is mistaken for someone else, or false negatives, where a person is not identified as themselves. The technology can also be used as a tool to facilitate state-sponsored mass surveillance.

.