You are here: Home / Internet Governance / Decrypting Automated Facial Recognition Systems (AFRS) and Delineating Related Privacy Concerns

Decrypting Automated Facial Recognition Systems (AFRS) and Delineating Related Privacy Concerns

Posted by Arindrajit Basu, Siddharth Sonkar at Jan 02, 2020 02:00 PM |
Arindrajit Basu and Siddharth Sonkar have co-written this blog as the first of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems?

 

 

The use of aggregated Big Data by governments has the potential to exacerbate power asymmetries and erode civil liberties like few technologies of the past. In order to guard against the aggressive  aggregation and manipulation of the data generated by individuals who are branded as suspect, it is critical that our firmly established constitutional rights protect human dignity in the face of this potential erosion.

The increasing ubiquity of Automated Facial Recognition Systems (AFRS) serve as a prime example of the rising desire of governments to push fundamental rights to the brink. With AFRS, the core fundamental right in question is privacy, although questions have been posed regarding the potential violation of other related rights, such as the Right to Equality and the Right to Free Speech and Expression, as well.

There is a rich corpus of literature, (see here, here and an excellent recent paper by Smriti Parsheera here) from a diverse coterie of scholars that call out the challenges posed by AFRS, particularly with respect to its proportionality as a restriction over the right to privacy. Our contribution to this discourse focuses on a very specific question around a ‘reasonable expectation of privacy’ — the standard identified for the protection of privacy in public spaces across jurisdictions, including in India. This is because at this juncture, the precise nature of the AFRS which will eventually be used and the regulations it will be subject to are not clear. 

In Retd. K.S Puttaswamy (Retd.) v. Union of India: Justice Chandrachud (Puttaswamy I), the Indian Supreme Court was concerned with the question whether there exists a fundamental right to privacy under the Indian Constitution. A nine-judge bench of the Court recognized that the right to privacy is a fundamental right implicit inter alia in the right to life within Article 21 of the Constitution.

The right to privacy protects people and not places. Every person is entitled, however, to a reasonable expectation of privacy. The expectation of privacy must be twofold. First, the person must prove that the alleged act could inflict some harm. Such harm must be real and not be speculative or imaginary. Second, society must recognize this expectation as reasonable. The test of reasonable expectations is contextual, i.e., the extent to which it safeguards privacy depends on the place at which the individual is.

In order to pass any constitutional test, therefore, AFRS must satisfy the ‘reasonable expectation’ test articulated in Puttaswamy. However, in this context, the test itself has multiple contours. Do we have a right to privacy in a public place? Is AFRS collecting any data that specifically violates a right to privacy? Is the aggregation of that data a potential violation?

After providing a brief introduction to the use cases of AFRS in India and across the world, we embark upon answering all these questions.

Primer on Automated Facial Recognition Systems (AFRS)

Facial recognition is a biometric technology that utilises cameras to match stored or live footage of individuals (including both stills and moving footage) with images or video from an existing database. Some systems might also be used to analyze broader demographic trends or conduct sentiment analysis through crowd scanning.

While the use of photographs and video footage have been core components of police investigation, the use of algorithms to process vast tracts of Big Data (characterized by ‘Volume, Velocity, and Variety), and compare disparate and discrete data points allows for the derivation of hitherto unfeasible insights on the subjects of Big Data.

The utilisation of AFRS for law enforcement is rapidly spreading around the world. A Global AI Surveillance Index compiled by the Carnegie Endowment for International Peace found that at least sixty-four countries are incorporating facial recognition systems into their AI surveillance programs.

Chinese technology company Yitu has entered into a partnership with security forces in Malaysia to equip police officers with facial recognition body cameras that, powered by enabling technologies, would allow a comparison of images caught by the live body cameras with images from several central databases.

In England and Wales, London Metropolitan Police, South Wales Police, and Leicestershire Police are all in the process of developing technologies that allow for the identification and comparison of live images with those stored in a database.

The technology is being developed by Japanese firm NEC and the police force has limited ability to oversee or modify the software, given its proprietary nature. The Deputy Chief of South Wales Police stated that “the tech is given to [them] as a sealed box… [and the police force themselves] have no input – whatever it does, it does what it does.”

In the US, Baltimore’s police set up facial recognition cameras to track and arrest protestors — a system that reached its zenith during the 2018 riots in the city. 

It is suspected that authorities in Hong Kong are also using AFRS to clamp down on the ongoing pro-democracy protests.

In India, the Ministry of Home Affairs, through the National Crime Records Bureau put out a tender for a new AFRS, whose stated objective is to “act as a foundation for national level searchable platform of facial images.” The AFRS will pull facial image data from CCTV feeds and compare these with existing records across databases including the Crime and Criminal Tracking Networks and Systems (CCTNS), Inter-operable Criminal Justice System (or ICJS), Immigration Visa Foreigner Registration Tracking (IVFRT), Passport, Prisons and state police records.

Plans are also afoot to integrate this with the yet to be deployed National Automated Fingerprint Identification System (NAFIS), thereby creating a multi-faceted surveillance system.

Despite raising eyeballs due to its potential all-pervasive scope, this tender is not the first instance of AFRS being used by Indian authorities. Punjab Police, in partnership with Gurugram-based start-up Staqu has launched and commenced implementation of  the Punjab Artificial Intelligence System (PAIS) which uses digitised criminal records and automated facial recognition to retrieve information on a suspected criminal and essentially tracks their public whereabouts, which poses potential constitutional questions.

 

This was published by AI Policy Exchange.