You are here: Home / Internet Governance / Automated Facial Recognition Systems (AFRS): Responding to Related Privacy Concerns

Automated Facial Recognition Systems (AFRS): Responding to Related Privacy Concerns

Posted by Arindrajit Basu, Siddharth Sonkar at Jan 02, 2020 02:09 PM |
Arindrajit Basu and Siddharth Sonkar have co-written this blog as the second of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems?

 

 

The Supreme Court of India, in Puttaswamy I recognized that the right to privacy is not surrendered merely because the individual is in a public place. Privacy is linked to the individual as it is an essential facet of human dignity. Justice Chelameswar further clarified that privacy is contextual. Even in a public setting, people trying to converse in whispers would signal a claim to the right to privacy. Speaking on a loudspeaker would naturally not signal the same claim.

The Supreme Court of Canada has also affirmed the notion of contextual privacy. As recently as on 7 March, 2019, the Supreme Court of Canada in a landmark decision defined privacy rights in public areas implicitly applying Helena Nissenbaum’s theory of contextual integrity. Helena Nissenbaum explains that the extent to which the right to privacy is eroded in public spaces with the help of her theory of contextual integrity.

Nissenbaum suggests that labelling information as exclusively public or private fails to take into account the context which rationalises the desire of the individual to exercise her privacy in public. To explain this with an illustration, there exists a reasonable expectation of privacy in the restroom of a restaurant, even though it is in a public space.

In R v Jarvis (Jarvis), the Court overruled a Court of Appeal for Ontario decision to hold that people can have a reasonable expectation of privacy even in public spaces. In this case, Jarvis was charged with the offence of voyeurism for secretly recording his students. The primary issue that the  Supreme Court of Canada was concerned with was whether the students filmed by Mr. Jarvis enjoyed a reasonable expectation of privacy at their school.

The Court in this case unanimously held that students did indeed have a reasonable expectation of privacy.  The Court concluded nine contextual factors relevant in determining whether a person has a reasonable expectation to privacy would arise. The listed factors were:

“1. The location the person was in when he or she was observed or recorded,

2. The nature of the impugned conduct (whether it consisted of observation or recording),

3. Awareness of or consent to potential observation or recording,

4. The manner in which the observation or recording was done,

5. The subject matter or content of the observation or recording,

6. Any rules, regulations or policies that governed the observation or recording in question,

7. The relationship between the person who was observed or recorded and the person who did the observing or recording,

8. The purpose for which the observation or recording was done, and

9. The personal attributes of the person who was observed or recorded.” (paragraph 29 of the judgement).

The Court emphasized that the factors are not an exhaustive list, but rather were meant to be a guiding tool in determining whether a reasonable expectation of privacy existed in a given context. It is not necessary that each of these factors is present in a given situation to give rise to an expectation of privacy.

Compared to the above-mentioned factors in Jarvis, the Indian Supreme Court in Justice K.S Puttaswamy (Retd.) v. Union of India: Justice Sikri (Puttaswamy II) the case which upheld the constitutionality of the Aadhaar project relied on the following factors to determine a reasonable expectation of privacy in a given context:

“(i) What is the context in which a privacy claim is set up?

(ii) Does the claim relate to private or family life, or a confidential relationship?

(iii) Is the claim a serious one or is it trivial?

(iv) Is the disclosure likely to result in any serious or significant injury and the nature and extent of disclosure?

(v) Is disclosure relates to personal and sensitive information of an identified person?

(vi) Does disclosure relate to information already disclosed publicly? If so, its implication?”

These factors (acknowledged in Puttaswamy II in paragraph 292) seem to be very similar to the ones laid down in Jarvis, i.e., there is a strong reliance on the context in both cases. While there is no explicit mention of individual attributes of the individual claiming a reasonable expectation, the holding that children should be given an opt out indicates that the Court implicitly takes into account personal attributes (e.g. age) as well.

The Court in Jarvis further (in paragraph 39) took the example of a woman in a communal change room at a public pool. She may expect other users to incidentally observe her undress but she would continue to expect only other women in the change room to observe her and reserve her rights against the general public. She would also expect not to be video recorded or photographed while undressing, both from other users of the pool and by the general public. 

If it is later found out that the change room had a one-way glass which allowed the pool staff to view the users change — or if there was a concealed camera recording persons while they were changing, she could claim a breach of her reasonable expectation of privacy under such circumstances and it would constitute an invasion of privacy.

So, in the context of an AFRS, an individual walking down a public road may still signal that they wish to avail of their right to privacy. In such contexts, a concerted surveillance mechanism may come up against constitutional  roadblocks.

What is the nature of information being collected?

The second big question the nature of information which is being collected plays a role in determining the extent to which a person can exercise their reasonable expectation of privacy. Puttaswamy II laid down that collection of core biometric information such as fingerprints, iris scans in the context of the Aadhaar-Based Biometric Authentication (‘ABBA’) is constitutionally permissible. The basis of this conclusion is that the Aadhaar Act does not deal with the individual’s intimate or private sphere.

The judgement of the Supreme Court in Puttaswamy II is in a very specific context (i.e. the ABBA). It does not explain or identify the contextual factors which determine the extent to which privacy may be reasonably expected over biometrics generally. In this judgment, the Court observed that demographic information and photographs do not raise a reasonable expectation of privacy under Article 21 unless there exist special circumstances such as the disclosure of juveniles in conflict of law or a rape victim’s identity.

Most importantly, the Court held that face photographs for the purpose of identification are not covered by a reasonable expectation of privacy. The Court distinguished face photographs from intimate photographs or those photographs which concern confidential situations.

Face photographs, according to the Court, are shared by individuals in the ordinary course of conduct for the purpose of obtaining a driving license, voter id, passport, examination admit cards, employment cards, and so on. Face photographs by themselves reveal no information.

Naturally, this pronouncement of the Apex Court is a huge boost for the introduction of AFRS in India.

Abroad, however, on 4 September 2019, in Edward Bridges v. Chief Constable of South Wales Police, a Division Bench of the High Court in England and Wales heard a challenge against an AFRS introduced by law enforcement (see Endnote 1). The High Court rejected a claim for judicial review holding that the AFRS in question does not violate inter alia the right to privacy under Article 8 of the European Convention of Human Rights (‘ECHR’).

According to the Court, the AFRS was used for specific and limited purposes, i.e., only when the image of the public matched a person on an existing watchlist. The use of the AFRS was therefore considered a lawful and fair restriction.

The Court, however, acknowledged that extracting biometric data through AFRS is “well beyond the expected and unsurprising”. This seems to be a departure from the Indian Supreme Court’s observation in Puttaswamy II that there is no reasonable expectation of privacy over biometric data in the context of ABBA, and may be a wiser approach for the Indian courts to adopt.

Endnote

1. The challenge was put forth by Edward Bridges, a civil liberties campaigner from Cardiff for being caught on camera in two particular deployments of the AFRS a) when he was at Queen Street, a busy shopping area in Cardiff and b) when he was at the Defence Procurement, Research, Technology and Exportability Exhibition held at the Motorpoint Arena.

 

This was published by AI Policy Exchange.