<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 1 to 3.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/india-is-falling-down-the-facial-recognition-rabbit-hole"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf">
    <title>Facial Recognition Technology in India</title>
    <link>http://editors.cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf'&gt;http://editors.cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facial Recognition</dc:subject>
    

   <dc:date>2021-09-02T16:17:44Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india">
    <title>Facial Recognition Technology in India </title>
    <link>http://editors.cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india</link>
    <description>
        &lt;b&gt;The Human Rights, Big Data and Technology Project, University of Essex, UK and the Centre for Internet &amp; Society (CIS) have jointly published a research paper on facial recognition technology. Authors, Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha, examine technological tools such as CCTV and FRT which are increasingly being deployed by the government.&lt;/b&gt;
        &lt;h3&gt;Executive Summary&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Over the past two decades there has been a sustained effort at digitising India’s governance structure in order to foster development and innovation. The field of law enforcement and safety has seen significant change in that direction, with technological tools such as Closed Circuit Television (CCTV) and Facial Recognition Technology (FRT) increasingly being deployed by the government.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Yet for all its increased use, there is still a lack of a coherent legal and regulatory framework governing FRT in India. Towards informing such a framework, this paper seeks to document present uses of FRT in India, specifically by  law enforcement agencies and central and state governments, understand the applicability of existing legal frameworks to the use of FRT, and define key areas that need to be addressed when using the technology in India. We also briefly look at how the coverage of FRT has increased beyond law enforcement; it now covers educational institutions, employment purposes, and it is now being used for providing Covid-19 vaccines.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We begin by examining use cases of FRT systems by various divisions of central and state governments. In doing so, it becomes apparent that there is a lack of uniform standards or guidelines at either the state or central level - leading to different FRT systems having differing standards of applicability and scope of use.  And while the use of such systems seems to be growing at a rapid rate, questions around their legality persist.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;It is unclear whether the use of FRT is compliant with the fundamental right to privacy as affirmed by the Supreme Court in 2017 in &lt;i&gt;Puttaswamy&lt;/i&gt;. While the right to privacy is not an absolute right, for the state to curtail this right, the restrictions will have to comply with a three-fold requirement— first, being the need for explicit legislative mandate in instances where the government looks to curtail the right. However, the FRT systems we have analysed do not have such a mandate and are often the result of administrative or executive decisions with no legislative blessing or judicial oversight.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;We further locate the use of FRT technology within the country’s wider legislative, judicial and constitutional frameworks governing surveillance. We also briefly articulate comparative perspectives on the use of  FRT in other jurisdictions. We further analyse the impact of the proposed Personal Data Protection Bill on the deployment of FRT. Finally, we propose a set of recommendations to develop a path forward for the technology’s use which include the need for a comprehensive legal and regulatory framework that governs the use of FRT. Such a framework must take into consideration the necessity of use, proportionality, consent, security, retention, redressal mechanisms, purpose limitation, and other such principles. Since the use of FRT in India is also at a nascent stage, it is imperative that there is greater public research and dialogue into its development and use to ensure that any harms that may arise in the field are mitigated.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;Click to download the entire &lt;a href="http://editors.cis-india.org/internet-governance/facial-recognition-technology-in-india.pdf" class="external-link"&gt;research paper here&lt;/a&gt;&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india'&gt;http://editors.cis-india.org/internet-governance/blog/hrbdt-and-cis-august-31-2021-facial-recognition-technology-in-india&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Elonnai Hickok, Pallavi Bedi, Aman Nair and Amber Sinha</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Facial Recognition</dc:subject>
    

   <dc:date>2021-09-02T16:21:24Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/india-is-falling-down-the-facial-recognition-rabbit-hole">
    <title>India is falling down the facial recognition rabbit hole</title>
    <link>http://editors.cis-india.org/internet-governance/blog/india-is-falling-down-the-facial-recognition-rabbit-hole</link>
    <description>
        &lt;b&gt;Its use as an effective law enforcement tool is overstated, while the underlying technology is deeply flawed.&lt;/b&gt;
        
&lt;p&gt;The article by Prem Sylvester and Karan Saini was published in &lt;a href="https://thewire.in/tech/india-is-falling-down-the-facial-recognition-rabbit-hole"&gt;the Wire&lt;/a&gt; on July 23, 2019.&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;div class="grey-text"&gt;
&lt;p&gt;In a 
discomfiting reminder of how far technology can be used to intrude on 
the lives of individuals in the name of security, the Ministry of Home 
Affairs, through the National Crime Records Bureau, &lt;a href="http://ncrb.gov.in/TENDERS/AFRS/RFP_NAFRS.pdf"&gt;recently put out a tender&lt;/a&gt; for a new Automated Facial Recognition System (AFRS).&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The stated objective of this system is to “act as a foundation for a national level searchable platform of facial images,” and to “[improve]
 outcomes in the area of criminal identification and verification by 
facilitating easy recording, analysis, retrieval and sharing of 
Information between different organizations.”&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The system will pull facial image 
data from CCTV feeds and compare these images with existing records in a
 number of databases, including (but not limited to) the Crime and 
Criminal Tracking Networks and Systems (or CCTNS), Interoperable 
Criminal Justice System (or ICJS), Immigration Visa Foreigner 
Registration Tracking (or IVFRT), Passport, Prisons, Ministry of Women 
and Child Development (KhoyaPaya), and state police records.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Furthermore, this system of facial 
recognition will be integrated with the yet-to-be-deployed National 
Automated Fingerprint Identification System (NAFIS) as well as other 
biometric databases to create what is effectively a multi-faceted system
 of biometric surveillance.&lt;/p&gt;
&lt;p&gt;It is rather unfortunate, then, that 
the government has called for bids on the AFRS tender without any form 
of utilitarian calculus that might justify its existence. The tender 
simply states that this system would be “a great investigation 
enhancer.”&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This confidence is misplaced at best.
 There is significant evidence that not only is a facial recognition 
system, as has been proposed, &lt;a href="https://www.nytimes.com/2019/07/01/us/facial-recognition-san-francisco.html"&gt;ineffective in its application as a crime-fighting tool&lt;/a&gt;, but it is a significant &lt;a href="https://www.independent.co.uk/news/uk/home-news/facial-recognition-uk-police-london-trials-inaccurate-legal-results-ethics-a8938851.html"&gt;threat to the privacy rights and dignity of citizens&lt;/a&gt;.
 Notwithstanding the question of whether such a system would ultimately 
pass the test of constitutionality – on the grounds that it affects 
various freedoms and rights guaranteed within the constitution – there 
are a number of faults in the issued tender.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Let us first consider the mechanics of a facial recognition system itself. Facial recognition systems &lt;a href="https://medium.com/@ageitgey/machine-learning-is-fun-part-4-modern-face-recognition-with-deep-learning-c3cffc121d78"&gt;chain together a number of algorithms to identify&lt;/a&gt;
 and pick out specific, distinctive details about a person’s face – such
 as the distance between the eyes, or shape of the chin, along with 
distinguishable ‘facial landmarks’. These details are then converted 
into &lt;a href="https://www.eff.org/pages/face-recognition"&gt;a mathematical representation known as a face template&lt;/a&gt;&amp;nbsp;for
 comparison with similar data on other faces collected in a face 
recognition database. There are, however, several problems with facial 
recognition technology that employs such methods.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Facial recognition technology depends
 on machine learning – the tender itself mentions that the AFRS is 
expected to work on neural networks “or similar technology” –&amp;nbsp; which is 
far from perfect. At a relatively trivial level, there are several ways 
to fool facial recognition systems, including wearing &lt;a href="https://www.theguardian.com/technology/2016/nov/03/how-funky-tortoiseshell-glasses-can-beat-facial-recognition"&gt;eyewear&lt;/a&gt;, or &lt;a href="https://theoutline.com/post/5172/juggalo-juggalette-facepaint-makeup-hack-beat-facial-recognition-technology?curator=MusicREDEF&amp;amp;zd=4&amp;amp;zi=s7q4e3fe"&gt;specific types of makeup&lt;/a&gt;. The training sets for the algorithm itself can be deliberately poisoned to recognise objects incorrectly, &lt;a href="https://www.theregister.co.uk/2017/11/06/mit_fooling_ai/"&gt;as observed by students at MIT&lt;/a&gt;.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;More consequentially, these systems 
often throw up false positives, such as when the face recognition system
 incorrectly matches a person’s face (say, from CCTV footage) to an 
image in a database (say, a mugshot), which might result in innocent 
citizens being identified as criminals. In a &lt;a href="https://www.bka.de/SharedDocs/Downloads/EN/Publications/Other/photographBasedSearchesFinalReport.pdf?__blob=publicationFile&amp;amp;v=1"&gt;real-time experiment&lt;/a&gt; set in a train station in Mainz, Germany,
 facial recognition accuracy ranged from 17-29% – and that too only for 
faces seen from the front – and was at 60% during the day but 10-20% at 
night, indicating that environmental conditions play a significant role 
in this technology.&lt;/p&gt;
&lt;p&gt;Facial recognition software used by the UK’s Metropolitan Police &lt;a href="https://www.independent.co.uk/news/uk/home-news/met-police-facial-recognition-success-south-wales-trial-home-office-false-positive-a8345036.html" rel="noopener" target="_blank"&gt;has returned false positives in more than 98% of match alerts generated&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;When the American Civil Liberties Union (ACLU) &lt;a href="https://www.aclu.org/blog/privacy-technology/surveillance-technologies/amazons-face-recognition-falsely-matched-28"&gt;used&lt;/a&gt;
 Amazon’s face recognition system, Rekognition, to compare images of 
legislative members of the American Congress with a database of 
mugshots, the results included 28 incorrect matches.&lt;/p&gt;
&lt;p&gt;There is another uncomfortable reason
 for these inaccuracies – facial recognition systems often reflect the 
biases of the society they are deployed in, leading to problematic 
face-matching results. Technological objectivity is largely a myth, and 
facial recognition offers a stark example of this.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;&lt;a href="http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf"&gt;An MIT study&lt;/a&gt; shows that existing facial recognition technology routinely misidentifies
 people of darker skin tone, women and young people at high rates, 
performing better on male faces than female faces (8.1% to 20.6% 
difference in error rate), lighter faces than darker faces (11.8% to 
19.2% difference in error rate) and worst on darker female faces (20.8% 
to 34.7% error rate). In the aforementioned ACLU study, the false 
matches were disproportionately people of colour, particularly 
African-Americans. The bias rears its head when the parameters of 
machine-learning algorithms, derived from labelled data during a 
“supervised learning” phase, adhere to socially-prejudiced ideas of who 
might commit crimes.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The implications for facial 
recognition are chilling. In an era of pervasive cameras and big data, 
such prejudice can be applied at unprecedented scale through facial 
recognition systems. By replacing biased human judgment with a machine 
learning technique that embeds the same bias, and more reliably, we 
defeat any claims of technological neutrality. Worse, because humans 
will assume that the machine’s “judgment” is not only consistently fair 
on average but independent of their personal biases, they will read 
agreement of its conclusions with their intuition as independent 
corroboration.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;In the Indian context, consider that Muslims, Dalits, Adivasis and other SC/STs are &lt;a href="https://www.newsclick.in/how-caste-plays-out-criminal-justice-system"&gt;disproportionately targeted&lt;/a&gt; by law enforcement.
 The NCRB in its 2015 report on prison statistics in India recorded that
 over 55% of the undertrials prisoners in India are either Dalits, 
Adivasis or Muslims, a number grossly disproportionate to the combined 
population of Dalits, Adivasis and Muslims, which amounts to just 39% of
 the total population according to the 2011 Census.&lt;/p&gt;
&lt;p&gt;If the AFRS is thus trained on these 
records, it would clearly reinforce socially-held prejudices against 
these communities, as inaccurately representative as they may be of 
those who actually carry out crimes. The tender gives no indication that
 the developed system would need to eliminate or even minimise these 
biases, nor if the results of the system would be human-verifiable.&lt;/p&gt;
&lt;p&gt;This could lead to a runaway effect 
if subsequent versions of the machine-learning algorithm are trained 
with criminal convictions in which the algorithm itself played a causal 
role. Taking such a feedback loop to its logical conclusion, law 
enforcement may use machine learning to allocate police resources to 
likely crime spots – which would often be in low income or otherwise 
vulnerable communities.&lt;/p&gt;
&lt;p&gt;Adam Greenfield writes in &lt;em&gt;Radical Machines&lt;/em&gt;
 on the idea of ‘over transparency,’ that combines “bias” of the 
system’s designers as well of the training sets – based as these systems
 are on machine learning – and “legibility” of the data from which 
patterns may be extracted. The “meaningful question,” then, isn’t 
limited to whether facial recognition technology works in identification
 – “[i]t’s whether someone believes that they do, and acts on that 
belief.”&lt;/p&gt;
&lt;p&gt;The question thus arises as to why 
the MHA/NCRB believes this is an effective tool for law enforcement. 
We’re led, then, to another, larger concern with the AFRS – that it 
deploys a system of surveillance that oversteps its mandate of law 
enforcement. The AFRS ostensibly circumvents the fundamental right to 
privacy, as ratified by the Supreme Court in 2018, through sourcing its 
facial images from CCTV cameras installed in public locations, where the
 citizen may expect to be observed.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The extent of this surveillance is 
made even clearer when one observes the range of databases mentioned in 
the tender for the purposes of matching with suspects’ faces extends to 
“any other image database available with police/other entity” besides 
the previously mentioned CCTNS, ICJS et al. The choice of these 
databases makes overreach extremely viable.&lt;/p&gt;
&lt;p&gt;This is compounded when we note that 
the tender expects the system to “[m]atch suspected criminal face[sic] 
from pre-recorded video feeds obtained from CCTVs deployed in various 
critical identified locations, or with the video feeds received from 
private or other public organization’s video feeds.” There further 
arises a concern with regard to the&amp;nbsp; process of identification of such 
“critical […] locations,” and if there would be any mechanisms in place 
to prevent this from being turned into an unrestrained system of 
surveillance, particularly with the stated access to private 
organisations’ feeds.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://www.perpetuallineup.org/sites/default/files/2016-12/The%20Perpetual%20Line-Up%20-%20Center%20on%20Privacy%20and%20Technology%20at%20Georgetown%20Law%20-%20121616.pdf"&gt;The Perpetual Lineup report&lt;/a&gt;
 by Georgetown Law’s Center on Privacy &amp;amp; Technology identifies 
real-time (and historic) video surveillance as posing a very high risk 
to privacy, civil liberties and civil rights, especially owing to the 
high-risk factors of the system using real-time dragnet searches that 
are more or less invisible to the subjects of surveillance.&lt;/p&gt;
&lt;p&gt;It is also designated a “Novel Use” 
system of criminal identification, i.e., with little to no precedent as 
compared to fingerprint or DNA analysis, the latter of which was 
responsible for countless wrongful convictions during its nascent 
application in the science of forensic identification, which have since 
then been overturned.&lt;/p&gt;
&lt;p&gt;In the &lt;em&gt;Handbook of Face Recognition&lt;/em&gt;,
 Andrew W. Senior and Sharathchandra Pankanti identify a more serious 
threat that may be born out of automated facial recognition, assessing 
that “these systems also have the potential […] to make judgments about 
[subjects’] actions and behaviours, as well as aggregating this data 
across days, or even lifetimes,”&amp;nbsp; making video surveillance “an 
efficient, automated system that observes everything in front of any of 
its cameras, and allows all that data to be reviewed instantly, and 
mined in new ways” that allow constant tracking of subjects.&lt;/p&gt;
&lt;p&gt;Such “blanket, omnivident surveillance networks” are a serious possibility through the proposed AFRS. &lt;a href="https://jis-eurasipjournals.springeropen.com/track/pdf/10.1155/2009/865259"&gt;Ye et al, in their paper on “Anonymous biometric access control”&lt;/a&gt;,&amp;nbsp;show
 how automatically captured location and facial image data obtained from
 cameras designed to track the same can be used to learn graphs of 
social networks in groups of people.&lt;/p&gt;
&lt;p&gt;Consider those charged with sedition or similar &lt;em&gt;crimes&lt;/em&gt;,
 given that the CCTNS records the details as noted in FIRs across the 
country. Through correlating the facial image data obtained from CCTVs 
across the country – the tender itself indicates that the system must be
 able to match faces obtained from two (or more) CCTVs – this system 
could easily be used to target the movements of dissidents moving across
 locations.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Constantly watched&lt;/strong&gt;&lt;/p&gt;
&lt;p&gt;Further, something which has not been
 touched upon in the tender – and which may ultimately allow for a 
broader set of images for carrying out facial recognition – is the 
definition of what exactly constitutes a ‘criminal’. Is it when an FIR 
is registered against an individual, or when s/he is arrested and a 
chargesheet is filed? Or is it only when an individual is convicted by a
 court that they are considered a criminal?&lt;/p&gt;
&lt;p&gt;Additionally, does a person cease to be recognised by the tag of a &lt;em&gt;criminal &lt;/em&gt;once
 s/he has served their prison sentence and paid their dues to society? 
Or are they instead marked as higher-risk individuals who may 
potentially commit crimes again? It could be argued that such a 
definition is not warranted in a tender document, however, these are 
legitimate questions which should be answered prior to commissioning and
 building a &lt;em&gt;criminal &lt;/em&gt;facial recognition system.&lt;/p&gt;
&lt;p&gt;Senior and Pankanti note the generalised metaphysical consequences of pervasive video surveillance in the &lt;em&gt;Handbook of Face Recognition:&lt;/em&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;“the 
feeling of disquiet remains [even if one hasn’t committed a major 
crime], perhaps because everyone has done something “wrong”, whether in 
the personal or legal sense (speeding, parking, jaywalking…) and few 
people wish to live in a society where all its laws are enforced 
absolutely rigidly, never mind arbitrarily, and there is always the 
possibility that a government to which we give such powers may begin to 
move towards authoritarianism and apply them towards ends that we do not
 endorse.”&lt;/p&gt;
&lt;p&gt;Such a seemingly apocalyptic scenario
 isn’t far-fetched. In the section on ‘Mandatory Features of the AFRS’, 
the system goes a step further and is expected to integrate “with other 
biometric solution[sic] deployed at police department system like 
Automatic Fingerprint identification system (AFIS)[sic]” and “Iris.” 
This form of linking of biometric databases opens up possibilities of a 
dangerous extent of profiling.&lt;/p&gt;
&lt;p&gt;While the Aadhaar Act, 2016, 
disallows Aadhaar data from being handed over to law enforcement 
agencies, the AFRS and its linking with biometric systems (such as the 
NAFIS) effectively bypasses the minimal protections from biometric 
surveillance the prior unavailability of Aadhaar databases might have 
afforded. The fact that India does not have a data protection law yet – 
and the Bill makes no references to protection against surveillance 
either – deepens the concern with the usage of these integrated 
databases.&amp;nbsp;&lt;/p&gt;
&lt;p&gt;The Perpetual Lineup report warns 
that the government could use biometric technology “to identify multiple
 people in a continuous, ongoing manner [..] from afar, in public 
spaces,” allowing identification “to be done in secret”. Senior and 
Pankanti warn of “function creep,” where the public grows uneasy as 
“silos of information, collected for an authorized process […] start 
being used for purposes not originally intended, especially when several
 such databases are linked together to enable searches across multiple 
domains.”&lt;/p&gt;
&lt;p&gt;This, as Adam Greenfield points out, 
could very well erode “the effectiveness of something that has 
historically furnished an effective brake on power: the permanent 
possibility that an enraged populace might take to the streets in 
pursuit of justice.”&lt;/p&gt;
&lt;p&gt;What the NCRB’s AFRS amounts to, 
then, is a system of public surveillance that offers little demonstrable
 advantage to crime-fighting, especially as compared with its costs to 
fundamental human rights of privacy and the freedom of assembly and 
association. This, without even delving into its implications with 
regard to procedural law. To press on with this system, then, would be 
indicative of the government’s lackadaisical attitude towards protecting
 citizens’ freedoms.&amp;nbsp;&lt;/p&gt;
&lt;hr /&gt;
&lt;p&gt;&lt;em&gt;The views expressed by the authors in this article are 
personal.&lt;/em&gt;&lt;/p&gt;
&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/india-is-falling-down-the-facial-recognition-rabbit-hole'&gt;http://editors.cis-india.org/internet-governance/blog/india-is-falling-down-the-facial-recognition-rabbit-hole&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>Prem Sylvester and Karan Saini</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Cyber Security</dc:subject>
    
    
        <dc:subject>Facial Recognition</dc:subject>
    

   <dc:date>2019-07-25T13:40:00Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
