The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 81 to 95.
‘Future of Work’ in India’s IT/IT-es Sector
http://editors.cis-india.org/internet-governance/blog/future-of-work-in-india-it-it-es-sector
<b>The Centre for Internet and Society has recently undertaken research into the impact of Industry 4.0 on work in India. Industry 4.0, for the purposes of the research, is conceptualised as the technical integration of cyber physical systems (CPS) into production and logistics and the use of the ‘internet of things’ (connection between everyday objects) and services in (industrial) processes. By undertaking this research, CIS seeks to complement and contribute to the discourse and debates in India around the impact of Industry 4.0. In furtherance of the same, this report seeks to explore several key themes underpinning the impact of Industry 4.0 specifically in the IT/IT-es sector and broadly on the nature of work itself.</b>
<p> </p>
<h4>Read the complete case-study here: <a href="http://editors.cis-india.org/internet-governance/2018future-of-work2019-in-india2019s-it-it-es-sector-pdf" class="internal-link" title="‘Future of Work’ in India’s IT/IT-eS Sector pdf">Download</a> (PDF)</h4>
<hr />
<h3><strong>Introduction</strong></h3>
<p>Scholarship on 'Industry 4.0' that has emerged globally has sought to address the challenges of technological forecasting as it relates to work in varied forms. For instance, the Frey-Osborne methods examine characteristic tasks of each occupation and suggest that almost half of all jobs in the United States and other advanced countries are at risk of being substituted by computers or algorithms within the next 10 to 20 years. [1] On the other hand, scholars such as Autor and Handel as well as research produced by OECD on this subject argue that occupations as a whole are unlikely to be automated as there is great variability in the tasks within each occupation. [2] Existing literature on the impact on jobs in the IT sector in India too have arrived at mixed conclusions. Reports have raised concerns about job loss in the sector as a result of automation [3] whilst it has also been reported that employment from the IT sector reached 3.86 million in 2016-17 and an addition of around 105,000 was witnessed in FY18 itself. [4]</p>
<p>In this context, it is crucial to start by developing an understanding of which technologies are at the forefront of bringing in Industry 4.0. Such an understanding will further help understand which jobs, and more specifically, job functions are at the greatest risk of being replaced by automation technologies. To further contextualise the impact, it is imperative to develop a comprehensive understanding of how job functions are organised within the sector itself. This becomes especially relevant with the emphasis Industry 4.0 places on the horizontal and vertical integration of the various technologies constituting Industry 4.0. [5]</p>
<p>It is anticipated that to stay ahead of the curve of ‘technological unemployment’ there will be significant skilling and re-skilling challenges to enable new talent addition around emerging job roles. [6] The skilling challenge gains enhanced importance in the broader context of nurturing an inclusive digital economy. [7] This is particularly relevant in the context of female labour force participation, since it has been predicted that job creation will be concentrated in sectors where females are underrepresented and difficult to retain, while sectors with higher female participation, such as secretarial work, will undergo job loss. [8]</p>
<p>However, it is not clear how these trends will play out in the future, particularly because other structural changes are taking place simultaneously (such as globalisation and protectionism, demographic change, policy making, technological adoption etc.).</p>
<h3><strong>Objective and Scope</strong></h3>
<p>This research seeks to contribute to existing studies and dialogue on the impact and effect of industry 4.0 on work in the Information Technology services (IT) sector in India. Though the research focuses on the impact of technologies that comprise Industry 4.0, such technologies are frequently interchanged with the words ‘automation’ and ‘digitisation’. Thus, the desk research also examines the impact of ‘automation’ and ‘digitisation’ on the IT sector in India. The case study looks atthe IT sector broadly and where applicable, calls out information specific to sub-sectors such as IT enabled services (IT-eS) or Business Process Management (IT-BPM). The IT sector in India is uniquely placed; it is producing the technologies that are disrupting work in other industries as well as implementing them internally. This report focuses on the latter, but brings into context the former when relevant to work in the sector.</p>
<p>By drawing out trends and providing an analysis of contextual, quantitative and qualitative data on changes to work and labour markets in India as a result of technological uptake, it is anticipated that comparative research can be enabled by creating a framework that can be replicated in other, particularly developing, contexts.</p>
<p> </p>
<h3><strong>References</strong></h3>
<p>[1] Carl Benedikt Frey and Michael A. Osborne, 2013. The future of employment: How susceptible are jobs to computerisation?, Oxford Martin School, September.</p>
<p>[2] See David H. Autor & Michael J. Handel, 2013. “Putting Tasks to the Test: Human Capital, Job Tasks, and Wages,” Journal of Labor Economics, University of Chicago Press, Vol. 31(S1), pages S59 -S96. See also: Future of Work and Skills, The Organisation for Economic Co-operation and Development, February 2017.</p>
<p>[3] Business Today, AI, automation will cost 7 lakh IT jobs by 2022, says report. (November 7, 2017) Retrieved <a href="https://www.businesstoday.in/sectors/it/ai-and-automation-to-cost-7-lakh-it-jobs-by-2022-says-report/story/259880.html">https://www.businesstoday.in/sectors/it/ai-and-automation-to-cost-7-lakh-it-jobs-by-2022-says-report/story/259880.html</a></p>
<p>[4] Advantage India, India Brand Equity Foundation. Retrieved <a href="https://www.ibef.org/download/IT-ITeS-Report-Apr-2018.pdf">https://www.ibef.org/download/IT-ITeS-Report-Apr-2018.pdf</a></p>
<p>[5] Embracing Industry 4.0 -and Rediscovering Growth, Boston Consulting Group. Retrieved <a href="https://www.bcg.com/capabilities/operations/embracing-industry-4.0-rediscovering-growth.aspx">https://www.bcg.com/capabilities/operations/embracing-industry-4.0-rediscovering-growth.aspx</a></p>
<p>[6] India’s Readiness for Industry 4.0 -A Focus on Automotive Sector, Grant Thorton and Confederation of Indian Industry. Retrieved <a href="http://www.nasscom.in/sites/default/files/NASSCOM_Annual_Guidance_Final_22062017.pdf">http://www.nasscom.in/sites/default/files/NASSCOM_Annual_Guidance_Final_22062017.pdf</a></p>
<p>[7] G20 Insights, Bridging the digital divide: Skills for the new age., Retrieved <a href="http://www.g20-insights.org/policy_briefs/bridging-digital-divide-skills-new-age/">http://www.g20-insights.org/policy_briefs/bridging-digital-divide-skills-new-age/</a></p>
<p>[8] World Economic Forum, The Future of Jobs -Employment, Skills and Workforce Strategy for the Fourth Industrial Revolution, (January 2016).</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/future-of-work-in-india-it-it-es-sector'>http://editors.cis-india.org/internet-governance/blog/future-of-work-in-india-it-it-es-sector</a>
</p>
No publisherAayush Rathi and Elonnai HickokFuture of Workinternet governanceInternet Governance2020-04-28T09:52:59ZBlog EntryRBI Ban on Cryptocurrencies not backed by any data or statistics
http://editors.cis-india.org/internet-governance/blog/rbi-ban-on-cryptocurrencies-not-backed-by-any-data-or-statistics
<b>In March 2020, the Supreme Court of India quashed the RBI order passed in 2018 that banned financial services firms from trading in virtual currency or cryptocurrency.
Keeping this policy window in mind, the Centre for Internet & Society will be releasing a series of blog posts and policy briefs on cryptocurrency regulation in India
</b>
<p id="docs-internal-guid-9ddef591-7fff-b8f5-3c20-c4a78d53d066" style="text-align: justify;" dir="ltr"> </p>
<p style="text-align: justify;" dir="ltr">On April 6, 2018 <a href="https://www.rbi.org.in/Scripts/NotificationUser.aspx?Id=11243&Mode=0">the RBI issued a circular</a> preventing all Commercial and Co-operative Banks, Payments Banks, Small Finance Banks, NBFCs, and Payment System Providers not only from dealing in virtual currencies themselves but also directing them to stop providing services to all entities which deal with virtual currencies. The RBI had issued a Press Release cautioning the public against dealing in virtual currencies including Bitcoin in 2013. However, the growing popularity of cryptocurrencies and its adoption by large numbers of Indian users, may have been the reason which forced the RBI to issue another Press Release in February 2017 reiterating its earlier concerns regarding cryptocurrencies raised in its earlier circular of 2013. In December 2017 both the RBI as well as the Ministry of Finance issued Press Releases cautioning the general public about the dangers and risks associated with cryptocurrencies, finally culminating in the circular dated April 6, 2018 banning financial institutions from dealing with cryptocurrency traders. As a result of this circular the operations of cryptocurrency exchanges took a severe hit and the number of transactions on these exchanges reduced substantially. The cryptocurrency market in India all but disappeared with only a few extremely determined enthusiasts still dealing in cryptocurrencies, at the risk of potentially depriving themselves of banking services altogether.</p>
<p style="text-align: justify;" dir="ltr">The RBI circular was challenged in the Supreme Court by the Internet and Mobile Association of India; final arguments in the case were concluded only in the last week of January, 2020 with the judgment of the Supreme Court being awaited. Generally speaking, whenever such policy decisions of the executive branch are challenged in the courts, a well accepted defense for the executive authorities, specifically in highly complicated fields such as finance, etc. is that the decision was taken by an expert body using its expertise in the field. The basic rationale underlying this argument is that the authority has relied on verifiable data and used its expertise to analyse the same in order to arrive at its decision.</p>
<p style="text-align: justify;" dir="ltr">However, it appears from the response by the RBI to an RTI query by Centre for Internet and Society, that requested the RBI for a copy of all reports, papers, opinions and advice that was relied upon for issuing the April 6, 2018 circular, that the RBI has not relied upon any such data to come to a conclusion that banking services should be denied to all those entities dealing in cryptocurrencies. It appears from the response to the RTI query that it was the RBI’s own previous circulars and press releases which formed the basis for the April 6, 2018 circular. This response completely undermines the argument that the decision by the RBI was taken after an analysis of all the facts and statistics concerned with cryptocurrency trading.</p>
<p style="text-align: justify;" dir="ltr">Not only does the RTI response weaken the commonly accepted defense of an expert body making a well-reasoned decision, but it also strengthens another legal ground for challenging the decision of the RBI, viz. arbitrariness. One of the grounds on which executive decisions can be challenged is that the decision was made without taking into account relevant material and without the application of mind. The admission by the RBI in its RTI response that there is no material relied upon by the RBI, except its own previous Press Releases, only strengthens the argument that the decision was made in an arbitrary manner.</p>
<p style="text-align: justify;" dir="ltr">Such an admission by the RBI regarding the process followed before issuing the April 6, 2018 circular reduces the credibility of the decision itself. However it remains to be seen whether the Supreme Court of India agrees with the arguments of the petitioners challenging the April 6, 2018 circular, even though the petitioners may not have been able to produce this RTI response from the RBI to further bolster their case.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/rbi-ban-on-cryptocurrencies-not-backed-by-any-data-or-statistics'>http://editors.cis-india.org/internet-governance/blog/rbi-ban-on-cryptocurrencies-not-backed-by-any-data-or-statistics</a>
</p>
No publishervipulCybersecurityinternet governanceBitcoinInternet GovernanceCryptocurrenciesCyber Security2020-03-05T18:35:48ZBlog EntryCryptocurrency Regulation in India – A brief history
http://editors.cis-india.org/internet-governance/blog/cryptocurrency-regulation-in-india-2013-a-brief-history
<b>In March 2020, the Supreme Court of India quashed the RBI order passed in 2018 that banned financial services firms from trading in virtual currency or cryptocurrency.
Keeping this policy window in mind, the Centre for Internet & Society will be releasing a series of blog posts and policy briefs on cryptocurrency regulation in India
</b>
<p id="docs-internal-guid-18286fb9-7fff-c656-6a5b-a01a2e2b3682" style="text-align: justify;" dir="ltr"> </p>
<p style="text-align: justify;" dir="ltr">The story of cryptocurrencies
started in 2008 when a paper titled “Bitcoin: A Peer to Peer Electronic
Cash System” was published by a single or group of pseudonymous
developer(s) by the name of Satoshi Nakamoto. The actual network took
some time to start with the first transactions taking place only in
January 2009. The first actual sale of an item using Bitcoin took place a
year later with a user swapping 10,000 Bitcoin for two pizzas in 2010,
which attached a cash value to the cryptocurrency for the first time. By
2011 other cryptocurrencies began to emerge, with Litecoin, Namecoin
and Swiftcoin all making their debut. Meanwhile, Bitcoin the
cryptocurrency that started it all started getting criticised after
claims emerged that it was being used on the so-called “dark web”,
particularly on sites such as Silk Road as a means of payment for
illegal transactions. Over the next five years cryptocurrencies steadily
gained traction with increased number of transactions and the price of
Bitcoin, the most popular cryptocurrency shot up from around 5 Dollars
in the beginning of 2012 to almost 1000 Dollars at the end of 2017.</p>
<p style="text-align: justify;" dir="ltr">Riding on the back of this
wave of popularity, a number of cryptocurrency exchanges started
operating in India between 2012 and 2017 providing much needed depth and
volume to the Indian cryptocurrency market. These included popular
exchanges such as Zebpay, Coinsecure, Unocoin, Koinex, Pocket Bits and
Bitxoxo. With the price of cryptocurrencies shooting up and because of
its increased popularity and adoption by users outside of its
traditional cult following, regulators worldwide began to take notice of
this new technology; in India the RBI issued a Press Release cautioning
the public against dealing in virtual currencies including Bitcoin way
back in 2013. However, the transaction volumes and adoption of
cryptocurrencies in India really picked up in earnest only after the
demonetisation of high value currency notes in November of 2016, with
the government’s emphasis on digital payments leading to alternatives to
traditional online banking such as cryptocurrencies forcing their way
into the public consciousness. Indian cryptocurrency exchanges started
acquiring users at a much higher pace which drove up volume for
cryptocurrency transactions on all Indian exchanges. The growing
popularity of cryptocurrencies and its adoption by large numbers of
Indian users forced the RBI to issue another Press Release in February
2017 reiterating its concerns regarding cryptocurrencies raised in its
earlier Press Release of 2013. </p>
<p style="text-align: justify;" dir="ltr">In October and November, 2017
two Public Interest Petitions were filed in the Supreme Court of India,
one by Siddharth Dalmia and another by Dwaipayan Bhowmick, the former
asking the Supreme Court to restrict the sale and purchase of
cryptocurrencies in India, and the latter asking for cryptocurrencies in
India to be regulated. Both the petitions are currently pending in the
Supreme Court.</p>
<p style="text-align: justify;" dir="ltr">In November, 2017 the
Government of India constituted a high level Inter-ministerial Committee
under the chairmanship of Shri Subhash Chandra Garg, Secretary,
Department of Economic Affairs, Ministry of Finance and comprising of
Shri Ajay Prakash Sawhney (Secretary, Ministry of Electronics and
Information Technology), Shri Ajay Tyagi (Chairman, Securities and
Exchange Board of India) and Shri B.P. Kanungo (Deputy Governor, Reserve
Bank of India). The mandate of the Committee was to study various
issues pertaining to Virtual Currencies and to propose specific actions
that may be taken in relation thereto. This Committee submitted its
report in July of 2019 recommending a ban on private cryptocurrencies in
India.</p>
<p style="text-align: justify;" dir="ltr">In December 2017 both the RBI
as well as the Ministry of Finance issued Press releases cautioning the
general public about the dangers and risks associated with
cryptocurrencies, with the Ministry of Finance Press Release saying that
cryptocurrencies are like ponzi schemes and also declaring that they
are not currencies or coins. It should be mentioned here that till the
end of March 2018, the RBI and the Finance Ministry had issued various
Press Releases on cryptocurrencies cautioning people against their
risks, however none of them ever took any legal action or gave any
enforceable directions against cryptocurrencies. All of this changed
with the RBI circular dated April 6, 2018 whereby the RBI prevented
Commercial and Co-operative Banks, Payments Banks, Small Finance Banks,
NBFCs, and Payment System Providers not only from dealing in virtual
currencies themselves but also directing them to stop providing services
to all entities which deal with virtual currencies.</p>
<p style="text-align: justify;" dir="ltr">The effect of the circular was
that cryptocurrency exchanges, which relied on normal banking channels
for sending and receiving money to and from their users, could not
access any banking services within India. This essentially crippled
their business operations since converting cash to cryptocurrencies and
vice versa was an essential part of their operations. Even pure
cryptocurrency exchanges which did not deal in fiat currency, were
unable to carry out their regular operations such as paying for office
space, staff salaries, server space, vendor payments, etc. without
access to banking services. </p>
<p>As a the operations of cryptocurrency exchanges took a severe hit and
the number of transactions on these exchanges reduced substantially.
People who had bought cryptocurrencies on these exchanges as an
investment were forced to sell their crypto assets and cash out before
they lost access to banking facilities. The cryptocurrency exchanges
themselves found it hard to sustain operations in the face of the dual
hit of reduced transaction volumes and loss of access banking services.
Faced with such an existential threat, a number of exchanges who were
members of the Internet and Mobile Association of India (IMAI), filed a
writ petition in the Supreme Court on May 15, 2018 titled Internet and
Mobile Association of India v. Reserve Bank of India, the final
arguments in which were heard by the Supreme Court of India in January,
2020 and the judgment is awaited. If the Supreme Court agrees with the
arguments of the petitioners, then cryptocurrency exchanges would be
able to restart operations in India; as a result the cryptocurrency
ecosystem in India may be revived and cryptocurrencies may become a
viable investment alternative again.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cryptocurrency-regulation-in-india-2013-a-brief-history'>http://editors.cis-india.org/internet-governance/blog/cryptocurrency-regulation-in-india-2013-a-brief-history</a>
</p>
No publishervipulCybersecurityinternet governanceBitcoinInternet GovernanceCryptocurrenciesCyber Security2020-03-05T18:36:09ZBlog EntryA Compilation of Research on the PDP Bill
http://editors.cis-india.org/internet-governance/blog/compilation-of-research-on-data-protection
<b>The most recent step in India’s initiative to create an effective and comprehensive Data Protection regime was the call for comments to the Personal Data Protection Bill, 2019, which closed last month. Leading up to the comments, CIS has published numerous research pieces with the goal of providing a comprehensive overview of how this legislation would place India within the global scheme, and how the local situation has developed, as well as analysing its impacts on citizens’ rights.</b>
<p> </p>
<p>In addition to general and clause-by-clause comments and recommendations, we
have compiled an annotated version of the Personal Data Protection
Bill, which lays out our <a class="external-link" href="https://cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019">commentary</a> in an easy-to-follow format.</p>
<p> </p>
<p><img src="https://cis-india.org/internet-governance/pdp-bill-compilation-post-image/" alt="null" width="100%" /></p>
<p> </p>
<p>Below, you can find our other recent research on Data Protection:</p>
<p> </p>
<ul><li>Pallavi Bedi has put together a <a class="external-link" href="https://cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019">note</a> on the Divergence between EU’s General Data Protection Regulation (GDPR) and the Personal Data Protection Bill.</li></ul>
<div> </div>
<ul><li>In addition, Pallavi has also <a class="external-link" href="https://cis-india.org/internet-governance/blog/comparison-of-the-personal-data-protection-bill-with-the-general-data-protection-regulation-and-the-california-consumer-protection-act-2">contrasted</a> the Personal Data Protection Bill with the GDPR and California Consumer Protection Act, in the contexts of jurisdiction and scope, rights of the data principal, obligations of data fiduciaries, exemptions, data protection authority, and breach of personal data. </li></ul>
<div> </div>
<ul><li>On IAPP’s blog <em>Privacy Perspectives</em>, D. Shweta Reddy has <a class="external-link" href="https://iapp.org/news/a/grade-sheet-for-indias-adequacy-status/">assessed</a> whether the Personal Data Protection Bill 2019 is sufficient for India to receive adequacy status from the EU.</li></ul>
<div> </div>
<ul><li>Along with Justin Sherman, Arindrajit Basu has <a class="external-link" href="https://www.lawfareblog.com/key-global-takeaways-indias-revised-personal-data-protection-bill">outlined</a> the key global takeaways from the Personal Data Protection Bill 2019 on <em>Lawfare</em>.</li></ul>
<div> </div>
<ul><li>On <em>The Diplomat</em>, Arindrajit has also <a class="external-link" href="https://thediplomat.com/2020/01/the-retreat-of-the-data-localization-brigade-india-indonesia-and-vietnam/">traced</a> the narrowing localization provisions in India, as well as Vietnam and Indonesia, and studied the actors and geopolitical tussle that has shaped these provisions.</li></ul>
<div> </div>
<ul><li>Through a string of publicly available submissions, press statements, and other media reports, Arindrajit and Amber Sinha have <a class="external-link" href="https://www.epw.in/engage/article/politics-indias-data-protection-ecosystem">tracked</a> the political evolution of the data protection ecosystem in India, and how this has, and will continue to impact legislative and policy developments on <em>EPW Engage</em>.</li></ul>
<div> </div>
<ul><li>Gurshabad Grover and Tanaya Rajwade have <a class="external-link" href="https://thewire.in/tech/indias-privacy-bill-regulates-social-media-platforms">written</a> on <em>The Wire</em> about how the Personal Data Protection Bill regulates social media.</li></ul>
<div> </div>
<ul><li>Amber was also a guest on <em>Suno India’s <a class="external-link" href="https://www.sunoindia.in/cyber-democracy/personal-data-protection-bill-what-does-it-mean-for-your-right-to-privacy/">Cyber Democracy podcast</a></em>, with Srinivas Kodali, to discuss how the latest version of the Personal Data Protection Bill will impact the right to privacy.
</li></ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/compilation-of-research-on-data-protection'>http://editors.cis-india.org/internet-governance/blog/compilation-of-research-on-data-protection</a>
</p>
No publisherpranavinternet governanceInternet GovernanceData ProtectionPrivacy2020-03-05T08:04:24ZBlog EntryGoverning ID: Kenya’s Huduma Namba Programme
http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme
<b></b>
<p>In our fourth case-study, we use our Evaluation Framework for Digital ID to examine the use of Digital ID in Kenya.</p>
<p>Read the <a class="external-link" href="https://digitalid.design/evaluation-framework-case-studies/kenya.html">case-study</a> or download as <a href="http://editors.cis-india.org/internet-governance/digital-id-kenya-case-study" class="internal-link" title="Digital ID Kenya Case Study">PDF</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme'>http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme</a>
</p>
No publisheramberinternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T13:19:15ZBlog EntryGoverning ID: Use of Digital ID in the Healthcare Sector
http://editors.cis-india.org/internet-governance/blog/governing-id-use-of-digital-id-in-the-healthcare-sector
<b></b>
<p>In our third case-study, we use our Evaluation Framework for Digital ID to examine the use of Digital ID in the healthcare sector.</p>
<p><img src="https://cis-india.org/internet-governance/image-digital-id-healthcare-case-study/" alt="null" width="100%" /></p>
<p>Read the <a class="external-link" href="https://digitalid.design/evaluation-framework-case-studies/healthcare.html">case-study</a> or download as <a href="http://editors.cis-india.org/internet-governance/digital-id-healthcare-case-study" class="internal-link" title="Digital ID Healthcare Case Study">PDF</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/governing-id-use-of-digital-id-in-the-healthcare-sector'>http://editors.cis-india.org/internet-governance/blog/governing-id-use-of-digital-id-in-the-healthcare-sector</a>
</p>
No publisherShruti Trikanadinternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T13:21:22ZBlog EntryGoverning ID: India’s Unique Identity Programme
http://editors.cis-india.org/internet-governance/governing-id-india2019s-unique-identity-programme
<b></b>
<div class="content">
<p>In our second case-study, we use our Evaluation Framework for Digital ID to assess India’s Unique Identity Programme.</p>
<p>Read the <a class="external-link" href="https://digitalid.design/evaluation-framework-case-studies/india.html">case-study</a> or download as <a href="http://editors.cis-india.org/internet-governance/digital-id-india-case-study" class="internal-link" title="Digital ID India Case Study">PDF</a>.</p>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/governing-id-india2019s-unique-identity-programme'>http://editors.cis-india.org/internet-governance/governing-id-india2019s-unique-identity-programme</a>
</p>
No publisherVrinda Bhandariinternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T11:38:51ZBlog EntryGoverning ID: Use of Digital ID for Verification
http://editors.cis-india.org/internet-governance/blog/governing-id-2028use-of-digital-id-for-verification
<b></b>
<p>This is the first in a series of case studies, using our recently-published <a href="https://digitalid.design/evaluation-framework-02.html">Evaluation Framework for Digital ID</a>. It looks at the use of digital identity programmes for the purpose of verification, often using the process of deduplication.</p>
<p><img src="https://cis-india.org/internet-governance/image-governing-id-use-of-digital-id-for-verification/" alt="null" width="100%" /></p>
Read the <a class="external-link" href="https://digitalid.design/evaluation-framework-case-studies/verification.html">case-study</a> or download as <a href="http://editors.cis-india.org/internet-governance/use-of-digital-id-for-verification" class="internal-link" title="Use of Digital ID for Verification">PDF.</a>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/governing-id-2028use-of-digital-id-for-verification'>http://editors.cis-india.org/internet-governance/blog/governing-id-2028use-of-digital-id-for-verification</a>
</p>
No publisherShruti Trikanadinternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T11:16:19ZBlog EntryGoverning ID: A Framework for Evaluation of Digital Identity
http://editors.cis-india.org/internet-governance/blog/governing-id-a-framework-for-evaluation-of-digital-identity
<b></b>
<p>As governments across the globe implement new and foundational
digital identification systems (Digital ID), or modernize existing ID
programs, there is an urgent need for more research and discussion about
appropriate uses of Digital ID systems. This significant momentum for
creating Digital ID has been accompanied with concerns about privacy,
surveillance and exclusion harms of state-issued Digital IDs in several
parts of the world, resulting in campaigns and litigations in countries,
such as UK, India, Kenya, and Jamaica. Given the sweeping range of
considerations required to evaluate Digital ID projects, it is necessary
to formulate evaluation frameworks that can be used for this purpose.</p>
<p>This work began with the question of what the appropriate uses
of Digital ID can be, but through the research process, it became clear
that the question of use cannot be divorced from the fundamental
attributes of Digital ID systems and their governance structures. This
framework provides tests, which can be used to evaluate the governance
of Digital ID across jurisdictions, as well as determine whether a
particular use of Digital ID is legitimate. Through three kinds of
checks — Rule of Law tests, Rights based tests, and Risks based tests —
this scheme is a ready guide for evaluation of Digital ID.</p>
<p><img src="https://cis-india.org/internet-governance/image-governing-id-principles-for-evalution/" alt="null" width="100%" /></p>
<p> </p>
<p>View the <a class="external-link" href="https://digitalid.design/evaluation-framework-02.html">framework</a> or download as <a href="http://editors.cis-india.org/internet-governance/governing-id-principles-for-evalution" class="internal-link" title="Governing ID: Principles for Evalution">PDF</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/governing-id-a-framework-for-evaluation-of-digital-identity'>http://editors.cis-india.org/internet-governance/blog/governing-id-a-framework-for-evaluation-of-digital-identity</a>
</p>
No publisherVrinda Bhandari, Shruti Trikanad, and Amber Sinhainternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T13:22:43ZBlog EntryGoverning ID: Introducing our Evaluation Framework
http://editors.cis-india.org/internet-governance/blog/governing-id-introducing-our-evaluation-framework
<b></b>
<div class="content">
<p>With the rise of national digital identity systems (Digital ID) across the world, there is a growing need to examine their impact on human rights. In several instances, national Digital ID programmes started with a specific scope of use, but have since been deployed for different applications, and in different sectors. This raises the question of how to determine appropriate and inappropriate uses of Digital ID. In April 2019, our research began with this question, but it quickly became clear that a determination of the legitimacy of uses hinged on the fundamental attributes and governing structure of the Digital ID system itself. Our evaluation framework is intended as a series of questions against which Digital ID may be tested. We hope that these questions will inform the trade-offs that must be made while building and assessing identity programmes, to ensure that human rights are adequately protected.</p>
<h4>Rule of Law Tests</h4>
<p>Foundational Digital ID must only be implemented along with a
legitimate regulatory framework that governs all aspects of Digital ID,
including its aims and purposes, the actors who have access to it, etc.
In the absence of this framework, there is nothing that precludes
Digital IDs from being leveraged by public and private actors for
purposes outside the intended scope of the programme. Our rule of law
principles mandate that the governing law should be enacted by the
legislature, be devoid of excessive delegation, be clear and accessible
to the public, and be precise and limiting in its scope for discretion.
These principles are substantiated by the criticism that the Kenyan
Digital ID, the Huduma Namba, was met with when it was legalized through
a Miscellaneous Amendment Act, meant only for small or negligible
amendments and typically passed without any deliberation. These set of
tests respond to the haste with which Digital ID has been implemented,
often in the absence of an enabling law which adequately addresses its
potential harms.</p>
<h4>Rights based Tests</h4>
<p>Digital ID, because of its collection of personal data and
determination of eligibility and rights of users, intrinsically involves
restrictions on certain fundamental rights. The use of Digital ID for
essential functions of the State, including delivery of benefits and
welfare, and maintenance of civil and sectoral records, enhance the
impact of these restrictions. Accordingly, the entire identity
framework, including its architecture, uses, actors, and regulators,
must be evaluated at every stage against the rights it is potentially
violating. Only then will we be able to determine if such violation is
necessary and proportionate to the benefits it offers. In Jamaica, the
National Identification and Registration Act, which mandated citizens’
biometric enrolment at the risk of criminal sanctions, was held to be a
disproportionate violation of privacy, and therefore unconstitutional.</p>
<h4>Risk based Tests</h4>
<p>Even with a valid rule of law framework that seeks to protect
rights, the design and use of Digital ID must be based on an analysis of
the risks that the system introduces. This could take the form of
choosing between a centralized and federated data-storage framework,
based on the effects of potential failure or breach, or of restricting
the uses of the Digital ID to limit the actors that will benefit from
breaching it. Aside from the design of the system, the regulatory
framework that governs it should also be tailored to the potential risks
of its use. The primary rationale behind a risk assessment for an
identity framework is that it should be tested not merely against
universal metrics of legality and proportionality, but also against an
examination of the risks and harms it poses. Implicit in a risk based
assessment is also the requirement of implementing a responsive
mitigation strategy to the risks identified, both while creating and
governing the identity programme.</p>
<p>Digital ID programmes create an inherent power imbalance
between the State and its residents because of the personal data they
collect and the consequent determination of significant rights,
potentially creating risks of surveillance, exclusion, and
discrimination. The accountability and efficiency gains they promise
must not lead to hasty or inadequate implementation.</p>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/governing-id-introducing-our-evaluation-framework'>http://editors.cis-india.org/internet-governance/blog/governing-id-introducing-our-evaluation-framework</a>
</p>
No publisherShruti Trikanadinternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T08:03:49ZBlog EntryDivergence between the General Data Protection Regulation and the Personal Data Protection Bill, 2019
http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019
<b></b>
<p>Our note on the divergence between the General Data Protection Regulation and the Personal Data Protection Bill can be downloaded as a PDF <a href="http://editors.cis-india.org/internet-governance/divergence-between-the-gdpr-and-pdp-bill-2019" class="internal-link" title="Divergence between the GDPR and PDP Bill 2019">here</a>.</p>
<p>The European Union’s General Data
Protection Regulation (GDPR), replacing the 1995 EU Data Protection Directive
came into effect in May 2018. It harmonises the data protection regulations
across the European Union. In India, the Ministry of Electronics and
Information Technology had constituted a Committee of Experts (chaired by
Justice Srikrishna) to frame recommendations for a data protection framework in
India. The Committee submitted its report and a draft Personal Data Protection
Bill in July 2018 (2018 Bill). Public comments were sought on the bill till
October 2018. The Central Government revised the Bill and introduced the
revised version of the Personal Data Protection Bill (PDP Bill) on December 11,
2019 in the Lok Sabha.</p>
<p>The PDP Bill has incorporated certain
aspects of the GDPR, such as requirements for notice to be given to the data
principal, consent for processing of data, establishment of a data protection
authority, etc. However, there are some differences and in this note we have highlighted
the areas of divergence between the two. It only includes
provisions which are common to the GDPR and the PDP Bill. It does not include
the provisions on (i) Appellate Tribunal, (ii) Finance, Account and Audit; and
(iii) Non- Personal Data. </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019'>http://editors.cis-india.org/internet-governance/blog/divergence-between-the-general-data-protection-regulation-and-the-personal-data-protection-bill-2019</a>
</p>
No publisherPallavi BediInternet GovernanceData ProtectionPrivacy2020-02-21T11:08:50ZBlog EntryContent takedown and users' rights
http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1
<b>After Shreya Singhal v Union of India, commentators have continued to question the constitutionality of the content takedown regime under Section 69A of the IT Act (and the Blocking Rules issued under it). There has also been considerable debate around how the judgement has changed this regime: specifically about (i) whether originators of content are entitled to a hearing, (ii) whether Rule 16 of the Blocking Rules, which mandates confidentiality of content takedown requests received by intermediaries from the Government, continues to be operative, and (iii) the effect of Rule 16 on the rights of the originator and the public to challenge executive action. In this opinion piece, we attempt to answer some of these questions.</b>
<p style="text-align: justify;" class="normal"> </p>
<p style="text-align: justify;" class="normal">This article was first <a class="external-link" href="http://https://theleaflet.in/content-takedown-and-users-rights/">published</a> at the Leaflet. It has subsequently been republished by <a class="external-link" href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content">Scroll.in</a>, <a class="external-link" href="https://kashmirobserver.net/2020/02/15/content-takedown-and-users-rights/">Kashmir Observer</a> and the <a class="external-link" href="https://cyberbrics.info/content-takedown-and-users-rights/">CyberBRICS blog</a>. </p>
<p style="text-align: justify;" class="normal"><strong><br /></strong></p>
<p style="text-align: justify;" class="normal"><strong>Introduction</strong></p>
<p style="text-align: justify;" class="normal">Last year, several Jio users from different states <a href="https://www.medianama.com/2019/03/223-indiankanoon-jio-block/">reported</a> that sites like Indian Kanoon, Reddit and Telegram were inaccessible through their connections. While attempting to access the website, the users were presented with a notice that the websites were blocked on orders from the Department of Telecommunications (DoT). When contacted by the founder of Indian Kanoon, Reliance Jio <a href="https://in.reuters.com/article/us-india-internet-idINKCN1RF14D">stated</a> that the website had been blocked on orders of the government, and that the order had been rescinded the same evening. However, in response to a Right to Information (RTI) request, the DoT <a href="https://twitter.com/indiankanoon/status/1218193372210323456">said</a> they had no information about orders relating to the blocking of Indian Kanoon.</p>
<p style="text-align: justify;" class="normal">Alternatively, consider that the Committee to Protect Journalists (CPJ) <a href="https://cpj.org/blog/2019/10/india-opaque-legal-process-suppress-kashmir-twitter.php">expressed concern</a> last year that the Indian government was forcing Twitter to suspend accounts or remove content relating to Kashmir. They reported that over the last two years, the Indian government suppressed a substantial amount of information coming from the area, and prevented Indians from accessing more than five thousand tweets.</p>
<p style="text-align: justify;" class="normal">These instances are <a href="https://www.hindustantimes.com/analysis/to-preserve-freedoms-online-amend-the-it-act/story-aC0jXUId4gpydJyuoBcJdI.html">symptomatic</a> of a larger problem of opaque and arbitrary content takedown in India, enabled by the legal framework under the Information Technology (IT) Act. The Government derives its powers to order intermediaries (entities storing or transmitting information on behalf of others, a definition which includes internet service providers and social media platforms alike) to block online resources through <a href="https://indiankanoon.org/doc/10190353/">section 69A</a> of the IT Act and the <a href="https://meity.gov.in/writereaddata/files/Information%20Technology%20%28%20Procedure%20and%20safeguards%20for%20blocking%20for%20access%20of%20information%20by%20public%29%20Rules%2C%202009.pdf">rules</a> [“the blocking rules”] notified thereunder. Apart from this, <a href="https://indiankanoon.org/doc/844026/">section 79</a> of the IT Act and its allied rules also prescribe a procedure for content removal. <a href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">Conversations</a> with one popular intermediary revealed that the government usually prefers to use its powers under section 69A, possibly because of the opaque nature of the procedure that we highlight below.</p>
<p style="text-align: justify;" class="normal">Under section 69A, a content removal request can be sent by authorised personnel in the Central Government not below the rank of a Joint Secretary. The grounds for issuance of blocking orders under section 69A are: “<em>the interest of the sovereignty and integrity of India, defence of India, the security of the state, friendly relations with foreign states or public order or for preventing incitement to the commission of any cognisable offence relating to the above.</em>” Specifically, the blocking rules envisage the process of blocking to be largely executive-driven, and require strict confidentiality to be maintained around the issuance of blocking orders. This shrouds content takedown orders in a cloak of secrecy, and makes it impossible for users and content creators to ascertain the legitimacy or legality of the government action in any instance of blocking.</p>
<p style="text-align: justify;" class="normal"><strong>Issues</strong></p>
<p style="text-align: justify;" class="normal">The Supreme Court had been called to determine the constitutional validity of section 69A and the allied rules in <a href="https://indiankanoon.org/doc/110813550/"><em>Shreya Singhal v Union of India</em></a>. The petitioners had contended that as per the procedure laid down by these rules, there was no guarantee of pre-decisional hearing afforded to the originator of the information. Additionally, the petitioners pointed out that the safeguards built into section 95 and 96 of the Code of Criminal Procedure (CrPC), which allow state governments to ban publications and persons to initiate legal challenges to those actions respectively, were absent from the blocking procedures. Lastly, the petitioners assailed rule 16 of the blocking rules, which mandated confidentiality of blocking procedures, on the grounds that it was affecting their fundamental rights.</p>
<p style="text-align: justify;" class="normal">The Court, however, found little merit in these arguments. Specifically, the Court found that section 69A was narrowly drawn and had sufficient procedural safeguards, which included the grounds of issuance of a blocking order being specifically drawn, and mandating that the reasons of the website blocking be in writing, thus making it amenable to judicial review. Further, the Court also found that the provision of setting up of a review committee saved the law from being constitutional infirmity. In the Court’s opinion, the mere absence of additional safeguards, as the ones built into the CrPC, did not mean that the law was unconstitutional.</p>
<p style="text-align: justify;" class="normal">But do the ground realities align with the Court’s envisaged implementation of these principles? Apar Gupta, a counsel for the petitioners, <a href="https://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">pointed</a> out that there was no recorded instance of pre-decisional hearing being granted to show that this safeguard contained in the rules was actually being implemented. However, Gautam Bhatia <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">read</a> <em>Shreya Singhal </em>to make an important advance: that the right of hearing be mandatorily extended to the ‘originator’, i.e. the content creator.</p>
<p style="text-align: justify;" class="normal">Additionally, Bhatia also noted that the Court, while upholding the constitutionality of the procedure under section 69A, held that the “<em>reasons have to be recorded in writing in such blocking order so that they may be assailed in a writ petition under Article 226 of the Constitution.</em>”</p>
<p style="text-align: justify;" class="normal">There are two important takeaways from this. <em>Firstly</em>, he argued that the broad contours of the judgment invoke an established constitutional doctrine — that the fundamental right under Article 19(1)(a) does not merely include the right of expression, but also the <em>right of access to information. </em>Accordingly, the right of challenging a blocking order was not only vested in the originator or the concerned intermediary, but may rest with the general public as well. And <em>secondly</em>, by the doctrine of necessary implication, it followed that for the general public to challenge any blocking order under Article 226, the blocking orders must be made public. While Bhatia concedes that public availability of blocking orders may be an over-optimistic reading of the judgment, recent events suggest that even the commonly-expected result, i.e. that the content creators having the right to a hearing, has not been implemented by the Government.</p>
<p style="text-align: justify;" class="normal">Consider the <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">blocking</a> of the satirical website DowryCalculator.com in September 2019 on orders from the government. The website displayed a calculator that suggests a ‘dowry’ depending on the salary and education of a prospective groom: even if someone misses the satire, the contents of the website are not immediately relatable to any grounds of removal listed under section 69A of the IT Act.</p>
<p style="text-align: justify;" class="normal"> Tanul Thakur, the creator of the website, was not granted a hearing despite the fact that he had publicly claimed the ownership of the website at various times and that the website had been covered widely by the press. The information associated with the domain name also publicly lists Thakur’s name and contact information. Clearly, the government made no effort to contact Thakur when passing the order. Perhaps even more worryingly, when he <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">tried</a> to access a copy of the blocking order by filing a RTI, the MeitY cited the confidentiality rule to deny him the information.</p>
<p style="text-align: justify;" class="normal">This incident documents a fundamental problem plaguing the rules: the confidentiality clause is still being used to deny disclosure of key information on content takedown orders. The government has also used the provision to deny citizens a list of blocked websites , as responses to RTI requests have proven <a href="https://cis-india.org/internet-governance/blog/rti-application-to-bsnl-for-the-list-of-websites-blocked-in-india">time</a> and <a href="https://sflc.in/deity-provides-list-sites-blocked-2013-withholds-orders">again</a>.</p>
<p style="text-align: justify;" class="normal">Clearly, the Supreme Court’s rationale in considering Section 69A and the blocking rules as constitutional is not one that is implemented in reality. The confidentiality clause is preventing legal challenges to content blocking in totality: content creators are unable access the orders, and hence are unable to understand the executive’s reasoning in ordering their content to be blocked from public access.</p>
<p style="text-align: justify;" class="normal">As we noted earlier, the grounds of issuing a blocking order under section 69A pertain to certain reasonable restrictions on expression permitted by Article 19(2), which are couched in broad terms. The government’s implementation of section 69A and the rules make it impossible for any judicial review or accountability on the conformity of blocking orders with the mentioned grounds under the rules, or any reasonable restriction at all.</p>
<p style="text-align: justify;" class="normal"><strong>The Way Forward</strong></p>
<p style="text-align: justify;" class="normal">From the opacity of proceedings under the law, to the lack of information regarding the same on public domain, the Indian content takedown regime leaves a lot to be desired from both the government and intermediaries at play. </p>
<p style="text-align: justify;" class="normal">First, we believe the Supreme Court’s decision in <em>Shreya Singhal v. Union of India</em> casts an obligation on the government to attempt to contact the content creator if they are passing a content takedown order to an intermediary. <em>Second</em>, even if the content creator is unavailable for a hearing at that instance, the confidentiality clause should not be used to prevent future disclosure of information to the content creator, so that affected citizens can access and challenge these orders.</p>
<p style="text-align: justify;" class="normal">While we wait for legal reform, intermediaries can also step up to ensure the rights of users online are upheld. On receiving formal orders, intermediaries should <a href="https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass">assess</a> the legality of the received request. This should involve ensuring that only authorised agencies and personnel have sent the content removal orders, that the order specifically mentions what provision the government is exercising the power under, and that the content removal requests relate to the grounds of removal that are permissible under section 69A. For instance, intermediaries should refuse to entertain content removal requests under section 69A of the IT Act if they relate to obscenity, a ground not covered by the provision.</p>
<p style="text-align: justify;" class="normal">The representatives of the intermediary should also push for the committee to grant a hearing to the content creator. Here, the intermediary can act as a liaison between the uploader and the governmental authorities.</p>
<p style="text-align: justify;" class="normal">The Supreme Court’s recent decision in <a href="https://indiankanoon.org/doc/82461587/"><em>Anuradha Bhasin v. Union of India</em></a><em> </em>offers a glimmer of hope for user rights online<em>. </em>While the case primarily challenged the orders imposing section 144 of the CrPC and a communication blockade in Jammu and Kashmir, the final decision does affirm the fundamental principle that government-imposed restrictions on the freedom of expression and assembly must be made available to the public and affected parties to enable challenges in a court of law.</p>
<p style="text-align: justify;" class="normal"> The judiciary has yet another opportunity to consider the provision and the rules: late last year, Tanul Thakur <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">approached</a> the Delhi High Court to challenge the orders passed by the government to ISPs to block his website. One hopes that the future holds robust reforms to the content takedown regime.</p>
<p style="text-align: justify;" class="normal"> We live in an era where the ebb and flow of societal discourse is increasingly channeled through intermediaries on the internet. In the absence of a mature, balanced and robust framework that enshrines the rule of law, we risk arbitrary modulation of the marketplace of ideas by the executive.</p>
<p style="text-align: justify;" class="normal"><em> </em></p>
<p style="text-align: justify;" class="normal"><em>Torsha Sakar and Gurshabad Grover are researchers at the Centre for Internet and Society.</em></p>
<p style="text-align: justify;" class="normal"><em>Disclosure: The Centre for Internet and Society is a recipient of research grants from Facebook and Google.</em></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1'>http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1</a>
</p>
No publisherTorsha Sarkar, Gurshabad GroverInternet FreedomInternet GovernanceIntermediary LiabilityCensorship2020-02-17T05:18:25ZBlog Entry Comments to the Personal Data Protection Bill 2019
http://editors.cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019
<b>The Personal Data Protection Bill, 2019 was introduced in the Lok Sabha on December 11, 2019. </b>
<p> </p>
<h4>Please view our general comments below, or download as PDF <a href="http://editors.cis-india.org/accessibility/blog/cis-general-comments-to-the-pdp-bill-2019" class="internal-link" title="CIS' General Comments to the PDP Bill 2019">here</a>.</h4>
<h4>Our comments and recommendations can be downloaded as PDF <a href="http://editors.cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019" class="internal-link" title="CIS Comments PDP Bill 2019">here</a>.</h4>
<h4>We have also prepared an annotated version of the Bill, where our detailed comments and recommendations can be viewed alongside the Bill, available as PDF <a href="http://editors.cis-india.org/accessibility/blog/annotated-ver-pdp-bill-2019" class="internal-link" title="Annotated ver PDP Bill 2019">here</a>.</h4>
<hr />
<h2>General Comments</h2>
<h3>1. Executive notification cannot abrogate fundamental rights <br /></h3>
<p>In 2017, the Supreme Court in K.S. Puttaswamy v Union of India [1] held the right to privacy to be a fundamental right. While this right is subject to reasonable restrictions, the restrictions have to meet a three fold requirement, namely (i) existence of a law; (ii) legitimate state aim; (iii) proportionality.Under the 2018 Bill, the exemption to government agencies for processing of personal data from the provisions of the Bill in the ‘interest of the security of the State’ [2] was subject to a law being passed by Parliament. However, under Clause 35 of the present Bill, the Central Government is merely required to pass a written order exempting the government agency from the provisions of the Bill.Any restriction on the right to privacy will have to comply with the conditions prescribed in Puttaswamy I. An executive order issued by the central government authorising any agency of the government to process personal data does not satisfy the first requirement laid down by the Supreme Court in Puttaswamy I — as it is not a law passed by Parliament. The Supreme Court while deciding upon the validity of Aadhar in K.S. Puttaswamy v Union of India [3] noted that “an executive notification does not satisfy the requirement of a valid law contemplated under Puttaswamy. A valid law in this case would mean a law passed by Parliament, which is just, fair and reasonable. Any encroachment upon the fundamental right cannot be sustained by an executive notification.”</p>
<p> </p>
<h3>2. Exemptions under Clause 35 do not comply with the legitimacy and proportionality test</h3>
<p>The lead judgement in Puttaswamy I while formulating the three fold test held that the restraint on privacy emanate from the procedural and content based mandate of Article 21 [4]. The Supreme Court in Maneka Gandhi v Union India [5] had clearly established that “mere prescription of some kind of procedure cannot ever meet the mandate of Article 21. The procedure prescribed by law has to be fair, just and reasonable, not fanciful, oppressive and arbitrary” [6]. The existence of a law is the first requirement; the second requirement is that of ‘legitimate state aim’. As per the lead judgement this requirement ensures that “the nature and content of the law which imposes the restriction falls within the zone of reasonableness mandated by Article 14, which is a guarantee against arbitrary state action” [7]. It is established that for a provision which confers upon the executive or administrative authority discretionary powers to be regarded as non-arbitrary, the provision should lay down clear and specific guidelines for the executive to exercise the power [8]. The third test to be complied with is that the restriction should be ‘proportionate,’ i.e. the means that are adopted by the legislature are proportional to the object and needs sought to be fulfilled by the law. The Supreme Court in Modern Dental College & Research Centre v State of Madhya Pradesh [9] specified the components of proportionality standards —</p>
<ol><li>A measure restricting a right must have a legitimate goal;</li>
<li>It must be a suitable means of furthering this goal;</li>
<li>There must not be any less restrictive, but equally effective alternative; and</li>
<li>The measure must not have any disproportionate impact on the right holder</li></ol>
<p>Clause 35 provides extensive grounds for the Central Government to exempt any agency from the requirements of the bill but does not specify the procedure to be followed by the agency while processing personal data under this provision. It merely states that the ‘procedure, safeguards and oversight mechanism to be followed’ will be prescribed in the rules.The wide powers conferred on the central government without clearly specifying the procedure may be contrary to the three fold test laid down in Puttaswamy I, as it is difficult to ascertain whether a legitimate or proportionate objective is being fulfilled [10].</p>
<p> </p>
<h3>3. Limited powers of Data Protection Authority in comparison with the Central Government</h3>
<p>In comparison with the last version of the Personal Data Protection Bill, 2018 prepared by the Committee of Experts led by Justice Srikrishna, we witness an abrogation of powers of the Data Protection Authority (Authority), to be created, in this Bill. The powers and functions that were originally intended to be performed by the Authority have now been allocated to the Central Government. For example:</p>
<ol><li>In the 2018 Bill, the Authority had the power to notify further categories of sensitive personal data. Under the present Bill, the Central Government in consultation with the sectoral regulators has been conferred the power to do so.</li>
<li>Under the 2018 Bill, the Authority had the sole power to determine and notify significant data fiduciaries, however, under the present Bill, the Central Government has in consultation with the Authority been given the power to notify social media intermediaries as significant data fiduciaries.</li></ol>
<p>In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate and resources. The political nature of the personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Authority are independent of the Executive.</p>
<p> </p>
<h3>4. No clarity on data sandbox</h3>
<p>The Bill contemplates a sandbox for “ innovation in artificial intelligence, machine-learning or any other emerging technology in public interest.” A Data Sandbox is a non-operational environment where the analyst can model and manipulate data inside the data management system. Data sandboxes have been envisioned as a secure area where only a copy of the company’s or participant companies’ data is located [11]. In essence, it refers to the scalable and creation platform which can be used to explore an enterprise’s information sets. On the other hand, regulatory sandboxes are controlled environments where firms can introduce innovations to a limited customer base within a relaxed regulatory framework, after which they may be allowed entry into the larger market after meeting certain conditions. This purportedly encourages innovation through the lowering of entry barriers by protecting newer entrants from unnecessary and burdensome regulation. Regulatory sandboxes can be interpreted as a form of responsive regulation by governments that seek to encourage innovation – they allow selected companies to experiment with solutions within an environment that is relatively free of most of the cumbersome regulations that they would ordinarily be subject to, while still subject to some appropriate safeguards and regulatory requirements. Sandboxes are regulatory tools which may be used to permit companies to innovate in the absence of heavy regulatory burdens. However, these ordinarily refer to burdens related to high barriers to entry (such as capital requirements for financial and banking companies), or regulatory costs. In this Bill, however, the relaxing of data protection provisions for data fiduciaries would lead to restrictions of the privacy of individuals. Limitations to a fundamental rights on grounds of ‘fostering innovation’ is not a constitutional tenable position, and contradict the primary objectives of a data protection law.</p>
<p> </p>
<h3>5. The primacy of ‘harm’ in the Bill ought to be reconsidered</h3>
<p>While a harms based approach is necessary for data protection frameworks, such approaches should be restricted to the positive obligations, penal provisions and responsive regulation of the Authority. The Bill does not provide any guidance on either the interpretation of the term ‘harm,’ [12] or on the various activities covered within the definition of the term. Terms such as ‘loss of reputation or humiliation’ ‘any discriminatory treatment’ are a subjective standard and are open to varied interpretations. This ambiguity in the definition will make it difficult for the data principal to demonstrate harm and for the DPA to take necessary action as several provisions are based upon harm being caused or likely to be caused.Some of the significant provisions where ‘harm’ is a precondition for the provision to come into effect are —</p>
<ol><li>Clause 25: Data Fiduciary is required to notify the Authority about the breach of personal data processed by the data fiduciary, if such breach is likely to cause harm to any data principal. The Authority after taking into account the severity of the harm that may be caused to the data principal will determine whether the data principal should be notified about the breach.</li>
<li>Clause 32 (2): A data principal can file a complaint with the data fiduciary for a contravention of any of the provisions of the Act, which has caused or is likely to cause ‘harm’ to the data principal.</li><li>Clause 64 (1): A data principal who has suffered harm as a result of any violation of the provision of the Act by a data fiduciary, has the right to seek compensation from the data fiduciary.</li></ol>
<p>Clause 16 (5): The guardian data fiduciary is barred from profiling, tracking or undertaking targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child.</p>
<p> </p>
<h3>6. Non personal data should be outside the scope of this Bill</h3>
<p>Clause 91 (1) states that the Act does not prevent the Central Government from framing a policy for the digital economy, in so far as such policy does not govern personal data. The Central Government can, in consultation with the Authority, direct any data fiduciary to provide any anonymised personal data or other non-personal data to enable better targeting of delivery of services or formulation of evidence based policies in any manner as may be prescribed.It is concerning that the data protection bill has specifically carved out an exception for the Central Government to frame policies for the digital economy and seems to indicate that the government plans to freely use any and all anonymized and/or non-personal data that rests with any data fiduciary that falls under the ambit of the bill to support the digital economy including for its growth, security, integrity, and prevention of misuse. It is unclear how the government, in practice, will be able to compel organizations to share this data. Further, there is a lack of clarity on the contours of the definition of non-personal data and the Bill does not define the term. It is also unclear whether the Central Government can compel the data fiduciary to transfer/share all forms of non-personal data and the rights and obligations of the data fiduciaries and data principals over such forms of data. Anonymised data refers to data which has ‘ irreversibly’ been converted into a form in which the data principal cannot be identified. However, as several instances have shown ‘ irreversible’ anonymisation is not possible. In the United States, the home addresses of taxi drivers were uncovered and in Australia individual health records were mined from anonymised medical bills [13]. In September 2019, the Ministry of Electronics and Information Technology, constituted an expert committee under the chairmanship of Kris Gopalkrishnan to study various issues relating to non-personal data and to deliberate over a data governance framework for the regulation of such data.The provision should be deleted and the scope of the bill should be limited to protection of personal data and to provide a framework for the protection of individual privacy. Until the report of the expert committee is published, the Central Government should not frame any law/regulation on the access and monetisation of non-personal/ anonymised data nor can they create a blanket provision allowing them to request such data from any data fiduciary that falls within the ambit of the bill. If the government wishes to use data resting with a data fiduciary; it must do so on a case to case basis and under formal and legal agreements with each data fiduciary.</p>
<p> </p>
<h3>7. Steps towards greater decentralisation of power</h3>
<p>We propose the following steps towards greater decentralisation of powers and devolved jurisdiction —</p>
<ol><li>Creation of State Data Protection Authorities: A single centralised body may not be the appropriate form of such a regulator. We propose that on the lines of central and state commissions under the Right to Information Act, 2005, state data protection authorities are set up which are in a position to respond to local complaints and exercise jurisdiction over entities within their territorial jurisdictions.</li>
<li>More involvement of industry bodies and civil society actors: In order to lessen the burden on the data protection authorities it is necessary that there is active engagement with industry bodies, sectoral regulators and civil society bodies engaged in privacy research. Currently, the Bill provides for involvement of industry or trade association, association representing the interests of data principals, sectoral regulator or statutory Authority, or an departments or ministries of the Central or State Government in the formulation of codes of practice. However, it would be useful to also have a more active participation of industry associations and civil society bodies in activities such as promoting awareness among data fiduciaries of their obligations under this Act, promoting measures and undertaking research for innovation in the field of protection of personal data.</li></ol>
<p> </p>
<h3>8. The Authority must be empowered to exercise responsive regulation</h3>
<p>In a country like India, the challenge is to move rapidly from a state of little or no data protection law, and consequently an abysmal state of data privacy practices to a strong data protection regulation and a powerful regulator capable of enabling a state of robust data privacy practices. This requires a system of supportive mechanisms to the stakeholders in the data ecosystem, as well as systemic measures which enable the proactive detection of breaches. Further, keeping in mind the limited regulatory capacity in India, there is a need for the Authority to make use of different kinds of inexpensive and innovative strategies.We recommend the following additional powers for the Authority to be clearly spelt out in the Bill —</p>
<ol><li>Informal Guidance: It would be useful for the Authority to set up a mechanism on the lines of the Security and Exchange Board of India (SEBI)’s Informal Guidance Scheme, which enables regulated entities to approach the Authority for non-binding advice on the position of law. Given that this is the first omnibus data protection law in India, and there is very little jurisprudence on the subject from India, it would be extremely useful for regulated entities to get guidance from the regulator.</li>
<li>Power to name and shame: When a DPA makes public the names of organisations that have seriously contravened data protection legislation, this is a practice known as “naming and shaming.” The UK ICO and other DPAs recognise the power of publicity, as evidenced by their willingness to co-operate with the media. The ICO does not simply post monetary penalty notices (MPNs or fines) on its websites for journalists to find, but frequently issues press releases, briefs journalists and uses social media. The ICO’s publicity statement on communicating enforcement activities states that the “ICO aims to get media coverage for enforcement activities.”</li>
<li>Undertakings: The UK ICO has also leveraged the threats of fines into an alternative enforcement mechanism seeking contractual undertakings from data controllers to take certain remedial steps. Undertakings have significant advantages for the regulator. Since an undertaking is a more “co-operative”solution, it is less likely that a data controller will change it. An undertaking is simpler and easier to put in place. Furthermore, the Authority can put an undertaking in place quickly as opposed to legal proceedings which are longer.</li></ol>
<p> </p>
<h3>9. No clear roadmap for the implementation of the Bill</h3>
<p>The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified [14]. It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified.The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within within which the Authority shall be established and specific rules and regulations notified. Considering that 25 provisions have been deferred to rules that have to be framed by the Central Government and a further 19 provisions have been deferred to the regulations to be notified by the Authority the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill.The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017.We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Authority. This is important to give full and effective effect to the right of privacy of the <br />individual. It is also important to ensure that individuals have an effective mechanism to enforce the right and seek recourse in case of any breach of obligations by the data fiduciaries.For offences, we suggest a system of mail boxing where provisions and punishments are enforced in a staggered manner, for a period till the fiduciaries are aligned with the provisions of the Act. The Authority must ensure that data principals and fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment are brought into force. This will allow the data fiduciaries to align their practices with the provisions of this new legislation and the Authority will also have time to define and determine certain provisions that the Bill has left the Authority to define. Additionally enforcing penalties for offences initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders from paying a high price. This will relieve the fear of smaller companies and startups who might fear processing data for the fear of paying penalties for offences.</p>
<p> </p>
<h3>10. Lack of interoperability</h3>
<p>In its current form, a number of the provisions in the Bill will make it difficult for India’s framework to be interoperable with other frameworks globally and in the region. For example, differences between the draft Bill and the GDPR can be found in the grounds for processing, data localization frameworks, the framework for cross border transfers, definitions of sensitive personal data, inclusion of the undefined category of ‘critical data’, and the roles of the authority and the central government.</p>
<p> </p>
<h3>11. Legal Uncertainty</h3>
<p>In its current structure, there are a number of provisions in the Bill that, when implemented, run the risk of creating an environment of legal uncertainty. These include: lack of definition of critical data, lack of clarity in the interpretation of the terms ‘harm’ and ‘significant harm’, ability of the government to define further categories of sensitive personal data, inclusion of requirements for ‘social media intermediaries’, inclusion of ‘non-personal data’, framing of the requirements for data transfers, bar on processing of certain forms of biometric data as defined by the Central Government, the functioning between a consent manager and another data fiduciary, the inclusion of an AI sandbox and the definition of state. To ensure the greatest amount of protection of individual privacy rights and the protection of personal data while also enabling innovation, it is important that any data protection framework is structured and drafted in a way to provide as much legal certainty as possible.</p>
<p> </p>
<h3>Endnotes</h3>
<p>1. (2017) 10 SCC 641 (“Puttaswamy I”).</p>
<p>2. Clause 42(1) of the 2018 Bill states that “Processing of personal data in the interests of the security of the State shall not be permitted unless it is authorised pursuant to a law, and is in accordance with the procedure established by such law, made by Parliament and is necessary for, and proportionate to such interests being achieved.”</p>
<p>3. (2019) 1 SCC 1 (“Puttaswamy II”)</p>
<p>4. Puttaswamy I, supra, para 180.</p>
<p>5. (1978) 1 SCC 248.</p>
<p>6. Ibid para 48.</p>
<p>7. Puttaswamy I supra para 180.</p>
<p>8. State of W.B. v. Anwar Ali Sarkar, 1952 SCR 284; Satwant Singh Sawhney v A.P.O AIR 1967 SC1836.</p>
<p>9. (2016)7 SCC 353.</p>
<p>10. Dvara Research “Initial Comments of Dvara Research dated 16 January 2020 on the Personal Data Protection Bill, 2019 introduced in Lok Sabha on 11 December 2019”, January 2020, https://www.dvara.com/blog/2020/01/17/our-initial-comments-on-the-personal-data-protection-bill-2019/ (“Dvara Research”).</p>
<p>11. “A Data Sandbox for Your Company”, Terrific Data, last accessed on January 31, 2019, http://terrificdata.com/2016/12/02/3221/.</p>
<p>12. Clause 3(20) — “harm” includes (i) bodily or mental injury; (ii) loss, distortion or theft of identity; (ii) financial loss or loss of property; (iv) loss of reputation or humiliation; (v) loss of employment; (vi) any discriminatory treatment; (vii) any subjection to blackmail or extortion; (viii) any denial or withdrawal of service,benefit or good resulting from an evaluative decision about the data principal; (ix) any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear of being observed or surveilled; or (x) any observation or surveillance that is not reasonably expected by the data principal.</p>
<p>13. Alex Hern “Anonymised data can never be totally anonymous, says study”, July 23, 2019 https://www.theguardian.com/technology/2019/jul/23/anonymised-data-never-be-anonymous-enough-study-finds.</p>
<p>14. Clause 97 of the 2018 Bill states“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6)The Authority shall no, later than twelve months from the date notified date issue codes of practice on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f ) security safeguards under section 31; (g) research purposes under section 45; (h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019'>http://editors.cis-india.org/internet-governance/blog/comments-to-the-personal-data-protection-bill-2019</a>
</p>
No publisherAmber Sinha, Elonnai Hickok, Pallavi Bedi, Shweta Mohandas, Tanaya RajwadeInternet GovernanceData ProtectionPrivacy2020-02-21T10:13:35ZBlog EntryHow to Shut Down Internet Shutdowns
http://editors.cis-india.org/internet-governance/events/how-to-shutdown-internet-shutdowns
<b>This talk will focus on the challenges and opportunities for research on internet shutdowns after the judgement of the Supreme Court in Anuradha Bhasin v. Union of India. Stepping beyond the judgement, there will be a wider discussion on the practice of whitelists, blocking powers of the central government.
</b>
<p> </p>
<p><img src="https://cis-india.org/How-to-Shut-Down-Internet-Shutdowns-Details/" alt="null" width="100%" /></p>
<p> </p>
<h3><strong>About the Speaker</strong> </h3>
<p>Apar Gupta is the Executive Director of the Internet Freedom Foundation.</p>
<p>Apar has been fighting the good fight for digital rights. While in law school almost 20 years ago, he wrote a legal commentary on the IT Act that is now in its third edition. As a lawyer in the Supreme Court, he worked on landmark cases such as on Section 66A, Intermediary Liability, Internet Shutdowns, the Right to Privacy and Privacy.</p>
<p>He also helped create public campaigns to advance net neutrality, reform defamation laws, fight Internet shutdowns and create a privacy statute. Apar previously ran his own successful law firm, was profiled in Outlook Magazine and listed in Forbes India's list of 30 under 30. He has also worked as a commercial litigator and partner in top law firms, written papers cited widely in local and international publications and taught courses at NLS and NLU.</p>
<p>RSVP <a class="external-link" href="https://forms.gle/CGei6wNUbR4t92549">here</a>, or by sending an email Torsha (torsha@cis-india.org).</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/events/how-to-shutdown-internet-shutdowns'>http://editors.cis-india.org/internet-governance/events/how-to-shutdown-internet-shutdowns</a>
</p>
No publisherpranavinternet governanceEventInternet Governance2020-02-03T11:13:12ZEventAutomated Facial Recognition Systems and the Mosaic Theory of Privacy: The Way Forward
http://editors.cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward
<b> Arindrajit Basu and Siddharth Sonkar have co-written this blog as the third of their three-part blog series on AI Policy Exchange under the parent title: Is there a Reasonable Expectation of Privacy from Data Aggregation by Automated Facial Recognition Systems? </b>
<p> </p>
<p><strong>The Mosaic Theory of Privacy</strong></p>
<p>Whether the data collected by the AFRS should be treated similar to
face photographs taken for the purposes of ABBA is not clear in the
absence of judicial opinion. The AFRS would ordinarily collect
significantly more data than facial photographs during authentication.
This can be explained with the help of the <em><a href="https://www.lawfareblog.com/defense-mosaic-theory" rel="noreferrer noopener" target="_blank">mosaic theory of privacy</a></em>.</p>
<p>The mosaic theory of privacy suggests that data collected for long
durations of an individual can be qualitatively different from single
instances of observation. It argues that aggregating data from different
instances can create a picture of an individual which affects her
reasonable expectation of privacy. This is because a mere slice of
information reveals a lot less if the same is contextualised in a broad
pattern — a mosaic. </p>
<p>The mosaic theory of privacy does not find explicit reference in
Puttaswamy II. The petitioners had argued that seeding of Aadhaar data
into existing databases would bridge information across silos so as to
make real time surveillance possible. This is because information when
integrated from different silos becomes more than the sum of its parts.</p>
<p>The Court, however, dismissed this argument, accepting UIDAI’s
submission that the data collected remains in different silos and
merging is not permitted within the Aadhaar framework. Therefore, the
Court did not examine whether it is constitutionally permissible to
integrate data from different silos; it simply rejected the possibility
of surveillance as a result of Aadhaar authentication.</p>
<p>Jurisprudence in other jurisdictions is more advanced. In <em>United States v. Jones</em>,
the United States Supreme Court had observed that the insertion of a
global positioning system into Antoine Jones’ Jeep in the absence of a
warrant and without his consent invaded his privacy, entitling him to
Fourth Amendment Protection. In this case, the movement of Jones’
vehicle was monitored for a period of twenty-eight days. Five concurring
opinions in Jones acknowledges that aggregated and extensive
surveillance is capable of violating the reasonable expectation of
privacy irrespective of whether or not surveillance has taken place in
public.</p>
<p>The Court distinguished between prolonged surveillance and short term
surveillance. Surveillance in the short run does not reveal what a
person repeatedly does, as opposed to sustained surveillance which can
reveal significantly more about a person. The Court takes the example of
how a sequence of trips to a bar, a bookie, a gym or a church can tell a
lot more about a person than the story of any single visit viewed in
isolation.</p>
<p>Most recently, in<a href="https://www.supremecourt.gov/opinions/17pdf/16-402_h315.pdf" rel="noreferrer noopener" target="_blank"> <em>Carpenter v. United States</em></a>,
the Supreme Court of the United States held that the collection of
historical cell data by the government exposes the physical movements
of an individual to potential surveillance, and an individual holds a
reasonable expectation of privacy against such collection. The Court
admitted that historical-cell site information allows the government to
go back in time in order to retract the exact whereabouts of a person.</p>
<p>Judicial decisions have not addressed specifically whether facial
recognition through law enforcement constitutes a search under the
Fourth Amendment or a “mere visual observation”.</p>
<p>The common thread linking CCTV footages and cellular data is the
unique ability to track the movement of an individual from one place to
another, enabling extreme forms of surveillance. It is perhaps this
crucial link that would make ARFS-enabled CCTVs prejudicial to
individual privacy.</p>
<p> The mosaic theory as understood in <em>Carpenter</em> helps one
understand the extent to which an AFRS can augment the capacities of law
enforcement in India. This in turn can help in understanding whether it
is constitutionally permissible to install such systems across the
country.</p>
<p>AFRS enabled-CCTV footages from different CCTVs. if viewed in
conjunction could reveal a sequence of movements of an individual,
enabling long-term surveillance of a nature that is qualitatively
distinct from isolated observances observed across unrelated CCTV
footages.</p>
<p>Subsequent to <em>Carpenter</em>, <a href="https://www.lawfareblog.com/four-months-later-how-are-courts-interpreting-carpenter" rel="noreferrer noopener" target="_blank">federal district courts</a>
in the United States have declined to apply Carpenter to video
surveillance cases since the judgement did not “call into question
conventional surveillance techniques and tools, such as security
cameras.”</p>
<p>The extent of processing that an AFRS-enabled CCTV exposes an
individual to would be significantly greater. This is because every time
an individual is in the zone of a AFRS-enabled CCTV, the facial image
will be compared to a common database. Snippets from different CCTVs
capturing the individual’s physical presence in two different locations
may not be meaningful per se. When observed together, the AFRS will make
it possible to identify the individual’s movement from one place to
another.</p>
<p>For instance, the AFRS will be able to identify the person when they
are on Street A at a particular time and when they are Street B in the
immediately subsequent hour recorded by respective CCTV cameras,
indicating the person’s physical movement from A to B. While a CCTV
camera only records movement of an individual in video format, AFRS
translates that digital information into individualised data with the
help of a comparison of facial features with a pre-existing database.</p>
<p>Through data aggregation, which appears to be the aim of the Indian
government in their tender that links three databases, it is apparent
that the right to privacy is in danger. Yet, at present, there does not
exist any case law or legislation that can render such efforts illegal
at this juncture.</p>
<p><strong>Conclusions and The Way Forward</strong></p>
<p>Despite a lack of judicial recognition of the potential
unconstitutionality of deploying AFRS, it is clear that the introduction
of these systems pose a clear and present danger to civil rights and
human dignity. Algorithmic surveillance alters a human being’s life in
ways that even the subject of this surveillance cannot fully comprehend.
As an individual’s data is manipulated and aggregated to derive a
pattern about that individual’s world, the individual or his data no
longer exists for itself<sup> </sup>but are massaged into various categories.</p>
<p>Louis Amoore terms this a ‘<a href="https://journals.sagepub.com/doi/abs/10.1177/0263276411417430?journalCode=tcsa" rel="noreferrer noopener" target="_blank">data-derivative</a>’,
which is an abstract conglomeration of data that continuously shapes
our futures without us having a say in their framing. The branding of an
individual as a criminal and then aggregating their data causes
emotional distress as individuals move about in fear of the state gaze
and their association with activities that are branded as potentially
dangerous — thereby suppressing a right to dissent — as exemplified by
their use reported use during the recent protests in Hong Kong.</p>
<p>Case law both in India and abroad has clearly suggested that a right
to privacy is contextual and is not surrendered merely because an
individual is in a public place. However, the jurisprudence protecting
public photography or videography under the umbrella of privacy remains
less clear globally and non-existent in India.</p>
<p>The mosaic theory of privacy is useful in this regard as it prevents
mass ‘data-veillance’ of individual behaviour and accurately identifies
the unique power that the volume, velocity and variety of Big Data
provides to the state. Therefore, it is imperative that the judiciary
recognise safeguards from data aggregation as an essential component of a
reasonable expectation of privacy. At the same time, legislation could
also provide the required safeguards.</p>
<p>In the US, Senators Coons and Lee recently introduced a draft Bill titled ‘<a href="https://www.coons.senate.gov/imo/media/doc/ALB19A70.pdf" rel="noreferrer noopener" target="_blank">The Facial Recognition Technology Warrant Act of 2019’</a>.
The Bill aims to impose reasonable restrictions on the use of facial
recognition technology by law enforcement. The Bill creates safeguards
against sustained tracking of physical movements of an individual in
public spaces. The Bill terms such tracking ‘ongoing surveillance’ when
it occurs for over a period of 72 hours in real time or through
application of technology to historical records. The Bill requires that
ongoing surveillance only be conducted for law enforcement purposes <em>and</em> in pursuance of a Court Order (unless it is impractical to do so).</p>
<p>While the Bill has its textual problems, it is definitely worth
considering as a model going forward and ensure that AFR systems are
deployed in line with a rights-respecting reading of a reasonable
expectation of privacy. <a href="http://datagovernance.org/report/adoption-and-regulation-of-facial-recognition-technologies-in-india" rel="noreferrer noopener" target="_blank">Parsheera</a>
suggests that the legislation should narrow tailoring of the objects
and purposes for deployment of AFRS, restrictions on the person whose
images may be scanned from the databases, judicial approval for its use
on a case by case basis and effective mechanisms of oversight, analysis
and verification.</p>
<p>Appropriate legal intervention is crucial. A failure to implement
this effectively jeopardizes the expression of our true selves and the
core tenets of our democracy.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward'>http://editors.cis-india.org/internet-governance/automated-facial-recognition-systems-and-the-mosaic-theory-of-privacy-the-way-forward</a>
</p>
No publisherArindrajit Basu, Siddharth SonkarCybersecurityCyber Securityinternet governanceInternet Governance2020-01-02T14:12:38ZBlog Entry