The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 15.
Reconfiguring Data Governance: Insights from India and the EU
http://editors.cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu
<b>This policy paper is the result of a workshop organised jointly by the Tilburg Institute of Law, Technology and Society, Netherlands, the Centre for Communication Governance at the National Law University Delhi, India and the Centre for Internet & Society, India in January, 2023. The workshop brought together a number of academics, researchers, and industry representatives in Delhi to discuss a range of issues at the core of data governance theory and practice. </b>
<p style="text-align: justify; "><img src="http://editors.cis-india.org/home-images/ReconfiguringDataGovernance.png/@@images/70165fe1-cc66-4cac-9f99-b7485c87218a.png" alt="Reconfiguring Data Governance" class="image-inline" title="Reconfiguring Data Governance" /></p>
<p style="text-align: justify; ">The workshop aimed to compare and assess lessons from data governance from India and the European Union, and to make recommendations on how to design fit-for-purpose institutions for governing data and AI in the European Union and India.</p>
<p style="text-align: justify; ">This policy paper collates key takeaways from the workshop by grounding them across three key themes: how we conceptualise data; how institutional mechanisms as well as community-centric mechanisms can work to empower individuals, and what notions of justice these embody; and finally a case study of enforcement of data governance in India to illustrate and evaluate the claims in the first two sections.</p>
<p style="text-align: justify; ">This report was a collaborative effort between researchers Siddharth Peter De Souza, Linnet Taylor, and Anushka Mittal at the Tilburg Institute for Law, Technology and Society (Netherlands), Swati Punia, Sristhti Joshi, and Jhalak M. Kakkar at the Centre for Communication Governance at the National Law University Delhi (India) and Isha Suri, and Arindrajit Basu at the Centre for Internet & Society, India.</p>
<hr />
<p>Click to download the <a class="external-link" href="http://cis-india.org/internet-governance/files/reconfiguring-data-governance.pdf"><b>report</b></a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu'>http://editors.cis-india.org/internet-governance/blog/reconfiguring-data-governance-insights-from-india-and-eu</a>
</p>
No publisherSwati Punia, Srishti Joshi, Siddharth Peter De Souza, Linnet Taylor, Jhalak M. Kakkar, Isha Suri, Arindrajit Basu, and Anushka MittalInternet GovernanceData GovernanceData ProtectionData Management2024-02-20T00:30:00ZBlog EntryUnpacking Algorithmic Infrastructures: Mapping the Data Supply Chain in the Healthcare Industry in India
http://editors.cis-india.org/raw/unpacking-algorithmic-infrastructures
<b>The Unpacking Algorithmic Infrastructures project, supported by a grant from the Notre Dame-IBM Tech Ethics Lab, aims to study the Al data supply chain infrastructure in healthcare in India, and aims to critically analyse auditing frameworks that are utilised to develop and deploy AI systems in healthcare. It will map the prevalence of Al auditing practices within the sector to arrive at an understanding of frameworks that may be developed to check for ethical considerations - such as algorithmic bias and harm within healthcare systems, especially against marginalised and vulnerable populations. </b>
<p style="text-align: justify; ">There has been an increased interest in health data in India over the recent years, where health data policies encourage sharing of data with different entities, at the same time, there has been a growing interest in deployment of Al in healthcare from startups, hospitals, as well as multinational technology companies.</p>
<p style="text-align: justify; ">Given the invisibility of algorithmic infrastructures that underlie the digital economy and the important decisions these technologies can make about patients' health, it's important to look at how these systems are developed, how data flows within them, how these systems are tested and verified and what ethical considerations inform their deployment.</p>
<p style="text-align: justify; "><img src="http://editors.cis-india.org/home-images/ResearchersWork.png/@@images/00a848c7-b7f7-41b4-8bd9-45f2928fd44e.png" alt="Researchers at Work" class="image-inline" title="Researchers at Work" /></p>
<p style="text-align: justify; "><strong>The </strong><strong>Unpacking Algorithmic Infrastructures</strong> project, supported by a grant from the Notre Dame-IBM Tech Ethics Lab, aims to study the Al data supply chain infrastructure in healthcare in India, and aims to critically analyse auditing frameworks that are utilised to develop and deploy AI systems in healthcare. It will map the prevalence of Al auditing practices within the sector to arrive at an understanding of frameworks that may be developed to check for ethical considerations - such as algorithmic bias and harm within healthcare systems, especially against marginalised and vulnerable populations.</p>
<h3 style="text-align: justify; ">Research Questions</h3>
<ol>
<li style="text-align: justify; ">To what extent organisations take ethical principles into account when developing AI , managing the training and testing dataset, and while deploying the AI in the healthcare sector.</li>
<li style="text-align: justify; ">What best practices for auditing can be put in place based on our critical understanding of AI data supply chains and auditing frameworks being employed in the healthcare sector.</li>
<li style="text-align: justify; ">What is a possible auditing framework that is best suited to organisations in the majority world.</li>
</ol>
<h3>Research Design and Methods</h3>
<p>For this study, we will use a comprehensive mixed methods approach. We will survey professionals working towards designing, developing and deploying AI systems for healthcare in India, across technology and healthcare organizations. We will also undertake in-depth interviews with experts who are part of key stakeholder groups.</p>
<p>We hereby invite researchers, technologists, healthcare professionals, and others working at the intersection of Artificial Intelligence and Healthcare to speak to us and help us inform the study. You may contact Shweta Monhandas at <a href="mailto:shweta@cis-india.org">shweta@cis-india.org</a></p>
<ol> </ol>
<hr />
<p>Research Team: Amrita Sengupta, Chetna V. M., Pallavi Bedi, Puthiya Purayil Sneha, Shweta Mohandas and Yatharth.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/raw/unpacking-algorithmic-infrastructures'>http://editors.cis-india.org/raw/unpacking-algorithmic-infrastructures</a>
</p>
No publisherAmrita Sengupta, Chetna V. M., Pallavi Bedi, Puthiya Purayil Sneha, Shweta Mohandas and YatharthHealth TechRAW BlogResearchData ProtectionHealthcareResearchers at WorkArtificial Intelligence2024-01-05T02:38:22ZBlog EntryCoWIN Breach: What Makes India's Health Data an Easy Target for Bad Actors?
http://editors.cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution
<b>Recent health data policies have failed to even mention the CoWIN platform.</b>
<p style="text-align: justify; ">The article was <a class="external-link" href="https://www.thequint.com/opinion/cowin-data-breach-health-sensitive-details-policies-solution#read-more">originally published in the Quint</a> on 19 June 2023.</p>
<hr />
<p style="text-align: justify; ">Last week, it was reported that due to an alleged breach of <a href="https://www.thequint.com/fit/cowin-data-breach-private-information-covid-vaccine-telegram-bot">the CoWIN platform</a>, details such as Aadhaar and passport numbers of Indians were made public via a Telegram bot.</p>
<p style="text-align: justify; ">While Minister of State for Information Technology <a href="https://www.thequint.com/fit/cowin-data-breach-telegram-bot-covid-19-vaccine-unanswered-questions">Rajeev Chandrashekar</a> put out information acknowledging that there was some form of a data breach, there is no information on how the breach took place or when a past breach may have taken place.</p>
<blockquote class="quoted" style="text-align: justify; ">This data leak is yet another example of <a href="https://www.thequint.com/opinion/cowin-breach-shows-us-the-structural-problem-with-digital-indias-infrastructure">our health records</a> being exposed in the recent past – during the pandemic, there were reports of COVID-19 test results being leaked online. The leaked information included patients’ full names, dates of birth, testing dates, and names of centres in which the tests were held.</blockquote>
<p style="text-align: justify; ">In December last year, five servers of the <a href="https://www.thequint.com/fit/aiims-ayushman-bharat-digital-mission-health-data">All India Institute of Medical Science</a> (AIIMS) in Delhi were under a cyberattack, leaving sensitive personal data of around 3-4 crore patients compromised.</p>
<p style="text-align: justify; ">In such cases, the Indian Computer Emergency Response Team (CERT-In) is the agency responsible for looking into the vulnerabilities that may have led to them. However, till date, CERT-In has not made its technical findings into such attacks <a href="https://www.thequint.com/topic/data-breach">publicly available</a>.</p>
<h3 style="text-align: justify; ">The COVID-19 Pandemic Created Opportunity</h3>
<p style="text-align: justify; ">The pandemic saw a number of digitisation policies being rolled out in the health sector; the most notable one being the National Digital Health Mission (or NDHM, later re-branded as the Ayushman Bharat Digital Mission).</p>
<p style="text-align: justify; ">Mobile phone apps and web portals launched by the central and state governments during the pandemic are also examples of this health digitisation push. The rollout of the COVID-19 vaccinations also saw the deployment of the CoWIN platform.</p>
<p style="text-align: justify; ">Initially, it was mandatory for individuals to register on CoWIN to get an appointment for vaccination, and there was no option for walk-in-registration or to book an appointment. But, the Centre subsequently modified this rule and walk-in appointments and registrations on CoWIN became permissible from June 2021.</p>
<blockquote>However, a study conducted by the Centre for Internet and Society (CIS) found that states such as Jharkhand and Chhattisgarh, which have low internet penetration, permitted on-site registration for vaccinations from the beginning.</blockquote>
<p>The rollout of the NDHM also saw Health IDs being generated for citizens.</p>
<p style="text-align: justify; ">In several reported cases across states, this rollout happened during the COVID-19 vaccination process – without the informed consent of the concerned person.</p>
<p style="text-align: justify; ">The <b>beneficiaries who have had their Health IDs created through the vaccination process had not been informed</b> about the creation of such an ID or their right to opt out of the digital health ecosystem.</p>
<h3>A Web of Health Data Policies</h3>
<p>Even before the pandemic, India was working towards a Health ID and a health data management system.</p>
<p style="text-align: justify; ">The components of the umbrella National Digital Health Ecosystem (NDHE) are the National Digital Health Blueprint published in 2019 (NDHB) and the NDHM.</p>
<p style="text-align: justify; ">The Blueprint was created to implement the National Health Stack (published in 2018) which facilitated the creation of Health IDs. Whereas the NDHM was drafted to drive the implementation of the Blueprint, and promote and facilitate the evolution of NDHE.</p>
<p>The National Health Authority (NHA), established in 2018, has been given the responsibility of implementing the National Digital Health Mission.</p>
<blockquote style="text-align: justify; ">2018 also saw the Digital Information Security in Healthcare Act (DISHA), which was to regulate the generation, collection, access, storage, transmission, and use of Digital Health Data ("DHD") and associated personal data.</blockquote>
<p>However, since its call for public consultation, <b>no progress has been made</b> on this front.</p>
<p style="text-align: justify; ">In addition to documents that chalk out the functioning and the ecosystem of a digitised healthcare system, the NHA has released policy documents such as:</p>
<ul>
<li>
<p>the Health Data Management Policy (which was revised three times; the latest version released in April 2022)</p>
</li>
<li>
<p>the Health Data Retention Policy (released in April 2021)</p>
</li>
<li>
<p>Consultation paper on the Unified Health Interface (UHI) (released in December 2022)</p>
</li>
</ul>
<p style="text-align: justify; ">Along with these policies, in 2022, the NHA released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) – India’s state health insurance policy.</p>
<blockquote style="text-align: justify; ">However these <b>draft guidelines repeat the pattern of earlier policies</b> <b>on health data</b>, wherein there is no reference to the policies that predated it; the PM-JAY’s Data Sharing Guidelines, published in August 2022, did not even refer to the draft National Digital Health Data Management Policy (published in April 2022).</blockquote>
<p style="text-align: justify; "><b>Interestingly, the recent health data policies do not mention CoWIN.</b> Failing to cross-reference or mention preceding policies creates a lack of clarity on which documents are being used as guidelines by healthcare providers.</p>
<h3 style="text-align: justify; ">Can a Data Protection Bill Be the Solution?</h3>
<p>The draft Data Protection Bill, 2021, defined health data as “…the data related to the state of physical or mental health of the data principal and <b>includes records regarding the past, present or future state of the health of such data principal</b>, data collected in the course of registration for, or provision of health services, data associated with the data principal to the provision of specific health services.”</p>
<p>However, this definition as well as the definition of sensitive personal data was removed from the current version of the Bill (Digital Personal Data Protection Bill, 2022).</p>
<blockquote>Omitting these definitions from the Bill removes a set of data which, if collected, warrants increased responsibility and increased liability. Handling of health data, financial data, government identifiers, etc, need to come with a higher level of responsibility as they are a list of sensitive details of a person.</blockquote>
<p style="text-align: justify; ">The threats posed as a result of this data being leaked are not limited to spam messages or fraud and impersonation, but also of companies that can get a hand on this coveted data and gather insights and train their systems and algorithms, without the need to seek consent from anyone, or without facing the consequences of harm caused.</p>
<p style="text-align: justify; ">While the current version of the draft DPDP Bill states that the data fiduciary shall notify the data principal of any breach, the draft Bill also states that the Data Protection Board “may” direct the data fiduciary to adopt measures that remedy the breach or mitigate harm caused to the data principal.</p>
<p style="text-align: justify; ">The Bill also prescribes penalties of upto Rs 250 crore if the data fiduciary fails to take reasonable security safeguards to prevent a personal data breach, and a penalty of upto Rs 200 crore if the fiduciary fails to notify the data protection board and the data principal of such breach.</p>
<p style="text-align: justify; ">While <b>these steps, if implemented through legislation, would make organisations processing data take their data security more seriously</b>, the removal of sensitive personal data from the definition of the Bill, would mean that data fiduciaries processing health data will not have to take additional steps other than reasonable security safeguards.</p>
<p>The <b>absence of a clear indication of security standards</b> will affect data principals and fiduciaries.</p>
<p style="text-align: justify; ">Looking to bring more efficiency to governance systems, the Centre launched the Digital India Mission in 2015. The press release by the central government reporting the approval of the programme by the Cabinet of Ministers speaks of ‘cradle to grave’ digital identity as one of its vision areas.</p>
<p>The ambitious Universal Health ID and health data management policies are an example of this digitisation mission.</p>
<blockquote>However breaches like this are reminders that without proper data security measures, and a system for having a person responsible for data security, the data is always vulnerable to an attack.</blockquote>
<p style="text-align: justify; ">While the UK and Australia have also seen massive data breaches in the past, India is at the start of its health data digitisation journey and has the ability to set up strong security measures, employ experienced professionals, and establish legal resources to ensure that data breaches are minimised and swift action can be taken in case of a breach.</p>
<p style="text-align: justify; "><b>The first step</b> to understand the vulnerabilities would be to present the CERT-In reports of this breach, and guide other institutions to check for the same so that they are better prepared for future breaches and attacks.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution'>http://editors.cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution</a>
</p>
No publisherShweta Mohandas and Pallavi BediInternet GovernanceData ProtectionPrivacy2023-07-04T09:39:03ZBlog EntryThe Centre for Internet and Society’s comments and recommendations to the: The Digital Data Protection Bill 2022
http://editors.cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill
<b>The Centre for Internet & Society (CIS) published its comments and recommendations to the Digital Personal Data Protection Bill, 2022, on December 17, 2022.</b>
<div class="WordSection1" style="text-align: justify; ">
<p class="MsoNormal"><span> </span></p>
<p align="center" class="MsoNormal" style="text-align:center; "><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p align="right" class="MsoNormal" style="text-align:right; "><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<h1><span>High Level Comments</span></h1>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>1.<span> </span></span></b><b><span>Rationale for removing the distinction between personal data and sensitive personal data is unclear.</span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span>All the earlier iterations of the Bill as well as the rules made under Section 43A of the Information Technology Act, 2000<a href="#_ftn1" name="_ftnref1"><sup><sup><span>[1]</span></sup></sup></a> had classified data into two categories; (i) personal data; and (ii) sensitive personal data. The 2022 version of the Bill has removed this distinction and clubbed all personal data under one umbrella heading of personal data. The rationale for this is unclear, as sensitive personal data means such data which could reveal or be related to eminently private data such as financial data, health data, sexual orientations and biometric data. Considering the sensitive nature of the data, the data classified as sensitive personal data is accorded higher protection and safeguards from processing, therefore by clubbing all data as personal data, the higher protection such as the need for explicit consent to the processing of sensitive personal data, the bar on processing of sensitive personal data for employment purposes has also been removed. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>2.<span> </span></span></b><b><span>No clear roadmap for the implementation of the Bill</span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span>The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified.<a href="#_ftn2" name="_ftnref2"><sup><sup><span>[2]</span></sup></sup></a> It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within which the Board shall be established and specific Rules and regulations notified. Considering that certain provisions have been deferred to Rules that have to be framed by the Central government, the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill. Provisions such as Section 10(1) which deals with verifiable parental consent for data of children, Section 13 (1) which states the manner in which a Data Principal can initiate a right to correction, the process of selection and functioning of consent manager under </span><span>3(7)</span><span> are few such examples, that when the Act becomes applicable, the data principal will have to wait for the Rules to Act of these provisions, or to get clarity on entities created by the Act. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Board. This is important to give full and effective effect to the right of privacy of the individual. It is also important to ensure that individuals have an effective mechanism to enforce the right and seek recourse in case of any breach of obligations by the data fiduciaries. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The Board must ensure that Data Principals and Fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment into force. This will allow the Data Fiduciaries to align their practices with the provisions of this new legislation and the Board will also have time to define and determine certain provisions that the Bill has left the Board to define. Additionally enforcing penalties for offenses initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders which now could include data principals as well, from paying a high price. This will relieve the fear of smaller companies and startups and individuals who might fear processing data for the fear of paying penalties for offenses.</span></p>
<p class="MsoNormal"><span> </span></p>
<h3><a name="_kn12ecl3pdrp"></a><span>3.<span> </span></span><span>Independence of Data Protection Board of India.</span></h3>
<p class="MsoNormal"><span>The Bill proposes the creation of the Data Protection Board of India (Board) in place of the Data Protection Authority. In comparison with the powers of the Board with the 2018 and 2019 version of Personal Data Protection Bill, we witness an abrogation of powers of the Board to be created, in this Bill. Under Clause 19(2), the strength and composition of the Board, the process of selection, the terms and conditions of appointment and service, and the removal of its Chairperson and other Members shall be such as may be prescribed by the Union Government at a later stage. Further as per Clause 19(3), the Chief Executive of the Board will be appointed by the Union Government and the terms and conditions of her service will also be determined by the Union Government. The functions of the Board have also not been specified under the Bill, the Central Government may assign the functions to be performed by the Board.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate, ability to act swiftly, and resources. The political nature of personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Board are independent of the Executive. </span></p>
<h1><a name="_n9jzjnvile8f"></a><span>Chapter Wise Comments and Recommendations </span></h1>
<h2><a name="_chp7y0vgrjqa"></a><span>CHAPTER I- PRELIMINARY</span></h2>
<p class="MsoNormal"><span><span> </span>●<span> </span></span><b><span>Definition:</span></b><span> While the Bill has added a few new definitions to the Bill including terms such as gains, loss, consent manager etc. there are a few key definitions that have been removed from the earlier versions of the Bill. The removal of certain definitions in the Bill, eg. sensitive personal data, health data, biometric data, transgender status, creating a legal uncertainty about the application of the Bill. </span></p>
<p class="MsoNormal"><span>With respect to the existing definitions as well the definition of the term ‘harm’ has been significantly reduced to remove harms such as surveillance from the ambit of harms. In addition, with respect of the definition of the term of harms also, the 2019 version of the Bill under Clause 2 (20) the definition provides a non exhaustive list of harms, by using the phrase “harms include”, however in the new definition the phrase has been altered to “harm”, in relation to a Data Principal, means”, thereby removing the possibility of more harms that are not apparent currently from being within the purview of the Act. We recommend that the definition of harms be made into a non-exhaustive list.<br /> <br /> </span></p>
<h2><a name="_nhwnuzprx0ir"></a><span>CHAPTER II - OBLIGATIONS OF DATA FIDUCIARY</span></h2>
<p class="MsoNormal"><b><span>Notice: </span></b><span>The revised Clause on notice does away with the comprehensive requirements which were laid out under Clause 7 of the PDP Bill 2019. The current clause does not mention in detail what the notice should contain, while stating that that the notice should be itemised. While it can be reasoned that the Data Fiduciary can find the contents of the notice throughout the bill, such as with the rights of the Data Principal, the removal of a detailed list could create uncertainty for Data Fiduciaries. By leaving the finer details of what a notice should contain, it could cause Data Fiduciaries from missing out key information from the list, which in turn provide incomplete information to the Data Principal. Even in terms of Data Fiduciaries they might not know if they are complying with the provisions of the bill, and could result in them invariably being penalised. In addition to this by requiring less work by the Data Fiduciary and processor, the burden falls on the Data Principal to make sure they know how their data is processed and collected. The purpose of this legislation is to create further rights for individuals and consumers, hence the Bill should strive to put the individual at the forefront.</span></p>
<p class="MsoNormal"><span>In addition to this Clause 6(3) of the Bill states <i>“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English or any language specified in the Eighth Schedule to the Constitution of India.”</i> While the inclusion of regional language notices is a welcome step, we suggest that the text be revised as follows <i>“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English<b> and in</b> any language specified in the Eighth Schedule to the Constitution of India.” </i>While the main crux of notice is to let the person know before giving consent, notice in a language that a person cannot read would not lead to meaningful consent.</span></p>
<p class="MsoNormal"><b><span>Consent <br /> <br /> </span></b><span>Clause 3 of the Bill states <i>“request for consent would have the contact details of a Data Protection Officer, where applicable, or of any other person authorised by the Data Fiduciary to respond to any communication from the Data Principal for the purpose of exercise of her rights under the provisions of this Act.” </i>Ideally this provision should be a part of the notice and should be mentioned in the above section. This is similar to Clause 7(1)(c) of the draft Personal Data Protetion Bill 2019 which requires the notice to state <i>“the identity and contact details of the data fiduciary and the contact details of the data protection officer, if applicable;”. </i></span></p>
<p class="MsoNormal"><b><span>Deemed Consent</span></b></p>
<p class="MsoNormal"><span>The Bill introduces a new type of consent that was absent in the earlier versions of the Bill. We are of the understanding that deemed consent is used to redefine non consensual processing of personal data. The use of the term deemed consent and the provisions under the section while more concise than the earlier versions could create more confusion for Data Principals and Fiduciaries alike. The definition and the examples do not shed light on one of the key issues with voluntary consent - the absence of notice. In addition to this the Bill is also silent on whether deemed consent can be withdrawn or if the data principal has the same rights as those that come from processing of data they have consented to. </span></p>
<p class="MsoNormal"><b><span>Personal Data Protection of Children </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span>The age to determine whether a person has the ability to legally consent in the online world has been intertwined with the age of consent under the Indian Contract Act; i.e. 18 years. The Bill makes no distinction between a 5 year old and a 17 year old- both are treated in the same manner. It assumes the same level of maturity for all persons under the age of 18. It is pertinent to note that the law in the offline world does recognise that distinction and also acknowledges the changes in the level of maturity. As per Section 82 of the Indian Penal Code read with Section 83, any act by a child under the age of 12 shall not be considered as an offence. While the maturity of those aged between 12–18 years will be decided by court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industry</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>There is a need to evaluate and rethink the idea that children are passive consumers of the internet and hence the consent of the parent is enough. Additionally, the bracketing of all individuals under the age of 18 as children fails to look at how teenages and young people use the internet. This is more important looking at the 2019 data which suggests that two-thirds of India’s internet users are in the 12–29 years age group, with those in the 12–19 age group accounting for about 21.5% of the total internet usage in metro cities. Given that the pandemic has compelled students and schools to adopt and adapt to virtual schools, the reliance on the internet has become ubiquitous with education. Out of an estimated 504 million internet users, nearly one-third are aged under 19. As per the Annual Status on Education Report (ASER) 2020, more than one-third of all schoolchildren are pursuing digital education, either through online classes or recorded videos.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>Instead of setting a blanket age for determining valid consent, we could look at alternative means to determine the appropriate age for children at different levels of maturity, similar to what had been developed by the U.K. Information Commissioner’s Office. The Age Appropriate Code prescribes 15 standards that online services need to follow. It broadly applies to online services "provided for remuneration"—including those supported by online advertising—that process the personal data of and are "likely to be accessed" by children under 18 years of age, even if those services are not targeted at children. This includes apps, search engines, social media platforms, online games and marketplaces, news or educational websites, content streaming services, online messaging services. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The reservation to definition of child under the Bill has also been expressed by some members of the JPC through their dissenting opinion. MP Ritesh Pandey stated that keeping in mind the best interest of the child the Bill should consider a child to be a person who is less than 14 years of age. This would ensure that young people could benefit from the advances in technology without parental consent and reduce the social barriers that young women face in accessing the internet. Similarly Manish Tiwari in his dissenting note also observed that the regulation of the processing of data of children should be based on the type of content or data. The JPC Report observed that the Bill does not require the data fiduciary to take fresh consent of the child, once the child has attained the age of majority, and it also does not give the child the option to withdraw their consent upon reaching the majority age. It therefore, made the following recommendations:</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>Registration of data fiduciaries, exclusively dealing with children’s data. Application of the Majority Act to a contract with a child. Obligation of Data fiduciary to inform a child to provide their consent, three months before such child attains majority Continuation of the services until the child opts out or gives a fresh consent, upon achieving majority. However, these recommendations have not been incorporated into the provisions of the Bill. In addition to this the Bill is silent on the status of non consensual processing and deemed consent with respect to the data of children.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>We recommend that fiduciaries who have services targeted at children should be considered as significant Data Fiduciaries. In addition to this the Bill should also state that the guardians could approach the Data Protection Board on behalf of the child. With these obligations in place, the age of mandatory consent could be reduced and the data fiduciary could have an added responsibility of informing the children in the simplest manner how their data will be used. Such an approach places a responsibility on Data Fiduciaires when implementing services that will be used by children and allows the children to be aware of data processing, when they are interacting with technology.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>Chapter III-RIGHTS AND DUTIES OF DATA PRINCIPAL</span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span>Rights of Data Principal</span></b></p>
<p class="MsoNormal"><span>Clause 12(3) of the Bill while providing the Data Principal the right to be informed of the identities of all the Data Fiduciaries with whom the personal data has been shared, also states that the data principal has the right to be informed of the categories of personal data shared. However the current version of the Bill provides only one category of data that is personal data. </span></p>
<p class="MsoNormal"><span>Clause 14 of the Bill talks about the Right of Grievance Redressal, and states that the Data Principal has the right to readily available means of registering a grievance, however the Bill does not provide in the Notice provisions the need to mention details of a grievance officer or a grievance redressal mechanism. It is only the additional obligations on significant data fiduciary that mentions the need for a Data Protection officer to be the contact for the grievance redressal mechanism under the provisions of this Bill. The Bill could ideally re-use the provisions of the IT Act SPDI Rules 2011 in which Section 5(7) states <i>“Body corporate shall address any discrepancies and grievances of their provider of the information with respect to processing of information in a time bound manner. For this purpose, the body corporate shall designate a Grievance Officer and publish his name and contact details on its website. The Grievance Officer shall redress the grievances or provider of information expeditiously but within one month ' from the date of receipt of grievance.”<br /> </i><br /> The above framing would not only bring clarity to the data fiduciaries on what process to follow for a grievance redressal, it also would reduce the significant burden of theBoard. </span></p>
<p class="MsoNormal"><b><span>Duties of Data Principals</span></b></p>
<p class="MsoNormal"><span>The Bill while entisting duties of the Data Principal states that the “Data Principal shall not register a false or frivolous grievance or complaint with a Data Fiduciary or the Board”, however it is very difficult for a Data Principal to and even for the Board to determine what constitutes a “frivolous grievance”. In addition to this the absence of a defined notice provision and the inclusion of deemed consent would mean that the Data Fiduciary could have more information about the matter than the Data Principal. This could mean that the fiduciary could prove that a claim was false or frivolous. Clause 21(12) states that “<i>At any stage after receipt of a complaint, if the Board determines that the complaint is devoid of merit, it may issue a warning or impose costs on the complainant.” </i>In addition to this Clause 25(1) states that “ <i>If the Board determines on conclusion of an inquiry that non- compliance by <b>a person </b>is significant, it may, after giving the person a reasonable opportunity of being heard, impose such financial penalty as specified in Schedule 1, not exceeding rupees five hundred crore in each instance.” </i>The use of the term “person” in this case includes data which could mean that they could be penalised under the provisions of the Bill, which could also include not complying with the duties.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>CHAPTER IV- SPECIAL PROVISIONS</span></b></p>
<p class="MsoNormal"><b><span>Transfer of Personal Data outside India</span></b></p>
<p class="MsoNormal"><span>Clause 17 of the Bill has removed the requirement of data localisation which the 2018 and 2019 Bill required. Personal data can be transferred to countries that will be notified by the central government. There is no need for a copy of the data to be stored locally and no prohibition on transferring sensitive personal data and critical data. Though it is a welcome change that personal data can be transferred outside of India, we would highlight the concerns in permitting unrestricted access to and transfer of all types of data. Certain data such as defence and health data do require sectoral regulation and ringfencing of the transfer of data. </span></p>
<p class="MsoNormal"><b><span>Exemptions</span></b></p>
<p class="MsoNormal"><span>Clause 18 of the Bill has widened the scope of government exemptions. Blanket exemption has been given to the State under Clause 18(4) from deleting the personal data even when the purpose for which the data was collected is no longer served or when retention is no longer necessary. The requirement of <i>proportionality, reasonableness and fairness</i> have been removed for the Central Government to exempt any department or instrumentality from the ambit of the Bill.</span><span> </span><span>By doing away with the four pronged test, this provision is not in consonance with test laid down by the Supreme Court and are also incompatible with an effective privacy regulation. There is also no provision for either a prior judicial review of the order by a district judge as envisaged by the Justice Srikrishna Committee Report or post facto review by an oversight committee of the order as laid down under the Indian Telegraph Rules, 1951<a href="#_ftn3" name="_ftnref3"><sup><sup><span>[3]</span></sup></sup></a> and the rules framed under Information Technology Act<a href="#_ftn4" name="_ftnref4"><sup><sup><span>[4]</span></sup></sup></a>. The provision states that such processing of personal data shall be subject to the procedure, safeguard and oversight mechanisms that may be prescribed.</span></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span> </span></p>
</div>
<div style="text-align: justify; "><br clear="all" />
<hr align="left" size="1" width="100%" />
<div id="ftn1">
<p class="MsoNormal"><a href="#_ftnref1" name="_ftn1"><sup><span><sup><span>[1]</span></sup></span></sup></a><span> Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011</span><span>.</span></p>
</div>
<div id="ftn2">
<p class="MsoNormal"><a href="#_ftnref2" name="_ftn2"><sup><span><sup><span>[2]</span></sup></span></sup></a><span> Clause 97 of the 2018 Bill states<i>“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6) The Authority shall no, later than twelve months from the date notified date issue codes of practice on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f) security safeguards under section 31; (g) research purposes under section 45;(h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”</i></span></p>
</div>
<div id="ftn3">
<p class="MsoNormal"><a href="#_ftnref3" name="_ftn3"><sup><span><sup><span>[3]</span></sup></span></sup></a><span> </span><span>Rule 419A (16): The Central Government or the State Government shall constitute a Review Committee. </span></p>
<p class="MsoNormal"><span>Rule 419 A(17): The Review Committee shall meet at least once in two months and record its findings whether the directions issued under sub-rule (1) are in accordance with the provisions of sub-section (2) of Section 5 of the said Act. When the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above it may set aside the directions and orders for destruction of the copies of the intercepted message or class of messages.</span></p>
<p class="MsoNormal"><span> </span></p>
</div>
<div id="ftn4">
<p class="MsoNormal"><a href="#_ftnref4" name="_ftn4"><sup><span><sup><span>[4]</span></sup></span></sup></a><span> </span><span>Rule 22 of Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009: The Review Committee shall meet at least once in two months and record its findings whether the directions issued under rule 3 are in accordance with the provisions of sub-section (2) of section 69 of the Act and where the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above, it may set aside the directions and issue an order for destruction of the copies, including corresponding electronic record of the intercepted or monitored or decrypted information.</span></p>
<p class="MsoNormal"><span> </span></p>
</div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill'>http://editors.cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill</a>
</p>
No publisherShweta Mohandas and Pallavi BediInternet GovernanceDigital GovernanceData ProtectionPrivacy2023-01-20T02:35:30ZBlog EntryDemystifying Data Breaches in India
http://editors.cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india
<b>Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.
</b>
<p>Edited by Arindrajit Basu and Saumyaa Naidu</p>
<hr />
<p dir="ltr" style="text-align: justify; ">India saw a <a href="https://theprint.in/india/despite-62-drop-in-data-breaches-india-among-top-5-nations-targeted-by-hackers-study-finds/917197/">62% drop in data breaches in the first quarter of 2022</a>. Yet, it ranked fifth on the list of countries most hit by cyberattacks according to a 2022 <a href="https://surfshark.com/blog/data-breach-statistics-by-country">report by Surfshark</a>, a Netherlands-based VPN company. Another report <a href="https://analyticsindiamag.com/the-ridiculous-17-5-cr-for-a-data-breach/">on the cost of data breaches researched by the Ponemon Institute and published by IBM</a> reveals that the breach of about 29500 records between March 2021 and March 2022 resulted in a 25% increase in the average cost from INR 165 million in 2021 to INR 176 million in 2022.</p>
<p style="text-align: justify; "><span>These statistics are certainly a cause for concern, especially in the context of India’s rapidly burgeoning digital economy shaped by the pervasive platformization of private and public services such as welfare, banking, finance, health, and shopping among others. Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.</span></p>
<p style="text-align: justify; "><span>While expert articulations of cybersecurity in general and data breaches in particular tend to predominate the public discourse on data privacy, this post aims to situate broader understandings of data breaches within the historical context of India’s IT revolution and delve into specific concepts and terminology that have shaped the broader discourse on data protection. The late 1990s and early 2000s offer a useful point of entry into the genesis of the data security landscape in India.</span></p>
<h3><span></span><span>Data Breaches and their Predecessor Forms</span></h3>
<p style="text-align: justify; "><span></span><span>The articulation of data security concerns around the late 1990s and early 2000s isn’t always consistent in deploying the phrase, ‘data breach’ to signal cybersecurity concerns in India. The terms such as ‘data/ identity theft’ and ‘data leak’ figure prominently in the public articulation of concerns with the handling of personal information by IT systems, particularly in the context of business process outsourcing (BPO) and e-commerce activities. Other pertinent terms such as “security breach”, “data security”, and ‘“cyberfraud” also capture the specificity of growing concerns around outsourced data to India. At the time, i.e. around mid-2000s regulatory frameworks were still evolving to accommodate and address the complexities arising from a dynamic reconfiguration of the telecommunications and IT landscape in India.</span></p>
<p dir="ltr" style="text-align: justify; ">Some of the formative cases that instantiate the usage of the aforementioned terms are instructive to understand shifts in the reporting of such incidents over time. The earliest case during that period concerns<a href="https://www.stop-source-code-theft.com/source-code-theft-cases-in-india/"> a 2002 case concerning the theft and sale of source code</a> by an IIT Kharagpur student who intended to sell the code to two undercover FBI agents who worked with the CBI to catch the thief. A straightforward case of data theft was framed by media stories around the time as a <a href="https://timesofindia.indiatimes.com/iitian-held-for-stealing-software-source-code/articleshow/20389713.cms">cybercrime involving the illegal sale</a> of the source code of a software package, as <a href="https://economictimes.indiatimes.com/ip-laws-lax-but-us-firm-bets-on-india/articleshow/696197.cms?from=mdr">software theft of intellectual property in the context of outsourcing</a> and as an instance of <a href="https://www.computerworld.com/article/2573515/at-risk-offshore.html">industrial espionage in poor nations without laws protecting foreign companies</a>. This case became the basis of the earliest calls for the protection of data privacy and security in the context of the Indian BPO sector. The Indian IT Act, 2000 at the time only covered <a href="http://pavanduggal.com/wp-content/uploads/2016/01/India-Responds-to-Growing-Concerns-Over-Data-Security.pdf">unauthorized access and data theft from computers and networks without any provisions for data protection, interception or computer forgery</a>. The BPO boom in India brought with it <a href="https://blj.ucdavis.edu/archives/vol-6-no-2/offshore-outsourcing-to-india.html">employment opportunities for India’s English-speaking, educated youth but in the absence of concrete data privacy legislation</a>, the country was regarded as an unsafe destination for outsourcing aside from the political ramifications concerning the loss of American jobs.</p>
<p dir="ltr" style="text-align: justify; ">In a major 2005 incident, employees of the Mphasis BFL call centre in Pune extracted sensitive bank account information of Citibank’s American customers to divert INR 1.90 crore into new accounts set up in India. The media coverage of this incident calls it <a href="https://www.indiatoday.in/magazine/economy/story/20050502-pune-call-centre-fraud-rattles-india-booming-bpo-sector-787790-2005-05-01">India’s first outsourcing cyberfraud and a well planned scam</a>, a <a href="https://economictimes.indiatimes.com/mphasis-call-centre-fraud-net-widens/articleshow/1077097.cms">cybercrime in a globalized world</a>, and a case of <a href="https://timesofindia.indiatimes.com/home/sunday-times/deep-focus/indias-first-bpo-scam-unraveled/articleshow/1086438.cms">financial fraud and a scam</a> that required no hacking skills, and a <a href="https://www.infoworld.com/article/2668975/indian-call-center-workers-charged-with-citibank-fraud.html">case of data theft and misuse</a>. Within the ambit of cybercrime, media reports of these incidents refer to them as cases of “fraud”, “scam” and “theft''.</p>
<p dir="ltr" style="text-align: justify; ">Two other incidents in 2005 set the trend for a critical spotlight on data security practices in India. In a <a href="http://news.bbc.co.uk/2/hi/south_asia/4619859.stm">June 2005 incident, an employee of a Delhi-based BPO firm, Infinity e-systems, sold the account numbers and passwords of 1000 bank customers </a>to the British Tabloid, The Sun. The Indian newspaper, Telegraph India, carried an online story headlined, “<a href="https://www.telegraphindia.com/india/bpo-blot-in-british-backlash-indian-sells-secret-data/cid/873737">BPO Blot in British Backlash: Indian Sells Secret Data</a>,” which reported that the employee, Kkaran Bahree, 24, was set up by a British journalist, Oliver Harvey. Harvey filmed Bahree accepting wads of cash for the stolen data. Bahree’s theft of sensitive information is described both as a data fraud and a leak in the above 2005 BBC story by Soutik Biswar. Another story on the incident calls it a “<a href="https://www.rediff.com/money/2005/jun/24bpo3.htm">scam” involving the leakage of credit card information</a>. The use of the term ‘leak’ appears consistently across other media accounts such as a <a href="https://timesofindia.indiatimes.com/city/delhi/esearch-bpo-employee-sacked-still-missing/articleshow/1153017.cms">2005 story on Karan Bahree in the Times of India</a> and another story in the Economic Times about the Australian Broadcasting Corporation’s (ABC) sting operation similar to the one in Delhi, describing the scam by the <a href="https://economictimes.indiatimes.com/hot-links/bpo/karan-bahree-part-ii-shot-in-australia/articleshow/1201347.cms?from=mdr">fraudsters as a leak</a> of the online information of Australians. Another media account of the coverage describes the incident in more generic terms such as an “<a href="https://www.tribuneindia.com/2005/20050625/edit.htm">outsourcing crime</a>”.</p>
<p dir="ltr" style="text-align: justify; ">The other case concerned <a href="https://www.taylorfrancis.com/chapters/mono/10.4324/9781315610689-16/political-economy-data-security-bpo-industry-india-alan-chong-faizal-bin-yahya">four former employees of Parsec technologies who stole classified information and diverted calls from potential customers</a>, causing a sudden drop in the productivity of call centres managed by the company in November 2005. Another call centre <a href="http://news.bbc.co.uk/1/hi/uk/7953401.stm">fraud came to light in 2009 through a BBC sting operation in which British reporters went to Delhi </a>and secretly filmed a deal with a man selling credit card and debit card details obtained from Symantec call centres, which sold software made by Norton. This BBC story uses the term “breach” to refer to the incident.</p>
<p dir="ltr">In the broader framing of these cases generally understood as cybercrime, which received transnational media coverage, the terms “fraud”, “leak”, “scam”, and “theft” appear interchangeably. The term “data breach” does not seem to be a popular or common usage in these media accounts of the BPO-related incidents. A broader sense of breach (of confidentiality, privacy) figures in the media reportage in <a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr">implicitly racial terms of cultural trust</a>, as a matter of <a href="https://www.news18.com/news/business/bpo-staff-need-ethical-training-poll-248442.html">ethics and professionalism</a> and in the <a href="https://www.news18.com/news/business/sting-op-may-spell-doom-for-bpos-248260.html">language of scandal </a>in some cases.</p>
<p dir="ltr" style="text-align: justify; ">These early cases typify a specific kind of cybercrime concerning the theft or misappropriation of outsourced personal data belonging to British or American residents. What’s remarkable about these cases is the utmost sensitivity of the stolen personal information including financial details, bank account and credit/debit card numbers, passwords, and in one case, source code. While these cases rang the alarm bells on the Indian BPO sector’s data security protocols, they also directed attention to concerns around <a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr">the training of Indian employees on the ethics of data confidentiality and vetting through psychometric tests</a> for character assessment. In the wake of these incidents, the National Association of Software and Service Companies (NASSCOM), an Indian non-governmental trade and advocacy group,<a href="https://www.computerworld.com/article/2547959/outsourcing-to-india--dealing-with-data-theft-and-misuse.html"> launched a National Skills Registry for IT professionals to enable employers to conduct background checks</a> in 2006.</p>
<p dir="ltr" style="text-align: justify; ">These data theft incidents earned India a global reputation of an unsafe destination for business process outsourcing, seen to be lacking both, a culture of maintaining data confidentiality and concrete legislation for data protection at the time. Importantly, the incidents of data theft or misappropriation were also traceable back to a known source, a BPO employee or a group of malefactors, who often sold sensitive data belonging to foreign nationals to others in India.</p>
<p dir="ltr" style="text-align: justify; ">The phrase “data leak” also caught on in another register in the context of the widespread use of camera-equipped mobile phones in India. The 2004 Delhi MMS case offers an instance of a date leak, recapitulating the language of scandal in moralistic terms.</p>
<h3 dir="ltr">The Delhi MMS Case</h3>
<p dir="ltr" style="text-align: justify; ">The infamous 2004 incident involved two underage Delhi Public School (DPS) students who recorded themselves in a sexually explicit act on a cellular phone. After a fall out, the male student passed the low-resolution clip on to his friend in which his female friend’s face is seen. The clip, distributed far and wide in India, ended up on the famous e-shopping and auction website, bazee.com leading to <a href="https://indiancaselaw.in/avnish-bajaj-vs-state-dps-mms-scandal-case/">the arrest of the website’s CEO Avinash Bajaj for hosting the listing for sale</a>. Another similar case in 2004 mimicked the mechanics of visual capture through hand-held MMS-enabled mobile phones. A two-minute MMS of a top South-Indian actress <a href="https://timesofindia.indiatimes.com/india/web-of-sleaze-now-nude-video-of-top-actress/articleshow/966048.cms">taking a shower went viral on the Internet in 2004, the year when another MMS of two prominent Bollywood actors kissing</a> had already done the rounds. The <a href="https://www.journals.upd.edu.ph/index.php/plaridel/article/view/2392">MMS case also marked the onset of a national moral panic around the amateur uses of mobile phone technologies</a>, capable of corrupting young Indian minds under a sneaky regime of new media modernity. The MMS case, not strictly the classic case of a data breach - non-visual information generally stored in databases - became an iconic case of a data leak framed in the media as <a href="https://www.telegraphindia.com/india/scandal-in-school-shakes-up-delhi/cid/1667531">a scandal that shocked the country</a>, with calls for the regulation of mobile phone use in schools. The case continued its scandalous afterlife in a <a href="https://www.heraldgoa.in/Edit/dev-ds-leni-has-a-dps-mms-scandal-connection-/21344">2009 Bollywood film, Dev D</a> and another <a href="https://indianexpress.com/article/entertainment/entertainment-others/delhi-mms-scandal-inspires-dibakars-love-sex-aur-dhoka/">2010 film, Love, Sex and Dhokha</a>,</p>
<p dir="ltr" style="text-align: justify; ">Taken together, the BPO data thefts and frauds and the data leak scandals prefigure the contemporary discourse on data breaches in the second decade of the 21st century, or what may also be called the Decade of Datafication. The launch of the Indian biometric identity project, Aadhaar, in 2009, which linked access to public services and welfare delivery with biometric identification, resulted in large-scale data collection of the scheme’s subscribers. Such linking raised the spectre of state surveillance as alleged by the critics of Aadhaar, marking a watershed moment in the discourse on data privacy and protection.</p>
<h3 dir="ltr">Aadhaar Data Security and Other Data Breaches</h3>
<p dir="ltr" style="text-align: justify; ">Aadhaar was challenged in the Indian Supreme Court in 2012 when <a href="https://www.outlookindia.com/website/story/worries-about-the-aadhaar-monster/296790">it was made mandatory for welfare and other services such as banking, taxation and mobile telephony</a>. The national debate on the status of privacy as a cultural practice in Indian society and a fundamental right in the Indian Constitution led to two landmark judgments - the <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf">2017 Puttaswamy ruling</a> holding privacy to be a constitutional right subject to limitations and <a href="https://indiankanoon.org/doc/127517806/">the 2018 Supreme Court judgment holding mandatory Aadhaar to be constitutional only for welfare and taxation but no other service</a>.</p>
<p dir="ltr" style="text-align: justify; ">While these judgments sought to rein in Aadhaar’s proliferating mandatory uses, biometric verification remained the most common mode of identity authentication with <a href="https://www.businesstoday.in/latest/trends/story/aadhaar-not-mandatory-yet-organisations-pose-it-as-a-mandatory-document-335550-2022-05-29">most organizations claiming it to be mandatory for various purposes</a>. During the same period from 2010 onwards, a range of data security events concerning Aadhaar came to light. These included <a href="https://www.firstpost.com/tech/news-analysis/aadhaar-security-breaches-here-are-the-major-untoward-incidents-that-have-happened-with-aadhaar-and-what-was-actually-affected-4300349.html">app-based flaws, government websites publishing Aadhaar details of subscribers, third party leaks of demographic data, duplicate and forged Aadhaar cards and other misuses</a>.</p>
<p dir="ltr" style="text-align: justify; ">In 2015, the Indian government launched its ambitious <a href="https://indiancc.mygov.in/wp-content/uploads/2021/08/mygov-10000000001596725005.pdf">Digital India Campaign to provide government services to Indian citizens</a> through online platforms. Yet, data security breach incidents continued to increase, particularly the trade in the sale and purchase of sensitive financial information related to bank accounts and credit card numbers. The online availability of <a href="https://www.livemint.com/Industry/l5WlBjdIDXWehaoKiuAP9J/India-unprepared-to-tackle-online-data-security-report.html">a rich trove of data, accessible via a simple Google search without the use of any extractive software or hacking skills </a>within a thriving shadow economy of data buyers and sellers makes India a particularly vulnerable digital economy, especially in the absence of robust legislation. The lack of awareness around digital crimes and low digital literacy further exacerbates the situation given that datafication via government portals, e-commerce, and online apps has outpaced the enforcement of legislative frameworks for data protection and cybersecurity.</p>
<p dir="ltr" style="text-align: justify; ">In the context of Aadhaar data security issues, the term “data leak” seems to have more traction in media stories followed by the term “security breach”. Given the complexity of the myriad ways in which Aadhaar data has been breached, terms such as <a href="https://techcrunch.com/2022/06/13/aadhaar-leak-pm-kisan/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAADvQXtC19Gj80LSKVc5jLwnRsREalvM2f6dV3N9KmCs8be6_1Zbvu3J6abPmBxhLlUooLiOjg4JktYDDCXr0OYYvOZ5XFlXa6DfCJk97TvMXM-cs3uJbCJBA-ePqvAC5K4qGZSyDB4OykMEOIKXJpB0CTOourPRc5dBxFFq5JXlB">data leak and exposure</a> (of <a href="https://zeenews.india.com/personal-finance/aadhaar-data-breach-over-110-crore-indian-farmers-aadhaar-card-data-compromised-2473666.html">11 crore Indian farmers’ sensitive information</a>) add to the specificity of the data security compromise. The term “fraud” also makes a comeback in the context of <a href="https://www.business-standard.com/article/economy-policy/india-s-aadhaar-id-system-delivers-benefits-but-at-risk-of-widespread-fraud-122062400124_1.html">Aadhaar-related data security incidents</a>. These cases represent a mix of data frauds involving<a href="https://economictimes.indiatimes.com/news/india/alarm-over-fake-id-printing-websites-using-customer-data-for-cyber-fraud/articleshow/94742646.cms"> fake identities</a>, <a href="https://indianexpress.com/article/cities/delhi/in-new-age-data-theft-fraudsters-steal-thumb-prints-from-land-registries-7914530/">theft of thumb prints </a>for instance from land registries and inadvertent data leaks in numerous incidents involving <a href="https://techcrunch.com/2019/01/31/aadhaar-data-leak/">government employees in Jharkhand</a>, v<a href="https://www.firstpost.com/india/aadhaar-data-leak-details-of-7-82-cr-indians-from-ap-and-telangana-found-on-it-grids-database-6448961.html">oter ID information of Indian citizens in Andhra Pradesh and Telangana</a> and <a href="https://www.thehindu.com/sci-tech/technology/major-aadhaar-data-leak-plugged-french-security-researcher/article26584981.ece">activist reports of Indian government websites leaking Aadhaar data</a>.</p>
<p dir="ltr" style="text-align: justify; ">Aadhaar-related data security events parallel the increase in corporate data breaches during the decade of datafication. The term “data leak” again alternates with the term “data breach” in most media accounts while other terms such as “theft” and “scam” all but disappear in the media coverage of corporate data breaches.</p>
<p dir="ltr" style="text-align: justify; ">From 2016 onwards, incidents of corporate data breaches in India continued to rise. A massive <a href="https://thewire.in/banking/debit-card-breach-india-banking">debit card data breach involving the YES Bank ATMs and point-of-sale (PoS) machines </a>compromised through malware between May and July of 2016 resulted in the exposure of ATM PINs and non-personal identifiable information of customers. It went <a href="https://www.livemint.com/Industry/Ope7B0jpjoLkemwz6QXirN/SBI-Yes-Bank-MasterCard-deny-data-breach-of-own-systems.html">undetected for nearly three</a> months. Another data leak in 2018 concerned a <a href="https://www.zdnet.com/article/another-data-leak-hits-india-aadhaar-biometric-database/">system run by Indane, a state-owned utility company, which allowed anyone to download private information on all Aadhaar holders </a>including their names, services they were connected to and the unique 12-digit Aadhaar number. Data breaches continued to be reported in India concurrent with the incidents of data mismanagement related to Aadhaar. Some <a href="https://www.csoonline.com/article/3541148/the-biggest-data-breaches-in-india.html">prominent data breaches included </a>a cyberattack on the systems of airline data service provider SITA resulting in the leak of Air India passenger data, leakage of the personal details of the Common Admission Test (CAT) applicants, details of credit card and order preferences of Domino’s pizza customers on the dark web, leakage of COVID-19 patients’ test results leaked by government websites, user data of Justpay and Big Basket for sale on the dark web and an SBI data breach among others between 2019 and 2021.</p>
<p dir="ltr" style="text-align: justify; ">The media reportage of these data breaches use the term “cyberattack” to describe the activities of hackers and cybercriminals operating within a<a href="https://www.thehindu.com/sci-tech/technology/internet/most-damaging-cybercrime-services-are-cheap-on-the-dark-web/article37004587.ece"> shadow economy or the dark web</a>. Recent examples of cyberattacks by hackers who leak user data for sale on the dark web include <a href="https://indianexpress.com/article/technology/tech-news-technology/mobikwik-database-leaked-on-dark-web-company-denies-any-data-breach-7251448/">8.2 terabytes of 110 million sensitive financial data (KYC details, Aadhaar, credit/debit cards and phone numbers) of the payments app MobiKwik users</a>, <a href="https://www.firstpost.com/tech/news-analysis/dominos-india-data-breach-name-location-mobile-number-email-of-18-crore-orders-up-for-sale-on-dark-web-9650591.html">180 million Domino’s pizza orders (name, location, emails, mobile numbers),</a> and <a href="https://techcrunch.com/2022/07/18/cleartrip-data-breach-dark-web/">Flipkart’s Cleartrip users’ data</a>. In these incidents again, three terms appear prominently in the media reportage - cyberattack, data breach, and leak. The term “data breach” remains the most frequently used epithet in the media coverage of the lapses of data security. While it alternates with the term “leak” in the stories, the term “data breach” appears consistently across most headlines in the news stories.</p>
<p dir="ltr">The exposure of sensitive, personal, and non-personal data by public and private entities in India is certainly a cause for concern, given the ongoing data protection legislative vacuum.</p>
<p dir="ltr" style="text-align: justify; ">The media coverage of data breaches tends to emphasize the quantum of compromised user data aside from the types of data exposed. The media framing of these breaches in <a href="https://www.livemint.com/technology/tech-news/indian-firms-lost-176-million-to-data-breaches-last-fiscal-11658914231530.html">quantitative terms of financial loss</a> as well as the <a href="https://www.indiatoday.in/technology/news/story/personal-data-of-3-4-million-paytm-mall-users-reportedly-exposed-in-2020-data-breach-1980690-2022-07-27">magnitude</a> and the <a href="https://www.moneycontrol.com/news/business/banks/indian-banks-reported-248-data-breaches-in-last-four-years-says-government-8940891.html">number of breaches</a> certainly highlights the gravity of these incidents but harm to individual users is often not addressed.</p>
<h3 dir="ltr">Evolving Terminology and the Source of Data Harms</h3>
<p dir="ltr" style="text-align: justify; ">The main difference in the media reportage of the BPO cybersecurity incidents during the early aughts and the contemporary context of datafication is the usage of the term, “data breach”, which figures prominently in contemporary reportage of data security incidents but not so much in the BPO-related cybercrimes.</p>
<p dir="ltr" style="text-align: justify; ">THe BPO incidents of data theft and the attendant fraud must be understood in the context of the anxieties brought on by a globalizing world of Internet-enabled systems and transnational communications. In most of these incidents regarded as cybercrimes, the language of fraud and scam ventures further to attribute such illegal actions of the identifiable malefactors to cultural factors such as lack of ethics and professionalism.The usage of the term “data leak” in these media reports functions more specifically to underscore a broader lapse in data security as well as a lack of robust cybersecurity laws. The broader term, “breach”, is occasionally used to refer to these incidents but the term, “data breach” doesn’t appear as such.</p>
<p dir="ltr" style="text-align: justify; ">The term “data breach” gains more prominence in media accounts from 2009 onwards in the context of Aadhaar and the online delivery of goods and services by public and private players. The term “data breach” is often used interchangeably with the term “leak” within the broader ambit of cyberattacks in the corporate sector. The media reportage frames Aadhaar-related security lapses as instances of security/data breaches, data leaks, fraud, and occasionally scam.</p>
<p dir="ltr" style="text-align: justify; ">In contrast to the handful of data security cases in the BPO sector, data breaches have abounded in the second decade of the twenty-first century. What further differentiates the BPO-related incidents to the contemporary data breaches is the source of the data security lapse. Most corporate data breaches remain attributable to the actions of hackers and cybercriminals while the BPO security lapses were traceable back to ex-employees or insiders with access to sensitive data. We also see in the coverage of the BPO-related incidents, the attribution of such data security lapses to cultural factors including a lack of ethics and professionalism often in racial overtones. The media reportage of the BBC and ABC sting operations suggests that the India BPOs lack of preparedness to handle and maintain personal data confidentiality of foreigners point to the absence of a privacy culture in India. Interestingly, this transnational attribution recurs in a different form in the national debate on <a href="https://huffpost.netblogpro.com/archive/in/entry/indians-don-t-care-about-privacy-but-thankfully-the-law-will-teach-them-what-it-means_a_23179031">Aadhaar and how Indians don’t care about their privacy</a>.</p>
<p dir="ltr" style="text-align: justify; ">The question of the harms of data breaches to individuals is also an important one. In the discourse on contemporary data breaches, the actual material harm to an individual user is rarely ever established in the media reportage and generally framed as potential harm that could be devastating given the sensitivity of the compromised data. The harm is reported to be predominantly a function of organizational cybersecurity weakness or attributed to hackers and cybercriminals.</p>
<p dir="ltr" style="text-align: justify; ">The reporting of harm in collective terms of the number of accounts breached, financial costs of a data breach, the sheer number of breaches and the global rankings of countries with the highest reported cases certainly suggests a problem with cybersecurity and the lack of organizational preparedness. However, this collective framing of a data breach’s impact usually elides an individual user’s experience of harm. Even in the case of Aadhaar-related breaches - a mix of leaking data on government websites and other online portals and breaches - the notion of harm owing to exposed data isn’t clearly established. This is, however, different from the <a href="https://scroll.in/article/1013700/six-types-of-problems-aadhaar-is-causing-and-safeguards-needed-immediately">extensively documented cases of Aadhaar-related issues</a> in which welfare benefits have been denied, identities stolen and legitimate beneficiaries erased from the system due to technological errors.</p>
<h3 dir="ltr">Future Directions of Research</h3>
<p dir="ltr" style="text-align: justify; ">This brief, qualitative foray into the media coverage of data breaches over two decades has aimed to trace the usage of various terms in two different contexts - the Indian BPO-related incidents and the contemporary context of datafication. It would be worth exploring at length, the relationship between frequent reports of data breaches, and the language used to convey harm in the contemporary context of a concrete data protection legislation vacuum. It would be instructive to examine the specific uses of the terms such as “fraud”, “leak”, “scam”, “theft” and “breach” in media reporting of such data security incidents more exhaustively. Such analysis would elucidate how media reportage shapes public perception towards the safety of user data and an anticipation of attendant harm as data protection legislation continues to evolve.</p>
<p dir="ltr" style="text-align: justify; ">Especially with Aadhaar, which represents a paradigm shift in identity verification through digital means, it would be useful to conduct a sentiment analysis of how biometric identity related frauds, scams, and leaks are reported by the mainstream news media. A study of user attitudes and behaviours in response to the specific terminology of data security lapses such as the terms “breach”, “leak”, “fraud”, “scam”, “cybercrime”, and “cyberattack” would further contribute to how lay users understand the gravity of a data security lapse. Such research would go beyond expert understandings of data security incidents that tend to dominate media reportage to elucidate the concerns of lay users and further clarify the cultural meanings of data privacy.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india'>http://editors.cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india</a>
</p>
No publisherPawan SinghPrivacyInternet GovernanceData GovernanceData ProtectionData Management2022-10-17T16:14:03ZBlog EntryNHA Data Sharing Guidelines – Yet Another Policy in the Absence of a Data Protection Act
http://editors.cis-india.org/internet-governance/blog/nha-data-sharing-guidelines
<b>In July this year, the National Health Authority (NHA) released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) just two months after publishing the draft Health Data Management Policy.</b>
<p>Reviewed and edited by Anubha Sinha</p>
<hr />
<p style="text-align: justify; ">Launched in 2018, PM-JAY is a public health insurance scheme set to cover 10 crore poor and vulnerable families across the country for secondary and tertiary care hospitalisation. Eligible candidates can use the scheme to avail of cashless benefits at any public/private hospital falling under this scheme. Considering the scale and sensitivity of the data, the creation of a well-thought-out data-sharing document is a much-needed step. However, the document – though only a draft – has certain portions that need to be reconsidered, including parts that are not aligned with other healthcare policy documents. In addition, the guidelines should be able to work in tandem with the Personal Data Protection Act whenever it comes into force. With no prior intimation of the publication of the guidelines, and the provision of a mere 10 days for consultation, there was very little scope for stakeholders to submit their comments and participate in the consultation. While the guidelines pertain to the PM-JAY scheme, it is an important document to understand the government’s concerns and stance on the sharing of health data, especially by insurance companies.</p>
<h3 style="text-align: justify; ">Definitions: Ambiguous and incompatible with similar policy documents</h3>
<p style="text-align: justify; ">The draft guidelines add to the list of health data–related policies that have been published since the beginning of the pandemic. These include three draft health data management policies published within two years, which have already covered the sharing and management of health data. The draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; in this case, the guidelines fail to refer to the draft National Digital Health Data Management Policy (published in April 2022). To add to this, the document – by placing the definitions at the end – is difficult to read and understand, especially when terms such as ‘beneficiary’, ‘data principal’, and ‘individual’ are used interchangeably. In the same vein, the document uses the terms ‘data principal’ and ‘data fiduciary’, and the definitions of health data and personal data, from the 2019 PDP Bill, while also referring to the IT Act SDPI Rules and its definition of ‘sensitive personal data’. While the guidelines state that the IT Act and Rules will be the legislation to refer to for these guidelines, it is to be noted that the IT Act under the SPDI Rules covers ‘body corporates’, which under Section 43A(1), is defined as “any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities;”. It is difficult to add responsibility and accountability to the organisations under the guidelines when they might not even be covered under this definition.</p>
<p style="text-align: justify; ">With each new policy, civil society organisations have been pointing out the need to have a data protection act before introducing policies and guidelines that deal with the processing and sharing of the data of individuals. Ideally, these policies – even in draft form – should have been published after the Personal Data Protection Bill was enacted, to ensure consistency with the provisions of the law. For example, the guidelines introduce a new category of governance mechanisms under the data-sharing committee headed by a data-sharing officer (DSO). The responsibilities and powers of the DSO are similar to that of the data protection officer under the draft PDP Bill as well as the National Data Health Management Policy (NHDMP). This, in turn, raises the question of whether the DSO and the DPOs under both the PDP Bill and the draft NDMP will have the same responsibilities. Clarity in terms of which of the policies are in force and how they intersect is needed to ensure a smooth implementation. Ideally, having multiple sources of definitions should be addressed at the drafting stage itself.</p>
<h3 style="text-align: justify; ">Guiding Principles: Need to look beyond privacy</h3>
<p style="text-align: justify; ">The guidelines enumerate certain principles to govern the use, collection, processing, and transmission of the personal or sensitive personal data of beneficiaries. These principles are accountability, privacy by design, choice and consent, openness/transparency, etc. While these provisions are much needed, their explanation at times misses the mark of why these principles were added. For example, in the case of accountability, the guidelines state that the ‘data fiduciary’ shall be accountable for complying with measures based on the guiding principles However, it does not specify who the fiduciaries would be accountable to and what the steps are to ensure accountability. Similarly, in the case of openness and transparency, the guidelines state that the policies and practices relating to the management of personal data will be available to all stakeholders. However, openness and transparency need to go beyond policies and practices and should consider other aspects of openness, including open data and the use of open-source software and open standards. This again will add to transparency, in that it would specify the rights of the data principal, as the current draft looks at the rights of the data principal merely from a privacy perspective. In the case of purpose limitation as well, the guidelines are tied to the privacy notice, which again puts the burden on the individual (in this case, beneficiary) when the onus should actually be on the data fiduciary. Lastly, under the empowerment of beneficiaries, the guidelines state that the “data principal shall be able to seek correction, amendments, or deletion of such data where it is inaccurate;”. The right to deletion should not be conditional on inaccuracy, especially when entering the scheme is optional and consent-based.</p>
<h3 style="text-align: justify; ">Data sharing with third parties without adequate safeguards</h3>
<p style="text-align: justify; ">The guidelines outline certain cases where personal data can be collected, used, or disclosed without the consent of the individual. One of these cases is when the data is anonymised. However, the guidelines do not detail how this anonymisation would be achieved and ensured through the life cycle of the data, especially when the clause states that the data will also be collected without consent. The guidelines also state that the anonymised data could be used for public health management, clinical research, or academic research. The guidelines should have limited the scope of academic research or added certain criteria to gain access to the data; the use of vague terminology could lead to this data (sometimes collected without consent) being de-anonymised or used for studies that could cause harm to the data principal or even a particular community. The guidelines state that the data can be shared as ‘protected health information’ with a government agency for oversight activities authorised by law, epidemic control, or in response to court orders. With the sharing of data, care should be taken to ensure data minimisation and purpose limitations that go beyond the explanations added in the body of the guidelines. In addition, the guidelines also introduce the concept of a ‘clean room’, which is defined as “a secure sandboxed area with access controls, where aggregated and anonymised or de-identified data may be shared for the purposes of developing inference or training models”. The definition does not state who will be developing these training models; it could be a cause of worry if AI companies or even insurance companies have the potential to use this data to train models that could eventually make decisions based on the results. The term ‘sandbox’ is explained under the now revoked DP Bill 2021 as “such live testing of new products or services in a controlled or test regulatory environment for which the Authority may or may not permit certain regulatory relaxations for a<br />specified period for the limited purpose of the testing”. Neither the 2019 Bill nor the IT Act/Rules defines ‘sandbox’; the guidelines should have ideally spent more time explaining how the sandbox system in the ‘Clean Room’ works.</p>
<h3 style="text-align: justify; ">Conclusion</h3>
<p style="text-align: justify; ">The draft Data Sharing Guidelines are a welcome step in ensuring that the entities sharing and processing data have guidelines to adhere to, especially since the Data Protection Bill has not been passed yet. The mention of the best practices for data sharing in annexures, including practices for people who have access to the data, is a step in the right direction, which could be made better with regular training and sensitisation. While the guidelines are a good starting point, they still suffer from the issues that have been highlighted in similar health data policies, including not referring to older policies, adding new entities, and the reliance on digital and mobile technology. The guidelines could have added more nuance to the consent and privacy by design sections to ensure other forms of notice, e.g., notice in audio form in different Indian languages. While PM-JAY aims to reach 10 crore poor and vulnerable families, there is a need to look at how to ensure that consent is given according to the guidelines that are “free, informed, clear, and specific”.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/nha-data-sharing-guidelines'>http://editors.cis-india.org/internet-governance/blog/nha-data-sharing-guidelines</a>
</p>
No publisherShweta Mohandas and Pallavi BediIT ActInternet GovernanceData ProtectionPrivacy2022-09-29T15:17:24ZBlog EntryCCTVs in Public Spaces and the Data Protection Bill, 2021
http://editors.cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021
<b>This article has been authored by Ms. Anamika Kundu, Research Assistant at the Centre for Internet and Society, and Digvijay S. Chaudhary, Researcher at the Centre for Internet and Society. This blog is a part of RSRR’s Blog Series on the Right to Privacy and the Legality of Surveillance, in collaboration with the Centre for Internet & Society.</b>
<p><span>The article by Anamika Kundu and Digvijay S. Chaudhary was originally </span><a class="external-link" href="https://rsrr.in/2022/04/20/cctv-surveillance-privacy/">published by RGNUL Student Research Review</a><span> on April 20, 2022</span></p>
<p><span><img src="http://editors.cis-india.org/home-images/Surveillance.jpg/@@images/f8fad564-44ab-46e2-bd44-29607ea7fd19.jpeg" alt="Surveillance" class="image-inline" title="Surveillance" /></span></p>
<hr />
<h2>Introduction</h2>
<p style="text-align: justify; ">In recent times, Indian cities have seen an expansion of state deployed CCTV cameras. According to a recent report, in terms of CCTVs deployed, Delhi was considered as the most surveilled city in the world, surpassing even the most surveilled cities in China. Delhi was not the only Indian city in that list, Chennai and Mumbai also made it to the list. In Hyderabad as well, the development of a Command and Control Centre aims to link the city’s surveillance infrastructure in real-time. Even though studies have shown that there is little correlation between CCTVs and crime control, deployment of CCTV cameras has been justified on the basis of national security and crime deterrence. Such an activity brings about the collection and retention of audio-visual/visual information of all individuals frequenting spaces where CCTV cameras are deployed. This information could be used to identify them (directly or indirectly) based on their looks or other attributes. Potential risks associated with the misuse, and processing of such personal data also arise. These risks include large scale profiling, criminal abuse (law enforcement misusing CCTV information for personal gains), and discriminatory targeting (law enforcement disproportionately focusing on a particular group of people). As these devices capture personal data of individuals, this article seeks data protection safeguards available to data principals against CCTV surveillance employed by the State in a public space under the proposed Data Protection Bill, 2021 (the “DPB”).</p>
<h2>Safeguards Available Under the Data Protection Bill, 2021</h2>
<p style="text-align: justify; ">To use CCTV surveillance, the measures and compliance listed under the DPB have to be followed. Obligations of data fiduciaries available under Chapter II, such as consent (clause 11), notice requirement (clause 7), and fair and reasonable processing (clause 5) are common to all data processing entities for a variety of activities. Similarly, as the DPB follows the principles of data minimisation (clause 6), storage limitation (clause 9), purpose limitation (clause 5), lawful and fair processing (clause 4), transparency (clause 23), and privacy by design (clause 22), these safeguards too are common to all data processing entities/activities. If a data fiduciary processes personal data of children, it has to comply with the standards stated under clause 16.</p>
<p style="text-align: justify; ">Under the DPB, compliance differs on the basis of grounds and purpose of data processing. As such, if compliance standards differ, so do the availability of safeguards under the DPB. Of relevance to this article, there are three standards of compliance under the DPB wherein the standards of safeguards available to a data principal differ. First, cases which would fall under Chapter III and hence, not require consent. Chapter III lists grounds for processing of personal data without consent. Second, cases which would fall under exemption clauses in Chapter VIII. In such cases, the DPB or some of its provisions would be inapplicable. Clause 35 under Chapter VIII gives power to the Central Government to exempt any agency from the application of the DPB. Similarly, Clause 36 under Chapter VIII, exempts certain provisions for certain processing of personal data. Third, cases which would not fall under either of the above Chapters. In such cases, all safeguards available under the DPB would be available to the data principals. Consequently, safeguards available to data principals in each of these standards are different. We will go through each of these separately.</p>
<p style="text-align: justify; ">First, if the grounds of processing of CCTV information is such that it falls under the scope of Chapter III of the DPB, wherein the consent requirement is done away with, then in those cases, the notice requirement has to reflect such purpose, meaning that even if consent is not necessary for certain cases, other requirements under the DPB would still apply. Here, we must note that CCTV deployment by the state on such a large scale may be justified on the basis of conditions stated under clauses 12 and 14 of DPB – specifically, the condition for the performance of state function authorised by law, and public interest. The requirement under clause 12 of “authorised by law” simply means that the state function should have legal backing. Deployment of CCTVs is most likely to fall under clause 12 as various states have enacted legislations providing for CCTV deployment in the name of public safety. As a result, even if section 12 takes away the requirement of consent for certain cases, data principals should be able to exercise all rights accorded to them under the DPB (chapter V) except the right to data portability under clause 19.</p>
<p style="text-align: justify; ">Second, processing of personal data via CCTVs by government agencies could be exempted from DPB under clause 35 for certain cases under the clause. Another exemption that is particularly concerning with regard to the use of CCTVs is the exemption provided under clause 36(a). Section 36(a) says that the provisions of chapters II-VII would not apply where the data is processed in the interest of prevention, detection, investigation, and prosecution of any offence under the law. Chapters II-VII govern the obligations of data fiduciaries, grounds where consent would not be required, personal data of children, rights of data principals, transparency and accountability measures, and restrictions on transfer of personal data outside India respectively. In these cases, the requirement of fair and reasonable processing under clause 5 would also not apply. As a broad justification provided for CCTVs deployment by the government is crime control, it is possible that section 36(a) justification can be used to exempt the processing of CCTV footage from the above-mentioned safeguards.</p>
<p style="text-align: justify; ">From the above discussion, the following can be concluded. First, if the grounds of processing fall under Chapter III, then standards of fair and reasonable processing, notice requirement, and all rights except the right to data portability u/s 19 would be available to data principals. Second, if the grounds of processing fall under clause 36, then, in that case, consent requirement, notice requirement, and the rights under DPB would be unavailable as that section mandates the non-application of those chapters. In such a case, even the processing requirements of a fair and reasonable manner stand suspended. Third, if the grounds of processing of CCTV information doesn’t fall under Chapter III, then all obligations listed under Chapter II would have to be followed. Moreover, the data principal would be able to exercise all the rights available under Chapter V of the DPB.</p>
<h2>Constitutional Standards</h2>
<p style="text-align: justify; ">When the Supreme Court recognised privacy as a fundamental right in the case of Puttaswamy v. Union of India (“Puttaswamy”), it located the principles of informed consent and purpose limitation as central to informational privacy. It recognised that privacy inheres not in spaces but in an individual. It also recognised that privacy is not an absolute right and certain restrictions may be imposed on the exercise of the right. Before listing the constitutional standards that activities infringing privacy must adhere to, it’s important to answer whether there exists a reasonable expectation of privacy in CCTV footage deployed in a public space by the State?</p>
<p style="text-align: justify; ">In Puttaswamy, the court recognised that privacy is not denuded in public spaces. Writing for the plurality judgement, Chandrachud J. recognised that the notion of a reasonable expectation of privacy has elements both of a subjective and objective nature. Defining these concepts, he writes, “Privacy at a subjective level is a reflection of those areas where an individual desire to be left alone. On an objective plane, privacy is defined by those constitutional values which shape the content of the protected zone where the individual ought to be left alone…hence while the individual is entitled to a zone of privacy, its extent is based not only on the subjective expectation of the individual but on an objective principle which defines a reasonable expectation.” Note how in the above sentences, the plurality judgement recognises “a reasonable expectation” to be inherent in “constitutional values”. This is important as the meaning of what’s reasonable is to be constituted according to constitutional values and not societal norms. A second consideration that the phrase “reasonable expectation of privacy” requires is that an individual’s reasonable expectation is allied to the purpose for which the information is provided, as held in the case of Hyderabad v. Canara Bank (“Canara Bank”). Finally, the third consideration in defining the phrase is that it is context dependent. For example, in the case of In the matter of an application by JR38 for Judicial Review (Northern Ireland) 242 (2015) (link here), the UK Supreme Court was faced with a scenario where the police published the CCTV footage of the appellant involved in riotous behaviour. The question before the court was: “Whether the publication of photographs by the police to identify a young person suspected of being involved in riotous behaviour and attempted criminal damage can ever be a necessary and proportionate interference with that person’s article 8 [privacy] rights?” The majority held that there was no reasonable expectation of privacy in the case because of the nature of the criminal activity the appellant was involved in. However, the majority’s formulation of this conclusion was based on the reasoning that “expectation of privacy” was dependent on the “identification” purpose of the police. The court stated, “Thus, if the photographs had been published for some reason other than identification, the position would have been different and might well have engaged his rights to respect for his private life within article 8.1”. Therefore, as the purpose of publishing the footage was “identification” of the wrongdoer, the reasonable expectation of privacy stood excluded. The Canara Bank case was relied on by the SC in Puttaswamy. The plurality judgement in Puttaswamy also quoted the above paragraphs from the UK Supreme Court judgement.</p>
<p style="text-align: justify; ">Finally, the SC in the Aadhaar case, laid down the factors of “reasonable expectation of privacy.” Relying on those factors, the Supreme Court observed that demographic information and photographs do not raise a reasonable expectation of privacy. It further held that face photographs for the purpose of identification are not covered by a reasonable expectation of privacy. As this author has recognised, the majority in the Aadhaar case misconstrued the “reasonable expectation of privacy” to lie not in constitutional values as held in Puttaswamy but in societal norms. Even with the misapplication of the Puttaswamy principles by the majority in Aadhaar, it is clear that the exclusion of a “reasonable expectation of privacy” in face photographs is valid only for the purpose of “identification”. For purposes other than “identification”, there should exist a reasonable expectation of privacy in CCTV footage. Having recognised the existence of “reasonable expectation of privacy” in CCTV footage, let’s see how the safeguards mentioned under the DPB stand the constitutional standards of privacy laid down in Puttaswamy.</p>
<p style="text-align: justify; ">The bench in Puttaswamy located privacy not only in Article 21 but the entirety of part III of the Indian Constitution. Where transgression to privacy relates to different provisions under Part III, the tests evolved under those Articles would apply. Puttaswamy recognised that national security and crime control are legitimate state objectives. However, it also recognised that any limitation on the right must satisfy the proportionality test. The proportionality test requires a legitimate state aim, rational nexus, necessity, and balancing of interests. Infringement on the right to privacy occurs under the first and second standard. The first requirement of proportionality stands justified as national security and crime control have been recognised to be legitimate state objectives. However, it must be noted that the EU Guidelines on Processing of Personal Data through video devices state that the mere purpose of “safety” or “for your safety” is not sufficiently specific and is contrary to the principle that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject. The second requirement is a rational nexus. As stated above, there is little correlation between crime control and surveillance measures. Even if the state justifies a rational nexus between state aim and the action employed, it is the necessity part of the proportionality test where the CCTV surveillance measures fail (as explained by this author). Necessity requires us to draw a list of alternatives and their impact on an individual, and then do a balancing analysis with regard to the alternatives. Here, judicial scrutiny of the exemption order under clause 35 is a viable alternative that respects individual rights while at the same time, not interfering with the state’s aim.</p>
<h2>Conclusion</h2>
<p style="text-align: justify; ">Informed consent and purpose limitation were stated to be central principles of informational privacy in Puttaswamy. Among the three standards we identified, the principles of informed consent and purpose limitation remain available only in the third standard. In the first standard, even though the requirement of consent has become unavailable, the principle of purpose limitation would still be applicable to the processing of such data. The second standard is of particular concern wherein neither of those principles is available to data principals. It is worth mentioning here that in large scale monitoring activities such as CCTV surveillance, the safeguards which the DPB lists out would inevitably have an implementation flaw. The reason is that in scenarios where individuals refuse consent for large scale CCTV monitoring, what alternatives would the government offer to those individuals? Practically, CCTV surveillance would fall under clause 12 standards where consent would not be required. Even in those cases, would the notice requirement safeguard be diminished to “you are under surveillance” notices? When we talk about exercise of rights available under the DPB, how would an individual effectively exercise their right when the data processing is not limited to a particular individual? These questions arise because the safeguards under the DPB (and data protection laws in general) are based on individualistic notions of privacy. Interestingly, individual use cases of CCTVs have also increased with an increase in state use of CCTVs. Deployment of CCTVs for personal or domestic purposes would be exempt from the above-mentioned compliances as that would fall under the exemption provision of clause 36(d). Two additional concerns arise in relation to processing of data concerning CCTVs – the JPC report’s inclusion of Non-Personal Data (“NPD”) within the ambit of DPB, and the government’s plan to develop a National Automated Facial Recognition System (“AFRS”). A significant part of the data collected by CCTVs would fall within the ambit of NPD.With the JPC’s recommendation, it will be interesting to follow the processing standards for NPD under the DPB. AFRS has been imagined as a national database of photographs gathered from various agencies to be used in conjunction with facial recognition technology. The use of facial recognition technology with CCTV cameras raises concerns surrounding biometric data, and risks of large scale profiling. Indeed, section 27 of the DPB reflects this risk and mandates a data protection impact assessment to be undertaken by the data fiduciary with respect to processing involving new technologies or large scale profiling or use of biometric data by such technologies, however the DPB does not define what “new technology” means. Concerns around biometric data are outside the scope of the present article, however, it would be interesting to look at how the use of facial recognition technology with CCTVs could impact the safeguards under DPB.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021'>http://editors.cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021</a>
</p>
No publisherAnamika Kundu and Digvijay S ChaudharyInternet GovernanceData ProtectionPrivacy2022-04-28T02:29:42ZBlog EntryPersonal Data Protection Bill must examine data collection practices that emerged during pandemic
http://editors.cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic
<b>The PDP bill is speculated to be introduced during the winter session of the parliament soon. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen’s data in order to fulfil its responsibilities. The bill could ensure that employers have some responsibility towards the data they collect from the employees.
</b>
<p>The article by Shweta Mohandas and Anamika Kundu was <a class="external-link" href="https://www.news9live.com/technology/personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic-137031?infinitescroll=1">originally published by <strong>news nine</strong></a> on November 29, 2021.</p>
<hr />
<p style="text-align: justify; ">The Personal Data Protection Bill (PDP) is speculated to be introduced during the winter session of the parliament soon, and the report of the Joint Parliamentary Committee (JPC) has already been <a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">adopted</a> by the committee on Monday. The Report of the JPC comes after almost two years of deliberation and secrecy over how the final version of the Personal Data Protection Bill will be. Since the publication of the <a class="external-link" href="https://prsindia.org/files/bills_acts/bills_parliament/2019/Personal%20Data%20Protection%20Bill,%202019.pdf">2019 version</a> of the PDP Bill, the Covid 19 pandemic and the public safety measures have opened the way for a number of new organisations and reasons to collect personal data that was non-existent in 2019. Hence along with changes that have been suggested by multiple civil society organisations, the dissent notes submitted by the members of the JPC, the new version of the PDP Bill must also look at how data processing has changed over the span of two years.</p>
<h3 style="text-align: justify; ">Concerns with the bill</h3>
<p style="text-align: justify; ">At the outset there are certain parts of the PDP Bill which need to be revised in order to uphold the spirit of privacy and individual autonomy laid out in the Puttaswamy judgement. The two sections that need to be in line with the privacy judgement are the ones that allow for non consensual processing of data by the government, and by employers. The PDP Bill in its current form provides wide-ranging exemptions which allow government agencies to process citizen's data in order to fulfil its <a class="external-link" href="https://www.livemint.com/news/india/big-brother-on-top-in-data-protection-bill-11576164271430.html">responsibilities</a>.</p>
<p style="text-align: justify; ">In the <a class="external-link" href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf">2018 version</a> of bill, drafted by the Justice Srikrishna Committee exemptions granted to the State with regard to processing of data was subject to a four pronged test which required the processing to be (i) authorised by law; (ii) in accordance with the procedure laid down by the law; (iii) necessary; and (iv) proportionate to the interests being achieved. This four pronged test was in line with the principles laid down by the Supreme Court in the Puttaswamy judgement. The 2019 version of the PDP Bill has diluted this principle by merely retaining the 'necessity principle' and removing the other requirements which is not in consonance with the test laid down by the Supreme Court in Puttaswamy.</p>
<p style="text-align: justify; ">Section 35 was also widely discussed in the panel meetings where members had <a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">argued</a> the removal of 'public order' as a ground for exemption. The panel also insisted for '<a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">judicial or parliamentary oversight</a>' to grant such exemptions. The final report did not accept these suggestions stating a need to balance <a class="external-link" href="https://www.thehindu.com/news/national/parliamentary-panel-retains-controversial-exemption-clause-in-personal-data-protection-bill/article37633344.ece">national security, liberty and privacy</a> of an individual. There ought to be prior judicial review of the written order exempting the governmental agency from any provisions of the bill. Allowing the government to claim an exemption if it is satisfied to be "necessary or expedient" can be misused.</p>
<p style="text-align: justify; ">Another clause which gives the data principal a wide berth is with respect to employee data Section 13 of the current version of the bill provides the employer with a leeway into processing employee data (other than sensitive personal data) without consent based on two grounds: when consent is not appropriate, or when obtaining consent would involve disproportionate effort on the part of the employer.</p>
<p style="text-align: justify; ">The personal data so collected can only be collected for recruitment, termination, attendance, provision of any service or benefit, and assessing performance. This covers almost all of the activities that require data of the employee. Although the 2019 version of the bill excludes non-consensual collection of sensitive personal data (a provision that was missing in the 2018 version of the bill), there is still a lot of scope to improve this provision and provide employees further right to their data. At the outset the bill does not define employee and employer, which could result in confusion as there is no one definition of these terms across Indian Labour Laws.</p>
<p style="text-align: justify; ">Additionally, the bill distinguishes between employee and consumer, where the consumer of the same company or service has a greater right to their data than an employee. In the sense that the consumer as a data principal has the option to use any other product or service and also has the right to withdraw consent at any time, in the case of an employee the consequence of refusing consent or withdrawing consent would be being terminated from the employment. It is understood that there is a requirement for employee data to be collected, and that consent does not work the same way as it does in the case of a consumer.</p>
<p style="text-align: justify; ">The bill could ensure that employers have some responsibility towards the data they collect from the employees, such as ensuring that they are only used for the purpose for which they were collected, the employee knows how long their data will be retained, and know if the data is being processed by third parties. It is also worth mentioning that the Indian government is India's largest employer spanning a variety of agencies and public enterprises.</p>
<h3 style="text-align: justify; ">Concerns highlighted by JPC Members</h3>
<p style="text-align: justify; ">Going back to the few members of the JPC who have moved dissent notes, specifically with regard to governmental exemptions. Jairam Ramesh filed a <a href="https://www.news9live.com/india/parliament-panel-adopts-report-on-data-protection-amid-dissent-by-opposition-135591">dissent note</a>, to which many other opposition members followed suit. While Jairam Ramesh praised the JPC's functioning, he disagreed with certain aspects of the Report. According to him, the 2019 bill is designed in a manner where the right to privacy is given importance only in cases of private activities. He raised concerns regarding the unbridled powers given to the government to exempt itself from any of the provisions.</p>
<p style="text-align: justify; ">The amendment suggested by him would require parliamentary approval before exemption would take place. He also added that Section 12 of the bill which provided certain scenarios where consent was not needed for processing of personal data should have been made '<a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html">less sweeping</a>'. Similarly, Gaurav Gogoi's <a href="https://www.hindustantimes.com/india-news/mps-file-dissent-notes-over-glaring-lacunae-in-report-on-data-protection-bill-101637566365637.html">note</a> stated that the exemptions would create a surveillance state and similarly criticised Section 12 and 35 of the bill. He also mentioned that there ought to be parliamentary oversight for the exemptions provided in the bill.</p>
<p style="text-align: justify; ">On the same issue, Congress leader Manish Tiwari noted that the bill creates '<a href="https://timesofindia.indiatimes.com/business/india-business/personal-data-protection-bill-what-is-it-and-why-is-the-opposition-so-unhappy-with-it/articleshow/87869391.cms">parallel universes</a>' - one for the private sector which needs to be compliant and the other for the State which can exempt itself. He has opposed the entire bill stating there exists an "inherent design flaw". He has raised specific objections to 37 clauses and stated that any blanket exemptions to the state goes against the Puttaswamy Judgement.</p>
<p style="text-align: justify; ">In their joint <a href="https://www.news9live.com/india/tmc-congress-mps-submit-dissent-notes-to-joint-panel-on-personal-data-protection-bill-135491">dissent note</a>, Derek O'Brien and Mahua Mitra have said that there is a lack of adequate safeguards to protect the data principals' privacy and the lack of time and opportunity for stakeholder consultations. They have also pointed out that the independence of the DPA will cease to exist with the present provision of allowing the government powers to choose members and the chairman. Amar Patnaik is to object to the lack of inclusion of state level authorities in the bill. Without such bodies, he says, there would be federal override.</p>
<h3 style="text-align: justify; ">Conclusion</h3>
<p style="text-align: justify; ">While a number of issues were highlighted by civil society, the members of the JPC, and the media, the new version of the bill should also need to take into account the shifts that have taken place in view of the pandemic. The new version of the data protection bill should take into consideration the changes and new data collection practices that have emerged during the pandemic, be comprehensive and leave very little provisions to be decided later by the Rules.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic'>http://editors.cis-india.org/internet-governance/blog/news-nine-shweta-mohandas-and-anamika-kundu-personal-data-protection-bill-must-examine-data-collection-practices-that-emerged-during-pandemic</a>
</p>
No publisherShweta Mohandas and Anamika KunduInternet GovernanceData ProtectionPrivacy2022-03-30T15:15:21ZBlog EntryNothing to Kid About – Children's Data Under the New Data Protection Bill
http://editors.cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill
<b>The pandemic has forced policymakers to adapt their approach to people's changing practices, from looking at contactless ways of payment to the shifting of educational institutions online.</b>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; ">The article was originally <a class="external-link" href="https://www.ijlt.in/post/nothing-to-kid-about-children-s-data-under-the-new-data-protection-bill">published in the Indian Journal of Law and Technology</a></p>
<hr />
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; ">For children, the internet has shifted from being a form of entertainment to a medium to connect with friends and seek knowledge and education. However, each time they access the internet, data about them and their choices are inadvertently recorded by companies and unknown third parties. The growth of EdTech apps in India has led to growing concerns regarding children's data privacy. This has led to the creation of a <a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank">self-regulatory</a> body, the Indian EdTech Consortium. More recently, the <a class="_1lsz7 _3Bkfb" href="https://economictimes.indiatimes.com/tech/startups/edtech-firms-work-to-get-communication-right-with-the-asci/articleshow/89082308.cms" rel="noopener noreferrer" target="_blank">Advertising Standard Council of India</a><span class="_3zM-5"> has </span>also started looking at passing a draft regulation to keep a check on EdTech advertisements.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; ">The Joint Parliamentary Committee (JPC), tasked with drafting and revising the Data Protection Bill, had to consider the number of changes that had happened after the release of the 2019 version of the Bill. While the most significant change was the removal of the term “personal data” from the title of the Bill, in a move to create a comprehensive Data Protection Bill that includes both personal and non personal data. Certain other provisions of the Bill also featured additions and removals. The JPC, in its revised version of the Bill has removed an entire class of <a class="_1lsz7 _3Bkfb" href="https://prsindia.org/billtrack/the-personal-data-protection-bill-2019#:~:text=Obligations%20of%20data%20fiduciary%3A%20A,specific%2C%20clear%20and%20lawful%20purpose" rel="noopener noreferrer" target="_blank">data fiduciaries</a> – guardian data fiduciary – which was tasked with greater responsibility for managing children's data. While the JPC justified the removal of the guardian data fiduciary stating that consent from the guardian of the child is enough to meet the end for which personal data of children are processed by the data fiduciary. While thought has been given to looking at how consent is given by the guardian on behalf of the child, there was no change in the age of children in the Bill. Keeping the age of consent under the Bill as the same as the age of majority to enter into a contract under the 1872 Indian Contract Act – 18 years – reveals the disconnect the law has with the ground reality of how children interact with the internet.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; ">In the current state of affairs where Indian children are navigating the digital world on their own there is a need to look deeply at the processing of children’s data as well as ways to ensure that children have information about consent and informational privacy. By placing the onus of granting consent on parents, the PDP Bill fails to look at how consent works in a privacy policy–based consent model and how this, in turn, harms children in the long run.</p>
<h3 class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d aujbK _3M0Fe _1FoOD iWv3d _1j-51 mm8Nw">1. Age of Consent</h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; ">By setting the age of consent as 18 years under the Data Protection Bill, 2021, it brings all individuals under 18 years of age under one umbrella without making a distinction between the internet usage of a 5-year-old child and a 16-year-old teenager. There is a need to look at the current internet usage habits of children and assess whether requiring parental consent is reasonable or even practical. It is also pertinent to note that the law in the offline world does make the distinction between age and maturity. For example, it has been <a class="_1lsz7 _3Bkfb" href="https://cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill" rel="noopener noreferrer" target="_blank">highlighted</a> that Section 82 of the Indian Penal Code, read with Section 83, states that any act by a child under the age of 12 years shall not be considered an offence, while the maturity of those aged between 12–18 years will be decided by the court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industries, which would qualify them to fall under Section 13 of the Bill, which deals with employee data.</p>
<p style="text-align: justify; "><span>A 2019 </span><a class="_1lsz7 _3Bkfb" href="https://reverieinc.com/wp-content/uploads/2020/09/IAMAI-Digital-in-India-2019-Round-2-Report.pdf" rel="noopener noreferrer" target="_blank">report</a><span> suggests that two-thirds of India’s internet users are in the 12–29 years age group, accounting for about 21.5% of the total internet usage in metro cities. With the emergence of cheaper phones equipped with faster processing and low internet data costs, children are no longer passive consumers of the internet. They have social media accounts and use several applications to interact with others and make purchases. There is a need to examine how children and teenagers interact with the internet as well as the practicality of requiring parental consent for the usage of applications.</span></p>
<p style="text-align: justify; "><span>Most applications that require age data request users to type in their date of birth; it is not difficult for a child to input a suitable date that would make it appear that they are </span><a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank">over 18</a><span>. In this case they are still children but the content that will be presented to them would be those that are meant for adults including content that might be disturbing or those involving use of </span><a class="_1lsz7 _3Bkfb" href="https://www.theguardian.com/media/2013/jul/26/children-lie-age-facebook-asa" rel="noopener noreferrer" target="_blank">alcohol and gambling. </a><span>Additionally, in their privacy policies, applications sometimes state that they are not suited for and restricted from users under 18. Here, data fiduciaries avoid liability by placing the onus on the user to declare their age and properly read and understand the privacy policy.</span></p>
<p style="text-align: justify; "><span>Reservations about the age of consent under the Bill have also been highlighted by some members of the JPC through their dissenting opinions. </span><a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank">MP Ritesh Pandey </a><span>suggested that the age of consent should be reduced to 14 years keeping the best interest of the children in mind as well as to support children in benefiting from technological advances. Similarly, </span><a class="_1lsz7 _3Bkfb" href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf#page=221" rel="noopener noreferrer" target="_blank">MP Manish Tiwari </a><span>in his dissenting opinion suggested regulating data fiduciaries based on the type of content they provide or data they collect.</span></p>
<h3><span>2. How is the 2021 Bill Different from the 2019 Bill?</span></h3>
<p style="text-align: justify; "><span>The </span><a class="_1lsz7 _3Bkfb" href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf" rel="noopener noreferrer" target="_blank">2019 </a><span>draft of the Bill consisted of a class of data fiduciaries called guardian data fiduciaries – entities that operate commercial websites or online services directed at children or which process large volumes of children’s personal data. This class of fiduciaries was barred from profiling, tracking, behavioural monitoring, and running targeted advertising directed at children and undertaking any other processing of personal data that can cause significant harm to the child. In the previous draft, such data fiduciaries were not allowed to engage in ‘profiling, tracking, behavioural monitoring of children, or direct targeted advertising at children’. There was also a prohibition on conducting any activities that might significantly harm the child. As per Chapter IV, any violation could attract a penalty of up to INR 15 crore of the worldwide turnover of the data fiduciary for the preceding financial year, whichever is higher. However, this separate class of data fiduciaries do not have any additional responsibilities. It is also unclear as to whether a data fiduciary that does not by definition fall within such a category would be allowed to engage in activities that could cause ‘significant harm’ to children.</span></p>
<p style="text-align: justify; "><span>The new Bill also does not provide any mechanisms for age verification and only lays down considerations that verification processes should be undertaken. Furthermore, the JPC has suggested that consent options available to the child when they attain the age of majority i.e. 18 years should be included within the rule frame by the Data Protection Authority instead of being an amendment in the Bill.</span></p>
<h3><span>3. In the Absence of a Guardian Data Fiduciary</span></h3>
<p style="text-align: justify; "><span>The 2018 and 2019 drafts of the PDP Bill consider a child to be any person below the age of 18 years. For a child to access online services, the data fiduciary must first verify the age of the child and obtain consent from their guardian. The Bill does not provide an explicit process for age verification apart from stating that regulations shall be drafted in this regard. The 2019 Bill states that the Data Protection Authority shall specify codes of practice in this matter. Taking best practices into account, there is a need for ‘</span><a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/project-brief-highlighting-inclusive-and-practical-mechanisms-to-protect-childrens-data.pdf" rel="noopener noreferrer" target="_blank">user-friendly and privacy-protecting age verification techniques</a><span>’ to encourage safe navigation across the internet. This will require </span><a class="_1lsz7 _3Bkfb" href="https://cuts-ccier.org/pdf/bp-global-technological-developments-in-age-verification-and-age-estimation.pdf" rel="noopener noreferrer" target="_blank">looking at </a><span>technological developments and different standards worldwide. There is a need to hold companies </span><a class="_1lsz7 _3Bkfb" href="https://www.livemint.com/opinion/columns/theres-a-better-way-to-protect-the-online-privacy-of-kids-11615306723478.html" rel="noopener noreferrer" target="_blank">accountable</a><span> for the protection of children’s online privacy and the harm that their algorithms cause children and to make sure that they are not continued.</span></p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 iWv3d b+iTF _78FBa _1FoOD iWv3d _1j-51 mm8Nw" style="text-align: justify; ">The JPC in the 2021 version of the Bill removed provisions about guardian data fiduciaries, stating that there was no advantage in creating a different class of data fiduciary. As per the JPC, even those data fiduciaries that did not fall within the said classification would also need to comply with rules pertaining to the personal data of children i.e. with Section 16 of the Bill. Section 16 of the Bill requires the data fiduciary to verify the child’s age and obtain consent from the parent/guardian. The manner of age verification has also een spelt out. Furthermore, since ‘significant data fiduciaries’ is an existing class, there is still a need to comply with rules related to data processing. The JPC also removed the phrase “in the best interests of, the child” and “is in the best interests of, the child” under sub-clause 16(1), implying that the entire Bill concerned the rights of the data principal and the use of such terms dilutes the purpose of the legislation and could give way to manipulation by the data fiduciary.</p>
<h3><span>Conclusion</span></h3>
<p style="text-align: justify; "><span>Over the past two years, there has been a significant increase in applications that are targeted at children. There has been a proliferation of EduTech apps, which ideally should have more responsibility as they are processing children's data. We recommend that instead of creating a separate category, such fiduciaries collecting children's data or providing services to children be seen as ‘significant data fiduciaries’ that need to take up additional compliance measures.</span></p>
<p style="text-align: justify; "><span>Furthermore, any blanket prohibition on tracking children may obstruct safety measures that could be implemented by data fiduciaries. These fears are also increasing in other jurisdictions as there is a likelihood to restrict data fiduciaries from using software that looks out for such as </span><a class="_1lsz7 _3Bkfb" href="https://www.unodc.org/e4j/en/cybercrime/module-12/key-issues/online-child-sexual-exploitation-and-abuse.html" rel="noopener noreferrer" target="_blank">Child Sexual Abuse Material</a><span> as well as online predatory behaviour. Additionally, concerning the age of consent under the Bill, the JPC could look at international best practices and come up with ways to make sure that children can use the internet and have rights over their data, which would enable them to grow up with more awareness about data protection and privacy. One such example to look at could be the Children's Online Privacy Protection Rule (COPPA) in the US, where the rules apply to operators of websites and online services that collect personal information from kids </span><a class="_1lsz7 _3Bkfb" href="https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance" rel="noopener noreferrer" target="_blank">under 13 </a><span>or provide services to children that are directed at a general audience, but have actual knowledge that they collect personal information from such children. A form of combination of this system and the significant data fiduciary classification could be one possible way to ensure that children’s data and privacy are preserved online.</span></p>
<hr />
<p>The authors are researchers at the Centre for Internet and Society and thank their colleague Arindrajit Basu for his inputs.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill'>http://editors.cis-india.org/internet-governance/blog/ijlt-shweta-mohandas-and-anamika-kundu-march-6-2022-nothing-to-kid-about-childrens-data-under-the-new-data-protection-bill</a>
</p>
No publisherShweta Mohandas and Anamika KunduDigitalisationDigital KnowledgeInternet GovernanceData ProtectionData Management2022-03-10T13:19:52ZBlog EntryClause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study
http://editors.cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study
<b>In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?</b>
<p>The blog post was <a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-digital-healthcare-case-study/">published in Medianama</a> on February 21, 2022. This is the second in a two-part series by Amber Sinha.</p>
<hr />
<p style="text-align: justify; ">In the <a href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/">previous post</a>, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.</p>
<p style="text-align: justify; ">To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.</p>
<h2 style="text-align: justify; ">A Stack Model for Big Data Ecosystem in Healthcare</h2>
<p style="text-align: justify; ">A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.</p>
<p style="text-align: justify; ">Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.</p>
<p>Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.</p>
<p style="text-align: justify; ">The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.</p>
<p style="text-align: justify; ">The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.</p>
<p style="text-align: justify; ">The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.</p>
<p style="text-align: justify; ">Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.</p>
<p style="text-align: justify; ">Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.</p>
<p style="text-align: justify; ">In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study'>http://editors.cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study</a>
</p>
No publisheramberData GovernanceInternet GovernanceData ProtectionPrivacy2022-03-01T15:07:44ZBlog EntryHow Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill
http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function
<b>The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.</b>
<p>The blog post was <a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/">published in Medianama</a> on February 18, 2022. This is the first of a two-part series by Amber Sinha.</p>
<hr />
<p style="text-align: justify; ">In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote <a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html">an opinion piece</a> providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash <a href="https://twitter.com/pranesh/status/1023116679440621568">pointed out</a> that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.</p>
<p style="text-align: justify; ">While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.</p>
<p style="text-align: justify; ">The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —</p>
<p>a) where it is necessary to respond to a medical emergency<br />b) where it is necessary for state to provide a service or benefit to the individual<br />c) where it is necessary for the state to issue any certification, licence or permit<br />d) where it is necessary under any central or state legislation, or to comply with a judicial order<br />e) where it is necessary for any measures during an epidemic, outbreak or public health<br />f) where it is necessary for safety procedures during disaster or breakdown of public order</p>
<p>In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.</p>
<h2>Twin restrictions in Clause 12</h2>
<p style="text-align: justify; ">The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.</p>
<p style="text-align: justify; ">What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.</p>
<h2 style="text-align: justify; ">Scope of privacy under Puttaswamy</h2>
<p style="text-align: justify; ">It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench was not required to provide a legal test to determine the extent and scope of the right to privacy, but they do provide sufficient guidance for us to contemplate how the limits and scope of the constitutional right to privacy could be determined in future cases.</p>
<p style="text-align: justify; ">The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.</p>
<p style="text-align: justify; ">In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —<br />a) the existence of a “law”<br />b) a “legitimate State interest”<br />c) the requirement of “proportionality”.</p>
<p style="text-align: justify; ">The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.</p>
<h2 style="text-align: justify; ">Proportionality and Clause 12</h2>
<p style="text-align: justify; ">It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —</p>
<p>a) the limiting measures must be carefully designed, or rationally connected, to the objective<br />b) they must impair the right as little as possible<br />c) the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.</p>
<p style="text-align: justify; ">The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.</p>
<p style="text-align: justify; ">The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.</p>
<p style="text-align: justify; ">However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.</p>
<p style="text-align: justify; ">Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.</p>
<p style="text-align: justify; ">The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'>http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</a>
</p>
No publisheramberData GovernanceInternet GovernanceData ProtectionPrivacy2022-03-01T14:56:49ZBlog EntryCIS Comments and Recommendations on the Data Protection Bill, 2021
http://editors.cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill
<b>This document is a revised version of the comments we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill.</b>
<p style="text-align: justify; ">After nearly two years of deliberations and a few changes in its composition, the Joint Parliamentary Committee (JPC), on 17 December 2021, submitted its report on the Personal Data Protection Bill, 2019 (2019 Bill). The report also contains a new version of the law titled the Data Protection Bill, 2021 (2021 Bill). Although there were no major revisions from the previous version other than the inclusion of all data under the ambit of the bill, some provisions were amended.</p>
<p style="text-align: justify; ">This document is a revised version of the<a href="https://cis-india.org/accessibility/blog/cis-comments-pdp-bill-2019"> comments</a> we provided on the 2019 Bill on 20 February 2020, with updates based on the amendments in the 2021 Bill. Through this document we aim to shed light on the issues that we highlighted in our previous comments that have not yet been addressed, along with additional comments on sections that have become more relevant since the pandemic began. In several instances our previous comments have either not been addressed or only partially been addressed; in such instances, we reiterate them.</p>
<p style="text-align: justify; ">These general comments should be read in conjunction with our previous recommendations for the reader to get a comprehensive overview of what has changed from the previous version and what has remained the same. This document can also be read while referencing the new Data Protection Bill 2021 and the JPC’s report to understand some of the significant provisions of the bill.</p>
<hr />
<p style="text-align: justify; "><strong><a href="http://editors.cis-india.org/internet-governance/general-comments-data-protection-bill.pdf" class="internal-link">Read on to access the comments</a> | </strong><span>Review and editing by Arindrajit Basu. Copy editing: The Clean Copy; Shared under Creative Commons Attribution 4.0 International license</span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill'>http://editors.cis-india.org/internet-governance/blog/pallavi-bedi-and-shweta-mohandas-cis-comments-on-data-protection-bill</a>
</p>
No publisherPallavi Bedi and Shweta MohandasInternet GovernanceData ProtectionPrivacy2022-02-14T16:07:44ZBlog EntryThe Competition Law Case Against Whatsapp’s 2021 Privacy Policy Alteration
http://editors.cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration
<b>Having examined the privacy implications of Whatsapp's changes to its privacy policy in 2021, this issue brief is the second output in our series examining the effects of those changes. This brief examines the changes in the context of data sharing between Whatsapp and Facebook as being an anticompetitive action in violation of the Indian Competition Act, 2002. </b>
<span id="docs-internal-guid-2e4a5c52-7fff-f416-6970-948314f0b524">
<p style="text-align: justify;" dir="ltr"> </p>
<h3 style="text-align: justify;">Executive Summary</h3>
<p style="text-align: justify;" dir="ltr">On January 4, 2021, Whatsapp announced a revised privacy policy through an in-app notification. It highlighted that the new policy would impact user interactions with business accounts, including those which may be using Facebook's hosting services. The updated policy presented users with the option of either accepting greater data sharing between Whatsapp and Facebook or being unable to use the platform post 15th May, 2021. The updated policy resulted in temporarily slowed growth for Whatsapp and increased growth for other messaging apps like Signal and Telegram. While Whatsapp has chosen to delay the implementation of this policy due to consumer outrage, it is important for us to unpack and understand what this (and similar policies) mean for the digital economy, and its associated competition law concerns. Competition law is one of the sharpest tools available to policy-makers to fairly regulate and constrain the unbridled power of large technology companies.</p>
<p style="text-align: justify;" dir="ltr">While it is evident the Indian competition landscape will benefit from revisiting the existing law and policy framework to reign in Big technology companies, we argue that the change in Whatsapp’s privacy policy in 2021 can be held anti-competitive using legal provisions as they presently stand. Therefore, in this issue brief, we largely limit ourselves to evaluating the legality of Whatsapp’s privacy policy within the confines of the present legal system. </p>
<p style="text-align: justify;" dir="ltr">First, we dive into an articulation of the present abuse of dominance framework in Indian Competition Law. Second, we analyze whether there was abuse of dominance-bearing in mind an economic analysis of Whatsapp’s role in the relevant market by using tests laid out in previous rulings of the CCI</p>
<br />
<p style="text-align: justify;" dir="ltr">The framework for determining abuse of dominance as per The Competition Act is based on three factors:</p>
<p style="text-align: justify;" dir="ltr">1. Determination of relevant market</p>
<p style="text-align: justify;" dir="ltr">2. Determination of dominant position</p>
<p style="text-align: justify;" dir="ltr">3. Abuse of the dominant position</p>
<br />
<p style="text-align: justify;" dir="ltr">In two previous orders in 2016 and 2020, CCI has held that Whatsapp is dominant in its relevant market based on several factors which we explore. These include:</p>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Advantage in user base, usage and reach,</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Barriers to entry for other competitors</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p style="text-align: justify;" dir="ltr">Power of acquisition over competitors.</p>
</li></ol>
<br />
<p style="text-align: justify;" dir="ltr">However, in both orders, CCI held that Whatsapp did not abuse its dominance by arguing that the practices in question allowed for user choice. We critique these judgments for not reflecting the market structures and exploitative practices of large technology companies. We also argue that even if we use the test of user choice laid down by the CCI in its previous orders concerning Whatsapp and Facebook, the changes made to the privacy policy in 2021 did abuse dominance,and should be held guilty of violating competition law standards.</p>
<p style="text-align: justify;" dir="ltr">Our analysis revolves around examining the explicit and implicit standards of user choice laid out by the CCI in its 2016 and 2020 judgements as the standard for evaluating fairness in an Abuse of Dominance claim.We demonstrate how the 2021 changes failed to meet these standards. </p>
<p style="text-align: justify;" dir="ltr">Finally, we conclude by noting that the present case offers a crucial opportunity for India to take a giant step forward in its regulation of big tech companies and harmonise its rulings with regulatory developments around the world.</p>
<p style="text-align: justify;" dir="ltr">The full issue brief can be found <a href="https://cis-india.org/internet-governance/whatsapp-privacy-policy-2021-issue-brief-competition-law">here</a></p>
<div> </div>
<p style="text-align: justify;" dir="ltr"> </p>
<p style="text-align: justify;" dir="ltr"> </p>
<div> </div>
</span>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration'>http://editors.cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration</a>
</p>
No publisherAman Nair and Arindrajit BasuConsumer RightsDigital EconomyData ProtectionFacebookCompetitionWhatsAppCompetition Law2021-03-24T16:12:09ZBlog EntryA Guide to Drafting Privacy Policy under the Personal Data Protection Bill, 2019
http://editors.cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill
<b>The Personal Data Protection Bill, 2019, (PDP Bill) which is currently being deliberated by the Joint Parliamentary Committee, is likely to be tabled in the Parliament during the winter session of 2021.</b>
<p style="text-align: justify;">The Bill in its current form, doesn’t have explicit transitory provisions i.e. a defined timeline for the enforcement of the provisions of the Bill post its notification as an enforceable legislation. Since the necessary subject matter expertise may be limited on short notice and out of budget for certain companies, we intend to release a series of guidance documents that will attempt to simplify the operational requirements of the legislation.</p>
<p style="text-align: justify;">Certain news reports had earlier suggested that the Joint Parliamentary Committee reviewing the Bill has proposed <a class="external-link" href="https://economictimes.indiatimes.com/news/politics-and-nation/parliamentary-panel-examining-personal-data-protection-bill-recommends-89-changes/articleshow/80138488.cms">89 new amendments and a new clause</a>. The nature and content of these amendments so far remain unclear. However, we intend to start the series by addressing some frequently asked questions around meeting the requirements of publishing a privacy notice and shall make the relevant changes post notification of the new Bill. The solutions provided in this guidance document are mostly based on international best practices and any changes in the solutions based on Indian guidelines and the revised PDP Bill will be redlined in the future.</p>
<p style="text-align: justify;">The frequently asked questions and other specific examples on complying with the requirements of publishing a privacy policy have been compiled based on informal discussions with stakeholders, unsolicited queries from smaller organizations and publicly available details from conferences on the impact of the Bill. We intend to conduct extensive empirical analysis of additional queries or difficulties faced by smaller organizations towards achieving compliance post the notification of the new Bill. Regardless, any smaller organizations(NGOs, start-ups etc.) interested in discussing compliance related queries can get in touch with us.</p>
<hr />
<p style="text-align: justify;">Click to download the <a href="http://editors.cis-india.org/internet-governance/guide-to-personal-data-protection-bill.pdf" class="internal-link">full report here</a>. The report was reviewed by Pallavi Bedi and Amber Sinha.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill'>http://editors.cis-india.org/internet-governance/blog/shweta-reddy-september-17-2021-a-guide-to-drafting-privacy-policy-under-personal-data-protection-bill</a>
</p>
No publishershwetarInternet GovernanceData ProtectionPrivacy2021-09-20T10:34:40ZBlog EntryBeyond the PDP Bill: Governance Choices for the DPA
http://editors.cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill
<b>This article examines the specific governance choices the Data Protection Authority (DPA) in India must deliberate on vis-à-vis its standard-setting function, which are distinct from those it will encounter as part of its enforcement and supervision functions.</b>
<p style="text-align: justify;">The Personal Data Protection Bill, 2019, was introduced in the Lok Sabha on 11 December 2019. It lays down an overarching framework for personal data protection in India. Once revised and approved by Parliament, it is likely to establish the first comprehensive data protection framework for India. However, the provisions of the Bill are only one component of the forthcoming data protection framework It further proposes setting up the Data Protection Authority (DPA) to oversee the final enforcement, supervision, and standard-setting. The Bill consciously chooses to vest the responsibility of administering the framework with a regulator instead of a government department. As an independent agency, the DPA is expected to be autonomous from the legislature and the Central Government and capable of making expert-driven regulatory decisions in enforcing the framework.</p>
<p style="text-align: justify;">Furthermore, the DPA is not merely an implementing authority; it is also expected to develop privacy regulations for India by setting standards. As such, it will set the day-to-day obligations of regulated entities under its supervision. Thus, the effectiveness with which it carries out its functions will be the primary determinant of the impact of this Bill (or a revised version thereof) and the data protection framework set out under it.</p>
<p style="text-align: justify;">The final version for the PDP Bill may or may not provide the DPA with clear guidance regarding its functions. In this article, we emphasise the need to look beyond the Bill and instead examine the specific governance choices the DPA must deliberate on vis-à-vis its standard-setting function, which are distinct from those it will encounter as part of its enforcement and supervision functions.</p>
<p style="text-align: justify;"><strong>A brief timeline of the genesis of a distinct privacy regulator for India</strong></p>
<p style="text-align: justify;">The vision of an independent regulator for data protection in India emerged over the course of several intervening processes that set out to revise India’s data protection laws. In fact, the need for a dedicated data protection regulation for India, with enforceable obligations and rights, was debated years before the <a href="https://thewire.in/government/privacy-aadhaar-supreme-court">Aadhaar</a>, <a href="https://www.thehindu.com/news/national/urgent-need-for-data-protection-laws-experts/article23314655.ece">Cambridge Analytica</a>, and <a href="https://www.livemint.com/opinion/online-views/pegasus-has-given-privacy-legislation-a-jab-of-urgency-11628181453098.html">Pegasus</a><sup> </sup>revelations captured the public imagination and mainstreamed conversations on privacy.</p>
<p style="text-align: justify;">The <a href="https://cis-india.org/internet-governance/draft-bill-on-right-to-privacy">Right to Privacy Bill, 2011</a>, which never took off, recognised the right to privacy in line with Article 21 of the Constitution of India, which pertains to the right to life and personal liberty. The Bill laid down express conditions for collecting and processing data and the rights of data subjects. It also proposed setting up a Data Protection Authority (DPA) to supervise and enforce the law and advise the government in policy matters. Upon review by the Cabinet, it was <a href="https://cis-india.org/internet-governance/draft-bill-on-right-to-privacy">suggested</a> that the Authority be revised to an Advisory Council, given its role under the Bill was limited.</p>
<p style="text-align: justify;">Subsequently, in 2012, the AP Shah Committee Report <a href="https://cis-india.org/internet-governance/blog/report-of-group-of-experts-on-privacy.pdf">recommended</a> a principle-based data protection law, focusing on set standards while refraining from providing granular rules, to be enforced through a co-regulatory structure. This structure would consist of central and regional-level privacy commissioners, self-regulatory bodies, and data protection officers appointed by data controllers. There were also a few private members’ bills <a href="https://saveourprivacy.in/media/all/Brief-PDP-Bill-25.12.2020.pdf">introduced</a> between 2011 and 2019.</p>
<p style="text-align: justify;">None of these efforts materialised, and the regulatory regime for data protection and privacy remained embedded within the Information Technology Act, 2000, and the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (SPDI Rules). Though the <a href="https://www.meity.gov.in/writereaddata/files/GSR313E_10511%281%29_0.pdf">SPDI Rules</a> require body corporates to secure personal data, their enforcement is <a href="https://www.indiacode.nic.in/show-data?actid=AC_CEN_45_76_00001_200021_1517807324077&orderno=49">limited</a> to cases of negligence in abiding by these limited set of obligations pertaining to sensitive personal information only, and which have caused wrongful loss or gain – a high threshold to prove for aggrieved individuals. Otherwise, the <a href="https://www.meity.gov.in/writereaddata/files/GSR314E_10511%281%29_0.pdf">Intermediary Guidelines</a>, 2011 require all intermediaries to generally follow these Rules under Rule 3(8). The enforcement of these obligations is <a href="https://www.ikigailaw.com/dispute-resolution-framework-under-the-information-technology-act-2000/#acceptLicense">entrusted</a> to adjudicating officers (AO) appointed by the central government, who are typically bureaucrats appointed as AOs in an ex-officio capacity.</p>
<p style="text-align: justify;">By 2017, the Aadhaar litigations had provided additional traction to the calls for a dedicated and enforceable data protection framework in India. In its judgement, the Supreme Court <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf">recognised</a> the right to privacy as a fundamental right in India and stressed the need for a dedicated data protection law. Around the same time, the Ministry of Electronics and Information Technology (MeitY) constituted a <a href="https://pib.gov.in/newsite/PrintRelease.aspx?relid=169420">committee of experts</a> under the chairmanship of Justice BN Srikrishna. The Srikrishna Committee undertook public consultations on a 2017 <a href="https://www.meity.gov.in/writereaddata/files/white_paper_on_data_protection_in_india_171127_final_v2.pdf">white paper</a>, which culminated in the nearly comprehensive <a href="https://www.meity.gov.in/writereaddata/files/Personal_Data_Protection_Bill,2018.pdf">Personal Data Protection Bill, 2018</a>, and an accompanying <a href="https://www.meity.gov.in/writereaddata/files/Data_Protection_Committee_Report.pdf">report</a>. This 2018 Bill outlined a regulatory framework of personal data processing for India and defined data processing entities as fiduciaries, which owe a duty of care to individuals to whom personal data relates. The Bill provided for the setting up of an independent regulator that would, among other things, specify further standards for data protection and administer and enforce the provisions of the Bill.</p>
<p style="text-align: justify;">MeitY invited public comments on this Bill and tabled a revised version, the Personal Data Protection <a href="http://164.100.47.4/BillsTexts/LSBillTexts/Asintroduced/373_2019_LS_Eng.pdf">Bill</a>, 2019 (PDP Bill), in the Lok Sabha in December 2019. Following public pressure calling for detailed discussions on the Bill before its passing, it was referred to a <a href="http://loksabhaph.nic.in/Committee/CommitteeInformation.aspx?comm_code=73&tab=1">Joint Parliamentary Committee</a> (JPC) constituted for this purpose. It currently remains under review; the JPC is <a href="https://www.hindustantimes.com/india-news/need-state-level-data-protection-authorities-joint-parliamentary-committee-mp-amar-patnaik-101632679181340.html">reportedly</a> expected to table its report in the 2021 Winter Session of Parliament. Though the Bill is likely to undergo another <a href="https://www.hindustantimes.com/india-news/over-100-drafting-changes-proposed-to-jpc-on-data-protection-bill-101631730726756.html">round of revisions</a> following the JPC’s review, this is the closest India has come to realising its aspirations of establishing a dedicated and enforceable data protection framework.</p>
<p style="text-align: justify;">This Bill carries forward the choice of a distinct regulatory body, though <a href="https://thewire.in/tech/india-data-protection-authority-needs-constitutional-entrenchment">questions remain</a> on the degree of its independence, given the direct control granted to the central government in appointing its members and funding the DPA.</p>
<p style="text-align: justify;"><strong>Conceptualising an Independent DPA</strong></p>
<p style="text-align: justify;">The Srikrishna Committee’s 2017 white paper and its 2018 report on the PDP Bill discuss the need for a regulator in the context of <em>enforcement</em> of its provisions. However, the DPA under the PDP Bill is tasked with extensive powers to frame detailed regulations and codes of conduct to inform the day-to-day obligations of data fiduciaries and processors. To be clear, the standard-setting function for a regulator <a href="https://ssrn.com/abstract=1393647">entails</a> laying down the standards based on which regulated entities (i.e. the data fiduciaries) will be held accountable, and the manner in which they may conduct themselves while undertaking the regulated activity (i.e. personal data processing). This is in addition to its administrative and enforcement, and quasi-judicial functions, as outlined below:</p>
<p style="text-align: justify;"><strong>Functions of the DPA under the PDP Bill 2019</strong></p>
<p style="text-align: justify;"><strong><img src="http://editors.cis-india.org/home-images/PDPBill.png/@@images/93bcf598-962a-48f1-b1b1-78933dac5d27.png" alt="null" class="image-inline" title="PDP" /></strong></p>
<p style="text-align: justify;">At this stage, it is important to note that the choice of regulation via a regulator is distinct from the administration of the Bill by the central or state governments. Creating a distinct regulatory body allows government procedures to be replaced with expert-driven decision-making to ensure sound economic regulation of the sector. At the same time, the independence of the regulatory authority <a href="https://www.oxfordhandbooks.com/view/10.1093/law/9780198704898.001.0001/oxfordhb-9780198704898">insulates it</a> from political processes. The third advantage of independent regulatory authorities is the scope for ‘operational flexibility’, which is embodied in the relative autonomy of its employees and its decision-making from government scrutiny.</p>
<p style="text-align: justify;">This is also the rationale provided by the Srikrishna Committee in stating their choice to entrust the administration of the data protection law to an independent DPA. The 2017 white paper that preceded the 2018 Srikrishna Committee Report proposed a distinct regulator to provide expert-driven enforcement of laws for the highly specialised data protection sphere. Secondly, the regulator would serve as a single point of contact for entities seeking guidance and will ensure consistency by issuing rules, standards, and guidelines. The Srikrishna Committee Report concretised this idea and proposed a sector-agnostic regulator that is expected to <a href="https://www.meity.gov.in/writereaddata/files/Data_Protection_Committee_Report.pdf">undertake</a> expertise-driven standard-setting, enforcement, and adjudication under the Bill.<sup> </sup> The PDP Bill carries forward this conception of a DPA, which is distinct from the central government.</p>
<p style="text-align: justify;">Conceptualised as such, the DPA has a completely new set of questions to contend with. Specifically, regulatory bodies require additional safeguards to overcome the legitimacy and accountability questions that <a href="https://www.oxfordhandbooks.com/view/10.1093/law/9780198704898.001.0001/oxfordhb-9780198704898">arise</a> when law-making is carried out not by elected members of the legislature, but via the unelected executive. The DPA would need to incorporate democratic decision-making processes to overcome the deficit of public participation in an expert-driven body. Thus, the meta-objective of ensuring autonomous, expertise-driven, and legitimate regulation of personal data processing necessitates that the regulator has sufficient independence from political interference, is populated with subject matter experts and competent decision-makers, and further has democratic decision-making procedures.</p>
<p>Further, the standard-setting role of the regulator does not receive sufficient attention in terms of providing distinct procedural or substantive safeguards either in the legislation or public policy guidance.</p>
<h3>Reconnaissance under the PDP Bill: How well does it guide the DPA?</h3>
<p style="text-align: justify;">At this time, the PDP Bill is the primary guidance document that defines the DPA and its overall structure. India also lacks an overarching statute or binding framework that lays down granular guidance on regulation-making by regulatory agencies.</p>
<p style="text-align: justify;">The PDP Bill, in its current iteration, sets out skeletal provisions to guide the DPA in achieving its objectives. Specifically, the Bill provides guidance limited to the following:</p>
<ol>
<li style="text-align: justify;"><em>Parliamentary scrutiny of regulations:</em> The DPA must table all its regulations before the Parliament. This is meant to accord <a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf">legislative scrutiny</a> to binding legal standards promulgated by unelected officials.</li>
<li style="text-align: justify;"><em>Consistency with the Act:</em> All regulations should be consistent with the Act and the rules framed under it. This integrates a standard of administrative law to a limited extent within the regulation-making process. </li></ol>
<p style="text-align: justify;">However, India’s past track record <a href="https://prsindia.org/theprsblog/how-well-does-parliament-examine-rules-framed-under-various-laws">indicates</a> that regulations, once tabled before the Parliament, are rarely questioned or scrutinised. Judicial review is typically based on ‘thin’ procedural considerations such as whether the regulation is unconstitutional, arbitrary, <em>ultra vires</em>, or goes beyond the statutory obligations or jurisdiction of the regulator. In any event, judicial review is possible only when an instrument is challenged by a litigant, and, therefore, it may not always be a robust <em>ex-ante</em> check on the exercise of this power. A third challenge arises where instruments other than regulations are issued by the regulator. These could be circulars, directions, guidelines, and even FAQs, which are <a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf">rarely bound</a> by even the minimal procedural mandate of being tabled before the Parliament. To be sure, older regulators including the Reserve Bank of India (RBI) and the Securities and Exchange Board of India (SEBI) also face similar issues, which they have attempted to address through various methods including voluntary public consultations, stakeholder meetings, and publication of minutes of meetings. These are useful tools for the DPA to consider as well.</p>
<p>Apart from these, specific guidance is provided with respect to issuing and approving codes of practice and issuing directions as follows:</p>
<ol>
<li style="text-align: justify;">Codes of practice: The DPA is required to (i) ensure transparency,<a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftn1"><sup><sup>[1]</sup></sup></a> (ii) consult with other sectoral regulators and stakeholders, and (iii) follow a procedure to be prescribed by the central government prior to the notification of codes of practice under the Bill.<a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftn2"><sup><sup>[2]</sup></sup></a></li>
<li style="text-align: justify;">Directions: The DPA may issue directions to individual, regulated entities or their classes from time to time, provided these entities have been given the opportunity to be heard by the DPA before such directions are issued.<a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftn3"><sup><sup>[3]</sup></sup></a></li></ol>
<p style="text-align: justify;">However, the meaning of transparency and the process for engaging with sectoral regulators remains unspecified under the Bill. Furthermore, the central government has been provided vast discretion to formulate these procedures, as the Bill does not specify the principles or outcomes sought to be achieved via these procedures. The Bill also does not specify instances where such directions may be issued and in which form.</p>
<p>Thus, as per its last publicly available iteration, the Bill remains silent on the following:</p>
<ul>
<li>The principles that may guide the DPA in its functioning.</li>
<li>The procedure to be followed for issuing regulations and other subordinate legislation under the Bill.</li>
<li style="text-align: justify;">The relevant regulatory instruments, other than regulations and codes of practice – such as circulars, guidelines, FAQs, etc. – that may be issued by the DPA.</li>
<li>The specifics regarding the members and employees within the DPA who are empowered to make these regulations.</li></ul>
<p style="text-align: justify;">It is unclear whether the JPC will revise the DPA’s structure or recommend statutory guidance for the DPA in executing any of its functions. This is unlikely, given that parent statutes for other regulators typically omit such guidance. As a result, the DPA may be required to make intentional and proactive choices on these matters, much like their regulatory counterparts in India. These are discussed in the section below.</p>
<h3 style="text-align: justify;">Envisaging a Proactive Role for the DPA</h3>
<p>As the primary regulatory body in charge of the enforcement of the forthcoming data protection framework, what should be the role of the DPA in setting standards for data protection?</p>
<p style="text-align: justify;">The complexity of the subject matter, and the DPA’s role as the frontline body to define day-to-day operational standards for data protection for the entire digital economy, necessitates that it develop transparent guiding principles and procedures. Furthermore, given that the DPA’s autonomy and capacity are currently unclear, the DPA will need to make deliberate choices regarding how it conducts itself. In this regard, the skeletal nature of the PDP Bill also allows the DPA to determine its own procedures to carry out its tasks effectively.</p>
<p style="text-align: justify;">This is not uncommon in India: various regulators have devised frameworks to create benchmarks for themselves. The Airports Economic Regulatory Authority (AERA) is <a href="http://aera.gov.in/aera/upload/uploadfiles/files/AERAACT.pdf">obligated</a> to follow a dedicated consultation process as per an explicit transparency mandate under the parent statute. However, the Insolvency and Bankruptcy Board of India (IBBI) has, on its own initiative, <a href="https://ibbi.gov.in/webadmin/pdf/legalframwork/2018/Oct/IBBI(Mechamism%20for%20Issuing%20Regulations)%20Regulations,%202018_2018-10-26%2011:59:43.pdf">formulated regulations</a> to guide its regulation-making functions. In other cases, consultation processes have been integrated into the respective framework through judicial intervention: the Telecom Regulatory Authority of India (TRAI) has been mandated to undertake consultations through <a href="https://clpr.org.in/wp-content/uploads/2018/10/Cellular-Operators-v.-TRAI.pdf">judicial interpretation</a> of the requirement for transparency under the Telecom Regulatory Authority of India Act, 1997 (TRAI Act).</p>
<p style="text-align: justify;">In this regard, we develop a list of considerations that the DPA should look to address while carrying out its standard-setting functions. We also draw on best practices by Indian regulators and abroad, which can help identify feasible solutions for an effective DPA for India.</p>
<p><strong>The choice of regulatory instruments</strong></p>
<p style="text-align: justify;">The DPA is empowered to issue regulations, codes of practice, and directions under the Bill. At the same time, regulators in India routinely issue other regulatory instruments to assign obligations and clarify them. Some commonly used regulatory instruments are outlined below. The terms used for instruments are not standard across regulators, and the list and description set out below outline the main concepts and not fixed labels for the instruments.</p>
<p><strong><em>Overview of regulatory instruments</em></strong><em> </em></p>
<table>
<tbody>
<tr>
<td>
<p> </p>
</td>
<td>
<p><strong>Circulars and Master Circulars</strong></p>
</td>
<td>
<p><strong>Guidelines</strong></p>
</td>
<td>
<p><strong>FAQs</strong></p>
</td>
<td>
<p><strong>Directions</strong></p>
</td>
</tr>
<tr>
<td>
<p><strong>Content</strong></p>
</td>
<td>
<p>Circulars are used to prescribe detailed obligations and prohibitions for regulated entities and can mimic regulations. Master circulars consolidate circulars on a particular topic periodically.</p>
</td>
<td>
<p>These may be administrative or substantive, depending on the practice of the regulator in question.</p>
</td>
<td>
<p>Issued in public interest by regulators to clarify the regulatory framework administered by them. They cannot prescribe new standards or create obligations.</p>
</td>
<td>
<p>Issued to provide focused instructions to individual entities or class of entities in response to an adjudicatory action or in lieu of a current challenge.</p>
</td>
</tr>
<tr>
<td>
<p><strong>Binding character</strong></p>
</td>
<td>
<p>They are generally <a href="https://indiankanoon.org/doc/1588871/">binding</a> in the <a href="https://indiankanoon.org/doc/1316639/">same manner</a> as regulations and rules. However, if they go beyond the parent Act or existing rules and regulations, they may be <a href="https://indiankanoon.org/doc/15876695/">struck down</a> following a judicial review.</p>
</td>
<td>
<p>They may or may not be binding depending upon the language employed or the regulator’s practice.</p>
</td>
<td>
<p>Unclear whether these are binding and to what extent. However, crucial clarifications on important concepts sometimes emerge from FAQs.</p>
</td>
<td>
<p>Binding in respect of the class of regulated entities to whom this is issued.</p>
</td>
</tr>
<tr>
<td>
<p><strong>Parliamentary scrutiny</strong></p>
</td>
<td colspan="4">
<p>Unlike regulations, these do not have to be laid before the Parliament.</p>
</td>
</tr>
</tbody>
</table>
<p style="text-align: justify;">Thus, all these instruments, to varying degrees, have <a href="https://www.ncaer.org/news_details.php?nID=1399">been used</a> to create binding obligations for regulated entities. The <a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf">choice of regulatory instrument</a> is not made systematically. Indeed, even a <a href="https://www.bis.org/bcbs/publ/d321.pdf">hierarchy of instruments</a> and their functions are not clearly set out by most regulators. The <a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf">rationale</a> for deciding why a circular is issued as against a regulation is also unclear. A study on regulatory performance in India by Burman and Zaveri (2018) has <a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf">highlighted</a> an over-reliance on instruments such as circulars. As per their study, between 2014 and 2016, RBI and SEBI issued 1,016 and 122 circulars, as against 48 and 51 regulations, respectively. These circulars are not bound by the same pre-consultative mandate nor are they mandated to be laid before the Parliament. While circulars may have been intended for routine to routinely used to lay down administrative or procedural requirements, the study narrows its frame of reference to circulars which lay down substantive regulatory requirements. In this instance, it is unclear why parliamentary scrutiny is mandated for regulations alone, and not for instruments like circulars and directions, even though they lay down similarly substantive requirements. Furthermore, there have also been<a href="https://indiacorplaw.in/2014/11/are-sebis-faqs-binding-on-partiessebi.html"> instances</a> where certain instruments like FAQs have gone beyond their advisory scope to provide new directions or definitions that were not previously shared under binding instruments like regulations or circulars.</p>
<p>The DPA has been provided specific powers to issue regulations, codes of practice, and directions. However, the rationale for issuing one instead of the other has been <a href="https://www.medianama.com/2020/01/223-pdp-bill-2019-data-protection-authority/">absent</a> from the PDP Bill so far. In such a scenario, it is important that the DPA transparently outlines the <em>types</em> of instruments it wishes to use, whether they are binding or advisory, and the procedure to be followed for issuing each.</p>
<p><strong>Pre-legislative consultative rule-making</strong></p>
<ol></ol>
<p>Participatory and consultative processes have emerged as core components of democratic rule-making by regulators. Transparent consultative mechanisms could also ameliorate capacity challenges in a new regulator (particularly for technical matters) and help enhance public confidence in the regulator.</p>
<p style="text-align: justify;">In India, several regulators have adopted consultation mechanisms even when there is no specific statutory requirement. <a href="https://www.sebi.gov.in/sebiweb/home/HomeAction.do?doListing=yes&sid=4&smid=35&ssid=38">SEBI</a> and <a href="https://ibbi.gov.in/public-comments/comments-on">IBBI</a> routinely issue discussion papers and consultation papers. The RBI also issues draft instruments <a href="https://www.rbi.org.in/Scripts/DraftNotificationsGuildelines.aspx">soliciting comments</a>. As discussed previously, TRAI and AERA have distinct transparency mandates under which they carry out consultations before issuing regulations. However, these processes are not mandated all forms of subordinate legislation. Taking cognizance of this, the Financial Sector Legislative Reform Committee (FSLRC) has <a href="https://dea.gov.in/sites/default/files/fslrc_report_vol1_1.pdf">recommended</a> transparency in the regulation-making process. This was <a href="https://dea.gov.in/sites/default/files/Handbook_GovEnhanc_fslrc_2.pdf">carried forward</a> by the Financial Stability and Development Council (FSDC), which recommended that consultation processes should be a prerequisite for all subordinate legislations, including circulars, guidelines, etc. A <a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf">study</a> on regulators’ adherence to these mandates, spanning TRAI, AERA, SEBI, and RBI, demonstrated that this pre-consultation mandate is followed inconsistently, if at all. Predictable consultation practices are therefore critical.</p>
<p style="text-align: justify;">Furthermore, the study stated that it <a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf">could not determine</a> whether the consultation processes yielded meaningful participation, given that regulators are not obligated to disclose how public feedback was integrated into the rule-making process. Subordinate legislations issued in the form of circulars and guidelines also do not typically undergo the same rigorous consultation processes. Thus, an ideal consultation framework would <a href="https://ec.europa.eu/info/sites/default/files/better_regulation_joining_forces_to_make_better_laws_en_0.pdf">comprise</a>:</p>
<ul>
<li style="text-align: justify;">Publication of the draft subordinate legislation along with a detailed explanation of the policy objectives. Further, the regulator should publish the internal or external studies conducted to arrive at the proposed legislation to <a href="https://legalinstruments.oecd.org/public/doc/669/51f6da97-c198-4c93-922f-1a5d80beae86.pdf">engender</a> meaningful discussion.</li>
<li>Permitting sufficient time for the public and interested stakeholders to respond to the draft.</li>
<li>Publishing all feedback received for the public to assess, and allowing them to respond to the feedback.</li></ul>
<p>However, beyond specifying the manner of conducting consultations, it will be important for the DPA to determine where they are mandatory and binding, and for which type of subordinate legislations. These are discussed in the next section.</p>
<p><strong>Choice of consultation mandates for distinct regulatory instruments</strong></p>
<ol></ol>
<p style="text-align: justify;">While the Bill provides for consultation processes for issuing and approving codes of practice, no such mechanism has been set out for other instruments. Nevertheless, specifying consultation mandates for different regulatory instruments is important to ensure that decision-making is consistent and regulation-making remains bound by transparent and accountable processes. As discussed above, regulatory instruments such as circulars and FAQs are not necessarily bound by the same consultation mandates in India. This distinction has been clarified in more sophisticated administrative law frameworks abroad. For instance, under the Administrative Procedures Act in the United States (US), all substantive rules made by regulatory agencies are <a href="https://www.reginfo.gov/public/reginfo/Regmap/regmap.pdf">bound</a> by a consultation process, which requires notice of the proposed rule-making and public feedback. This does <a href="https://www.federalregister.gov/uploads/2011/01/the_rulemaking_process.pdf">not preclude</a> the regulatory agency from issuing clarifications, guidelines, and supplemental information on the rules issued. These documents do not require the consultation process otherwise required for formal rules. However, they cannot be used to expand the scope of the rules, set new legal standards, or have the effect of amending the rules. Nevertheless, agencies are not precluded from choosing to seek public feedback on such documents.</p>
<p style="text-align: justify;">Similarly, the Information Commissioner’s Office in the United Kingdom (UK) takes into consideration <a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/">public consultations</a> and <a href="https://ico.org.uk/about-the-ico/ico-and-stakeholder-consultations/ico-call-for-views-on-employment-practices/">surveys</a> while issuing toolkits and guidance for regulated entities on how to comply with the data protection framework in the UK.</p>
<p style="text-align: justify;">Here, the DPA may choose to subject strictly binding instruments like regulations and codes of practice to pre-legislative consultation mandates, while softer mechanisms like FAQs may be subject to the publication of a detailed outline of the policy objective or online surveys to invite non-binding, advisory feedback. For each of these, the DPA will nonetheless need to create specific criteria by which it classifies instruments as binding and advisory, and further outline specific pre-legislative mandates for each category.</p>
<p><strong>Framework for issuing regulatory instruments and instructions</strong></p>
<ol></ol>
<p style="text-align: justify;">While the DPA is likely to issue several instruments, the system based on which these instruments will be issued is not yet clear. Without a clearly thought-out framework, different departments within the regulator <a href="https://www.nipfp.org.in/media/medialibrary/2018/08/WP_237_2018_0ciIwuT.pdf">typically issue</a> a series of directions, circulars, regulations, and other instruments. This raises questions regarding the consistency between instruments. This also requires stakeholders to go through multiple instruments to find the position of law on a given issue. Older Indian regulators are now facing challenges in adapting their ad hoc system into a framework. For example, the RBI currently issues a series of circulars and guidelines that are periodically consolidated on a subject-matter basis as Master Circulars and Master Directions. These are then updated and published on their website. IBBI also publishes <a href="https://ibbi.gov.in/uploads/publication/e42fddce80e99d28b683a7e21c81110e.pdf">handbooks</a> and <a href="https://ibbi.gov.in/publication/information-brochures">information brochures</a> that consolidate instruments in an accessible manner.</p>
<p style="text-align: justify;">While these are useful improvements, these practices cannot keep pace with rapid changes in regulatory instructions and are not complete or user-friendly (for example, the subject-matter based consolidation does not allow for filtering regulatory instructions by entity). Other jurisdictions have developed different techniques such as formal codification processes to consolidate regulations issued by government agencies under one <a href="https://www.govinfo.gov/help/cfr">unified code</a>, <a href="https://www.oaic.gov.au/privacy/privacy-registers/privacy-codes-register/">register</a>, or <a href="https://www.handbook.fca.org.uk/handbook">handbook</a>, websites that allow for searches based on different parameters (subject-matter, type of instrument, chronology, entity-based), and <a href="https://www.handbook.fca.org.uk/handbook-guides">guides</a> tailored to different types of entities. The DPA, as a new regulator, can learn from this experience and adopt a consistent framework right from the beginning.</p>
<p style="text-align: justify;">Further, an ethos of responsive regulation also requires the DPA to evaluate and revise directions and regulations periodically, in response to market and technology trends. A commitment to periodic evaluation of subordinate legislations entrenched in the rules is critical to reducing the dependence on officials and leadership, which may change. For instance, the <a href="https://www.ibbi.gov.in/webadmin/pdf/whatsnew/2018/Oct/Mechanism%20for%20issuing%20regulations%20October%20after%20Board%20meeting%20final_2018-10-22%2020:42:06.pdf">IBBI</a> has set out a mandatory review of regulations issued by it every three years.</p>
<p><strong>Dedicating capacity for drafting subordinate legislations</strong></p>
<ol></ol>
<p style="text-align: justify;">The DPA has been granted the discretion to appoint experts and staff its offices with the personnel it needs. A <a href="https://www2.deloitte.com/content/dam/Deloitte/nl/Documents/risk/deloitte-nl-risk-reports-resources.pdf">study</a> of European data protection authorities shows that by the time the General Data Protection Regulation, 2016 became effective, most of the authorities increased the number of employees with some even reporting a 240% increase. The annual spending on the authorities also went up for most countries. While these authorities do not necessarily frame subordinate legislations, they nonetheless create guidance toolkits and codes of practice as part of their supervisory functions.</p>
<p style="text-align: justify;">In this regard, the DPA will need to ensure it has dedicated capacity in-house to draft subordinate legislations. Since regulators are generally seen as enforcement authorities, there is inadequate investment in capacity-building for drafting legislations in India.</p>
<p style="text-align: justify;">Moreover, considering the multiplicity of instruments and guidance documents the DPA is expected to issue, it may seek to create templates for these instruments, along with compulsory constituents of different types of instruments. For instance, the Office of the Australian Information Commissioner is required to include a <a href="https://www.oaic.gov.au/privacy/guidance-and-advice/guidelines-for-developing-codes/">mandatory set of components</a> while issuing or approving binding industry codes of practice.</p>
<h3 style="text-align: justify;">Conclusion</h3>
<p style="text-align: justify;">The Personal Data Protection Bill, 2019 (in the final form recommended by the JPC and accepted by the MeitY) will usher in a new chapter in India’s data protection timeline. While the Bill will finally effectuate a nearly comprehensive data protection framework for India, it will also establish a new regulatory framework that sets up a new regulator, the DPA, to oversee the new data protection law. This DPA will be empowered to regulate entities across sectors and is likely to determine the success of the data protection law in India.</p>
<p style="text-align: justify;">Furthermore, the DPA must not only contend with the complexity of markets and the fast pace of technological change, but it must also address <a href="https://blog.theleapjournal.org/2018/02/a-pragmatic-approach-to-data-protection.html">anticipated</a> regulatory capacity deficits, low levels of user literacy, the number and diversity of enities within its regulatory ambit, and the need to secure individual privacy within and outside the digital realm.</p>
<p style="text-align: justify;">Thus, looking ahead, we must account for the questions of governance that the forthcoming DPA is likely to face, as these will directly impact how entities and citizens engage with the DPA. In India, regulatory agencies adopt distinct choices to fulfil their functions. Regulators have also <a href="https://static1.squarespace.com/static/59c0077a9f745650903ac158/t/5cb62147104c7ba2eaf637e4/1555439944606/Burman+V2.pdf">fared variably</a> in ensuring transparent and accountable decision-making driven by demonstrable expertise. Even if the final form of the PDP Bill does not address these gaps, the DPA has the opportunity to integrate benchmarks and best practices as discussed above within its own governance framework from the get-go as it takes on its daunting responsibilities under the PDP Bill.</p>
<p style="text-align: justify;"><em>(<span id="docs-internal-guid-6bf51b9e-7fff-d2ac-d0fb-f42bcdd7f599">The authors are Research Fellow, Law, Technology and Society Initiative and Project Lead, Regulatory Governance Project respectively at the National Law School of India University, Bangalore. Views are personal.)</span></em></p>
<em>
</em>
<p style="text-align: justify;"><span id="docs-internal-guid-6bf51b9e-7fff-d2ac-d0fb-f42bcdd7f599"><em>This post was reviewed by Vipul Kharbanda and Shweta Mohandas</em><br /></span></p>
<h3 style="text-align: justify;">References</h3>
<ul>
<li style="text-align: justify;">For a discussion on distinct regulatory choices, please see TV Somanathan, <em>The Administrative and Regulatory State</em> in Sujit Choudhary, Madhav Khosla, et al. (eds), <a href="https://www.oxfordhandbooks.com/view/10.1093/law/9780198704898.001.0001/oxfordhb-9780198704898">Oxford Handbook of the Indian Constitution</a> (2016).</li>
<li style="text-align: justify;">On best practices for consultative law-making, see generally <em>European Union Better Regulation </em><a href="https://ec.europa.eu/info/sites/default/files/better_regulation_joining_forces_to_make_better_laws_en_0.pdf"><em>Communication</em></a>, <em>Guidelines for Effective Regulatory Consultations </em>(<a href="https://www.tbs-sct.gc.ca/rtrap-parfa/erc-cer/erc-cer-eng.pdf">Canada</a>), and<em> </em><a href="https://read.oecd-ilibrary.org/governance/the-governance-of-regulators_9789264209015-en#page81"><em>OECD</em></a><em> </em><em>Best Practice Principles for Regulatory Policy: The Governance of Regulators</em>,<em> 2014.</em></li></ul>
<hr align="left" size="1" width="33%" />
<p><a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftnref1"><sup><sup>[1]</sup></sup></a> Personal Data Protection Bill 2019, § 50(3).</p>
<p><a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftnref2"><sup><sup>[2]</sup></sup></a> Personal Data Protection Bill 2019, § 50(4).</p>
<p><a href="file:///C:/Users/Admin/AppData/Local/Temp/211105_Governance%20Choices%20for%20the%20DPA%20(1).docx#_ftnref3"><sup><sup>[3]</sup></sup></a> Personal Data Protection Bill 2019, § 51.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill'>http://editors.cis-india.org/internet-governance/blog/trishi-jindal-and-s-vivek-beyond-the-pdp-bill</a>
</p>
No publisherTrishi Jindal and S.VivekInternet GovernanceData ProtectionPrivacy2021-11-10T07:32:33ZBlog Entry