The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 15.
Digital Delivery and Data System for Farmer Income Support
http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support
<b>This report, jointly published by the Centre for Internet & Society and Privacy International, highlights the digital systems deployed by the government to augment farmer income. It analyses the PM-Kisan and Kalia schemes in Odisha and Andhra Pradesh. </b>
<h2>Executive Summary</h2>
<p style="text-align: justify; ">This study provides an in-depth analysis of two direct cash transfer schemes in India – Krushak Assistance for Livelihood and Income Augmentation (KALIA) and Pradhan Mantri Kisan Samman Nidhi (PM-KISAN) – which aim to provide income support to farmers. The paper examines the role of data systems in the delivery and transfer of funds to the beneficiaries of these schemes, and analyses their technological framework and processes.</p>
<p style="text-align: justify; ">We find that the use of digital technologies, such as direct benefit transfer (DBT) systems, can improve the efficiency and ensure timely transfer of funds. However, we observe that the technology-only system is not designed with the last beneficiaries in mind; these people not only have no or minimal digital literacy but are also faced with a lack of technological infrastructure, including internet connectivity and access to the system that is largely digital.</p>
<p style="text-align: justify; ">Necessary processes need to be implemented and personnel on the ground enhanced in the existing system, to promptly address the grievances of farmers and other challenges.</p>
<p style="text-align: justify; ">This study critically analyses the direct cash transfer scheme and its impact on the beneficiaries. We find that despite the benefits of direct benefit transfer (DBT) systems, there have been many instances of failures, such as the exclusion of several eligible households from the database.</p>
<p style="text-align: justify; ">The study also looks at gender as one of the components shaping the impact of digitisation on beneficiaries. We also identify infrastructural and policy constraints, in sync with the technological framework adopted and implemented, that impact the implementation of digital systems for the delivery of welfare. These include a lack of reliable internet connectivity in rural areas and low digital literacy among farmers. We analyse policy frameworks at the central and state levels and find discrepancies between the discourse of these schemes and their implementation on the ground.</p>
<p style="text-align: justify; ">We conclude the study by discussing the implications of datafication, which is the process of collecting, analysing, and managing data through the lens of data justice. Datafication can play a crucial role in improving the efficiency and transparency of income support schemes for farmers. However, it is important to ensure that the interests of primary beneficiaries are considered – the system should work as an enabling, not a disabling, factor. This appears to be the case in many instances since the current system does not give primacy to the interests of farmers. We offer recommendations for policymakers and other stakeholders to strengthen these schemes and improve the welfare of farmers and end users.</p>
<hr />
<p style="text-align: justify; "><a href="http://editors.cis-india.org/internet-governance/files/digital-tools-farmers-report/at_download/file" class="external-link"><b>Click to download the full report</b></a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support'>http://editors.cis-india.org/internet-governance/blog/cis-privacy-international-digital-delivery-and-data-system-for-farmer-income-support</a>
</p>
No publishersameetDigital TechnologiesData GovernanceInternet GovernancePrivacy2023-10-18T23:40:25ZBlog EntryDeceptive Design in Voice Interfaces: Impact on Inclusivity, Accessibility, and Privacy
http://editors.cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy
<b>This article was commissioned by the Pranava Institute, as part of their project titled Design Beyond Deception, supported by the University of Notre Dame - IBM's Tech Ethics Lab.” The article examines the design of voice interfaces (VI) to anticipate potential deceptive design patterns in VIs. It also presents design and regulatory recommendations to mitigate these practices. </b>
<p>The original blog post can be accessed <a class="external-link" href="https://www.design.pranavainstitute.com/post/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy">here</a>.</p>
<hr />
<h3><b>Introduction</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Voice Interfaces (VIs) have come a long way in recent years and are easily available as inbuilt technology with smartphones, downloadable applications, or standalone devices. In line with growing mobile and internet connectivity, there is now an increasing interest in India in internet-based multilingual VIs which have the potential to enable people to access services that were earlier restricted by language (primarily English) and interface (text-based systems). This current interest has seen even global voice applications such as Google Home and Amazon’s Alexa being available in <a class="itht3 TWoY9" href="https://www.businesstoday.in/technology/news/story/now-talk-to-alexa-seamlessly-in-hindi-english-and-hinglish-231469-2019-10-09" rel="noopener noreferrer" target="_blank">Hindi</a> (Singal, 2019) as well as the <a class="itht3 TWoY9" href="https://voice.cis-india.org/#mapping-actors" rel="noopener noreferrer" target="_blank">growth</a> of multilingual voice bots for certain banks, hotels, and hospitals (Mohandas, 2022).</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">The design of VIs can have a significant impact on the behavior of the people using them. Deceptive design patterns or design practices that trick people into taking actions they might otherwise not take (Tech Policy Design Lab, n.d.), have gradually become pervasive in most digital products and services. Their use in visual interfaces has been widely <a class="itht3 TWoY9" href="https://dl.acm.org/doi/pdf/10.1145/3400899.3400901" rel="noopener noreferrer" target="_blank">criticized</a> by researchers (Narayanan, Mathur, Chetty, and Kshirsagar, 2020), along with recent <a class="itht3 TWoY9" href="https://tacd.org/manipulative-design-practices-online-what-policy-solutions-for-the-eu-and-the-u-s/" rel="noopener noreferrer" target="_blank">policy interventions</a> (Schroeder and Lützow-Holm Myrstad, 2022) as well. As VIs become more relevant and mainstream, it is critical to anticipate and address the use of deceptive design patterns in them. This article, based on our learnings from the <a class="itht3 TWoY9" href="http://voice.cis-india.org/index.html" rel="noopener noreferrer" target="_blank">study</a> of VIs in India, examines the various types of deceptive design patterns in VIs and focuses on their implications in terms of linguistic barriers, accessibility, and privacy.</p>
<h3><b>Potential deceptive design patterns in VIs</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Our research findings suggest that VIs in India are still a long way off from being inclusive, accessible and privacy-preserving. While there has been some development in multilingual VIs in India, their compatibility has been limited to a few Indian languages (Mohandas, 2022) (Naidu, 2022)., The potential of VIs as a tool for people with vision loss and certain cognitive disabilities such as dyslexia is widely recognized (Pradhan, Mehta, and Findlater, 2018), but our conversations suggest that most developers and designers do not consider accessibility when conceptualizing a voice-based product, which leads to interfaces that do not understand non standard speech patterns, or have only text-based privacy policies (Mohandas, 2022). Inaccessible privacy policies full of legal jargon along with the lack of regulations specific to VIs, also make people vulnerable to privacy risks.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Deceptive design patterns can be used by companies to further these gaps in VIs. As with visual interfaces, the affordances and attributes of VI can determine the way in which they can be used to manipulate behavior. Kentrell Owens, et.al in their recent <a class="itht3 TWoY9" href="https://homes.cs.washington.edu/~kentrell/static/papers/owensEuroUSEC2022-preprint.pdf" rel="noopener noreferrer" target="_blank">research</a> lay down six unique properties of VIs that may be used to implement deceptive design patterns (Owens, Gunawan, Choffnes, Emami-Naeini, Kohno, and Roesner, 2022). Expanding upon these properties, and drawing from our research, we look at how they can be exacerbated in India.</p>
<h3><b>Making processes cumbersome</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">VIs are often limited by their inability to share large amounts of information through voice. They thus operate in combination with a smartphone app or a website. This can be intentionally used by platforms to make processes such as changing privacy settings or accessing the full privacy notice inconvenient for people to carry out. In India, this is experienced while unsubscribing from services such as Amazon Prime (Owens et al., 2022). Amazon Echo Dot presently allows individuals to subscribe to an Amazon Prime membership using a voice command, but directs them to use the website in order to unsubscribe from the membership. This can also manifest in the form of canceling orders and changing privacy settings.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">VIs follow a predetermined linear structure that ensures a tightly controlled interaction. People make decisions based on the information they are provided with at various steps. Changing their decision or switching contexts could involve going back several steps. People may accept undesirable actions from the VI in order to avoid this added effort (Owens et al., 2022). The urgency to make decisions on each step can also cause people to make unfavorable choices such as allowing consent to third party apps. The VI may prompt advertisements and push for the company’s preferred services in this controlled conversation structure, which the user cannot side-step. For example, while setting up the Google voice assistant on any device, it nudges people to sign into their Google account. This means the voice assistant gets access to their web and app activity and location history at this step. While the data management of Google accounts can be tweaked through the settings, it may get skipped during a linear set-up structure. Voice assistants can also push people to opt into features such as ads personalisation, default news sources, and location tracking.</p>
<h3><b>Making options difficult to find</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Discoverability is another challenge for VIs. This means that people might find it difficult to discover available actions or options using just voice commands. This gap can be misused by companies to trick people into making undesirable choices. For instance, while purchasing items, the VI may suggest products that have been sponsored and not share full information on other cheaper products, forcing people to choose without complete knowledge of their options. Many mobile based voice apps in India use a combination of images or icons with the voice prompts to enable discoverability of options and potential actions, which excludes people with vision loss (Naidu, 2022). These apps comprise a voice layer added to an otherwise touch-based visual platform so that people are able to understand and navigate through all available options using the visual interface, and use voice only for purposes such as searching or narrating. This means that these apps cannot be used through voice alone, making them disadvantageous for people with vision loss.</p>
<h3><b>Discreet integration with third parties</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">VIs can use the same voice for varying contexts. In the case of Alexa, Skills, which are apps on its platform, have the same voice output and invocation phrases as its own in-built features. End users find it difficult to differentiate between an interaction with Amazon and that with Skills which are third-party applications. This can cause users to share information that they otherwise would not have with third parties (Mozilla Foundation, 2022). There are numerous Amazon Skills inHindi and people might not be aware that the developers of these Skills are <a class="itht3 TWoY9" href="https://www.theverge.com/2021/3/5/22315211/amazon-alexa-skills-how-to-remove-security-privacy-problems" rel="noopener noreferrer" target="_blank">not vetted </a>by Amazon. This misunderstanding can create significant privacy or security risks if Skills are linked to contacts, banking, or social media accounts.</p>
<h3><b>Lack of language inclusivity </b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">The lack of local language support, colloquial translations, and accents can lead to individuals not receiving clear and complete information. VI’s failure to understand certain accents can also make people feel isolated (Harwell, 2018). While in India voice assistants and even voice bots are available in few Indic languages, the default initial setup, privacy policies, and terms and conditions are still in English. The translated policies also use literary language which is difficult for people to understand, and miss out on colloquial terms. This could mean that the person might have not fully understood these notices and hence not have given informed consent. Such use of unclear language and unavailability of information in Indic languages can be viewed as a deceptive design pattern.</p>
<h3><b>Making certain choices more apparent </b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">The different dimensions of voice such as volume, pitch, rate, fluency, pronunciation, articulation, and emphasis can be controlled and manipulated to implement deceptive design patterns. VIs may present the more privacy-invasive options more loudly or clearly, and the more privacy-preserving options more softly or quickly. It can use tone modulations to shame people into making a specific choice (Owens et al., 2022). For example, media streaming platforms may ask people to subscribe for a premium account to avoid ads in normal volume and mention the option to keep ads in a lower volume. Companies have also been observed to discreetly integrate product advertisements in voice assistants using tone. SKIN, a neurotargeting advertising strategy business, used a change of tone of the voice assistant to suggest a dry throat to advertise a drink (Chatellier, Delcroix, Hary, and Girard-Chanudet, 2019).</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">The attribution of gender, race, class, and age through stereotyping can create a persona of the VI for the user. This can extend to personality traits, such as an extroverted or an introverted, docile or aggressive character (Simone, 2020). The default use of female voices with a friendly and polite persona for voice assistants has drawn criticism for perpetuating harmful gender stereotypes (Cambre and Kulkarni, 2019). Although there is an option to change the wake word “Alexa” in Amazon’s devices, certain devices and third party apps do not work with another wake word (Ard, 2021). Further, projection of demographics can also be used to employ deceptive design patterns. For example, a VI persona that is constructed to create a perception of intelligence, reliability, and credibility can have a stronger influence on people’s decisions. Additionally, the effort to make voice assistants as human sounding as possible without letting people know they are human, could create a number of <a class="itht3 TWoY9" href="https://www.nytimes.com/2019/05/22/technology/personaltech/ai-google-duplex.html" rel="noopener noreferrer" target="_blank">issues</a> (X. Chen and Metz, 2019). First time users might divulge sensitive information thinking that they are interacting with a person. This becomes more ethically challenging when persons with vision loss are not able to know who they are interacting with.</p>
<h3><b>Recording without notification </b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Owens et al speak about VIs occupying physical domains due to which they have a much wider impact as opposed to a visual interface (Owens et al., 2022). The always-on nature of virtual assistants could result in personal information of a guest being recorded without their knowledge or consent as consent is only given at the setup stage by the owner of the device or smartphone.</p>
<h3><b>Making personalization more convenient through data collection</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">VIs are trained to adapt to the experience and expertise of the user. Virtual assistants provide personalization and the possibility to download a number of skills, save payment information, and phone contacts. In order to facilitate differentiation between multiple users on the same VI, individuals talking to the device are profiled based on their speech patterns and/or voice biometrics. This also helps in controlling or restricting content for children (Naidu, 2022). There is also tracking of commands to identify and list their intent for future use. The increase of specific and verified data can be used to provide better targeted advertisements, as well possibly be shared with law enforcement agencies in certain cases. <a class="itht3 TWoY9" href="https://www.business-standard.com/article/current-affairs/razorpay-shared-donor-data-with-police-claims-alt-news-122070501255_1.html" rel="noopener noreferrer" target="_blank">Recently</a>, a payment gateway company was made to share customer information to the law enforcement without their customer’s knowledge. This included not just the information about the client but also revealed sensitive personal data of the people who had used the gateway for transactions to the customer. While providing such details are not illegal and companies are meant to comply with requests from law enforcement, if more people knew of the possibility of every conversation of the house being accessible to law enforcement they would make more informed choices of what the VI records.</p>
<h3><b>Reducing friction in actions desired by the platform</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">One of the fundamental advantages of VIs is that it can reduce several steps to perform an action using a single command. While this is helpful to people interacting with it, the feature can also be used to reduce friction from actions that the platform wants them to take. These actions could include sharing sensitive information, providing consent to further data sharing, and making purchases. An <a class="itht3 TWoY9" href="http://insider.com/kids-alexa-buy-700-worth-of-toys-moms-credit-card-2019-12" rel="noopener noreferrer" target="_blank"><span class="D-jZk">example</span></a> of this can be seen where children have found it very easy to purchase items using Alexa (BILD, 2019).</p>
<h3><b>Recommendations for Designers and Policymakers</b></h3>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Through these deceptive design patterns, VIs can obstruct and control information according to the preferences of the platform. This can result in a heightened impact on people with less experience with technology. Presently, profitability is a key driving factor for development and design of VI products. There is more importance given to data-based and technical approaches, and interfaces are often conceptualized by people with technical expertise with lack of inputs from designers at the early stages (Naidu, 2022). Designers also focus more on the usability and functionality of the interfaces by enabling personalization, but are often not as sensitive to safeguarding the rights of individuals using them. In order to tackle deceptive design, designers must work towards prioritizing ethical practice, and building in more agency and control for people who use VIs.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Many of the potential deceptive design patterns can be addressed by designing for accessibility and inclusivity in a privacy preserving manner. This includes vetting third-party apps, providing opt-outs, and clearly communicating privacy notices. Privacy implications can also be prompted by the interface at the time of taking actions. There should be clear notice mechanisms such as a prominent visual cue to alert people when a device is on and recording, along with an easy way to turn off the ‘always listening’ mode. The use of different voice outputs for third party apps can also signal to people about who they are interacting with and what information they would like to share in that context.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">Training data that covers a diverse population should be built for more inclusivity. A linear and time-efficient architecture is helpful for people with cognitive disabilities. But, this linearity can be offset by adding conversational markers that let the individual know where they are in the conversation (Pearl, 2016). This could address discoverability as well, allowing people to easily switch between different steps. Speech-only interactions can also allow people with vision loss to access the interface with clarity.</p>
<p class="public-DraftStyleDefault-text-ltr fixed-tab-size public-DraftStyleDefault-block-depth0 bCMSCT yMZv8w lnyWN OZy-3 bCMSCT Y9Dpf xVISr" style="text-align: justify; ">A number of policy documents including the 2019 version of India’s Personal Data Protection Bill, emphasize on the need for privacy by design. But, they do not mention how deceptive design practices could be identified and avoided, or prescribe penalties for using these practices (Naidu, Sheshadri, Mohandas, and Bidare, 2020). In the case of VI particularly, there is a need to look at it as biometric data that is being collected and have related regulations in place to prevent harm to users. In terms of accessibility as well, there could be policies that require not just websites but also apps (including voice based apps) to be compliant with international accessibility guidelines , and to conduct regular audits to ensure that the apps are meeting the accessibility threshold.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy'>http://editors.cis-india.org/internet-governance/blog/deceptive-design-in-voice-interfaces-impact-on-inclusivity-accessibility-and-privacy</a>
</p>
No publisherSaumyaa Naidu and Shweta MohandasInternet GovernancePrivacy2023-08-08T15:22:51ZBlog EntryHealth Data Management Policies - Differences Between the EU and India
http://editors.cis-india.org/internet-governance/blog/health-data-management-policies
<b>Through this issue brief we would like to highlight the differences in approaches to health data management taken by the EU and India, and look at possible recommendations for India, in creating a privacy preserving health data management policy. </b>
<p>This issue brief was reviewed and edited by Pallavi Bedi</p>
<hr />
<h2>Introduction</h2>
<p style="text-align: justify; ">Health data has seen an increased interest the world over, on account of the amount of information and inferences that can be drawn not just about a person but also about the population in general. The Covid 19 pandemic also brought about an increased focus on health data, and brought players that earlier did not collect health data to be required to collect such data, including offices and public spaces. This increased interest has led to further thought on how health data is regulated and a greater understanding of the sensitivity of such data, because of which countries are in varying processes to get health data regulated over and above the existing data protection regulations. The regulations not only look at ensuring the privacy of the individual but also look at ways in which this data can be shared with companies, researchers and public bodies to foster innovation and to monetise this valuable data. However for a number of countries the effort is still on the digitisation of health data. India has been in the process of implementing a nationwide health ID that can be used by a person to get all their medical records in one place. The National Health Authority (NHA) has also since 2017 been publishing policies that look at the framework and ecosystem of health data, as well as the management and sharing of health data. However these policies and a scattered implementation of the health ID are being carried out without a data protection legislation in place. In comparison, Europe, which already has an established health Id system, and a data protection legislation (GDPR) is looking at the next stage of health data management through the EU Health Data Space (EUHDS). Through this issue brief we would like to highlight the differences in approaches to health data management taken by the EU and India, and look at possible recommendations for India, in creating a privacy preserving health data management policy.</p>
<h2 style="text-align: justify; ">Background</h2>
<h3>EU Health Data Space</h3>
<p style="text-align: justify; "><span>The EU Health Data Space (<b>EUHDS</b>) was proposed by the EU Council as a way to create an ecosystem which combines rules, standards, practices and infrastructure, around health data under a common governance framework. The EUHDS is set to rely on two pillars; namelyMyHealth@EU and HealthData@EU, where MyHealth@EU facilitates easy flow of health data between patients and healthcare professionals within member states, the HealthData@EU,faciliates secondary use of data which allows policy makers,researchers access to health data to foster research and innovation.<a href="#_ftn1" name="_ftnref1"><sup><sup><span>[1]</span></sup></sup></a> The EUHDS aims to provide a trustworthy system to access and process health data and builds up from the General Data Protection Regulation (GDPR), proposed Data Governance Act.<a href="#_ftn2" name="_ftnref2"><sup><sup><span>[2]</span></sup></sup></a></span></p>
<h3><span>India’s health data policies: </span></h3>
<p style="text-align: justify; "><span>The last few years has seen a flurry of health policies and documents being published and the creation of a framework for the evolution of a National Digital Health Ecosystem (NDHE). The components for this ecosystem were the National Digital Health Blueprint published in 2019 (NDHB) and the National Digital Health Mission (NDHM). The BluePrint was created to implement the National Health Stack (published in 2018) which facilitated the creation of Health IDs.<a href="#_ftn3" name="_ftnref3"><sup><sup><span>[3]</span></sup></sup></a> Whereas the NDHM was drafted to drive the implementation of the Blueprint, and promote and facilitate the evolution of NDHE.<a href="#_ftn4" name="_ftnref4"><sup><sup><span>[4]</span></sup></sup></a> </span></p>
<p style="text-align: justify; "><span>The National Health Authority (<b>NHA</b>) established in 2018 has been given the responsibility of implementing the National Digital Health Mission. 2018 also saw the Digital Information Security in Healthcare Act (<b>DISHA</b>) which was to be a legislation that laid down provisions that regulate the generation, collection, access, storage, transmission and use of Digital Health Data ("DHD") and associated personal data.<a href="#_ftn5" name="_ftnref5"><sup><sup><span>[5]</span></sup></sup></a> However since its call for public consultation no progress has been made on this front.</span></p>
<p style="text-align: justify; "><span>Along with these three strategy documents the NHA has also released policy documents more particularly the Health Data Management Policy (which was revised three times; the latest version released in April 2022), the Health Data Retention Policy (released April 2021), and the Consultation Paper on Unified Health Interface (UHI) (released March 2021). Along with this in 2022 the NHA released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) India’s state health insurance policy. </span></p>
<p style="text-align: justify; "><span>However these draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; the PM-JAY’s Data Sharing Guidelines published in August 2022 did not even refer to the draft National Digital Health Data Management Policy (published in April 2022). As stated through the examples above these documents do not cross-refer or mention preceding health data documents, creating a lack of clarity of which documents are being used as guidelines by health care providers. </span></p>
<p style="text-align: justify; "><span>In addition to this the Personal Data Protection Bill has been revised three times since its release in 2018. The latest version was published for public comments on November 18, 2022; the Bill has removed the distinction between sensitive personal data and personal data and clubbed all personal data under one umbrella heading of personal data. Health and health data definition has also been deleted; creating further uncertainty with respect to health data as the different policies mentioned above rely on the data protection legislation to define health data. <br /></span></p>
<h3><b><span>Comparison of the Health Data Management Approaches </span></b><span><br /> </span><span>Interoperability with Data Protection Legislations </span></h3>
<p style="text-align: justify; "><b><span><br /></span></b><span>At the outset the key difference between the EU and India’s health data management policies has been the legal backing of GDPR which the EUHDS has. EUHDS has a strong base in terms of rules for privacy and data protection as it follows, draws inference and works in tandem with the General Data Protection Regulation (GDPR). The provisions also build upon legislation such as Medical Devices Regulation and the In Vitro Diagnostics Regulation. With particular respect to GDPR the EUHDS draws from the rights set out for protection of personal data including that of electronic health data.<br /></span></p>
<p style="text-align: justify; "><span>The Indian Health data policies however currently exist in the vacuum created by the multiple versions of the Data Protection Bill that are published and repealed or replaced. The current version called the Digital Personal Data Protection Bill 2022 seems to take a step backward in terms of health data. The current version does away with sensitive personal data (which health data was a part of) and keeps only one category of data - personal data. It can be construed that the Bill currently considers all personal data as needing the same level of protection but it is not so in practice. The Bill does not at the moment mandate more responsibilities on data fiduciaries<a href="#_ftn6" name="_ftnref6"><sup><sup><span>[6]</span></sup></sup></a> that deal with health data (something that was present in all the earlier versions of the Bill) and in other data protection legislation across different jurisdictions and leaves the creation of Significant Data Fiduciaries (who have more responsibilities) to be created by rules, based on the sensitivity of data decided by the government at a later date.<a href="#_ftn7" name="_ftnref7"><sup><sup><span>[7]</span></sup></sup></a> In addition to this the Bill does not define “health data”, the reason why this is a cause for worry is that the existing health data policies also do not define health data often relying on the definition mentioned in the versions of Data Protection Bill. </span></p>
<h3><span>Definitions and Scope</span></h3>
<p><span>The EUHDS defines ‘personal electronic health data’ as data concerning health and genetic data as defined in Regulation (EU) 2016/679<a href="#_ftn8" name="_ftnref8"><sup><sup><span>[8]</span></sup></sup></a>, as well as data referring to determinants of health, or data processed in relation to the provision of healthcare services, processed in an electronic form. Health data by these parameters would then include not just data about the status of health of a person which includes reports and diagnosis, but also data from medical devices. <br /></span></p>
<p style="text-align: justify; "><span>In India the Health Data Management Policy 2022, defines “Personal Health Records” (<b>PHR</b>) as a health record that is initiated and maintained by an individual. The policy also states that a PHR would be able to reveal a complete and accurate summary of the health and medical history of an individual by gathering data from multiple sources and making this accessible online. However there is no definition of health data which can be used by companies or users to know what comes under health data. The 2018, 2019 and 2021 version of the Data Protection Legislation had definitions of the term health data, however the 2022 version of the Bill does away with the definition.<br /></span></p>
<h3><span>Health data and wearable devices</span></h3>
<p style="text-align: justify; "><span>One of the forward looking provisions in the EUHDS is the inclusion of devices that records health data into this legislation. This also includes the requirement of them to be added to registries to provide easy access and scrutiny. The document also requires voluntary labeling of wellness applications and registration of EHR systems and wellness applications. This is not just for the regulation point of view but also in the case of data portability, in order for people to control the data they share. In addition to this in the case where manufacturers of medical devices and high-risk AI systems declare interoperability with the EHR systems, they will need to comply with the essential requirements on interoperability under the EHDS. </span></p>
<p style="text-align: justify; "><span>In India the health data management policy 2022 while stating the applicable entities and individuals who are part of the ABDM ecosystem<a href="#_ftn9" name="_ftnref9"><sup><sup><span>[9]</span></sup></sup></a> mention medical device manufacturers, does not mention device sellers or use terms such as wellness applications or wearable devices. Currently the regulation of medical devices falls under the purview of the Drugs and Cosmetics Act, 1940 (DCA) read along with the Medical Device Rules, 2017 (MDR). However in 2020 possibly due to the pandemic the Indian Government along with the Drugs Technical Advisory Board (DTAB) issued two notifications the first one expanded the scope of medical devices which earlier was limited to only 37 categories excluding medical apps, and second one notified the Medical Device (Amendment) Rules, 2020. These two changes together brought all medical devices under the DCA as well as expanded the categories of medical devices. However it is still unclear whether fitness tracker apps that come with devices are regulated, as the rules and the DCA still rely on the manufacturer to self-identify as a medical device.<a href="#_ftn10" name="_ftnref10"><sup><sup><span>[10]</span></sup></sup></a> However, this regulatory uncertainty has not brought about any change in how this data is being used and insurance companies at times encourage people to sync their fitness tracker data.<a href="#_ftn11" name="_ftnref11"><sup><sup><span>[11]</span></sup></sup></a></span></p>
<h3><span>Multiple use of health data </span></h3>
<p style="text-align: justify; "><span>The EUHDS states two types of uses of data: primary and secondary use of data. In the document the EU states that while there are a number of organisations collecting data, this data is not made available for purposes other than for which it was collected. In order to ensure that researchers, innovators and policy makers can use this data. the EU encourages the data holders to contribute to this effort in making different categories of electronic health data they are holding available for secondary use. The data that can be used for secondary use would also include user generated data such as from devices, applications or other wearables and digital health applications.However, the regulation cautions against using this data for measures and making decisions that are detrimental to the individual, in ways such as increasing insurance premiums. The EUHDS also states that as the data is sensitive personal data care should be taken by the data access bodies, to ensure that while data is being shared it is necessary to ensure that the data will be processed in a privacy preserving manner. This could include through pseudonymisation, anonymisation, generalisation, suppression and randomisation of personal data.</span></p>
<p style="text-align: justify; "><span>While the document states how important it is to have secondary use of the data for public health, research and innovation it also requires that the data is not provided without adequate checks. The EUHDS requires the organisation seeking access to provide several pieces of information and be evaluated by the data access body. The information should include legitimate interest, the necessity and the process the data will go through. In the case where the organisation is seeking pseudonymised data, there is a need to explain why anonymous data would not be sufficient. In order to ensure a comprehensive approach between health data access bodies, the EUHDS states that the European Commission should support the harmonisation of data application, as well as data request. <br /></span></p>
<p style="text-align: justify; "><span>In India, while multiple health data documents state the need to share data for public interest, research and innovation, not much thought has been given to ensuring that the data is not misused and that there is harmonisation between bodies that provide the data. Most recently the PMJay documents states that the NHA shall make aggregated and anonymised data available through a public dashboard for the purpose of facilitating health and clinical research, academic research, archiving, statistical analysis, policy formulation, the development and promotion of diagnostic solutions and such other purposes as may be specified by the NHA. Such data can be accessed through a request to the Data Sharing Committee<a href="#_ftn12" name="_ftnref12"><sup><sup><span>[12]</span></sup></sup></a> for the sharing of such information through secure modes, including clean rooms and other such secure modes specified by NHA. However the document does not mention what clean rooms are in this context. </span></p>
<p style="text-align: justify; "><span>The Health Data Management Policy 2022 states that Data fiduciaries (data controllers/ processors according to the data protection legislation) can themselves make anonymised or de-identified data in an aggregated form available based in technical processes and anonymisation protocols which may be specified by the NDHM in consultation with the MeitY. The purposes mentioned in this policy included health and clinical research, academic research, archiving, statistical analysis, policy formulation, the development and promotion of diagnostic solutions and such other purposes as may be specified by the NDHMP. The policy states that in order to access the anonymised or de-identified data the entity requesting the data would have to provide relevant information such as name, purpose of use and nodal person of contact details. While the policy does not go into details about the scrutiny of the organisations seeking this data, it does state that the data will be provided based on the term as may be stipulated. <br /></span></p>
<p style="text-align: justify; "><span>However the issue arises as both the documents published by the NHA do not have a similar process for getting the data, for example the NDHMP requires the data fiduciary to share the data directly, while the PMJay guidelines requires the data to be shared by the Data Sharing Committee, creating duplicate datasets as well as affecting the quality of the data being shared. </span></p>
<h3><b><span>Recommendations for India </span></b><span><br /> </span><span>Need for a data protection legislation:</span></h3>
<p style="text-align: justify; "><span>While the EUHDS is still a draft document and the end result could be different based on the consultations and deliberations, the document has a strong base with respect to the privacy and data protection based on the earlier regulations and the GDPR. The definitions of what counts as health data, and the parameters for managing the data creates a more streamlined process for all stakeholders. More importantly the GDPR and other regulations provide a way of recourse for people. In India the health data related policies and strategy documents have been published and enforced before the data protection legislation is passed. In addition to this India, unlike the EU has just begun looking at a universal health ID and digitisation of the healthcare system, ideally it would be better to take each step at a time, and at first look at the issues that may arise due to the universal health ID. In addition to this, multiple policies, without a strong data protection legislation providing parameters and definitions could mean that the health data management policies only benefit certain people. This also creates uncertainty in terms of where an individual will go in case of harms caused by the processing of their data, and who would be the authority to govern questions around health data. The division of health data management between different documents also creates multiple silos of data management which creates data duplication and issues with data quality. </span></p>
<h3><span>Secondary use of data</span></h3>
<p style="text-align: justify; "><span>While both the EUHDS and India's Health Data Management Policy look at the sharing of health data with researchers and private organisations in order to foster innovation, the division of sharing of data based on who uses the data is a good way to ensure that only interested parties have access to the data. With respect to the health data policies in India, a number of policies talk about the sharing of anonymised data with researchers, however the documents being scattered could cause the same data to be shared by multiple health data entities, making it possible to identify people. For example, the health data management policy could share anonymised data of health services used by a person, whereas the PMJAY policy could share data about insurance covers, and the researcher could probably match the data and be closer to identifying people. It has also been revealed in multiple studies that anonymisation of data is not permanent and that the anonymisation can be broken. This is more concerning since the polices do not put limits or checks on who the researchers are and what is the end goal of the data sought by them, the policies seem to rely on the anonymisation of the data as the only check for privacy. This data could be used to de-anonymise people, could be used by companies working with the researchers to get large amounts of data to train their systems, </span></p>
<p><span>train data that could lead to greater surveillance, increase insurance scrutiny etc. The NHA and Indian health policy makers could look at the restrictions and checks that the EUHDS creates for the secondary use of data and create systems of checks and categories of researchers and organisations seeking data to ensure minimal risks to an individual’s data. </span></p>
<h2><b><span>Conclusion</span></b></h2>
<p style="text-align: justify; "><span>While the EU Health data space has been criticised for facilitating vast amounts of data with private companies and the collecting of data by governments, the codification of the legislation does in some way give some way to regulate the flow of health data. While India does not have to emulate the EU and have a similar document, it could look at the best practices and issues that are being highlighted with the EUHDS. Indian lawmakers have looked at the GDPR for guidance for the draft data protection legislation, similarly it could do so with regard to health data and health data management. One possible way to ensure both the free flow of health data and the safeguards of a regulation could be to re-introduce the DISHA Act which much like the EUHDS could act as a legislation which provides an anchor to the multiple health data policies, including standard definition of health data, grievance redressal bodies, and adjudicating authorities and their functions. In addition a legislation dedicated to the health data would also remove the existing burden on the to be formed data protection authority. </span></p>
<hr />
<div><br />
<div id="ftn1">
<p><a href="#_ftnref1" name="_ftn1"><sup><sup><span>[1]</span></sup></sup></a><span> “</span><span>European Health Data Space</span><span>”, European Commission, 03 May 2022,https://health.ec.europa.eu/ehealth-digital-health-and-care/european-health-data-space_en </span></p>
</div>
<div id="ftn2">
<p><a href="#_ftnref2" name="_ftn2"><sup><sup><span>[2]</span></sup></sup></a><span>“</span><span>European Health Data Space</span><span>”</span></p>
</div>
<div id="ftn3">
<p><a href="#_ftnref3" name="_ftn3"><sup><sup><span>[3]</span></sup></sup></a><span> “National Digital Health Blueprint”, Ministry of Health and Family Welfare Government of India, https://abdm.gov.in:8081/uploads/ndhb_1_56ec695bc8.pdf</span></p>
</div>
<div id="ftn4">
<p><a href="#_ftnref4" name="_ftn4"><sup><sup><span>[4]</span></sup></sup></a><span> “National Digital Health Blueprint”</span></p>
</div>
<div id="ftn5">
<p><a href="#_ftnref5" name="_ftn5"><sup><sup><span>[5]</span></sup></sup></a><span> “Mondaq” “DISHA – India's Probable Response To The Law On Protection Of Digital Health Data” accessed 13 June 2023,https://www.mondaq.com/india/healthcare/1059266/disha-india39s-probable-response-to-the-law-on-protection-of-digital-health-data</span></p>
</div>
<div id="ftn6">
<p><a href="#_ftnref6" name="_ftn6"><sup><sup><span>[6]</span></sup></sup></a><span>“The Digital Personal Data Protection Bill 2022”, accessed 13 June 2023 , https://www.meity.gov.in/writereaddata/files/The%20Digital%20Personal%20Data%20Potection%20Bill%2C%202022_0.pdf</span></p>
</div>
<div id="ftn7">
<p><a href="#_ftnref7" name="_ftn7"><sup><sup><span>[7]</span></sup></sup></a><span>The Digital Personal Data Protection Bill 2022</span></p>
</div>
<div id="ftn8">
<p style="text-align: justify; "><a href="#_ftnref8" name="_ftn8"><sup><sup><span>[8]</span></sup></sup></a><span> Regulation (EU) 2016/679 defines health data as “Personal data concerning health should include all data pertaining to the health status of a data subject which reveal information relating to the past, current or future physical or mental health status of the data subject. This includes information about the natural person collected in the course of the registration for, or the provision of, health care services as referred to in Directive 2011/24/EU of the European Parliament and of the Council (1) to that natural person; a number, symbol or particular assigned to a natural person to uniquely identify the natural person for health purposes; information derived from the testing or examination of a body part or bodily substance, including from genetic data and biological samples; and any information on, for example, a disease, disability, disease risk, medical history, clinical treatment or the physiological or biomedical state of the data subject independent of its source, for example from a physician or other health professional, a hospital, a medical device or an in vitro diagnostic test. </span></p>
<p><span> </span></p>
</div>
<div id="ftn9">
<p style="text-align: justify; "><a href="#_ftnref9" name="_ftn9"><sup><sup><span>[9]</span></sup></sup></a><span> For creating an integrated, uniform and interoperable ecosystem in a patient or individual centric manner, all the government healthcare facilities and programs, in a gradual/phased manner, should start assigning the same number for providing any benefit to individuals.</span></p>
</div>
<div id="ftn10">
<p style="text-align: justify; "><a href="#_ftnref10" name="_ftn10"><sup><sup><span>[10]</span></sup></sup></a><span> For example a manufacturer of a fitness tracker which is capable of monitoring heart rate could state that the intended purpose of the device was fitness or wellness as opposed to early detection of heart disease thereby not falling under the purview of the regulation.</span></p>
</div>
<div id="ftn11">
<p style="text-align: justify; "><a href="#_ftnref11" name="_ftn11"><sup><sup><span>[11]</span></sup></sup></a><span>“</span><span>Healthcare Executive” “GOQii Launches GOQii Smart Vital 2.0, an ECG-Enabled Smart Watch with Integrated Outcome based Health Insurance & Life Insurance, accessed 13 June 2023<br /> </span><a href="https://www.healthcareexecutive.in/blog/ecg-enabled-smart-watch"><span>https://www.healthcareexecutive.in/blog/ecg-enabled-smart-watch</span></a><span> </span></p>
</div>
<div id="ftn12">
<p style="text-align: justify; "><a href="#_ftnref12" name="_ftn12"><sup><sup><span>[12]</span></sup></sup></a><span> The guidelines only state that the Committee will be responsible for ensuring the compliance of the guidelines in relation to the personal data under its control. And does not go into details of defining the Committee.</span></p>
</div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/health-data-management-policies'>http://editors.cis-india.org/internet-governance/blog/health-data-management-policies</a>
</p>
No publishershwetaHealth ManagementPrivacyInternet GovernanceCovid19Digitisation2023-07-10T16:36:25ZBlog EntryCoWIN Breach: What Makes India's Health Data an Easy Target for Bad Actors?
http://editors.cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution
<b>Recent health data policies have failed to even mention the CoWIN platform.</b>
<p style="text-align: justify; ">The article was <a class="external-link" href="https://www.thequint.com/opinion/cowin-data-breach-health-sensitive-details-policies-solution#read-more">originally published in the Quint</a> on 19 June 2023.</p>
<hr />
<p style="text-align: justify; ">Last week, it was reported that due to an alleged breach of <a href="https://www.thequint.com/fit/cowin-data-breach-private-information-covid-vaccine-telegram-bot">the CoWIN platform</a>, details such as Aadhaar and passport numbers of Indians were made public via a Telegram bot.</p>
<p style="text-align: justify; ">While Minister of State for Information Technology <a href="https://www.thequint.com/fit/cowin-data-breach-telegram-bot-covid-19-vaccine-unanswered-questions">Rajeev Chandrashekar</a> put out information acknowledging that there was some form of a data breach, there is no information on how the breach took place or when a past breach may have taken place.</p>
<blockquote class="quoted" style="text-align: justify; ">This data leak is yet another example of <a href="https://www.thequint.com/opinion/cowin-breach-shows-us-the-structural-problem-with-digital-indias-infrastructure">our health records</a> being exposed in the recent past – during the pandemic, there were reports of COVID-19 test results being leaked online. The leaked information included patients’ full names, dates of birth, testing dates, and names of centres in which the tests were held.</blockquote>
<p style="text-align: justify; ">In December last year, five servers of the <a href="https://www.thequint.com/fit/aiims-ayushman-bharat-digital-mission-health-data">All India Institute of Medical Science</a> (AIIMS) in Delhi were under a cyberattack, leaving sensitive personal data of around 3-4 crore patients compromised.</p>
<p style="text-align: justify; ">In such cases, the Indian Computer Emergency Response Team (CERT-In) is the agency responsible for looking into the vulnerabilities that may have led to them. However, till date, CERT-In has not made its technical findings into such attacks <a href="https://www.thequint.com/topic/data-breach">publicly available</a>.</p>
<h3 style="text-align: justify; ">The COVID-19 Pandemic Created Opportunity</h3>
<p style="text-align: justify; ">The pandemic saw a number of digitisation policies being rolled out in the health sector; the most notable one being the National Digital Health Mission (or NDHM, later re-branded as the Ayushman Bharat Digital Mission).</p>
<p style="text-align: justify; ">Mobile phone apps and web portals launched by the central and state governments during the pandemic are also examples of this health digitisation push. The rollout of the COVID-19 vaccinations also saw the deployment of the CoWIN platform.</p>
<p style="text-align: justify; ">Initially, it was mandatory for individuals to register on CoWIN to get an appointment for vaccination, and there was no option for walk-in-registration or to book an appointment. But, the Centre subsequently modified this rule and walk-in appointments and registrations on CoWIN became permissible from June 2021.</p>
<blockquote>However, a study conducted by the Centre for Internet and Society (CIS) found that states such as Jharkhand and Chhattisgarh, which have low internet penetration, permitted on-site registration for vaccinations from the beginning.</blockquote>
<p>The rollout of the NDHM also saw Health IDs being generated for citizens.</p>
<p style="text-align: justify; ">In several reported cases across states, this rollout happened during the COVID-19 vaccination process – without the informed consent of the concerned person.</p>
<p style="text-align: justify; ">The <b>beneficiaries who have had their Health IDs created through the vaccination process had not been informed</b> about the creation of such an ID or their right to opt out of the digital health ecosystem.</p>
<h3>A Web of Health Data Policies</h3>
<p>Even before the pandemic, India was working towards a Health ID and a health data management system.</p>
<p style="text-align: justify; ">The components of the umbrella National Digital Health Ecosystem (NDHE) are the National Digital Health Blueprint published in 2019 (NDHB) and the NDHM.</p>
<p style="text-align: justify; ">The Blueprint was created to implement the National Health Stack (published in 2018) which facilitated the creation of Health IDs. Whereas the NDHM was drafted to drive the implementation of the Blueprint, and promote and facilitate the evolution of NDHE.</p>
<p>The National Health Authority (NHA), established in 2018, has been given the responsibility of implementing the National Digital Health Mission.</p>
<blockquote style="text-align: justify; ">2018 also saw the Digital Information Security in Healthcare Act (DISHA), which was to regulate the generation, collection, access, storage, transmission, and use of Digital Health Data ("DHD") and associated personal data.</blockquote>
<p>However, since its call for public consultation, <b>no progress has been made</b> on this front.</p>
<p style="text-align: justify; ">In addition to documents that chalk out the functioning and the ecosystem of a digitised healthcare system, the NHA has released policy documents such as:</p>
<ul>
<li>
<p>the Health Data Management Policy (which was revised three times; the latest version released in April 2022)</p>
</li>
<li>
<p>the Health Data Retention Policy (released in April 2021)</p>
</li>
<li>
<p>Consultation paper on the Unified Health Interface (UHI) (released in December 2022)</p>
</li>
</ul>
<p style="text-align: justify; ">Along with these policies, in 2022, the NHA released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) – India’s state health insurance policy.</p>
<blockquote style="text-align: justify; ">However these <b>draft guidelines repeat the pattern of earlier policies</b> <b>on health data</b>, wherein there is no reference to the policies that predated it; the PM-JAY’s Data Sharing Guidelines, published in August 2022, did not even refer to the draft National Digital Health Data Management Policy (published in April 2022).</blockquote>
<p style="text-align: justify; "><b>Interestingly, the recent health data policies do not mention CoWIN.</b> Failing to cross-reference or mention preceding policies creates a lack of clarity on which documents are being used as guidelines by healthcare providers.</p>
<h3 style="text-align: justify; ">Can a Data Protection Bill Be the Solution?</h3>
<p>The draft Data Protection Bill, 2021, defined health data as “…the data related to the state of physical or mental health of the data principal and <b>includes records regarding the past, present or future state of the health of such data principal</b>, data collected in the course of registration for, or provision of health services, data associated with the data principal to the provision of specific health services.”</p>
<p>However, this definition as well as the definition of sensitive personal data was removed from the current version of the Bill (Digital Personal Data Protection Bill, 2022).</p>
<blockquote>Omitting these definitions from the Bill removes a set of data which, if collected, warrants increased responsibility and increased liability. Handling of health data, financial data, government identifiers, etc, need to come with a higher level of responsibility as they are a list of sensitive details of a person.</blockquote>
<p style="text-align: justify; ">The threats posed as a result of this data being leaked are not limited to spam messages or fraud and impersonation, but also of companies that can get a hand on this coveted data and gather insights and train their systems and algorithms, without the need to seek consent from anyone, or without facing the consequences of harm caused.</p>
<p style="text-align: justify; ">While the current version of the draft DPDP Bill states that the data fiduciary shall notify the data principal of any breach, the draft Bill also states that the Data Protection Board “may” direct the data fiduciary to adopt measures that remedy the breach or mitigate harm caused to the data principal.</p>
<p style="text-align: justify; ">The Bill also prescribes penalties of upto Rs 250 crore if the data fiduciary fails to take reasonable security safeguards to prevent a personal data breach, and a penalty of upto Rs 200 crore if the fiduciary fails to notify the data protection board and the data principal of such breach.</p>
<p style="text-align: justify; ">While <b>these steps, if implemented through legislation, would make organisations processing data take their data security more seriously</b>, the removal of sensitive personal data from the definition of the Bill, would mean that data fiduciaries processing health data will not have to take additional steps other than reasonable security safeguards.</p>
<p>The <b>absence of a clear indication of security standards</b> will affect data principals and fiduciaries.</p>
<p style="text-align: justify; ">Looking to bring more efficiency to governance systems, the Centre launched the Digital India Mission in 2015. The press release by the central government reporting the approval of the programme by the Cabinet of Ministers speaks of ‘cradle to grave’ digital identity as one of its vision areas.</p>
<p>The ambitious Universal Health ID and health data management policies are an example of this digitisation mission.</p>
<blockquote>However breaches like this are reminders that without proper data security measures, and a system for having a person responsible for data security, the data is always vulnerable to an attack.</blockquote>
<p style="text-align: justify; ">While the UK and Australia have also seen massive data breaches in the past, India is at the start of its health data digitisation journey and has the ability to set up strong security measures, employ experienced professionals, and establish legal resources to ensure that data breaches are minimised and swift action can be taken in case of a breach.</p>
<p style="text-align: justify; "><b>The first step</b> to understand the vulnerabilities would be to present the CERT-In reports of this breach, and guide other institutions to check for the same so that they are better prepared for future breaches and attacks.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution'>http://editors.cis-india.org/internet-governance/blog/quint-shweta-mohandas-and-pallavi-bedi-june-19-2023-cowin-data-breach-health-sensitive-details-policies-solution</a>
</p>
No publisherShweta Mohandas and Pallavi BediInternet GovernanceData ProtectionPrivacy2023-07-04T09:39:03ZBlog EntryThe Centre for Internet and Society’s comments and recommendations to the: The Digital Data Protection Bill 2022
http://editors.cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill
<b>The Centre for Internet & Society (CIS) published its comments and recommendations to the Digital Personal Data Protection Bill, 2022, on December 17, 2022.</b>
<div class="WordSection1" style="text-align: justify; ">
<p class="MsoNormal"><span> </span></p>
<p align="center" class="MsoNormal" style="text-align:center; "><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p align="right" class="MsoNormal" style="text-align:right; "><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span> </span></p>
<h1><span>High Level Comments</span></h1>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>1.<span> </span></span></b><b><span>Rationale for removing the distinction between personal data and sensitive personal data is unclear.</span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span>All the earlier iterations of the Bill as well as the rules made under Section 43A of the Information Technology Act, 2000<a href="#_ftn1" name="_ftnref1"><sup><sup><span>[1]</span></sup></sup></a> had classified data into two categories; (i) personal data; and (ii) sensitive personal data. The 2022 version of the Bill has removed this distinction and clubbed all personal data under one umbrella heading of personal data. The rationale for this is unclear, as sensitive personal data means such data which could reveal or be related to eminently private data such as financial data, health data, sexual orientations and biometric data. Considering the sensitive nature of the data, the data classified as sensitive personal data is accorded higher protection and safeguards from processing, therefore by clubbing all data as personal data, the higher protection such as the need for explicit consent to the processing of sensitive personal data, the bar on processing of sensitive personal data for employment purposes has also been removed. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>2.<span> </span></span></b><b><span>No clear roadmap for the implementation of the Bill</span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span>The 2018 Bill had specified a roadmap for the different provisions of the Bill to come into effect from the date of the Act being notified.<a href="#_ftn2" name="_ftnref2"><sup><sup><span>[2]</span></sup></sup></a> It specifically stated the time period within which the Authority had to be established and the subsequent rules and regulations notified. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The present Bill does not specify any such blueprint; it does not provide any details on either when the Bill will be notified or the time period within which the Board shall be established and specific Rules and regulations notified. Considering that certain provisions have been deferred to Rules that have to be framed by the Central government, the absence and/or delayed notification of such rules and regulations will impact the effective functioning of the Bill. Provisions such as Section 10(1) which deals with verifiable parental consent for data of children, Section 13 (1) which states the manner in which a Data Principal can initiate a right to correction, the process of selection and functioning of consent manager under </span><span>3(7)</span><span> are few such examples, that when the Act becomes applicable, the data principal will have to wait for the Rules to Act of these provisions, or to get clarity on entities created by the Act. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The absence of any sunrise or sunset provision may disincentivise political or industrial will to support or enforce the provisions of the Bill. An example of such a lack of political will was the establishment of the Cyber Appellate Tribunal. The tribunal was established in 2006 to redress cyber fraud. However, it was virtually a defunct body from 2011 onwards when the last chairperson retired. It was eventually merged with the Telecom Dispute Settlement and Appellate Tribunal in 2017. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>We recommend that Bill clearly lays out a time period for the implementation of the different provisions of the Bill, especially a time frame for the establishment of the Board. This is important to give full and effective effect to the right of privacy of the individual. It is also important to ensure that individuals have an effective mechanism to enforce the right and seek recourse in case of any breach of obligations by the data fiduciaries. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The Board must ensure that Data Principals and Fiduciaries have sufficient awareness of the provisions of this Bill before bringing the provisions for punishment into force. This will allow the Data Fiduciaries to align their practices with the provisions of this new legislation and the Board will also have time to define and determine certain provisions that the Bill has left the Board to define. Additionally enforcing penalties for offenses initially must be in a staggered process, combined with provisions such as warnings, in order to allow first time and mistaken offenders which now could include data principals as well, from paying a high price. This will relieve the fear of smaller companies and startups and individuals who might fear processing data for the fear of paying penalties for offenses.</span></p>
<p class="MsoNormal"><span> </span></p>
<h3><a name="_kn12ecl3pdrp"></a><span>3.<span> </span></span><span>Independence of Data Protection Board of India.</span></h3>
<p class="MsoNormal"><span>The Bill proposes the creation of the Data Protection Board of India (Board) in place of the Data Protection Authority. In comparison with the powers of the Board with the 2018 and 2019 version of Personal Data Protection Bill, we witness an abrogation of powers of the Board to be created, in this Bill. Under Clause 19(2), the strength and composition of the Board, the process of selection, the terms and conditions of appointment and service, and the removal of its Chairperson and other Members shall be such as may be prescribed by the Union Government at a later stage. Further as per Clause 19(3), the Chief Executive of the Board will be appointed by the Union Government and the terms and conditions of her service will also be determined by the Union Government. The functions of the Board have also not been specified under the Bill, the Central Government may assign the functions to be performed by the Board.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>In order to govern data protection effectively, there is a need for a responsive market regulator with a strong mandate, ability to act swiftly, and resources. The political nature of personal data also requires that the governance of data, particularly the rule-making and adjudicatory functions performed by the Board are independent of the Executive. </span></p>
<h1><a name="_n9jzjnvile8f"></a><span>Chapter Wise Comments and Recommendations </span></h1>
<h2><a name="_chp7y0vgrjqa"></a><span>CHAPTER I- PRELIMINARY</span></h2>
<p class="MsoNormal"><span><span> </span>●<span> </span></span><b><span>Definition:</span></b><span> While the Bill has added a few new definitions to the Bill including terms such as gains, loss, consent manager etc. there are a few key definitions that have been removed from the earlier versions of the Bill. The removal of certain definitions in the Bill, eg. sensitive personal data, health data, biometric data, transgender status, creating a legal uncertainty about the application of the Bill. </span></p>
<p class="MsoNormal"><span>With respect to the existing definitions as well the definition of the term ‘harm’ has been significantly reduced to remove harms such as surveillance from the ambit of harms. In addition, with respect of the definition of the term of harms also, the 2019 version of the Bill under Clause 2 (20) the definition provides a non exhaustive list of harms, by using the phrase “harms include”, however in the new definition the phrase has been altered to “harm”, in relation to a Data Principal, means”, thereby removing the possibility of more harms that are not apparent currently from being within the purview of the Act. We recommend that the definition of harms be made into a non-exhaustive list.<br /> <br /> </span></p>
<h2><a name="_nhwnuzprx0ir"></a><span>CHAPTER II - OBLIGATIONS OF DATA FIDUCIARY</span></h2>
<p class="MsoNormal"><b><span>Notice: </span></b><span>The revised Clause on notice does away with the comprehensive requirements which were laid out under Clause 7 of the PDP Bill 2019. The current clause does not mention in detail what the notice should contain, while stating that that the notice should be itemised. While it can be reasoned that the Data Fiduciary can find the contents of the notice throughout the bill, such as with the rights of the Data Principal, the removal of a detailed list could create uncertainty for Data Fiduciaries. By leaving the finer details of what a notice should contain, it could cause Data Fiduciaries from missing out key information from the list, which in turn provide incomplete information to the Data Principal. Even in terms of Data Fiduciaries they might not know if they are complying with the provisions of the bill, and could result in them invariably being penalised. In addition to this by requiring less work by the Data Fiduciary and processor, the burden falls on the Data Principal to make sure they know how their data is processed and collected. The purpose of this legislation is to create further rights for individuals and consumers, hence the Bill should strive to put the individual at the forefront.</span></p>
<p class="MsoNormal"><span>In addition to this Clause 6(3) of the Bill states <i>“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English or any language specified in the Eighth Schedule to the Constitution of India.”</i> While the inclusion of regional language notices is a welcome step, we suggest that the text be revised as follows <i>“The Data Fiduciary shall give the Data Principal the option to access the information referred to in sub-sections (1) and (2) in English<b> and in</b> any language specified in the Eighth Schedule to the Constitution of India.” </i>While the main crux of notice is to let the person know before giving consent, notice in a language that a person cannot read would not lead to meaningful consent.</span></p>
<p class="MsoNormal"><b><span>Consent <br /> <br /> </span></b><span>Clause 3 of the Bill states <i>“request for consent would have the contact details of a Data Protection Officer, where applicable, or of any other person authorised by the Data Fiduciary to respond to any communication from the Data Principal for the purpose of exercise of her rights under the provisions of this Act.” </i>Ideally this provision should be a part of the notice and should be mentioned in the above section. This is similar to Clause 7(1)(c) of the draft Personal Data Protetion Bill 2019 which requires the notice to state <i>“the identity and contact details of the data fiduciary and the contact details of the data protection officer, if applicable;”. </i></span></p>
<p class="MsoNormal"><b><span>Deemed Consent</span></b></p>
<p class="MsoNormal"><span>The Bill introduces a new type of consent that was absent in the earlier versions of the Bill. We are of the understanding that deemed consent is used to redefine non consensual processing of personal data. The use of the term deemed consent and the provisions under the section while more concise than the earlier versions could create more confusion for Data Principals and Fiduciaries alike. The definition and the examples do not shed light on one of the key issues with voluntary consent - the absence of notice. In addition to this the Bill is also silent on whether deemed consent can be withdrawn or if the data principal has the same rights as those that come from processing of data they have consented to. </span></p>
<p class="MsoNormal"><b><span>Personal Data Protection of Children </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span>The age to determine whether a person has the ability to legally consent in the online world has been intertwined with the age of consent under the Indian Contract Act; i.e. 18 years. The Bill makes no distinction between a 5 year old and a 17 year old- both are treated in the same manner. It assumes the same level of maturity for all persons under the age of 18. It is pertinent to note that the law in the offline world does recognise that distinction and also acknowledges the changes in the level of maturity. As per Section 82 of the Indian Penal Code read with Section 83, any act by a child under the age of 12 shall not be considered as an offence. While the maturity of those aged between 12–18 years will be decided by court (individuals between the age of 16–18 years can also be tried as adults for heinous crimes). Similarly, child labour laws in the country allow children above the age of 14 years to work in non-hazardous industry</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>There is a need to evaluate and rethink the idea that children are passive consumers of the internet and hence the consent of the parent is enough. Additionally, the bracketing of all individuals under the age of 18 as children fails to look at how teenages and young people use the internet. This is more important looking at the 2019 data which suggests that two-thirds of India’s internet users are in the 12–29 years age group, with those in the 12–19 age group accounting for about 21.5% of the total internet usage in metro cities. Given that the pandemic has compelled students and schools to adopt and adapt to virtual schools, the reliance on the internet has become ubiquitous with education. Out of an estimated 504 million internet users, nearly one-third are aged under 19. As per the Annual Status on Education Report (ASER) 2020, more than one-third of all schoolchildren are pursuing digital education, either through online classes or recorded videos.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>Instead of setting a blanket age for determining valid consent, we could look at alternative means to determine the appropriate age for children at different levels of maturity, similar to what had been developed by the U.K. Information Commissioner’s Office. The Age Appropriate Code prescribes 15 standards that online services need to follow. It broadly applies to online services "provided for remuneration"—including those supported by online advertising—that process the personal data of and are "likely to be accessed" by children under 18 years of age, even if those services are not targeted at children. This includes apps, search engines, social media platforms, online games and marketplaces, news or educational websites, content streaming services, online messaging services. </span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>The reservation to definition of child under the Bill has also been expressed by some members of the JPC through their dissenting opinion. MP Ritesh Pandey stated that keeping in mind the best interest of the child the Bill should consider a child to be a person who is less than 14 years of age. This would ensure that young people could benefit from the advances in technology without parental consent and reduce the social barriers that young women face in accessing the internet. Similarly Manish Tiwari in his dissenting note also observed that the regulation of the processing of data of children should be based on the type of content or data. The JPC Report observed that the Bill does not require the data fiduciary to take fresh consent of the child, once the child has attained the age of majority, and it also does not give the child the option to withdraw their consent upon reaching the majority age. It therefore, made the following recommendations:</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>Registration of data fiduciaries, exclusively dealing with children’s data. Application of the Majority Act to a contract with a child. Obligation of Data fiduciary to inform a child to provide their consent, three months before such child attains majority Continuation of the services until the child opts out or gives a fresh consent, upon achieving majority. However, these recommendations have not been incorporated into the provisions of the Bill. In addition to this the Bill is silent on the status of non consensual processing and deemed consent with respect to the data of children.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><span>We recommend that fiduciaries who have services targeted at children should be considered as significant Data Fiduciaries. In addition to this the Bill should also state that the guardians could approach the Data Protection Board on behalf of the child. With these obligations in place, the age of mandatory consent could be reduced and the data fiduciary could have an added responsibility of informing the children in the simplest manner how their data will be used. Such an approach places a responsibility on Data Fiduciaires when implementing services that will be used by children and allows the children to be aware of data processing, when they are interacting with technology.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>Chapter III-RIGHTS AND DUTIES OF DATA PRINCIPAL</span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span>Rights of Data Principal</span></b></p>
<p class="MsoNormal"><span>Clause 12(3) of the Bill while providing the Data Principal the right to be informed of the identities of all the Data Fiduciaries with whom the personal data has been shared, also states that the data principal has the right to be informed of the categories of personal data shared. However the current version of the Bill provides only one category of data that is personal data. </span></p>
<p class="MsoNormal"><span>Clause 14 of the Bill talks about the Right of Grievance Redressal, and states that the Data Principal has the right to readily available means of registering a grievance, however the Bill does not provide in the Notice provisions the need to mention details of a grievance officer or a grievance redressal mechanism. It is only the additional obligations on significant data fiduciary that mentions the need for a Data Protection officer to be the contact for the grievance redressal mechanism under the provisions of this Bill. The Bill could ideally re-use the provisions of the IT Act SPDI Rules 2011 in which Section 5(7) states <i>“Body corporate shall address any discrepancies and grievances of their provider of the information with respect to processing of information in a time bound manner. For this purpose, the body corporate shall designate a Grievance Officer and publish his name and contact details on its website. The Grievance Officer shall redress the grievances or provider of information expeditiously but within one month ' from the date of receipt of grievance.”<br /> </i><br /> The above framing would not only bring clarity to the data fiduciaries on what process to follow for a grievance redressal, it also would reduce the significant burden of theBoard. </span></p>
<p class="MsoNormal"><b><span>Duties of Data Principals</span></b></p>
<p class="MsoNormal"><span>The Bill while entisting duties of the Data Principal states that the “Data Principal shall not register a false or frivolous grievance or complaint with a Data Fiduciary or the Board”, however it is very difficult for a Data Principal to and even for the Board to determine what constitutes a “frivolous grievance”. In addition to this the absence of a defined notice provision and the inclusion of deemed consent would mean that the Data Fiduciary could have more information about the matter than the Data Principal. This could mean that the fiduciary could prove that a claim was false or frivolous. Clause 21(12) states that “<i>At any stage after receipt of a complaint, if the Board determines that the complaint is devoid of merit, it may issue a warning or impose costs on the complainant.” </i>In addition to this Clause 25(1) states that “ <i>If the Board determines on conclusion of an inquiry that non- compliance by <b>a person </b>is significant, it may, after giving the person a reasonable opportunity of being heard, impose such financial penalty as specified in Schedule 1, not exceeding rupees five hundred crore in each instance.” </i>The use of the term “person” in this case includes data which could mean that they could be penalised under the provisions of the Bill, which could also include not complying with the duties.</span></p>
<p class="MsoNormal"><span> </span></p>
<p class="MsoNormal"><b><span>CHAPTER IV- SPECIAL PROVISIONS</span></b></p>
<p class="MsoNormal"><b><span>Transfer of Personal Data outside India</span></b></p>
<p class="MsoNormal"><span>Clause 17 of the Bill has removed the requirement of data localisation which the 2018 and 2019 Bill required. Personal data can be transferred to countries that will be notified by the central government. There is no need for a copy of the data to be stored locally and no prohibition on transferring sensitive personal data and critical data. Though it is a welcome change that personal data can be transferred outside of India, we would highlight the concerns in permitting unrestricted access to and transfer of all types of data. Certain data such as defence and health data do require sectoral regulation and ringfencing of the transfer of data. </span></p>
<p class="MsoNormal"><b><span>Exemptions</span></b></p>
<p class="MsoNormal"><span>Clause 18 of the Bill has widened the scope of government exemptions. Blanket exemption has been given to the State under Clause 18(4) from deleting the personal data even when the purpose for which the data was collected is no longer served or when retention is no longer necessary. The requirement of <i>proportionality, reasonableness and fairness</i> have been removed for the Central Government to exempt any department or instrumentality from the ambit of the Bill.</span><span> </span><span>By doing away with the four pronged test, this provision is not in consonance with test laid down by the Supreme Court and are also incompatible with an effective privacy regulation. There is also no provision for either a prior judicial review of the order by a district judge as envisaged by the Justice Srikrishna Committee Report or post facto review by an oversight committee of the order as laid down under the Indian Telegraph Rules, 1951<a href="#_ftn3" name="_ftnref3"><sup><sup><span>[3]</span></sup></sup></a> and the rules framed under Information Technology Act<a href="#_ftn4" name="_ftnref4"><sup><sup><span>[4]</span></sup></sup></a>. The provision states that such processing of personal data shall be subject to the procedure, safeguard and oversight mechanisms that may be prescribed.</span></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><b><span> </span></b></p>
<p class="MsoNormal"><span> </span></p>
</div>
<div style="text-align: justify; "><br clear="all" />
<hr align="left" size="1" width="100%" />
<div id="ftn1">
<p class="MsoNormal"><a href="#_ftnref1" name="_ftn1"><sup><span><sup><span>[1]</span></sup></span></sup></a><span> Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011</span><span>.</span></p>
</div>
<div id="ftn2">
<p class="MsoNormal"><a href="#_ftnref2" name="_ftn2"><sup><span><sup><span>[2]</span></sup></span></sup></a><span> Clause 97 of the 2018 Bill states<i>“(1) For the purposes of this Chapter, the term ‘notified date’ refers to the date notified by the Central Government under sub-section (3) of section 1. (2)The notified date shall be any date within twelve months from the date of enactment of this Act. (3)The following provisions shall come into force on the notified date-(a) Chapter X; (b) Section 107; and (c) Section 108. (4)The Central Government shall, no later than three months from the notified date establish the Authority. (5)The Authority shall, no later than twelve months from the notified date notify the grounds of processing of personal data in respect of the activities listed in sub-section (2) of section 17. (6) The Authority shall no, later than twelve months from the date notified date issue codes of practice on the following matters-(a) notice under section 8; (b) data quality under section 9; (c) storage limitation under section 10; (d) processing of personal data under Chapter III; (e) processing of sensitive personal data under Chapter IV; (f) security safeguards under section 31; (g) research purposes under section 45;(h) exercise of data principal rights under Chapter VI; (i) methods of de-identification and anonymisation; (j) transparency and accountability measures under Chapter VII. (7)Section 40 shall come into force on such date as is notified by the Central Government for the purpose of that section.(8)The remaining provision of the Act shall come into force eighteen months from the notified date.”</i></span></p>
</div>
<div id="ftn3">
<p class="MsoNormal"><a href="#_ftnref3" name="_ftn3"><sup><span><sup><span>[3]</span></sup></span></sup></a><span> </span><span>Rule 419A (16): The Central Government or the State Government shall constitute a Review Committee. </span></p>
<p class="MsoNormal"><span>Rule 419 A(17): The Review Committee shall meet at least once in two months and record its findings whether the directions issued under sub-rule (1) are in accordance with the provisions of sub-section (2) of Section 5 of the said Act. When the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above it may set aside the directions and orders for destruction of the copies of the intercepted message or class of messages.</span></p>
<p class="MsoNormal"><span> </span></p>
</div>
<div id="ftn4">
<p class="MsoNormal"><a href="#_ftnref4" name="_ftn4"><sup><span><sup><span>[4]</span></sup></span></sup></a><span> </span><span>Rule 22 of Information Technology (Procedure and Safeguards for Interception, Monitoring and Decryption of Information) Rules, 2009: The Review Committee shall meet at least once in two months and record its findings whether the directions issued under rule 3 are in accordance with the provisions of sub-section (2) of section 69 of the Act and where the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above, it may set aside the directions and issue an order for destruction of the copies, including corresponding electronic record of the intercepted or monitored or decrypted information.</span></p>
<p class="MsoNormal"><span> </span></p>
</div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill'>http://editors.cis-india.org/internet-governance/blog/cis-comments-recommendations-to-digital-data-protection-bill</a>
</p>
No publisherShweta Mohandas and Pallavi BediInternet GovernanceDigital GovernanceData ProtectionPrivacy2023-01-20T02:35:30ZBlog EntryDemystifying Data Breaches in India
http://editors.cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india
<b>Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.
</b>
<p>Edited by Arindrajit Basu and Saumyaa Naidu</p>
<hr />
<p dir="ltr" style="text-align: justify; ">India saw a <a href="https://theprint.in/india/despite-62-drop-in-data-breaches-india-among-top-5-nations-targeted-by-hackers-study-finds/917197/">62% drop in data breaches in the first quarter of 2022</a>. Yet, it ranked fifth on the list of countries most hit by cyberattacks according to a 2022 <a href="https://surfshark.com/blog/data-breach-statistics-by-country">report by Surfshark</a>, a Netherlands-based VPN company. Another report <a href="https://analyticsindiamag.com/the-ridiculous-17-5-cr-for-a-data-breach/">on the cost of data breaches researched by the Ponemon Institute and published by IBM</a> reveals that the breach of about 29500 records between March 2021 and March 2022 resulted in a 25% increase in the average cost from INR 165 million in 2021 to INR 176 million in 2022.</p>
<p style="text-align: justify; "><span>These statistics are certainly a cause for concern, especially in the context of India’s rapidly burgeoning digital economy shaped by the pervasive platformization of private and public services such as welfare, banking, finance, health, and shopping among others. Despite the rate at which data breaches occur and are reported in the media, there seems to be little information about how and when they are resolved. This post examines the discourse on data breaches in India with respect to their historical forms, with a focus on how the specific terminology to describe data security incidents has evolved in mainstream news media reportage.</span></p>
<p style="text-align: justify; "><span>While expert articulations of cybersecurity in general and data breaches in particular tend to predominate the public discourse on data privacy, this post aims to situate broader understandings of data breaches within the historical context of India’s IT revolution and delve into specific concepts and terminology that have shaped the broader discourse on data protection. The late 1990s and early 2000s offer a useful point of entry into the genesis of the data security landscape in India.</span></p>
<h3><span></span><span>Data Breaches and their Predecessor Forms</span></h3>
<p style="text-align: justify; "><span></span><span>The articulation of data security concerns around the late 1990s and early 2000s isn’t always consistent in deploying the phrase, ‘data breach’ to signal cybersecurity concerns in India. The terms such as ‘data/ identity theft’ and ‘data leak’ figure prominently in the public articulation of concerns with the handling of personal information by IT systems, particularly in the context of business process outsourcing (BPO) and e-commerce activities. Other pertinent terms such as “security breach”, “data security”, and ‘“cyberfraud” also capture the specificity of growing concerns around outsourced data to India. At the time, i.e. around mid-2000s regulatory frameworks were still evolving to accommodate and address the complexities arising from a dynamic reconfiguration of the telecommunications and IT landscape in India.</span></p>
<p dir="ltr" style="text-align: justify; ">Some of the formative cases that instantiate the usage of the aforementioned terms are instructive to understand shifts in the reporting of such incidents over time. The earliest case during that period concerns<a href="https://www.stop-source-code-theft.com/source-code-theft-cases-in-india/"> a 2002 case concerning the theft and sale of source code</a> by an IIT Kharagpur student who intended to sell the code to two undercover FBI agents who worked with the CBI to catch the thief. A straightforward case of data theft was framed by media stories around the time as a <a href="https://timesofindia.indiatimes.com/iitian-held-for-stealing-software-source-code/articleshow/20389713.cms">cybercrime involving the illegal sale</a> of the source code of a software package, as <a href="https://economictimes.indiatimes.com/ip-laws-lax-but-us-firm-bets-on-india/articleshow/696197.cms?from=mdr">software theft of intellectual property in the context of outsourcing</a> and as an instance of <a href="https://www.computerworld.com/article/2573515/at-risk-offshore.html">industrial espionage in poor nations without laws protecting foreign companies</a>. This case became the basis of the earliest calls for the protection of data privacy and security in the context of the Indian BPO sector. The Indian IT Act, 2000 at the time only covered <a href="http://pavanduggal.com/wp-content/uploads/2016/01/India-Responds-to-Growing-Concerns-Over-Data-Security.pdf">unauthorized access and data theft from computers and networks without any provisions for data protection, interception or computer forgery</a>. The BPO boom in India brought with it <a href="https://blj.ucdavis.edu/archives/vol-6-no-2/offshore-outsourcing-to-india.html">employment opportunities for India’s English-speaking, educated youth but in the absence of concrete data privacy legislation</a>, the country was regarded as an unsafe destination for outsourcing aside from the political ramifications concerning the loss of American jobs.</p>
<p dir="ltr" style="text-align: justify; ">In a major 2005 incident, employees of the Mphasis BFL call centre in Pune extracted sensitive bank account information of Citibank’s American customers to divert INR 1.90 crore into new accounts set up in India. The media coverage of this incident calls it <a href="https://www.indiatoday.in/magazine/economy/story/20050502-pune-call-centre-fraud-rattles-india-booming-bpo-sector-787790-2005-05-01">India’s first outsourcing cyberfraud and a well planned scam</a>, a <a href="https://economictimes.indiatimes.com/mphasis-call-centre-fraud-net-widens/articleshow/1077097.cms">cybercrime in a globalized world</a>, and a case of <a href="https://timesofindia.indiatimes.com/home/sunday-times/deep-focus/indias-first-bpo-scam-unraveled/articleshow/1086438.cms">financial fraud and a scam</a> that required no hacking skills, and a <a href="https://www.infoworld.com/article/2668975/indian-call-center-workers-charged-with-citibank-fraud.html">case of data theft and misuse</a>. Within the ambit of cybercrime, media reports of these incidents refer to them as cases of “fraud”, “scam” and “theft''.</p>
<p dir="ltr" style="text-align: justify; ">Two other incidents in 2005 set the trend for a critical spotlight on data security practices in India. In a <a href="http://news.bbc.co.uk/2/hi/south_asia/4619859.stm">June 2005 incident, an employee of a Delhi-based BPO firm, Infinity e-systems, sold the account numbers and passwords of 1000 bank customers </a>to the British Tabloid, The Sun. The Indian newspaper, Telegraph India, carried an online story headlined, “<a href="https://www.telegraphindia.com/india/bpo-blot-in-british-backlash-indian-sells-secret-data/cid/873737">BPO Blot in British Backlash: Indian Sells Secret Data</a>,” which reported that the employee, Kkaran Bahree, 24, was set up by a British journalist, Oliver Harvey. Harvey filmed Bahree accepting wads of cash for the stolen data. Bahree’s theft of sensitive information is described both as a data fraud and a leak in the above 2005 BBC story by Soutik Biswar. Another story on the incident calls it a “<a href="https://www.rediff.com/money/2005/jun/24bpo3.htm">scam” involving the leakage of credit card information</a>. The use of the term ‘leak’ appears consistently across other media accounts such as a <a href="https://timesofindia.indiatimes.com/city/delhi/esearch-bpo-employee-sacked-still-missing/articleshow/1153017.cms">2005 story on Karan Bahree in the Times of India</a> and another story in the Economic Times about the Australian Broadcasting Corporation’s (ABC) sting operation similar to the one in Delhi, describing the scam by the <a href="https://economictimes.indiatimes.com/hot-links/bpo/karan-bahree-part-ii-shot-in-australia/articleshow/1201347.cms?from=mdr">fraudsters as a leak</a> of the online information of Australians. Another media account of the coverage describes the incident in more generic terms such as an “<a href="https://www.tribuneindia.com/2005/20050625/edit.htm">outsourcing crime</a>”.</p>
<p dir="ltr" style="text-align: justify; ">The other case concerned <a href="https://www.taylorfrancis.com/chapters/mono/10.4324/9781315610689-16/political-economy-data-security-bpo-industry-india-alan-chong-faizal-bin-yahya">four former employees of Parsec technologies who stole classified information and diverted calls from potential customers</a>, causing a sudden drop in the productivity of call centres managed by the company in November 2005. Another call centre <a href="http://news.bbc.co.uk/1/hi/uk/7953401.stm">fraud came to light in 2009 through a BBC sting operation in which British reporters went to Delhi </a>and secretly filmed a deal with a man selling credit card and debit card details obtained from Symantec call centres, which sold software made by Norton. This BBC story uses the term “breach” to refer to the incident.</p>
<p dir="ltr">In the broader framing of these cases generally understood as cybercrime, which received transnational media coverage, the terms “fraud”, “leak”, “scam”, and “theft” appear interchangeably. The term “data breach” does not seem to be a popular or common usage in these media accounts of the BPO-related incidents. A broader sense of breach (of confidentiality, privacy) figures in the media reportage in <a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr">implicitly racial terms of cultural trust</a>, as a matter of <a href="https://www.news18.com/news/business/bpo-staff-need-ethical-training-poll-248442.html">ethics and professionalism</a> and in the <a href="https://www.news18.com/news/business/sting-op-may-spell-doom-for-bpos-248260.html">language of scandal </a>in some cases.</p>
<p dir="ltr" style="text-align: justify; ">These early cases typify a specific kind of cybercrime concerning the theft or misappropriation of outsourced personal data belonging to British or American residents. What’s remarkable about these cases is the utmost sensitivity of the stolen personal information including financial details, bank account and credit/debit card numbers, passwords, and in one case, source code. While these cases rang the alarm bells on the Indian BPO sector’s data security protocols, they also directed attention to concerns around <a href="https://economictimes.indiatimes.com/hot-links/bpo/cyber-crimes-can-the-west-trust-indian-bpos/articleshow/1157115.cms?from=mdr">the training of Indian employees on the ethics of data confidentiality and vetting through psychometric tests</a> for character assessment. In the wake of these incidents, the National Association of Software and Service Companies (NASSCOM), an Indian non-governmental trade and advocacy group,<a href="https://www.computerworld.com/article/2547959/outsourcing-to-india--dealing-with-data-theft-and-misuse.html"> launched a National Skills Registry for IT professionals to enable employers to conduct background checks</a> in 2006.</p>
<p dir="ltr" style="text-align: justify; ">These data theft incidents earned India a global reputation of an unsafe destination for business process outsourcing, seen to be lacking both, a culture of maintaining data confidentiality and concrete legislation for data protection at the time. Importantly, the incidents of data theft or misappropriation were also traceable back to a known source, a BPO employee or a group of malefactors, who often sold sensitive data belonging to foreign nationals to others in India.</p>
<p dir="ltr" style="text-align: justify; ">The phrase “data leak” also caught on in another register in the context of the widespread use of camera-equipped mobile phones in India. The 2004 Delhi MMS case offers an instance of a date leak, recapitulating the language of scandal in moralistic terms.</p>
<h3 dir="ltr">The Delhi MMS Case</h3>
<p dir="ltr" style="text-align: justify; ">The infamous 2004 incident involved two underage Delhi Public School (DPS) students who recorded themselves in a sexually explicit act on a cellular phone. After a fall out, the male student passed the low-resolution clip on to his friend in which his female friend’s face is seen. The clip, distributed far and wide in India, ended up on the famous e-shopping and auction website, bazee.com leading to <a href="https://indiancaselaw.in/avnish-bajaj-vs-state-dps-mms-scandal-case/">the arrest of the website’s CEO Avinash Bajaj for hosting the listing for sale</a>. Another similar case in 2004 mimicked the mechanics of visual capture through hand-held MMS-enabled mobile phones. A two-minute MMS of a top South-Indian actress <a href="https://timesofindia.indiatimes.com/india/web-of-sleaze-now-nude-video-of-top-actress/articleshow/966048.cms">taking a shower went viral on the Internet in 2004, the year when another MMS of two prominent Bollywood actors kissing</a> had already done the rounds. The <a href="https://www.journals.upd.edu.ph/index.php/plaridel/article/view/2392">MMS case also marked the onset of a national moral panic around the amateur uses of mobile phone technologies</a>, capable of corrupting young Indian minds under a sneaky regime of new media modernity. The MMS case, not strictly the classic case of a data breach - non-visual information generally stored in databases - became an iconic case of a data leak framed in the media as <a href="https://www.telegraphindia.com/india/scandal-in-school-shakes-up-delhi/cid/1667531">a scandal that shocked the country</a>, with calls for the regulation of mobile phone use in schools. The case continued its scandalous afterlife in a <a href="https://www.heraldgoa.in/Edit/dev-ds-leni-has-a-dps-mms-scandal-connection-/21344">2009 Bollywood film, Dev D</a> and another <a href="https://indianexpress.com/article/entertainment/entertainment-others/delhi-mms-scandal-inspires-dibakars-love-sex-aur-dhoka/">2010 film, Love, Sex and Dhokha</a>,</p>
<p dir="ltr" style="text-align: justify; ">Taken together, the BPO data thefts and frauds and the data leak scandals prefigure the contemporary discourse on data breaches in the second decade of the 21st century, or what may also be called the Decade of Datafication. The launch of the Indian biometric identity project, Aadhaar, in 2009, which linked access to public services and welfare delivery with biometric identification, resulted in large-scale data collection of the scheme’s subscribers. Such linking raised the spectre of state surveillance as alleged by the critics of Aadhaar, marking a watershed moment in the discourse on data privacy and protection.</p>
<h3 dir="ltr">Aadhaar Data Security and Other Data Breaches</h3>
<p dir="ltr" style="text-align: justify; ">Aadhaar was challenged in the Indian Supreme Court in 2012 when <a href="https://www.outlookindia.com/website/story/worries-about-the-aadhaar-monster/296790">it was made mandatory for welfare and other services such as banking, taxation and mobile telephony</a>. The national debate on the status of privacy as a cultural practice in Indian society and a fundamental right in the Indian Constitution led to two landmark judgments - the <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf">2017 Puttaswamy ruling</a> holding privacy to be a constitutional right subject to limitations and <a href="https://indiankanoon.org/doc/127517806/">the 2018 Supreme Court judgment holding mandatory Aadhaar to be constitutional only for welfare and taxation but no other service</a>.</p>
<p dir="ltr" style="text-align: justify; ">While these judgments sought to rein in Aadhaar’s proliferating mandatory uses, biometric verification remained the most common mode of identity authentication with <a href="https://www.businesstoday.in/latest/trends/story/aadhaar-not-mandatory-yet-organisations-pose-it-as-a-mandatory-document-335550-2022-05-29">most organizations claiming it to be mandatory for various purposes</a>. During the same period from 2010 onwards, a range of data security events concerning Aadhaar came to light. These included <a href="https://www.firstpost.com/tech/news-analysis/aadhaar-security-breaches-here-are-the-major-untoward-incidents-that-have-happened-with-aadhaar-and-what-was-actually-affected-4300349.html">app-based flaws, government websites publishing Aadhaar details of subscribers, third party leaks of demographic data, duplicate and forged Aadhaar cards and other misuses</a>.</p>
<p dir="ltr" style="text-align: justify; ">In 2015, the Indian government launched its ambitious <a href="https://indiancc.mygov.in/wp-content/uploads/2021/08/mygov-10000000001596725005.pdf">Digital India Campaign to provide government services to Indian citizens</a> through online platforms. Yet, data security breach incidents continued to increase, particularly the trade in the sale and purchase of sensitive financial information related to bank accounts and credit card numbers. The online availability of <a href="https://www.livemint.com/Industry/l5WlBjdIDXWehaoKiuAP9J/India-unprepared-to-tackle-online-data-security-report.html">a rich trove of data, accessible via a simple Google search without the use of any extractive software or hacking skills </a>within a thriving shadow economy of data buyers and sellers makes India a particularly vulnerable digital economy, especially in the absence of robust legislation. The lack of awareness around digital crimes and low digital literacy further exacerbates the situation given that datafication via government portals, e-commerce, and online apps has outpaced the enforcement of legislative frameworks for data protection and cybersecurity.</p>
<p dir="ltr" style="text-align: justify; ">In the context of Aadhaar data security issues, the term “data leak” seems to have more traction in media stories followed by the term “security breach”. Given the complexity of the myriad ways in which Aadhaar data has been breached, terms such as <a href="https://techcrunch.com/2022/06/13/aadhaar-leak-pm-kisan/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cuZ29vZ2xlLmNvbS8&guce_referrer_sig=AQAAADvQXtC19Gj80LSKVc5jLwnRsREalvM2f6dV3N9KmCs8be6_1Zbvu3J6abPmBxhLlUooLiOjg4JktYDDCXr0OYYvOZ5XFlXa6DfCJk97TvMXM-cs3uJbCJBA-ePqvAC5K4qGZSyDB4OykMEOIKXJpB0CTOourPRc5dBxFFq5JXlB">data leak and exposure</a> (of <a href="https://zeenews.india.com/personal-finance/aadhaar-data-breach-over-110-crore-indian-farmers-aadhaar-card-data-compromised-2473666.html">11 crore Indian farmers’ sensitive information</a>) add to the specificity of the data security compromise. The term “fraud” also makes a comeback in the context of <a href="https://www.business-standard.com/article/economy-policy/india-s-aadhaar-id-system-delivers-benefits-but-at-risk-of-widespread-fraud-122062400124_1.html">Aadhaar-related data security incidents</a>. These cases represent a mix of data frauds involving<a href="https://economictimes.indiatimes.com/news/india/alarm-over-fake-id-printing-websites-using-customer-data-for-cyber-fraud/articleshow/94742646.cms"> fake identities</a>, <a href="https://indianexpress.com/article/cities/delhi/in-new-age-data-theft-fraudsters-steal-thumb-prints-from-land-registries-7914530/">theft of thumb prints </a>for instance from land registries and inadvertent data leaks in numerous incidents involving <a href="https://techcrunch.com/2019/01/31/aadhaar-data-leak/">government employees in Jharkhand</a>, v<a href="https://www.firstpost.com/india/aadhaar-data-leak-details-of-7-82-cr-indians-from-ap-and-telangana-found-on-it-grids-database-6448961.html">oter ID information of Indian citizens in Andhra Pradesh and Telangana</a> and <a href="https://www.thehindu.com/sci-tech/technology/major-aadhaar-data-leak-plugged-french-security-researcher/article26584981.ece">activist reports of Indian government websites leaking Aadhaar data</a>.</p>
<p dir="ltr" style="text-align: justify; ">Aadhaar-related data security events parallel the increase in corporate data breaches during the decade of datafication. The term “data leak” again alternates with the term “data breach” in most media accounts while other terms such as “theft” and “scam” all but disappear in the media coverage of corporate data breaches.</p>
<p dir="ltr" style="text-align: justify; ">From 2016 onwards, incidents of corporate data breaches in India continued to rise. A massive <a href="https://thewire.in/banking/debit-card-breach-india-banking">debit card data breach involving the YES Bank ATMs and point-of-sale (PoS) machines </a>compromised through malware between May and July of 2016 resulted in the exposure of ATM PINs and non-personal identifiable information of customers. It went <a href="https://www.livemint.com/Industry/Ope7B0jpjoLkemwz6QXirN/SBI-Yes-Bank-MasterCard-deny-data-breach-of-own-systems.html">undetected for nearly three</a> months. Another data leak in 2018 concerned a <a href="https://www.zdnet.com/article/another-data-leak-hits-india-aadhaar-biometric-database/">system run by Indane, a state-owned utility company, which allowed anyone to download private information on all Aadhaar holders </a>including their names, services they were connected to and the unique 12-digit Aadhaar number. Data breaches continued to be reported in India concurrent with the incidents of data mismanagement related to Aadhaar. Some <a href="https://www.csoonline.com/article/3541148/the-biggest-data-breaches-in-india.html">prominent data breaches included </a>a cyberattack on the systems of airline data service provider SITA resulting in the leak of Air India passenger data, leakage of the personal details of the Common Admission Test (CAT) applicants, details of credit card and order preferences of Domino’s pizza customers on the dark web, leakage of COVID-19 patients’ test results leaked by government websites, user data of Justpay and Big Basket for sale on the dark web and an SBI data breach among others between 2019 and 2021.</p>
<p dir="ltr" style="text-align: justify; ">The media reportage of these data breaches use the term “cyberattack” to describe the activities of hackers and cybercriminals operating within a<a href="https://www.thehindu.com/sci-tech/technology/internet/most-damaging-cybercrime-services-are-cheap-on-the-dark-web/article37004587.ece"> shadow economy or the dark web</a>. Recent examples of cyberattacks by hackers who leak user data for sale on the dark web include <a href="https://indianexpress.com/article/technology/tech-news-technology/mobikwik-database-leaked-on-dark-web-company-denies-any-data-breach-7251448/">8.2 terabytes of 110 million sensitive financial data (KYC details, Aadhaar, credit/debit cards and phone numbers) of the payments app MobiKwik users</a>, <a href="https://www.firstpost.com/tech/news-analysis/dominos-india-data-breach-name-location-mobile-number-email-of-18-crore-orders-up-for-sale-on-dark-web-9650591.html">180 million Domino’s pizza orders (name, location, emails, mobile numbers),</a> and <a href="https://techcrunch.com/2022/07/18/cleartrip-data-breach-dark-web/">Flipkart’s Cleartrip users’ data</a>. In these incidents again, three terms appear prominently in the media reportage - cyberattack, data breach, and leak. The term “data breach” remains the most frequently used epithet in the media coverage of the lapses of data security. While it alternates with the term “leak” in the stories, the term “data breach” appears consistently across most headlines in the news stories.</p>
<p dir="ltr">The exposure of sensitive, personal, and non-personal data by public and private entities in India is certainly a cause for concern, given the ongoing data protection legislative vacuum.</p>
<p dir="ltr" style="text-align: justify; ">The media coverage of data breaches tends to emphasize the quantum of compromised user data aside from the types of data exposed. The media framing of these breaches in <a href="https://www.livemint.com/technology/tech-news/indian-firms-lost-176-million-to-data-breaches-last-fiscal-11658914231530.html">quantitative terms of financial loss</a> as well as the <a href="https://www.indiatoday.in/technology/news/story/personal-data-of-3-4-million-paytm-mall-users-reportedly-exposed-in-2020-data-breach-1980690-2022-07-27">magnitude</a> and the <a href="https://www.moneycontrol.com/news/business/banks/indian-banks-reported-248-data-breaches-in-last-four-years-says-government-8940891.html">number of breaches</a> certainly highlights the gravity of these incidents but harm to individual users is often not addressed.</p>
<h3 dir="ltr">Evolving Terminology and the Source of Data Harms</h3>
<p dir="ltr" style="text-align: justify; ">The main difference in the media reportage of the BPO cybersecurity incidents during the early aughts and the contemporary context of datafication is the usage of the term, “data breach”, which figures prominently in contemporary reportage of data security incidents but not so much in the BPO-related cybercrimes.</p>
<p dir="ltr" style="text-align: justify; ">THe BPO incidents of data theft and the attendant fraud must be understood in the context of the anxieties brought on by a globalizing world of Internet-enabled systems and transnational communications. In most of these incidents regarded as cybercrimes, the language of fraud and scam ventures further to attribute such illegal actions of the identifiable malefactors to cultural factors such as lack of ethics and professionalism.The usage of the term “data leak” in these media reports functions more specifically to underscore a broader lapse in data security as well as a lack of robust cybersecurity laws. The broader term, “breach”, is occasionally used to refer to these incidents but the term, “data breach” doesn’t appear as such.</p>
<p dir="ltr" style="text-align: justify; ">The term “data breach” gains more prominence in media accounts from 2009 onwards in the context of Aadhaar and the online delivery of goods and services by public and private players. The term “data breach” is often used interchangeably with the term “leak” within the broader ambit of cyberattacks in the corporate sector. The media reportage frames Aadhaar-related security lapses as instances of security/data breaches, data leaks, fraud, and occasionally scam.</p>
<p dir="ltr" style="text-align: justify; ">In contrast to the handful of data security cases in the BPO sector, data breaches have abounded in the second decade of the twenty-first century. What further differentiates the BPO-related incidents to the contemporary data breaches is the source of the data security lapse. Most corporate data breaches remain attributable to the actions of hackers and cybercriminals while the BPO security lapses were traceable back to ex-employees or insiders with access to sensitive data. We also see in the coverage of the BPO-related incidents, the attribution of such data security lapses to cultural factors including a lack of ethics and professionalism often in racial overtones. The media reportage of the BBC and ABC sting operations suggests that the India BPOs lack of preparedness to handle and maintain personal data confidentiality of foreigners point to the absence of a privacy culture in India. Interestingly, this transnational attribution recurs in a different form in the national debate on <a href="https://huffpost.netblogpro.com/archive/in/entry/indians-don-t-care-about-privacy-but-thankfully-the-law-will-teach-them-what-it-means_a_23179031">Aadhaar and how Indians don’t care about their privacy</a>.</p>
<p dir="ltr" style="text-align: justify; ">The question of the harms of data breaches to individuals is also an important one. In the discourse on contemporary data breaches, the actual material harm to an individual user is rarely ever established in the media reportage and generally framed as potential harm that could be devastating given the sensitivity of the compromised data. The harm is reported to be predominantly a function of organizational cybersecurity weakness or attributed to hackers and cybercriminals.</p>
<p dir="ltr" style="text-align: justify; ">The reporting of harm in collective terms of the number of accounts breached, financial costs of a data breach, the sheer number of breaches and the global rankings of countries with the highest reported cases certainly suggests a problem with cybersecurity and the lack of organizational preparedness. However, this collective framing of a data breach’s impact usually elides an individual user’s experience of harm. Even in the case of Aadhaar-related breaches - a mix of leaking data on government websites and other online portals and breaches - the notion of harm owing to exposed data isn’t clearly established. This is, however, different from the <a href="https://scroll.in/article/1013700/six-types-of-problems-aadhaar-is-causing-and-safeguards-needed-immediately">extensively documented cases of Aadhaar-related issues</a> in which welfare benefits have been denied, identities stolen and legitimate beneficiaries erased from the system due to technological errors.</p>
<h3 dir="ltr">Future Directions of Research</h3>
<p dir="ltr" style="text-align: justify; ">This brief, qualitative foray into the media coverage of data breaches over two decades has aimed to trace the usage of various terms in two different contexts - the Indian BPO-related incidents and the contemporary context of datafication. It would be worth exploring at length, the relationship between frequent reports of data breaches, and the language used to convey harm in the contemporary context of a concrete data protection legislation vacuum. It would be instructive to examine the specific uses of the terms such as “fraud”, “leak”, “scam”, “theft” and “breach” in media reporting of such data security incidents more exhaustively. Such analysis would elucidate how media reportage shapes public perception towards the safety of user data and an anticipation of attendant harm as data protection legislation continues to evolve.</p>
<p dir="ltr" style="text-align: justify; ">Especially with Aadhaar, which represents a paradigm shift in identity verification through digital means, it would be useful to conduct a sentiment analysis of how biometric identity related frauds, scams, and leaks are reported by the mainstream news media. A study of user attitudes and behaviours in response to the specific terminology of data security lapses such as the terms “breach”, “leak”, “fraud”, “scam”, “cybercrime”, and “cyberattack” would further contribute to how lay users understand the gravity of a data security lapse. Such research would go beyond expert understandings of data security incidents that tend to dominate media reportage to elucidate the concerns of lay users and further clarify the cultural meanings of data privacy.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india'>http://editors.cis-india.org/internet-governance/blog/demistifying-data-breaches-in-india</a>
</p>
No publisherPawan SinghPrivacyInternet GovernanceData GovernanceData ProtectionData Management2022-10-17T16:14:03ZBlog EntryGetting the (Digital) Indo-Pacific Economic Framework Right
http://editors.cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right
<b>On the eve of the Tokyo Quad Summit in May 2022, President Biden unveiled the Indo-Pacific Economic Framework (IPEF), visualising cooperation across the Indo-Pacific based on four pillars: trade; supply chains; clean energy, decarbonisation and infrastructure; and tax and anti-corruption. Galvanised by the US, the other 13 founding members of the IPEF are Australia, Brunei Darussalam, India, Indonesia, Japan, Republic of Korea, Malaysia, New Zealand, Philippines, Singapore, Thailand and Vietnam. The first official in-person Ministerial meeting was held in Los Angeles on 9 September 2022.</b>
<p style="text-align: justify; ">The article was <a class="external-link" href="https://directionsblog.eu/getting-the-digital-indo-pacific-economic-framework-right/">originally published in Directions</a> on 16 September 2022.</p>
<hr />
<p style="text-align: justify; ">It is still early days. Given the broad and noncommittal scope of the <a href="http://indiamediamonitor.in/ViewImg.aspx?rfW3mQFhdxZsqXnJzK5Xi5+XYlnW6zXnPDF3Ad56Y/KdgI1zvICzrodtLI85MPKdVO1fIh79GUlPfyXY2/bE2g==" rel="noreferrer noopener" target="_blank">economic arrangement</a>, it is unlikely that the IPEF will lead to a trade deal among members in the short run. Instead, experts believe that this new arrangement is designed to serve as a ‘<a href="https://indianexpress.com/article/opinion/columns/building-on-common-ground-7963518/" rel="noreferrer noopener" target="_blank">framework or starting point</a>’ for members to cooperate on geo-economic issues relevant to the Indo-Pacific, buoyed in no small part by the United States’ desire to make up lost ground and counter Chinese economic influence in the region.</p>
<p style="text-align: justify; ">United States Trade Representative (USTR) Katherine Tai has underscored the relevance of the Indo-Pacific digital economy to the US agenda with the IPEF. She has emphasized the <a href="https://www.whitehouse.gov/briefing-room/press-briefings/2022/05/23/on-the-record-press-call-on-the-launch-of-the-indo-pacific-economic-framework/" rel="noreferrer noopener" target="_blank">importance of</a> collaboratively addressing key connectivity and technology challenges, including standards on cross-border data flows, data localisation and online privacy, as well as the discriminatory and unethical use of artificial intelligence. This is an ambitious agenda given the divergence among members in terms of technological advancement, domestic policy preferences and international negotiating stances at digital trade forums. There is a significant risk that imposing external standards or values on this evolving and politically-contested digital economy landscape will not work, and may even undermine the core potential of the IPEF in the Indo-Pacific. This post evaluates the domestic policy preferences and strategic interests of the Framework’s member states, and how the IPEF can navigate key points of divergence in order to achieve meaningful outcomes.</p>
<h3><strong>State of domestic digital policy among IPEF members</strong></h3>
<p style="text-align: justify; ">Data localisation is a core point of divergence in global digital policymaking. It continues to dominate discourse and trigger dissent at all <a href="https://www.ikigailaw.com/the-data-localization-debate-in-international-trade-law/#acceptLicense" rel="noreferrer noopener" target="_blank">international trade forums</a>, including the World Trade Organization. IPEF members have a range of domestic mandates restricting cross-border flows, which vary in scope, format and rigidity (see table below)<strong>. </strong>Most countries only have a conditional data localisation requirement, meaning data can only be transferred to countries where it is accorded an equivalent level of protection – unless the individual whose data is being transferred consents to said transfer. <a href="https://www.lexology.com/library/detail.aspx?g=ee977f2e-ecfb-45cf-9f63-186a78a49512#:~:text=Australia%20has%20no%20broad%20data,transferred%20or%20processed%20outside%20Australia." rel="noreferrer noopener" target="_blank">Australia </a>and the <a href="https://www.acq.osd.mil/dpap/pdi/docs/FAQs_Network_Penetration_Reporting_and_Contracting_for_Cloud_Services_(01-27-2017).pdf" rel="noreferrer noopener" target="_blank">United States</a> have sectoral localisation requirements for health and defence data respectively. India presently has multiple sectoral data localisation requirements. In particular, a 2018 Reserve Bank of India (RBI) <a href="https://www.rbi.org.in/Scripts/NotificationUser.aspx?Id=11244&Mode=0" rel="noreferrer noopener" target="_blank">directive</a> imposed strict local storage requirements along with a 24-hour window for foreign processing of payments data generated in India. The RBI imposed a <a href="https://theprint.in/economy/what-is-data-localisation-why-mastercard-amex-diners-club-cant-add-more-customers-in-india/703790/" rel="noreferrer noopener" target="_blank">moratorium</a> on the issuance of new cards by several US-based card companies until compliance issues with the data localisation directive were resolved. Furthermore, several iterations of India’s recently <a href="https://www.thehindu.com/sci-tech/technology/internet/explained-why-has-the-government-withdrawn-the-personal-data-protection-bill-2019/article65736155.ece" rel="noreferrer noopener" target="_blank">withdrawn </a>Personal Data Protection Bill contained localisation requirements for some categories of personal data.</p>
<p style="text-align: justify; ">Indonesia and Vietnam have <a href="https://thediplomat.com/2020/01/the-retreat-of-the-data-localization-brigade-india-indonesia-and-vietnam/" rel="noreferrer noopener" target="_blank">diluted</a> the scopes of their data localisation mandates to apply, respectively, only to companies providing public services and to companies not complying with other local laws. These dilutions may have occurred in response to concerted pushback from foreign technology companies operating in these countries. In addition to sectoral restrictions on the transfer of geospatial data, South Korea<a href="https://carnegieendowment.org/2021/08/17/korean-approach-to-data-localization-pub-85165" rel="noreferrer noopener" target="_blank"> retains </a>several procedural checks on cross-border flows, including formalities regarding providing notice to individual users.</p>
<p style="text-align: justify; ">Moving onto another issue flagged by USTR Tai, while all IPEF members recognise the right to information privacy at an overarching or constitutional level, the legal and policy contours of data protection are at different stages of evolution in different countries. <a href="https://www.dlapiperdataprotection.com/index.html?t=law&c=JP#:~:text=Personal%20Information%20Protection%20Commission,-Kasumigaseki%20Common%20Gate&text=Japan%20does%20not%20have%20a%20central%20registration%20system.&text=There%20is%20no%20specific%20legal,(eg%20Chief%20Privacy%20Officer)." rel="noreferrer noopener" target="_blank">Japan</a>, <a href="https://www.dlapiperdataprotection.com/index.html?t=law&c=KR" rel="noreferrer noopener" target="_blank">South Korea</a>, <a href="https://www.pdp.gov.my/jpdpv2/assets/2020/01/Introduction-to-Personal-Data-Protection-in-Malaysia.pdf" rel="noreferrer noopener" target="_blank">Malaysia</a>, <a href="https://www.linklaters.com/en/insights/data-protected/data-protected---new-zealand#:~:text=There%20is%20no%20data%20portability%20right%20in%20New%20Zealand.&text=While%20there%20is%20no%20%22right,a%20correction%20to%20that%20information." rel="noreferrer noopener" target="_blank">New Zealand,</a> <a href="https://www.privacy.gov.ph/data-privacy-act/#:~:text=%E2%80%93%20(a)%20The%20personal%20information,against%20any%20other%20unlawful%20processing." rel="noreferrer noopener" target="_blank">Philippines</a>, <a href="https://www.pdpc.gov.sg/Overview-of-PDPA/The-Legislation/Personal-Data-Protection-Act#:~:text=What%20is%20the%20PDPA%3F,Banking%20Act%20and%20Insurance%20Act." rel="noreferrer noopener" target="_blank">Singapore</a> and <a href="https://www.trade.gov/market-intelligence/thailand-personal-data-protection-act#:~:text=The%20legislation%20mandates%20that%20data,1%20million%20in%20criminal%20fines." rel="noreferrer noopener" target="_blank">Thailand </a>have data protection frameworks in place. Data protection frameworks in India and Brunei are under consultation. Notably, the US does not have a comprehensive federal framework on data privacy, although there are patchworks of data privacy regulations at both the federal and state levels.</p>
<p style="text-align: justify; ">Regulation and strategic thinking on artificial intelligence (AI) are also at varying levels of development among IPEF members. India has produced a slew of policy papers on Responsible Artificial Intelligence. The most recent <a href="https://www.niti.gov.in/sites/default/files/2021-08/Part2-Responsible-AI-12082021.pdf" rel="noreferrer noopener" target="_blank">policy paper</a> published by NITI AAYOG (the Indian government’s think tank) refers to constitutional values and endorses a risk-based approach to AI regulation, much like that adopted by the EU. The US National Security Commission on Artificial Intelligence (NSCAI), chaired by Google CEO Eric Schmidt, expressed concerns about the US ceding AI leadership ground to China. The NSCAI’s final <a href="https://www.nscai.gov/" rel="noreferrer noopener" target="_blank">report </a>emphasised the need for US leadership of a ‘coalition of democracies’ as an alternative to China’s autocratic and control-oriented model. Singapore has also made key strides on trusted AI, launching <a href="https://www.pdpc.gov.sg/news-and-events/announcements/2022/05/launch-of-ai-verify---an-ai-governance-testing-framework-and-toolkit" rel="noreferrer noopener" target="_blank">A.I. verify</a> – the world’s first AI Governance Testing Framework for companies that wish to demonstrate their use of responsible AI through a minimum verifiable product.</p>
<h3><strong>IPEF and pipe dreams of digital trade</strong></h3>
<p style="text-align: justify; ">Some members of the IPEF are signatories to other regional trade agreements. With the exception of Fiji, India and the US, all the IPEF countries are members of the Regional Comprehensive Economic Partnership <a href="https://www.dfat.gov.au/trade/agreements/in-force/rcep#:~:text=RCEP%20entered%20into%20force%20on,Australia%20as%20an%20original%20party." rel="noreferrer noopener" target="_blank">(RCEP)</a>, which also includes China. Five IPEF member countries are also members of the <a href="https://www.dfat.gov.au/trade/agreements/in-force/cptpp/comprehensive-and-progressive-agreement-for-trans-pacific-partnership" rel="noreferrer noopener" target="_blank">Comprehensive and Progressive Trans-Pacific Partnership (CPTPP)</a> that President Trump backed out of in 2017. Several IPEF members also have bilateral or trilateral trading agreements among themselves, an example being the <a href="https://www.mfat.govt.nz/en/trade/free-trade-agreements/free-trade-agreements-in-force/digital-economy-partnership-agreement-depa/" rel="noreferrer noopener" target="_blank">Digital Economic Partnership Agreement (DEPA)</a> between Singapore, New Zealand and Chile.</p>
<p style="text-align: justify; "><img src="http://editors.cis-india.org/home-images/Pie.png" alt="Pie" class="image-inline" title="Pie" /></p>
<p style="text-align: justify; ">All these ‘mega-regional’ trading agreements contain provisions on data flows, including prohibitions on domestic legal provisions that mandate local computing facilities or restrict cross-border data transfers. Notably, these agreements also incorporate <a href="https://publications.clpr.org.in/the-philosophy-and-law-of-information-regulation-in-india/chapter/indias-engagement-with-global-trade-regimes-on-cross-border-data-flows/" rel="noreferrer noopener" target="_blank">exceptions</a> to these rules. The CPTPP includes within its ambit an exception on the grounds of ‘legitimate public policy objectives’ of the member, while the RCEP incorporates an additional exception for ‘essential security interests’.</p>
<p style="text-align: justify; ">IPEF members are also spearheading <a href="https://www.hinrichfoundation.com/research/article/wto/can-the-wto-build-consensus-on-digital-trade/" rel="noreferrer noopener" target="_blank">multilateral efforts </a>related to the digital economy: Australia, Japan and Singapore are working as convenors of the plurilateral Joint Statement Initiative (JSI) at the World Trade Organization (WTO), which counts 86 WTO members as parties. India (along with South Africa) vehemently <a href="https://docs.wto.org/dol2fe/Pages/SS/directdoc.aspx?filename=q:/WT/GC/W819.pdf&Open=True" rel="noreferrer noopener" target="_blank">opposes</a> this plurilateral push on the grounds that the WTO is a multilateral forum functioning on consensus and a plurilateral trade agreement should not be negotiated within the aegis of the WTO. They fear, rightly, that such gambits close out the domestic policy space, especially for evolving digital economy regimes where keen debate and contestation exist among domestic stakeholders. While wary of the implications of the JSI, other IPEF members, such as Indonesia, have cautiously joined the initiative to ensure that they have a voice at the table.</p>
<p style="text-align: justify; ">It is unlikely that the IPEF will lead to a digital trade arrangement in the short run. Policymaking on issues as complex as the digital economy that must respond to specific social, economic and (geo)political realities cannot be steamrolled through external trade agreements. For instance, after the Los Angeles Ministerial India <a href="https://www.business-standard.com/article/economy-policy/india-opts-out-of-joining-ipef-trade-pillar-to-wait-for-final-contours-122091000344_1.html" rel="noreferrer noopener" target="_blank">opted out</a> of the IPEF trade pillar citing both India’s evolving domestic legislative framework on data and privacy as well as a broader lack of consensus among IPEF members on several issues, including digital trade. Commerce Minister Piyush Goyal explained that India would wait for the “<a href="https://pib.gov.in/PressReleasePage.aspx?PRID=1858243" rel="noreferrer noopener" target="_blank">final contours</a>” of the digital trade track to emerge before making any commitments.</p>
<p style="text-align: justify; ">Besides, brokering a trade agreement through the IPEF runs a risk of redundancy. Already, there exists a ‘<a href="https://www.rieti.go.jp/en/columns/a01_0193.html" rel="noreferrer noopener" target="_blank">spaghetti bowl’</a> of regional trading agreements that IPEF members can choose from, in addition to forming bilateral trade ties with each other.</p>
<p style="text-align: justify; ">This is why Washington has been clear about calling the IPEF an ‘<a href="https://theprint.in/diplomacy/india-set-to-join-us-led-indo-pacific-economic-arrangement-next-week-with-aim-to-counter-china/963795/" rel="noreferrer noopener" target="_blank">economic arrangement</a>’ and not a trade agreement. Membership does not imply any legal obligations. Rather than duplicating ongoing efforts or setting unrealistic targets, the IPEF is an opportunity for all players to shape conversations, share best practices and reach compromises, which could feed back into ongoing efforts to negotiate trade deals. For example, several members of RCEP have domestic data localisation mandates that do not violate trade deals because the agreement carves out exceptions that legitimise domestic policy decisions. Exchanges on how these exceptions work in future trade agreements could be a part of the IPEF arrangement and nudge states towards framing digital trade negotiations through other channels, including at the WTO. Furthermore, states like Singapore that have launched AI self-governance mechanisms could share best practices on how these mechanisms were developed as well as evaluations of how they have helped policy goals be met. And these exchanges shouldn’t be limited to existing IPEF members. If the forum works well, countries that share strategic interests in the region with IPEF members, including, most notably, the European Union, may also want to get involved and further develop partnerships in the region.</p>
<h3><strong>Countering China</strong></h3>
<p>Talking shop on digital trade should certainly not be the only objective of the IPEF. The US has made it clear that they want the message emanating from the IPEF ‘<a href="https://www.business-standard.com/article/international/biden-to-visit-japan-for-quad-summit-to-have-bilateral-meetings-with-modi-122051900128_1.html" rel="noreferrer noopener" target="_blank">to be heard in Beijing</a>’. Indeed, the IPEF offers an opportunity for the reassertion of US economic interests in a region where President Trump’s withdrawal from the CPTPP has left a vacuum for China to fill. Accordingly, it is no surprise that the IPEF has representation from several regions of the Indo-Pacific: South Asia, Southeast Asia and the Pacific.</p>
<p>This should be an urgent policy priority for all IPEF members. Since its initial announcement in 2015, the <a href="https://www.cfr.org/china-digital-silk-road/" rel="noreferrer noopener" target="_blank">Digital Silk Road (DSR)</a>, the digital arm of China’s Belt and Road Initiative, has spearheaded <a href="https://www.iiss.org/blogs/research-paper/2021/02/china-digital-silk-road-implications-for-defence-industry" rel="noreferrer noopener" target="_blank">massive investments</a> by the Chinese private sector (allegedly under close control of the Chinese state) in e-commerce, fintech, smart cities, data centres, fibre optic cables and telecom networks. This expansion has also happened in the Indo-Pacific, unhampered by China’s aggressive geopolitical posturing in the region through maritime land grabs in the South China Sea. With the exception of <a href="https://www.scmp.com/news/asia/southeast-asia/article/3024479/vietnam-shuns-huawei-it-seeks-build-aseans-first-5g" rel="noreferrer noopener" target="_blank">Vietnam</a>, which remains wary of China’s economic expansionism, countries in Southeast Asia welcome Chinese investments, extolling their developmental benefits. Several IPEF members – <a href="https://www.iseas.edu.sg/wp-content/uploads/2022/05/ISEAS_Perspective_2022_57.pdf" rel="noreferrer noopener" target="_blank">including</a> Indonesia, Malaysia and Singapore – have associations with Chinese private sector companies, predominantly Huawei and ZTE. A <a href="https://carnegieendowment.org/2022/07/11/localization-and-china-s-tech-success-in-indonesia-pub-87477" rel="noreferrer noopener" target="_blank">study</a> evaluating Indonesia’s response to such investments indicates that while they are aware of the risks posed by Chinese infrastructure, their calculus remains unaltered: development and capacity building remain their primary focuses. Furthermore, on the specific question of surveillance, given evidence of other countries such as the US and Australia also using digital infrastructure for surveillance, the threat from China is not perceived as a unique risk.</p>
<h3><strong>Setting expectations and approaches</strong></h3>
<p style="text-align: justify; ">Still, the risks of excessive dependence on one country for the development of digital infrastructure are well known. While the IPEF cannot realistically expect to displace the DSR, it can be utilised to provide countries with alternatives. This can only be done by issuing carrots rather than sticks. A US narrative extolling ‘digital democracy’ is unlikely to gain traction in a region characterised by a diversity of political systems that is focused on economic and development needs. At the same time, an excessive focus on thorny domestic policy issues – such as data localisation and the pipe dream of yet another mega-regional trade deal – could risk derailing the geo-economic benefits of the IPEF.</p>
<p style="text-align: justify; ">Instead, the IPEF must focus on capacity building, training and private sector investment in infrastructure across the Indo-Pacific. The US must position itself as a geopolitically reliable ally, interested in the overall stability of the digital Indo-Pacific, beyond its own economic or policy preferences. This applies equally to other external actors, like the EU, who may be interested in engaging with or shaping the digital economic landscape in the Indo-Pacific.</p>
<p style="text-align: justify; ">Countering Chinese economic influence and complementing security agendas set through other fora – such as the Quadrilateral Security Dialogue – should be the primary objective of the IPEF. It is crucial that unrealistic ambitions seeking convergence on values or domestic policy do not undermine strategic interests and dilute the immense potential of the IPEF in catalysing a more competitive and secure digital Indo-Pacific.</p>
<h3><strong>Table: Domestic policy positions on data localisation and data protection</strong></h3>
<p><img src="http://editors.cis-india.org/home-images/Table.png/@@images/8e9a5192-5f6c-4666-8d78-e0863111534a.png" alt="Table" class="image-inline" title="Table" /></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right'>http://editors.cis-india.org/internet-governance/blog/directions-cyber-digital-europe-arindrajit-basu-september-16-2022-getting-the-digital-indo-pacific-economic-framework-right</a>
</p>
No publisherarindrajitPrivacyInternet GovernanceDigital GovernanceDigital Economy2022-10-03T14:56:22ZBlog EntryNHA Data Sharing Guidelines – Yet Another Policy in the Absence of a Data Protection Act
http://editors.cis-india.org/internet-governance/blog/nha-data-sharing-guidelines
<b>In July this year, the National Health Authority (NHA) released the NHA Data Sharing Guidelines for the Pradhan Mantri Jan Aarogya Yojana (PM-JAY) just two months after publishing the draft Health Data Management Policy.</b>
<p>Reviewed and edited by Anubha Sinha</p>
<hr />
<p style="text-align: justify; ">Launched in 2018, PM-JAY is a public health insurance scheme set to cover 10 crore poor and vulnerable families across the country for secondary and tertiary care hospitalisation. Eligible candidates can use the scheme to avail of cashless benefits at any public/private hospital falling under this scheme. Considering the scale and sensitivity of the data, the creation of a well-thought-out data-sharing document is a much-needed step. However, the document – though only a draft – has certain portions that need to be reconsidered, including parts that are not aligned with other healthcare policy documents. In addition, the guidelines should be able to work in tandem with the Personal Data Protection Act whenever it comes into force. With no prior intimation of the publication of the guidelines, and the provision of a mere 10 days for consultation, there was very little scope for stakeholders to submit their comments and participate in the consultation. While the guidelines pertain to the PM-JAY scheme, it is an important document to understand the government’s concerns and stance on the sharing of health data, especially by insurance companies.</p>
<h3 style="text-align: justify; ">Definitions: Ambiguous and incompatible with similar policy documents</h3>
<p style="text-align: justify; ">The draft guidelines add to the list of health data–related policies that have been published since the beginning of the pandemic. These include three draft health data management policies published within two years, which have already covered the sharing and management of health data. The draft guidelines repeat the pattern of earlier policies on health data, wherein there is no reference to the policies that predated it; in this case, the guidelines fail to refer to the draft National Digital Health Data Management Policy (published in April 2022). To add to this, the document – by placing the definitions at the end – is difficult to read and understand, especially when terms such as ‘beneficiary’, ‘data principal’, and ‘individual’ are used interchangeably. In the same vein, the document uses the terms ‘data principal’ and ‘data fiduciary’, and the definitions of health data and personal data, from the 2019 PDP Bill, while also referring to the IT Act SDPI Rules and its definition of ‘sensitive personal data’. While the guidelines state that the IT Act and Rules will be the legislation to refer to for these guidelines, it is to be noted that the IT Act under the SPDI Rules covers ‘body corporates’, which under Section 43A(1), is defined as “any company and includes a firm, sole proprietorship or other association of individuals engaged in commercial or professional activities;”. It is difficult to add responsibility and accountability to the organisations under the guidelines when they might not even be covered under this definition.</p>
<p style="text-align: justify; ">With each new policy, civil society organisations have been pointing out the need to have a data protection act before introducing policies and guidelines that deal with the processing and sharing of the data of individuals. Ideally, these policies – even in draft form – should have been published after the Personal Data Protection Bill was enacted, to ensure consistency with the provisions of the law. For example, the guidelines introduce a new category of governance mechanisms under the data-sharing committee headed by a data-sharing officer (DSO). The responsibilities and powers of the DSO are similar to that of the data protection officer under the draft PDP Bill as well as the National Data Health Management Policy (NHDMP). This, in turn, raises the question of whether the DSO and the DPOs under both the PDP Bill and the draft NDMP will have the same responsibilities. Clarity in terms of which of the policies are in force and how they intersect is needed to ensure a smooth implementation. Ideally, having multiple sources of definitions should be addressed at the drafting stage itself.</p>
<h3 style="text-align: justify; ">Guiding Principles: Need to look beyond privacy</h3>
<p style="text-align: justify; ">The guidelines enumerate certain principles to govern the use, collection, processing, and transmission of the personal or sensitive personal data of beneficiaries. These principles are accountability, privacy by design, choice and consent, openness/transparency, etc. While these provisions are much needed, their explanation at times misses the mark of why these principles were added. For example, in the case of accountability, the guidelines state that the ‘data fiduciary’ shall be accountable for complying with measures based on the guiding principles However, it does not specify who the fiduciaries would be accountable to and what the steps are to ensure accountability. Similarly, in the case of openness and transparency, the guidelines state that the policies and practices relating to the management of personal data will be available to all stakeholders. However, openness and transparency need to go beyond policies and practices and should consider other aspects of openness, including open data and the use of open-source software and open standards. This again will add to transparency, in that it would specify the rights of the data principal, as the current draft looks at the rights of the data principal merely from a privacy perspective. In the case of purpose limitation as well, the guidelines are tied to the privacy notice, which again puts the burden on the individual (in this case, beneficiary) when the onus should actually be on the data fiduciary. Lastly, under the empowerment of beneficiaries, the guidelines state that the “data principal shall be able to seek correction, amendments, or deletion of such data where it is inaccurate;”. The right to deletion should not be conditional on inaccuracy, especially when entering the scheme is optional and consent-based.</p>
<h3 style="text-align: justify; ">Data sharing with third parties without adequate safeguards</h3>
<p style="text-align: justify; ">The guidelines outline certain cases where personal data can be collected, used, or disclosed without the consent of the individual. One of these cases is when the data is anonymised. However, the guidelines do not detail how this anonymisation would be achieved and ensured through the life cycle of the data, especially when the clause states that the data will also be collected without consent. The guidelines also state that the anonymised data could be used for public health management, clinical research, or academic research. The guidelines should have limited the scope of academic research or added certain criteria to gain access to the data; the use of vague terminology could lead to this data (sometimes collected without consent) being de-anonymised or used for studies that could cause harm to the data principal or even a particular community. The guidelines state that the data can be shared as ‘protected health information’ with a government agency for oversight activities authorised by law, epidemic control, or in response to court orders. With the sharing of data, care should be taken to ensure data minimisation and purpose limitations that go beyond the explanations added in the body of the guidelines. In addition, the guidelines also introduce the concept of a ‘clean room’, which is defined as “a secure sandboxed area with access controls, where aggregated and anonymised or de-identified data may be shared for the purposes of developing inference or training models”. The definition does not state who will be developing these training models; it could be a cause of worry if AI companies or even insurance companies have the potential to use this data to train models that could eventually make decisions based on the results. The term ‘sandbox’ is explained under the now revoked DP Bill 2021 as “such live testing of new products or services in a controlled or test regulatory environment for which the Authority may or may not permit certain regulatory relaxations for a<br />specified period for the limited purpose of the testing”. Neither the 2019 Bill nor the IT Act/Rules defines ‘sandbox’; the guidelines should have ideally spent more time explaining how the sandbox system in the ‘Clean Room’ works.</p>
<h3 style="text-align: justify; ">Conclusion</h3>
<p style="text-align: justify; ">The draft Data Sharing Guidelines are a welcome step in ensuring that the entities sharing and processing data have guidelines to adhere to, especially since the Data Protection Bill has not been passed yet. The mention of the best practices for data sharing in annexures, including practices for people who have access to the data, is a step in the right direction, which could be made better with regular training and sensitisation. While the guidelines are a good starting point, they still suffer from the issues that have been highlighted in similar health data policies, including not referring to older policies, adding new entities, and the reliance on digital and mobile technology. The guidelines could have added more nuance to the consent and privacy by design sections to ensure other forms of notice, e.g., notice in audio form in different Indian languages. While PM-JAY aims to reach 10 crore poor and vulnerable families, there is a need to look at how to ensure that consent is given according to the guidelines that are “free, informed, clear, and specific”.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/nha-data-sharing-guidelines'>http://editors.cis-india.org/internet-governance/blog/nha-data-sharing-guidelines</a>
</p>
No publisherShweta Mohandas and Pallavi BediIT ActInternet GovernanceData ProtectionPrivacy2022-09-29T15:17:24ZBlog EntrySurveillance Enabling Identity Systems in Africa: Tracing the Fingerprints of Aadhaar
http://editors.cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar
<b>Biometric identity systems are being introduced around the world with a focus on promoting human development and social and economic inclusion, rather than previous goals of security. As a result, these systems being encouraged in developing countries, particularly in Africa and Asia, sometimes with disastrous consequences.</b>
<p style="text-align: justify; ">In this report, we identify the different external actors that influencing this “developmental” agenda. These range from philanthropic organisations, private companies, and technology vendors, to state and international institutions. Most notable among these is the World Bank, whose influence we investigated in the form of case studies of Nigeria and Kenya. We also explored the role played by the “success” of the Aadhaar programme in India on these new ID systems. A key characteristic of the growing “digital identity for development” trend is the consolidation of different databases that record beneficiary data for government programmes into one unified platform, accessed by a unique biometric ID. This “Aadhaar model” has emerged as a default model to be adopted in developing countries, with little concern for the risks it introduces. Read and download the full report <a href="http://editors.cis-india.org/internet-governance/surveillance-enabling-identity-systems-in-africa" class="internal-link">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar'>http://editors.cis-india.org/internet-governance/blog/surveillance-enabling-identity-systems-in-africa-tracing-the-fingerprints-of-aadhaar</a>
</p>
No publisherShruti Trikanad and Vrinda BhandariSurveillanceAadhaarInternet GovernancePrivacy2022-08-09T08:17:32ZBlog EntryDeployment of Digital Health Policies and Technologies: During Covid-19
http://editors.cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19
<b>In the last twenty years or so, the Indian government has adopted several digital mechanisms to deliver services to its citizens. </b>
<p style="text-align: justify; ">Digitisation of public services in India began with taxation, land record keeping, and passport details recording, but it was soon extended to cover most governmental services - with the latest being public health. The digitisation of healthcare system in India had begun prior to the pandemic. However, given the push digital health has received in recent years especially with an increase in the intensity of activity during the pandemic, we thought it is important to undertake a comprehensive study of India's digital health policies and implementation. The project report comprises a desk-based research review of the existing literature on digital health technologies in India and interviews with on-field healthcare professionals who are responsible for implementing technologies on the ground.</p>
<hr />
<p style="text-align: justify; ">The report by Privacy International and the Centre for Internet & Society can be <a href="http://editors.cis-india.org/internet-governance/deployment-of-digital-health-policies-and-technologies" class="internal-link"><strong>accessed here</strong></a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19'>http://editors.cis-india.org/internet-governance/blog/deployment-of-digital-health-policies-and-technologies-during-covid-19</a>
</p>
No publisherpallaviPrivacyDigitalisationDigital HealthDigital KnowledgeInternet GovernanceDigital MediaDigital TechnologiesDigitisation2022-07-21T14:49:56ZBlog EntryThe Government’s Increased Focus on Regulating Non-Personal Data: A Look at the Draft National Data Governance Framework Policy
http://editors.cis-india.org/internet-governance/blog/national-data-governance-framework-policy
<b>Digvijay Chaudhary and Anamika Kundu wrote an article on the National Data Governance Framework Policy. It was edited by Shweta Mohandas.</b>
<h2>Introduction</h2>
<p style="text-align: justify; ">Non Personal Data (‘NPD’) can be <a href="https://www.taylorfrancis.com/chapters/edit/10.4324/9780429022241-8/regulating-non-personal-data-age-big-data-bart-van-der-sloot">understood</a> as any information not relating to an identified or identifiable natural person. The origin of such data can be both human and non-human. Human NPD would be such data which has been anonymised in such a way that the person to whom the data relates cannot be re-identified. Non-human NPD would mean any such data that did not relate to a human being in the first place, for example, weather data. There has been a gradual demonstrated interest in NPD by the government in recent times. This new focus on regulating non personal data can be owed to the economic incentive it provides. In its report, the Sri Krishna committee, released in 2018 agreed that NPD holds considerable strategic or economic interest for the nation, however, it left the questions surrounding NPD to a future committee.</p>
<h2 style="text-align: justify; ">History of NPD Regulation</h2>
<p dir="ltr" style="text-align: justify; ">In 2020, the Ministry of Electronics and Information Technology (‘MEITY’) constituted an expert committee (‘NPD Committee’) to study various issues relating to NPD and to make suggestions on the regulation of non-personal data. The NPD Committee differentiated NPD into human and non-human NPD, based on the data’s origin. Human NPD would include all information that has been stripped of any personally identifiable information and non-human NPD meant any information that did not contain any personally identifiable information in the first place (eg. weather data). The final report of the NPD Committee is awaited but the Committee came out with a <a href="https://static.mygov.in/rest/s3fs-public/mygov_160922880751553221.pdf">revised draft</a> of its recommendations in December 2020. In its December 2020 report, the NPD Committee proposed the creation of a National Data Protection Authority (‘NPDA’) as it felt this is a new and emerging area of regulation. Thereafter, the Joint Parliamentary Committee on the Personal Data Protection Bill, 2019 (‘JPC’) came out with its <a href="http://164.100.47.193/lsscommittee/Joint%20Committee%20on%20the%20Personal%20Data%20Protection%20Bill,%202019/17_Joint_Committee_on_the_Personal_Data_Protection_Bill_2019_1.pdf">version of the Data Protection Bill </a>where it amended the short title of the PDP Bill 2019 to Data Protection Bill, 2021 widening the ambit of the Bill to include all types of data. The JPC report focuses only on human NPD, noting that non-personal data is essentially derived from one of the three sets of data - personal data, sensitive personal data, critical personal data - which is either anonymized or is in some way converted into non-re-identifiable data.</p>
<p dir="ltr" style="text-align: justify; ">On February 21, 2022, the Ministry of Electronics and Information Technology (‘MEITY’) came out with the <a href="https://www.meity.gov.in/content/draft-india-data-accessibility-use-policy-2022">Draft India Data Accessibility and Use Policy, 2022</a> (‘Draft Policy’). The Draft Policy was strongly criticised mainly due to its aims to monetise data through its sale and licensing to body corporates. The Draft Policy had stated that anonymised and non-personal data collected by the State that has “<a href="https://www.medianama.com/2022/06/223-new-data-governance-policy-privacy/">undergone value addition</a>” could be sold for an “appropriate price”. During the Draft Policy’s consultation process, it had been withdrawn several times and then finally removed from the website.<a href="https://www.meity.gov.in/writereaddata/files/Draft%20India%20Data%20Accessibility%20and%20Use%20Policy_0.pdf"> The National Data Governance Framework Policy</a> (‘NDGF Policy’) is a successor to this Draft Policy. There is a change in the language put forth in the NDGF Policy from the Draft Policy, where the latter mainly focused on monetary growth. The new NDGF Policy aims to regulate anonymised non-personal data (‘NPD’) kept with governmental authorities and make it accessible for research and improving governance. It wishes to create an ‘India Datasets programme’ which will consist of the aforementioned datasets. While MEITY has opened the draft for public comments, is a need to spell out the procedure in some ways for stakeholders to draft recommendations for the NDGF policies in an informed manner. Through this piece, we discuss the NDGF Policy in terms of issues related to the absence of a comprehensive Data Protection Framework in India and the jurisdictional overlap of authorities under the NDGF Policy and DPB.</p>
<h2 dir="ltr" style="text-align: justify; ">What the National Data Governance Framework Policy Says</h2>
<p dir="ltr" style="text-align: justify; ">Presently in India, NPD is stored in a variety of governmental departments and bodies. It is difficult to access and use this stored data for governmental functions without modernising collection and management of governmental data. Through the NDGF Policy, the government aims to build an Indian data storehouse of anonymised non-personal datasets and make it accessible for both improving governance and encouraging research. It imagines the establishment of an Indian Data Office (‘IDO’) set up by MEITY , which shall be responsible for consolidating data access and sharing of non-personal data across the government. In addition, it also mandates a Data Management Unit for every Ministry/department that would work closely with the IDO. IDO will also be responsible for issuing protocols for sharing NPD. The policy further imagines an Indian Data Council (‘IDC’) whose function would be to define frameworks for important datasets, finalise data standards, and Metadata standards and also review the implementation of the policy. The NDGF Policy has provided a broad structure concerning the setting up of anonymisation standards, data retention policies, data quality, and data sharing toolkit. The NDGF Policy states that these standards shall be developed and notified by the IDO or MEITY or the Ministry in question and need to be adhered to by all entities.</p>
<h2 dir="ltr" style="text-align: justify; ">The Data Protection Framework in India</h2>
<p dir="ltr" style="text-align: justify; ">The report adopted by the JPC, felt that it is simpler to enact a single law and a single regulator to oversee all the data that originates from any data principal and is in the custody of any data fiduciary. According to the JPC, the draft Bill deals with various kinds of data at various levels of security. The JPC also recommended that since the Data Protection Bill (‘DPB’) will handle both personal and non-personal data, any further policy / legal framework on non-personal data may be made a part of the same enactment instead of any separate legislation. The draft DPB states that what is to be done with the NDP shall be decided by the government from time to time according to its policy. As such, neither the DPB, 2021 nor the NDGF Policy go into details of regulating NPD but only provide a broad structure of facilitating free-flow of NPD, without taking into account the <a href="https://cis-india.org/internet-governance/cis-comments-revised-npd-report/view">specific concerns</a> that have been raised since the NPD committee came out with its draft report on regulating NPD dated December 2020.</p>
<h2 dir="ltr" style="text-align: justify; ">Jurisdictional overlaps among authorities and other concerns</h2>
<p dir="ltr" style="text-align: justify; ">Under the NDGF policy, all guidelines and rules shall be published by a body known as the Indian Data Management Office (‘IDMO’). The IDMO is set to function under the MEITY and work with the Central government, state governments and other stakeholders to set standards. Currently, there is no sign of when the DPB will be passed as law. According to the JPC, the reason for including NPD within the DPB was because of the impossibility to differentiate between PD and NPD. There are also certain overlaps between the DPB and the NDGF which are not discussed by the NDGF. NDGF does not discuss the overlap between the IDMO and Data Protection Authority (‘DPA’) established under the DPB 2021.</p>
<p dir="ltr" style="text-align: justify; ">Under the DPB, the DPA is tasked with specifying codes of practice under clause 49. On the other hand, the NDGF has imagined the setting up of IDO, IDMO, and the IDC, which shall be responsible for issuing codes of practice such as data retention, and data anonymisation, and data quality standards. As such, there appears to be some overlap in the functions of the to-be-constituted DPA and the NDGF Policy.</p>
<p dir="ltr" style="text-align: justify; ">Furthermore, while the NDGF Policy aims to promote openness with respect to government data, there is a conflict with <a href="https://opengovdata.org/">open government data (‘OGD’) principle</a>s when there is a price attached to such data. OGD is data which is collected and processed by the government for free use, reuse and distribution. Any database created by the government must be publicly accessible to ensure compliance with the OGD principles.</p>
<h2 dir="ltr" style="text-align: justify; ">Conclusion</h2>
<p dir="ltr" style="text-align: justify; ">Streamlining datasets across different authorities is a huge challenge for the government and hence the NGDF policy in its current draft requires a lot of clarification. The government can take inspiration from the European Union which in 2018, came out with a principles-based approach coupled with self-regulation on the framework of the free flow of non-personal data. The <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&from=EN">guidance</a> on the free-flow of non-personal data defines non-personal data based on the origin of data - data which originally did not relate to any personal data (non-human NPD) and data which originated from personal data but was subsequently anonymised (human NPD). The <a href="https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52019DC0250&from=EN">regulation</a> further realises the reality of mixed data sets and regulates only the non-personal part of such datasets and where the datasets are inextricably linked, the GDPR would apply to such datasets. Moreover, any policy that seeks to govern the free flow of NPD ought to make it clear that in case of re-identification of anonymised data, such re-identified data would be considered personal data. The DPB, 2021 and the NGDF, both fail to take into account this difference.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/national-data-governance-framework-policy'>http://editors.cis-india.org/internet-governance/blog/national-data-governance-framework-policy</a>
</p>
No publisherDigvijay Chaudhary and Anamika KunduOpen DataOpen Government DataInternet GovernancePrivacy2022-06-30T13:24:35ZBlog EntryMaking Voices Heard
http://editors.cis-india.org/internet-governance/blog/making-voices-heard
<b>We are happy to announce the launch of our final report on the study ‘Making Voices Heard: Privacy, Inclusivity, and Accessibility of Voice Interfaces in India. The study was undertaken with support from the Mozilla Corporation.</b>
<p style="text-align: center; "><img src="http://editors.cis-india.org/home-images/WebsiteHeader.jpg/@@images/8d8ed2a0-f0e4-44d7-8938-493b186402c5.jpeg" alt="Making Voices Heard" class="image-inline" title="Making Voices Heard" /></p>
<p style="text-align: justify; ">We believe that voice interfaces have the potential to democratise the use of the internet by addressing limitations related to reading and writing on digital text-only platforms and devices. This report examines the current landscape of voice interfaces in India, with a focus on concerns related to privacy and data protection, linguistic barriers, and accessibility for persons with disabilities (PwDs).</p>
<p style="text-align: justify; ">The report features a visual mapping of 23 voice interfaces and technologies publicly available in India, along with a literature survey, a policy brief towards development and use of voice interfaces and a design brief documenting best practices and users’ needs, both with a focus on privacy, languages, and accessibility considerations, and a set of case studies on three voice technology platforms. <span>Read and download the full report <a class="external-link" href="http://voice.cis-india.org/">here</a></span></p>
<hr />
<h3>Credits</h3>
<p><strong>Research</strong>: Shweta Mohandas, Saumyaa Naidu, Deepika Nandagudi Srinivasa, Divya Pinheiro, and Sweta Bisht.</p>
<p><strong>Conceptualisation, Planning, and Research Inputs</strong>: Sumandro Chattapadhyay, and Puthiya Purayil Sneha.</p>
<p><strong>Illustration</strong>: Kruthika NS (Instagram @theworkplacedoodler). Website Design Saumyaa Naidu. Website Development Sumandro Chattapadhyay, and Pranav M Bidare.</p>
<p><strong>Review and Editing</strong>: Puthiya Purayil Sneha, Divyank Katira, Pranav M Bidare, Torsha Sarkar, Pallavi Bedi, and Divya Pinheiro.</p>
<p><strong>Copy Editing</strong>: The Clean Copy</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/making-voices-heard'>http://editors.cis-india.org/internet-governance/blog/making-voices-heard</a>
</p>
No publishershwetaVoice User InterfacePrivacyAccessibilityInternet GovernanceResearchFeaturedHomepage2022-06-27T16:18:36ZBlog EntryCCTVs in Public Spaces and the Data Protection Bill, 2021
http://editors.cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021
<b>This article has been authored by Ms. Anamika Kundu, Research Assistant at the Centre for Internet and Society, and Digvijay S. Chaudhary, Researcher at the Centre for Internet and Society. This blog is a part of RSRR’s Blog Series on the Right to Privacy and the Legality of Surveillance, in collaboration with the Centre for Internet & Society.</b>
<p><span>The article by Anamika Kundu and Digvijay S. Chaudhary was originally </span><a class="external-link" href="https://rsrr.in/2022/04/20/cctv-surveillance-privacy/">published by RGNUL Student Research Review</a><span> on April 20, 2022</span></p>
<p><span><img src="http://editors.cis-india.org/home-images/Surveillance.jpg/@@images/f8fad564-44ab-46e2-bd44-29607ea7fd19.jpeg" alt="Surveillance" class="image-inline" title="Surveillance" /></span></p>
<hr />
<h2>Introduction</h2>
<p style="text-align: justify; ">In recent times, Indian cities have seen an expansion of state deployed CCTV cameras. According to a recent report, in terms of CCTVs deployed, Delhi was considered as the most surveilled city in the world, surpassing even the most surveilled cities in China. Delhi was not the only Indian city in that list, Chennai and Mumbai also made it to the list. In Hyderabad as well, the development of a Command and Control Centre aims to link the city’s surveillance infrastructure in real-time. Even though studies have shown that there is little correlation between CCTVs and crime control, deployment of CCTV cameras has been justified on the basis of national security and crime deterrence. Such an activity brings about the collection and retention of audio-visual/visual information of all individuals frequenting spaces where CCTV cameras are deployed. This information could be used to identify them (directly or indirectly) based on their looks or other attributes. Potential risks associated with the misuse, and processing of such personal data also arise. These risks include large scale profiling, criminal abuse (law enforcement misusing CCTV information for personal gains), and discriminatory targeting (law enforcement disproportionately focusing on a particular group of people). As these devices capture personal data of individuals, this article seeks data protection safeguards available to data principals against CCTV surveillance employed by the State in a public space under the proposed Data Protection Bill, 2021 (the “DPB”).</p>
<h2>Safeguards Available Under the Data Protection Bill, 2021</h2>
<p style="text-align: justify; ">To use CCTV surveillance, the measures and compliance listed under the DPB have to be followed. Obligations of data fiduciaries available under Chapter II, such as consent (clause 11), notice requirement (clause 7), and fair and reasonable processing (clause 5) are common to all data processing entities for a variety of activities. Similarly, as the DPB follows the principles of data minimisation (clause 6), storage limitation (clause 9), purpose limitation (clause 5), lawful and fair processing (clause 4), transparency (clause 23), and privacy by design (clause 22), these safeguards too are common to all data processing entities/activities. If a data fiduciary processes personal data of children, it has to comply with the standards stated under clause 16.</p>
<p style="text-align: justify; ">Under the DPB, compliance differs on the basis of grounds and purpose of data processing. As such, if compliance standards differ, so do the availability of safeguards under the DPB. Of relevance to this article, there are three standards of compliance under the DPB wherein the standards of safeguards available to a data principal differ. First, cases which would fall under Chapter III and hence, not require consent. Chapter III lists grounds for processing of personal data without consent. Second, cases which would fall under exemption clauses in Chapter VIII. In such cases, the DPB or some of its provisions would be inapplicable. Clause 35 under Chapter VIII gives power to the Central Government to exempt any agency from the application of the DPB. Similarly, Clause 36 under Chapter VIII, exempts certain provisions for certain processing of personal data. Third, cases which would not fall under either of the above Chapters. In such cases, all safeguards available under the DPB would be available to the data principals. Consequently, safeguards available to data principals in each of these standards are different. We will go through each of these separately.</p>
<p style="text-align: justify; ">First, if the grounds of processing of CCTV information is such that it falls under the scope of Chapter III of the DPB, wherein the consent requirement is done away with, then in those cases, the notice requirement has to reflect such purpose, meaning that even if consent is not necessary for certain cases, other requirements under the DPB would still apply. Here, we must note that CCTV deployment by the state on such a large scale may be justified on the basis of conditions stated under clauses 12 and 14 of DPB – specifically, the condition for the performance of state function authorised by law, and public interest. The requirement under clause 12 of “authorised by law” simply means that the state function should have legal backing. Deployment of CCTVs is most likely to fall under clause 12 as various states have enacted legislations providing for CCTV deployment in the name of public safety. As a result, even if section 12 takes away the requirement of consent for certain cases, data principals should be able to exercise all rights accorded to them under the DPB (chapter V) except the right to data portability under clause 19.</p>
<p style="text-align: justify; ">Second, processing of personal data via CCTVs by government agencies could be exempted from DPB under clause 35 for certain cases under the clause. Another exemption that is particularly concerning with regard to the use of CCTVs is the exemption provided under clause 36(a). Section 36(a) says that the provisions of chapters II-VII would not apply where the data is processed in the interest of prevention, detection, investigation, and prosecution of any offence under the law. Chapters II-VII govern the obligations of data fiduciaries, grounds where consent would not be required, personal data of children, rights of data principals, transparency and accountability measures, and restrictions on transfer of personal data outside India respectively. In these cases, the requirement of fair and reasonable processing under clause 5 would also not apply. As a broad justification provided for CCTVs deployment by the government is crime control, it is possible that section 36(a) justification can be used to exempt the processing of CCTV footage from the above-mentioned safeguards.</p>
<p style="text-align: justify; ">From the above discussion, the following can be concluded. First, if the grounds of processing fall under Chapter III, then standards of fair and reasonable processing, notice requirement, and all rights except the right to data portability u/s 19 would be available to data principals. Second, if the grounds of processing fall under clause 36, then, in that case, consent requirement, notice requirement, and the rights under DPB would be unavailable as that section mandates the non-application of those chapters. In such a case, even the processing requirements of a fair and reasonable manner stand suspended. Third, if the grounds of processing of CCTV information doesn’t fall under Chapter III, then all obligations listed under Chapter II would have to be followed. Moreover, the data principal would be able to exercise all the rights available under Chapter V of the DPB.</p>
<h2>Constitutional Standards</h2>
<p style="text-align: justify; ">When the Supreme Court recognised privacy as a fundamental right in the case of Puttaswamy v. Union of India (“Puttaswamy”), it located the principles of informed consent and purpose limitation as central to informational privacy. It recognised that privacy inheres not in spaces but in an individual. It also recognised that privacy is not an absolute right and certain restrictions may be imposed on the exercise of the right. Before listing the constitutional standards that activities infringing privacy must adhere to, it’s important to answer whether there exists a reasonable expectation of privacy in CCTV footage deployed in a public space by the State?</p>
<p style="text-align: justify; ">In Puttaswamy, the court recognised that privacy is not denuded in public spaces. Writing for the plurality judgement, Chandrachud J. recognised that the notion of a reasonable expectation of privacy has elements both of a subjective and objective nature. Defining these concepts, he writes, “Privacy at a subjective level is a reflection of those areas where an individual desire to be left alone. On an objective plane, privacy is defined by those constitutional values which shape the content of the protected zone where the individual ought to be left alone…hence while the individual is entitled to a zone of privacy, its extent is based not only on the subjective expectation of the individual but on an objective principle which defines a reasonable expectation.” Note how in the above sentences, the plurality judgement recognises “a reasonable expectation” to be inherent in “constitutional values”. This is important as the meaning of what’s reasonable is to be constituted according to constitutional values and not societal norms. A second consideration that the phrase “reasonable expectation of privacy” requires is that an individual’s reasonable expectation is allied to the purpose for which the information is provided, as held in the case of Hyderabad v. Canara Bank (“Canara Bank”). Finally, the third consideration in defining the phrase is that it is context dependent. For example, in the case of In the matter of an application by JR38 for Judicial Review (Northern Ireland) 242 (2015) (link here), the UK Supreme Court was faced with a scenario where the police published the CCTV footage of the appellant involved in riotous behaviour. The question before the court was: “Whether the publication of photographs by the police to identify a young person suspected of being involved in riotous behaviour and attempted criminal damage can ever be a necessary and proportionate interference with that person’s article 8 [privacy] rights?” The majority held that there was no reasonable expectation of privacy in the case because of the nature of the criminal activity the appellant was involved in. However, the majority’s formulation of this conclusion was based on the reasoning that “expectation of privacy” was dependent on the “identification” purpose of the police. The court stated, “Thus, if the photographs had been published for some reason other than identification, the position would have been different and might well have engaged his rights to respect for his private life within article 8.1”. Therefore, as the purpose of publishing the footage was “identification” of the wrongdoer, the reasonable expectation of privacy stood excluded. The Canara Bank case was relied on by the SC in Puttaswamy. The plurality judgement in Puttaswamy also quoted the above paragraphs from the UK Supreme Court judgement.</p>
<p style="text-align: justify; ">Finally, the SC in the Aadhaar case, laid down the factors of “reasonable expectation of privacy.” Relying on those factors, the Supreme Court observed that demographic information and photographs do not raise a reasonable expectation of privacy. It further held that face photographs for the purpose of identification are not covered by a reasonable expectation of privacy. As this author has recognised, the majority in the Aadhaar case misconstrued the “reasonable expectation of privacy” to lie not in constitutional values as held in Puttaswamy but in societal norms. Even with the misapplication of the Puttaswamy principles by the majority in Aadhaar, it is clear that the exclusion of a “reasonable expectation of privacy” in face photographs is valid only for the purpose of “identification”. For purposes other than “identification”, there should exist a reasonable expectation of privacy in CCTV footage. Having recognised the existence of “reasonable expectation of privacy” in CCTV footage, let’s see how the safeguards mentioned under the DPB stand the constitutional standards of privacy laid down in Puttaswamy.</p>
<p style="text-align: justify; ">The bench in Puttaswamy located privacy not only in Article 21 but the entirety of part III of the Indian Constitution. Where transgression to privacy relates to different provisions under Part III, the tests evolved under those Articles would apply. Puttaswamy recognised that national security and crime control are legitimate state objectives. However, it also recognised that any limitation on the right must satisfy the proportionality test. The proportionality test requires a legitimate state aim, rational nexus, necessity, and balancing of interests. Infringement on the right to privacy occurs under the first and second standard. The first requirement of proportionality stands justified as national security and crime control have been recognised to be legitimate state objectives. However, it must be noted that the EU Guidelines on Processing of Personal Data through video devices state that the mere purpose of “safety” or “for your safety” is not sufficiently specific and is contrary to the principle that personal data shall be processed lawfully, fairly and in a transparent manner in relation to the data subject. The second requirement is a rational nexus. As stated above, there is little correlation between crime control and surveillance measures. Even if the state justifies a rational nexus between state aim and the action employed, it is the necessity part of the proportionality test where the CCTV surveillance measures fail (as explained by this author). Necessity requires us to draw a list of alternatives and their impact on an individual, and then do a balancing analysis with regard to the alternatives. Here, judicial scrutiny of the exemption order under clause 35 is a viable alternative that respects individual rights while at the same time, not interfering with the state’s aim.</p>
<h2>Conclusion</h2>
<p style="text-align: justify; ">Informed consent and purpose limitation were stated to be central principles of informational privacy in Puttaswamy. Among the three standards we identified, the principles of informed consent and purpose limitation remain available only in the third standard. In the first standard, even though the requirement of consent has become unavailable, the principle of purpose limitation would still be applicable to the processing of such data. The second standard is of particular concern wherein neither of those principles is available to data principals. It is worth mentioning here that in large scale monitoring activities such as CCTV surveillance, the safeguards which the DPB lists out would inevitably have an implementation flaw. The reason is that in scenarios where individuals refuse consent for large scale CCTV monitoring, what alternatives would the government offer to those individuals? Practically, CCTV surveillance would fall under clause 12 standards where consent would not be required. Even in those cases, would the notice requirement safeguard be diminished to “you are under surveillance” notices? When we talk about exercise of rights available under the DPB, how would an individual effectively exercise their right when the data processing is not limited to a particular individual? These questions arise because the safeguards under the DPB (and data protection laws in general) are based on individualistic notions of privacy. Interestingly, individual use cases of CCTVs have also increased with an increase in state use of CCTVs. Deployment of CCTVs for personal or domestic purposes would be exempt from the above-mentioned compliances as that would fall under the exemption provision of clause 36(d). Two additional concerns arise in relation to processing of data concerning CCTVs – the JPC report’s inclusion of Non-Personal Data (“NPD”) within the ambit of DPB, and the government’s plan to develop a National Automated Facial Recognition System (“AFRS”). A significant part of the data collected by CCTVs would fall within the ambit of NPD.With the JPC’s recommendation, it will be interesting to follow the processing standards for NPD under the DPB. AFRS has been imagined as a national database of photographs gathered from various agencies to be used in conjunction with facial recognition technology. The use of facial recognition technology with CCTV cameras raises concerns surrounding biometric data, and risks of large scale profiling. Indeed, section 27 of the DPB reflects this risk and mandates a data protection impact assessment to be undertaken by the data fiduciary with respect to processing involving new technologies or large scale profiling or use of biometric data by such technologies, however the DPB does not define what “new technology” means. Concerns around biometric data are outside the scope of the present article, however, it would be interesting to look at how the use of facial recognition technology with CCTVs could impact the safeguards under DPB.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021'>http://editors.cis-india.org/internet-governance/blog/rssr-anamika-kundu-digvijay-s-chaudhary-april-20-2022-cctvs-in-public-spaces-and-data-protection-bill-2021</a>
</p>
No publisherAnamika Kundu and Digvijay S ChaudharyInternet GovernanceData ProtectionPrivacy2022-04-28T02:29:42ZBlog EntryRethinking Acquisition of Digital Devices by Law Enforcement Agencies
http://editors.cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies
<b>This article has been selected as a part of The Right to Privacy and the Legality of Surveillance series organized in collaboration with the RGNUL Student Research Review (RSRR) Journal.</b>
<p>Read the article originally published in <a class="external-link" href="https://rsrr.in/blog/">RGNUL Student Research Review (RSRR) Journal </a></p>
<hr />
<p><strong>Abstract</strong></p>
<p style="text-align: justify;">The Criminal Procedure Code was created in the 1970s when the concept of the right to privacy was highly unacknowledged. Following the <em>Puttuswamy</em> <em>I </em>(2017) judgement of the Supreme Court affirming the right to privacy, these antiquated codes must be re-evaluated. Today, the police can acquire digital devices through summons and gain direct access to a person’s life, despite the summons mechanism having been intended for targeted, narrow enquiries. Once in possession of a device, the police attempt to circumvent the right against self-incrimination by demanding biometric passwords, arguing that the right does not cover biometric information . However, due to the extent of information available on digital devices, courts ought to be cautious and strive to limit the power of the police to compel such disclosures, taking into consideration the <em>right to privacy</em> judgement.</p>
<p><strong>Keywords: </strong>Privacy, Criminal Procedural Law, CrPc, Constitutional Law</p>
<p><strong>Introduction<em></em></strong></p>
<p style="text-align: justify;">New challenges confront the Indian criminal investigation framework, particularly in the context of law enforcement agencies (LEAs) acquiring digital devices and their passwords. Criminal procedure codes delimiting police authority and procedures were created before the widespread use of digital devices and are no longer pertinent to the modern age due to the magnitude of information available on a single device. A single device could provide more information to LEAs than a complete search of a person’s home; yet, the acquisition of a digital device is not treated with the severity and caution it deserves. Following the affirmation of the right to privacy in <em>Puttuswamy I </em>(2017), criminal procedure codes must be revamped, taking into consideration that the acquisition of a person’s digital device constitutes a major infringement on their right to privacy.</p>
<p><strong>Acquisition of digital devices by LEAs through summons</strong></p>
<p style="text-align: justify;"><a href="https://www.indiacode.nic.in/bitstream/123456789/15272/1/the_code_of_criminal_procedure%2C_1973.pdf">Section 91 of the Criminal Procedure Code</a> (CrPc) grants powers to a court or police officer in charge of a police station to compel a person to produce any form of document or ‘thing’ necessary and desirable to a criminal investigation. In <a href="https://indiankanoon.org/doc/1395576/"><em>Rama Krishna v State</em></a>,<em> </em>‘necessary’ and ‘desirable’ have been interpreted as any piece of evidence relevant to the investigation or a link in the chain of evidence. <a href="https://deliverypdf.ssrn.com/delivery.php?ID=040088020003014069081068085012117023096031065012091090091115088031084097097081123000002033027047006112028087095120074083084003037094022080065067076089116106115025106025062083007085091067067124080091064096069093075026018100087109120024076084123086119022&EXT=pdf&INDEX=TRUE">Abhinav Sekhri</a>, a criminal law litigator and writer, has argued that the wide wording of this section allows summons to be directed towards the retrieval of specific digital devices.</p>
<p style="text-align: justify;">As summons are target-specific, the section has minimal safeguards. However, several issues arise in the context of summons regarding digital devices. In the current day, access to a user’s personal device can provide comprehensive insight into their life and personality due to the vast amounts of private and personal information stored on it. In <a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"><em>Riley v California</em></a>, the Supreme Court of the United States (SCOTUS) observed that due to the nature of the content present on digital devices, summons for them are equivalent to a roving search, i.e., demanding the simultaneous production of all contents of the home, bank records, call records, and lockers. The <em>Riley</em> decision correctly highlights the need for courts to recognise that digital devices ought to be treated distinctly compared to other forms of physical evidence due to the repository of information stored on digital devices.</p>
<p style="text-align: justify;">The burden the state must surpass in order to issue summons is low as the relevancy requirement is easily provable. As noted in <a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"><em>Riley</em></a>, police must identify which evidence on a device is relevant. Due to the sheer amount of data on phones, it is very easy for police to claim that there will surely be some form of connection between the content on the device and the case. Due to the wide range of offences available for Indian LEAs to cite, it is easy for them to argue that the content on the device is relevant to any number of possible offences. LEAs rarely face consequences for slamming the accused with a huge roster of charges – even if many of them are baseless – leading to the system being prone to abuse. The Indian Supreme Court in its judgement in <a href="https://indiankanoon.org/doc/1068532/"><em>Canara Bank</em></a> noted that the burden of proof must be higher for LEAs when investigations violate the right to privacy. <a href="https://www.ijlt.in/_files/ugd/066049_03e4a2b28a5e49f6a59b861aa4554ede.pdf">Tarun Krishnakumar</a> notes that the trickle-down effect of <em>Puttuswamy I</em> will lead to new privacy challenges with regards to a summons to appear in court. <em>Puttuswamy I</em>, will provide the bedrock and constitutional framework, within which future challenges to the criminal process will be undertaken. It is important for the court to recognise the transformative potential within the <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy</em></a> judgement to help ensure that the right to privacy of citizens is safeguarded. The colonial logic of policing – wherein criminal procedure law was merely a tool to maximise the interest of the state at the cost of the people – must be abandoned. Courts ought to devise a framework under Section 91 to ensure that summons are narrowly framed to target specific information or content within digital devices. Additionally, the digital device must be collected following a judicial authority issuing the summons and not a police authority. Prior judicial warrants will require LEAs to demonstrate their requirement for the digital device; on estimating the impact on privacy, the authority can issue a suitable summons. Currently, the only consideration is if the item will furnish evidence relevant to the investigation; however, judges ought to balance the need for the digital device in the LEA’s investigation with the users’ right to privacy, dignity, and autonomy.</p>
<p style="text-align: justify;"><a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy I</em></a><em> </em>provides a triple test encompassing legality, necessity, and proportionality to test privacy claims. Legality requires that the measure be prescribed by law, necessity analyses if it is the least restrictive means being adopted by the state, and proportionality checks if the objective pursued by the measure is proportional to the degree of infringement of the right. The relevance standard, as mentioned before, is inadequate as it does not provide enough safeguards against abuse. The police can issue summons based on the slightest of suspicions and thus get access to a digital device, following which they can conduct a roving enquiry of the device to find evidence of any other offence, unrelated to the original cause of suspicion.</p>
<p style="text-align: justify;">Unilateral police summons of digital devices cannot pass the triple test as it is grossly disproportionate and lacks any form of safeguard against the police. The current system has no mechanism for overseeing the LEAs; as long as LEAs themselves are of the view that they require the device, they can acquire it. In <a href="https://www.supremecourt.gov/opinions/13pdf/13-132_8l9c.pdf"><em>Riley</em></a>, SCOTUS has already held that warrantless seizure of digital devices constitutes a violation of the right to privacy. India ought to also adopt a requirement of a prior judicial warrant for the procurement of devices by LEAs. A re-imagined criminal process would have to abide by the triple test in particular proportionality wherein the benefit claimed by the state ought not to be disproportionate to the impact on the fundamental right to privacy; and further, a framework must be proposed to provide safeguards against abuse.</p>
<p><strong>Compelling the production of passwords of devices</strong></p>
<p style="text-align: justify;">In police investigations, gaining possession of a physical device is merely the first step in acquiring the data on the device, as the LEAs still require the passcodes needed to unlock the device. LEAs compelling the production of passcodes to gain access to potentially incriminating data raises obvious questions regarding the right against self-incrimination; however, in the context of digital devices, several privacy issues may crop up as well.</p>
<p style="text-align: justify;">In <a href="https://main.sci.gov.in/judgment/judis/4157.pdf"><em>Kathi Kalu Oghad</em></a>, the SC held that compelling the production of fingerprints of an accused person to compare them with fingerprints discovered by the LEA in the course of their investigation does not violate the right to protection against self-incrimination of the accused. <a href="https://lawschoolpolicyreview.com/2019/10/16/biometrics-as-passwords-the-slippery-scope-of-self-incrimination/">It has been argued</a> that the ratio in the judgement prohibits the compelling of disclosure of passwords and biometrics for unlocking devices because <a href="https://main.sci.gov.in/judgment/judis/4157.pdf"><em>Kathi Kalu Oghad</em></a> only dealt with the production of fingerprints in order to compare the fingerprints with pre-existing evidence, as opposed to unlocking new evidence by utilising the fingerprint. However, the judgement deals with self-incrimination and does not address any privacy issues.</p>
<p style="text-align: justify;">The right against self-incrimination approach alone may not be enough to resolve all concerns. Firstly, there may be varying levels of protection provided to different forms of password protections on digital devices; text- and pattern-based passcodes are inarguably protected under Art. 20(3) of the Constitution. However, the protection of biometrics-based passcodes relies upon the correct interpretation of the <a href="https://main.sci.gov.in/judgment/judis/4157.pdf"><em>Kathi Kalu Oghad</em></a> precedent. Secondly, Art. 20(3) only protects the accused in investigations and not when non-accused digital devices are acquired by LEAs and the passcodes of the devices demanded.</p>
<p style="text-align: justify;">Therefore, considering the aforementioned points, it is pertinent to remember that the right against self-incrimination does not exist in a vacuum separate from privacy. It originates from the concept of decisional autonomy – the right of individuals to make decisions about matters intimate to their life without interference from the state and society. <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy I</em></a> observed that decisional autonomy is the bedrock of the right to privacy, as privacy allows an individual to make these intimate decisions away from the glare of society and/or the state. This has heightened importance in this context as interference with such autonomy could lead to the person in question facing criminal prosecution. The SC in <a href="https://main.sci.gov.in/jonew/judis/36303.pdf"><em>Selvi v Karnataka</em></a><em> </em>and <a href="https://main.sci.gov.in/supremecourt/2012/35071/35071_2012_Judgement_24-Aug-2017.pdf"><em>Puttuswamy I</em></a> has repeatedly affirmed that the right against self-incrimination and the right to privacy are linked concepts, with the court observing that the right to remain silent is an integral aspect of decisional autonomy.</p>
<p style="text-align: justify;">In <a href="http://karnatakajudiciary.kar.nic.in:8080/repository/rep_judgmentcase.php"><em>Virendra Khanna</em></a>, the Karnataka High Court (HC) dealt with the privacy and self-incrimination concerns caused by LEAs compelling the disclosure of passwords. The HC brushes aside concerns related to privacy by noting that the right to privacy is not absolute and that an exception to the right to privacy is state interest and protection of law and order (para 5.11), and that unlawful disclosure of material to third parties could be an actionable wrong (para 15). The court’s interpretation of privacy effectively provides a free pass for the police to interfere with the right to privacy under the pretext of a criminal investigation. This conception of privacy is inadequate as the issue of proportionality is avoided, and the court does not attempt to ensure that the interference is proportionate with the outcome.</p>
<p style="text-align: justify;">US courts also see the compelling of production of passcodes as an issue of self-incrimination as well as privacy. In its judgement in <a href="https://casetext.com/case/in-re-application-for-a-search-warrant?__cf_chl_f_tk=lTxiJpZIvKfkIBtGQJtMObSmqhdRUZdjGk5hXeMfprQ-1642253001-0-gaNycGzNCJE"><em>Application for a Search Warrant</em></a>, a US court observed that compelling the disclosure of passcodes existed at an intersection of the right to privacy and self-incrimination; the right against self-incrimination serves to protect the privacy interests of suspects.</p>
<p style="text-align: justify;">Disclosure of passwords to digital devices amounts to an intrusion of the privacy of the suspect as the collective contents on the digital device effectively amount to providing LEAs with a method to observe a person’s mind and identity. Police investigative techniques cannot override fundamental rights and must respect the personal autonomy of suspects – particularly, the choice between silence and speech. Through the production of passwords, LEAs can effectively get a snapshot of a suspect’s mind. This is analogous to the polygraph and narco-analysis test struck down as unconstitutional by the SC in <a href="https://main.sci.gov.in/jonew/judis/36303.pdf"><em>Selvi</em></a> as it violates decisional autonomy.</p>
<p style="text-align: justify;">As <a href="https://theproofofguilt.blogspot.com/2021/03/mobile-phones-and-criminal.html">Sekhri</a> noted, a criminal process that reflects the aspirations of the <em>Puttuswamy </em>judgement would require LEAs to first explain with reasonable detail the material which they wish to find in the digital devices. Secondly, they must provide a timeline for the investigation to ensure that individuals are not subjected to inexhaustible investigations with police roving through their devices indefinitely. Thirdly, such a criminal process must demand, a higher burden to be discharged from the state if the privacy of the individual is infringed upon. These aspirations should form the bedrock of a system of judicial warrants that LEAs ought to be required to comply with if they wish to compel the disclosure of passwords from individuals. The framework proposed above is similar to the <a href="http://karnatakajudiciary.kar.nic.in:8080/repository/rep_judgmentcase.php"><em>Virendra Khanna</em></a><em> </em>guidelines, as they provide a system of checks and balances that ensure that the intrusion on privacy is carried out proportionately; additionally, it would require LEAs to show a real requirement to demand access to the device. The independent eyes of a judicial magistrate provide a mechanism of oversight and a check against abuse of power by LEAs.</p>
<p><strong>Conclusion</strong></p>
<p style="text-align: justify;">The criminal law apparatus is the most coercive power available to the state, and, therefore, privacy rights will become meaningless unless they can withstand it. Several criminal procedures in the country are rooted in colonial statutes, where the rights of the populace being policed were never a consideration; hence, a radical shift is required. However, post-1947 and <em>Puttuswamy</em>, the ignorance and refusal to submit to the rights of the population can no longer be justified and significant reformulation is necessary to guarantee meaningful protections to device owners. There is a need to ensure that the rights of individuals are protected, especially when the motivation for their infringement is the supposed noble intentions of the criminal justice system. Failing to defend the right to privacy in these moments would be an invitation for allowing the power of the state to increase and inevitably become absolute.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies'>http://editors.cis-india.org/internet-governance/blog/rethinking-acquisition-of-digital-devices-by-law-enforcement-agencies</a>
</p>
No publisherHarikartik RameshSurveillanceInternet GovernancePrivacy2022-05-02T09:27:54ZBlog EntryComments to the draft Motor Vehicle Aggregators Scheme, 2021
http://editors.cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021
<b>This submission presents a response by researchers at the Centre for Internet and Society, India (CIS) to the draft Motor Vehicle Aggregators Scheme, 2021 published by the Transport Department, Government of National Capital Territory of Delhi, (hereafter “draft Scheme”).</b>
<p style="text-align: justify; "><span>CIS, established in Bengaluru in 2008 as a non-profit organisation, undertakes interdisciplinary research on internet and digital technologies from public policy andacademic perspectives. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and regulatory practices around internet, technology,and society in India, and elsewhere.</span></p>
<p style="text-align: justify; "><span>CIS is grateful for the opportunity to submit its comments to the draft Scheme. Please find below our thematically organised comments.</span></p>
<hr />
<p><a style="text-align: justify; " href="http://editors.cis-india.org/internet-governance/comments-draft-motor-vehicle-aggregators-scheme.pdf" class="internal-link"><strong>Click here</strong></a><span style="text-align: justify; "> to read more.</span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021'>http://editors.cis-india.org/internet-governance/blog/comments-to-the-draft-motor-vehicle-aggregators-scheme-2021</a>
</p>
No publisherChiara Furtado, Aayush Rathi and Abhishek SekharanMotor VehicleInternet GovernancePrivacy2022-04-01T15:25:06ZBlog Entry