The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 151 to 165.
The Srikrishna Committee Data Protection Bill and Artificial Intelligence in India
http://editors.cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india
<b>Artificial Intelligence in many ways is in direct conflict with traditional data protection principles and requirements including consent, purpose limitation, data minimization, retention and deletion, accountability, and transparency.</b>
<h3 style="text-align: justify; ">Privacy Considerations in AI</h3>
<p style="text-align: justify; ">Other related privacy concerns in the context of AI center around re-identification and de-anonymisation, discrimination, unfairness, inaccuracies, bias, opacity, profiling, and misuse of data and imbedded power dynamics.<a href="#_ftn1" name="_ftnref1"><sup>[1]</sup></a></p>
<p style="text-align: justify; ">The need for large amounts of data to improve accuracy, the ability to process vast amounts of granular data, and the present relationship between explainability and result of AI systems<a href="#_ftn2" name="_ftnref2"><sup><sup>[2]</sup></sup></a> have raised many concerns on both sides of the fence. On one hand, there is concern that heavy handed or inappropriate regulation will result in stifling innovation. If developers can only use data for pre-defined purpose - the prospects of AI are limited. On the other hand, individuals are concerned that privacy will be significantly undermined in light of AI systems that collect and process data in realtime and at a personal level not previously possible. Chatbots, house assistants, wearable devices, robot caregivers, facial recognition technology etc. have the ability to collect data from a person at an intimate level. At the sametime, some have argued that AI can work towards protecting privacy by limiting the access that humans working at respective companies have to personal data.<a href="#_ftn3" name="_ftnref3"><sup><sup>[3]</sup></sup></a></p>
<p style="text-align: justify; ">India is embracing AI. Two national roadmaps for AI were released in 2018 respectively by the Ministry of Commerce and Industry and Niti Aayog. Both roadmaps emphasized the importance of addressing privacy concerns in the context of AI and ensuring that a robust privacy legislation is enacted. In August 2018, the Srikrishna Committee released a draft Personal Data Protection Bill 2018 and the associated report that outlines and justifies a framework for privacy in India. As the development and use of AI in India continues to grow, it is important that India simultaneously moves forward with a privacy framework that addresses the privacy dimensions of AI.</p>
<p style="text-align: justify; ">In this article we attempt to analyse if and how the Srikrishna committee draft Bill and report has addressed AI, contrast this with developments in the EU and the passing of the GDPR, and identify solutions that are being explored towards finding a way to develop AI while upholding and safeguarding privacy.</p>
<h3 style="text-align: justify; ">The GDPR and Artificial Intelligence</h3>
<p style="text-align: justify; ">The General Data Protection Regulation became enforceable in May 2018 and establishes a framework for the processing of personal data for individuals within the European Union. The GDPR has been described by IAAP as taking a ‘risk based’ approach to data protection that pushes data controllers to engage in risk analysis and adopt ‘risk measured responses’.<a href="#_ftn4" name="_ftnref4"><sup><sup>[4]</sup></sup></a> Though the GDPR does not explicitly address artificial intelligence, it does have a number of provisions that address automated decision making and profiling and a number of provisions that will impact companies using artificial intelligence in their business activities. These have been outlined below:</p>
<ol style="text-align: justify; ">
<li><b>Data rights: </b> The GDPR enables individuals with a number of data rights: the right to be informed, right of access, right to rectification, right to erasure, right to restrict processing, right to data portability, right to object, and rights related to automated decision making including profiling. The last right - rights related to automated decision making - seeks to address concerns arising out of automated decision making by giving the individual the right to request to not be subject to a decision based solely on automated decision making including profiling if the decision would produce legal effects or similarly significantly affects them. There are three exceptions to this right - if the automated decision making is: a. necessary for the performance of a contract, b. authorised by the Union or Member State c. is based on explicit consent.<a href="#_ftn5" name="_ftnref5"><sup><sup>[5]</sup></sup></a> </li>
<li><b>Transparency:</b> Under Article 14, data controllers must enable the right to opt out of automated decision making by notifying individuals of the existence of automated decision making including profiling and providing meaningful information about the logic involved as well as the potential consequences of such processing.<a href="#_ftn6" name="_ftnref6"><sup><sup>[6]</sup></sup></a> Importantly, this requirement has the potential of ensuring that companies do not operate complete ‘black box’ algorithms within their business processes.</li>
<li><b>Fairness: </b>The principle of fairness found under Article 5(1) will also apply to the processing of personal data by AI. The principle requires that personal data must be processed in a way to meet the three conditions of lawfully, fairly, and in a transparent manner in relation to the data subject. Recital 71 further clarifies that this will include implementing appropriate mathematical and statistical measures for profiling, ensuring that inaccuracies are corrected, and ensuring that processing that does not result in negative discriminatory results.<a href="#_ftn7" name="_ftnref7"><sup><sup>[7]</sup></sup></a> </li>
<li><b>Purpose Limitation:</b> The principle of purpose limitation (Article 5(1)(b) requires that personal data must be collected for specified, explicit, and legitimate purposes and not be further processed in a manner incompatible with those purposes. Processing for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes are not considered to be incompatible with the initial purposes. It has been noted that it is unclear if research carried out through artificial intelligence would fall under this exception as the GDPR does not define ‘scientific purposes’.<a href="#_ftn8" name="_ftnref8"><sup><sup>[8]</sup></sup></a> </li>
<li><b>Privacy by Design and Default:</b> Article 25 requires all data controllers to implement technical and organizational measures to meet the requirements of the regulation. This could include techniques like pseudonymisation. Data controllers also are required to implement appropriate technical and organizational measures for ensuring that by default only personal data which are necessary for a specific purpose are processed.<a href="#_ftn9" name="_ftnref9"><sup><sup>[9]</sup></sup></a></li>
<li><b>Data Protection Impact Assessments:</b> Article 35 requires data controllers to undertake impact assessments if they are undertaking processing that is likely to result in a high risk to individuals. This includes if the data controller undertakes: systematic and extensive profiling, processes special categories of criminal offence data on a large scale, systematically monitor publicly accessible places on a large scale. In implementation, some jurisdictions like the UK require impact assessments on additional conditions including if the data controller: uses new technologies, uses profiling or special category data to decide on access to services, profile individuals on a large scale, process biometric data, process genetic data, match data or combine datasets from different sources, collect personal data from a source other than the individual without providing them with a privacy notice, track individuals’ location or behaviour, profile children or target marketing or online services at them, process data that might endanger the individual’s physical health or safety in the event of a security breach.<a href="#_ftn10" name="_ftnref10"><sup><sup>[10]</sup></sup></a></li>
<li><b>Security:</b> Article 30 requires data controllers to ensure a level of security appropriate to the risk including employing methods like encryption and pseudonymization. </li>
</ol>
<h3 style="text-align: justify; ">Srikrishna Committee Bill and AI</h3>
<p style="text-align: justify; ">The Draft Data Protection Bill and associated report by the Srikrishna Committee was published in August 2018 and recommends a privacy framework for India. The Bill contains a number of provisions that will directly impact data fiduciaries using AI and that try and account for the unintended consequences of emerging technologies like AI. These include:</p>
<ol style="text-align: justify; ">
<li><b>Definition of Harm:</b> The Bill defines harm as including bodily or mental injury, loss, distortion or theft of identity, financial loss or loss of property, loss of reputation or humiliation, loss of employment, any discriminatory treatment, any subjection to blackmail or extortion, any denial or withdrawal of a service, benefit or good resulting from an evaluative decision about the data principal, any restriction placed or suffered directly or indirectly on speech, movement or any other action arising out of a fear of being observed or surveilled, any observation or surveillance that is not reasonably expected by the data principal. The Bill also allows for categories of significant harm to be further defined by the data protection authority.</li>
</ol>
<p style="text-align: justify; ">Many of the above are harms that have been associated with artificial intelligence - specifically loss employment, discriminatory treatment, and denial of service. Enabling the data protection authority to further define categories of significant harm, could allow for unexpected harms arising from the use of AI to come under the ambit of the Bill.</p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Data Rights:</b> Like the GDPR, the Bill creates a set of data rights for the individual including the right to confirmation and access, correction, data portability, and right to be forgotten. At the sametime the Bill is intentionally silent on the rights and obligations that have been incorporated into the GDPR that address automated decision making including: The right to object to processing,<a href="#_ftn11" name="_ftnref11"><sup><sup>[11]</sup></sup></a> the right to opt out of automated decision making<a href="#_ftn12" name="_ftnref12"><sup><sup>[12]</sup></sup></a>, and the obligation on the data controller to inform the individual about the use of automated decision making and basic information regarding the logic and impact of same.<a href="#_ftn13" name="_ftnref13"><sup><sup>[13]</sup></sup></a> As justification, in their report the Committee noted the following: The right to restrict processing may be unnecessary in India as it provides only interim remedies around issues such as inaccuracy of data and the same can be achieved by a data principal approaching the DPA or courts for a stay on processing as well as simply withdraw consent. The objective of protecting against discrimination, bias, and opaque decisions that the right to object to automated processing and receive information about the processing of data in the Indian context seeks to fulfill would be better achieved through an accountability framework requiring specific data fiduciaries that will be making evaluative decisions through automated means to set up processes that ‘weed out’ discrimination. At the same time, if discrimination has taken place, individuals can seek remedy through the courts.</li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">By taking this approach, the Bill creates a framework to address harms arising out of AI, but does not empower the individual to decide how their data is processed and remains silent on the issue of ‘black box’ algorithms.</p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Data Quality</b>: Requires data fiduciaries to ensure that personal data that is processed is complete, accurate, not misleading and updated with respect to the purposes for which it is processed. When taking steps to comply with this - data fiduciaries must take into consideration if the personal data is likely to be used to make a decision about the data principal, if it is likely to be disclosed to other individuals, if the personal data is kept in a form that distinguishes personal data based on facts from personal data based on opinions or personal assessments.<a href="#_ftn14" name="_ftnref14"><sup><sup>[14]</sup></sup></a></li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">This principle, while not mandating that data fiduciaries take into account considerations such as biases in datasets, could potentially be be interpreted by the data protection authority to include in its scope, means towards ensuring that data does not contain or result in bias.</p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Principle of Privacy by Design:</b> Requires significant data fiduciaries to have in place a number policies and measures around several aspects of privacy. These include - (a) measures to ensure managerial, organizational, business practices and technical systems are designed in a manner to anticipate, identify, and avoid harm to the data principal (b) the obligations mentioned in Chapter II are embedded in organisational and business practices (c) technology used in the processing of personal data is in accordance with commercially accepted or certified standards (d) legitimate interests of business including any innovation is achieved without compromising privacy interests (e) privacy is protected throughout processing from the point of collection to deletion of personal data (f) processing of personal data is carried out in a transparent manner (g) the interest of the data principal is accounted for at every stage of processing of personal data.</li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">A number of these (a, d, e, and g) require that the interest of the data principal is accounted for throughout the processing of personal data, This will be significant for systems driven by artificial intelligence as a number of the harms that have arisen from the use of AI include discrimination, denial of service, or loss of employment - have been brought under the definition of harm within the Bill. Placing the interest of the data principal first is also important in protecting against unintended consequences or harms that may arise from AI.<a href="#_ftn15" name="_ftnref15"><sup><sup>[15]</sup></sup></a> If enacted, it will be important to see what policies and measures emerge in the context of AI to comply with this principle. It will also be important to see what commercially accepted or certified standard companies rely on to comply with (c.)</p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Data Protection Impact Assessment:</b> Requires data fiduciaries to undertake a data protection impact assessment when implementing new technologies or large scale profiling or use of sensitive personal data. Such assessments need to include a detailed description of the proposed processing operation, the purpose of the processing and the nature of personal data being processed, an assessment of the potential harm that may be caused to the data principals whose personal data is proposed to be processed, and measures for managing, minimising, mitigating or removing such risk of harm. If the Authority finds that the processing is likely to cause harm to the data principles, it may direct the data fiduciary to undertake processing in certain circumstances or entirely. This requirement applies to all significant data fiduciaires and all other data fiduciaries as required by the DPA.<a href="#_ftn16" name="_ftnref16"><sup><sup>[16]</sup></sup></a></li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">This principle will apply to companies implementing AI systems. For AI systems, it will be important to see how much information the DPA will require under the requirement of data fiduciaries providing detailed descriptions of the proposed processing operation and purpose of processing.</p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Classification of data fiduciaries as significant data fiduciaries</b>: The Authority has the ability to notify certain categories of data fiduciaries as significant data fiduciaries based on 1. The volume of personal data processed, 2. The sensitivity of personal data processed, turnover of the data fiduciary, risk of harm resulting from any processing being undertaken by the fiduciary, use of new technologies for processing, and other factor relevant for causing harm to any data principal. If a data fiduciary falls under the ambit of any of these conditions they are required to register with the Authority. All significant data fiduciaries must undertake data protection impact assessments, maintain records as per the bill, under go data audits, and have in place a data protection officer.</li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">As per this provision - companies deploying artificial intelligence would come under the definition of a significant data fiduciary and be subject to the principles of privacy by design etc. articulated in the chapter. The exception to this will be if the data fiduciary comes under the definition of ‘small entity’ found in section 48.<a href="#_ftn17" name="_ftnref17"><sup><sup>[17]</sup></sup></a></p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Restrictions on cross border transfer of personal data: </b>Requires that all data fiduciaries must store a copy of personal data on a server or data centre located in India and notified categories of critical personal data must be processed in servers located in India.</li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">It is interesting to note that in the context of cross border sharing of data, the Bill is creating a new category of data that can be further defined beyond personal and sensitive personal data. For companies implementing artificial intelligence, this provision may prove cumbersome to comply with as many utilize cloud storage and facilities located outside of India for the processing of larger amounts of data.<a href="#_ftn18" name="_ftnref18"><sup><sup>[18]</sup></sup></a></p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Powers and functions of the Authority</b>: The Bill lays down a number of functions of the Authority one being to monitor technological developments and commercial practices that may affect protection of personal data.</li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">By assumption, this will include monitoring of technological developments in the field of Artificial Intelligence.<a href="#_ftn19" name="_ftnref19"><sup><sup>[19]</sup></sup></a></p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Fair and reasonable processing: </b>Requires that any person processing personal data owes a duty to the data principal to process such personal data in a fair and reasonable manner that respects the privacy of the data principal. In the Srikrishna Committee report, the committee explains that the principle of the fair and reasonable is meant to address 1. Power asymmetries between data subjects and data fiduciaries - recognizing that data fiduciaires have a responsibility to act in the best interest of the data principal 2. Situations where processing may be legal but not necessary fair or in the best interest of the data principal 3. Developing trust between the data principal and the data fiduciary.<a href="#_ftn20" name="_ftnref20"><sup><sup>[20]</sup></sup></a></li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">This is in contrast to the GDPR which requires processing to simultaneously meet the three conditions of fairness, lawfulness, and transparency.</p>
<ol style="text-align: justify; "> </ol>
<ul style="text-align: justify; ">
<li><b>Purpose Limitation: </b>Personal data can only be processed for the purposes specified or any other purpose that the data principal would reasonably expect.</li>
</ul>
<ol style="text-align: justify; "> </ol>
<p style="text-align: justify; ">As a note, the Srikrishna Committee Bill does not include ‘scientific purposes’ as an exception to the principle of purpose limitation as found in the GDPR,<a href="#_ftn21" name="_ftnref21"><sup><sup>[21]</sup></sup></a> and instead creates an exception for research, archiving, or statistical purposes.<a href="#_ftn22" name="_ftnref22"><sup><sup>[22]</sup></sup></a> The DPA has the responsibility of developing codes defining research purposes under the act.<a href="#_ftn23" name="_ftnref23"><sup><sup>[23]</sup></sup></a></p>
<ol style="text-align: justify; ">
<li><b>Security Safeguards:</b> Every data fiduciary must implement appropriate security safeguards including the use of methods such as de-identification and encryption, steps to protect the integrity of personal data, and steps necessary to prevent misuse, unauthorised access to, modification, and disclosure or destruction of personal data.<a href="#_ftn24" name="_ftnref24"><sup><sup>[24]</sup></sup></a></li>
</ol>
<p style="text-align: justify; ">Unlike the GDPR which explicitly refers to the technique of pseudonymization, the Srikrishna uses Bill uses term de-identification. The Srikrishna Report clarifies that the this includes techniques like pseudonymization and masking and further clarifies that because of the risk of re-identification, de-identified personal data should still receive the same level of protection as personal data. The Bill further gives the DPA the authority to define appropriate levels of anonymization. <a href="#_ftn25" name="_ftnref25"><sup><sup>[25]</sup></sup></a></p>
<h3 style="text-align: justify; ">Technical perspectives of Privacy and AI</h3>
<p style="text-align: justify; ">There is an emerging body of work that is looking at solutions to the dilemma of maintaining privacy while employing artificial intelligence and finding ways in which artificial intelligence can support and strengthen privacy. For example, there are AI driven platforms that leverage the technology to help a business to meet regulatory compliance with data protection laws<a href="#_ftn26" name="_ftnref26"><sup><sup>[26]</sup></sup></a>, as well as research into AI privacy enhancing technologies.<a href="#_ftn27" name="_ftnref27"><sup><sup>[27]</sup></sup></a> Standards setting bodies like IEEE have undertaken work on the ethical considerations in the collection and use of personal data when designing, developing, and/or deploying AI through the standard ‘Ethically Aligned Design’.<a href="#_ftn28" name="_ftnref28"><sup><sup>[28]</sup></sup></a> . In the article Artificial Intelligence and Privacy by Datatilsynet - the Norwegian Data Protection Authority<a href="#_ftn29" name="_ftnref29"><sup><sup>[29]</sup></sup></a> break such methods into three categories:</p>
<ol style="text-align: justify; ">
<li>Techniques for reducing the need for large amounts of training data: Such techniques can include</li>
<ol>
<li><b>Generative adversarial networks (GANs):</b> GANs are used to create synthetic data and can address the need for large volumes of labelled data without relying on real data containing personal data. GANs could potentially be useful from a research and development perspective in sectors like healthcare where most data would quality as sensitive personal data.</li>
<li><b>Federated Learning:</b> Federated learning allows for models to be trained and improved on data from a large pool of users without directly using user data. This is achieved by running a centralized model on a client unit and subsequently improved on local data. Changes from the improvements are shared back with the centralized server. An average of the changes from multiple individual client units becomes the basis for improving the centralized model.</li>
<li><b>Matrix Capsules</b>: Proposed by Google researcher Geoff Hinton, Matrix Capsules improve the accuracy of existing neural networks while requiring less data.<a href="#_ftn30" name="_ftnref30"><sup><sup>[30]</sup></sup></a></li>
</ol>
<li>Techniques that uphold data protection without reducing the basic data set</li>
<ol>
<li><b>Differential Privacy</b>: Differential privacy intentionally adds ‘noise’ to data when accessed. This allows for personal data to be accessed with revealing identifying information.</li>
<li><b>Homomorphic Encryption:</b> Homomorphic encryption allows for the processing of data while it is still encrypted. This addresses the need to access and use large amounts of personal data for multiple purposes</li>
<li><b>Transfer Learning</b>: Instead of building a new model, transfer learning relies builds upon existing models that are applied to new related purposes or tasks. This has the potential to reduce the amount of training data needed. </li>
<li><b>RAIRD</b>: Developed by Statistics Norway and the Norwegian Centre for Research Data, RAIRD is a national research infrastructure that allows for access to large amounts of statistical data for research while managing statistical confidentiality. This is achieved by allowing researchers access to metadata. The metadata is used to build analyses which are then run against detailed data without giving access to actual data.<a href="#_ftn31" name="_ftnref31"><sup><sup>[31]</sup></sup></a></li>
</ol>
<li>Techniques to move beyond opaque algorithms</li>
<ol>
<li><b>Explainable AI (XAI): </b>DARPA in collaboration with Oregon State University is researching how to create explainable models and explanation interface while ensuring a high level of learning performance in order to enable individuals to interact with, trust, and manage artificial intelligence.<a href="#_ftn32" name="_ftnref32"><sup><sup>[32]</sup></sup></a> DARPA identifies a number of entities working on different models and interfaces for analytics and autonomy AI.<a href="#_ftn33" name="_ftnref33"><sup><sup>[33]</sup></sup></a></li>
<li><b>Local Interpretable Model Agnostic Explanations</b>: Developed to enable trust between AI models and humans by generating explainers to highlight key aspects that were important to the model and its decision - thus providing insight into the rationale behind a model.<a href="#_ftn34" name="_ftnref34"><sup><sup>[34]</sup></sup></a></li>
</ol> </ol>
<h3 style="text-align: justify; ">Public Sector use of AI and Privacy</h3>
<p style="text-align: justify; ">The role of AI in public sector decision making has been gradually growing globally across sectors such as law enforcement, education, transportation, judicial decision making and healthcare. In India too, use of automated processing in electronic governance under the Digital India mission, domestic law enforcement agencies monitoring social media content and educational schemes is being discussed and gradually implemented. Much like the potential applications of AI across sub-sectors, the nature of regulatory issues are also diverse.</p>
<p style="text-align: justify; ">Aside from the accountability framework discussed in the Srikrishna Committee report, the Puttaswamy judgment also provides a basis for governance of AI with respect to its concerns for privacy, in limited contexts. The sources of right to privacy as articulated in the Puttaswamy judgments included the terms ‘personal liberty’ under Article 21 of the Constitution. In order to fully appreciate how constitutional principles could apply to automated processing in India, we need to look closely at the origins of privacy under liberty. In the famous case of <i>AK Gopalan</i> there is a protracted discussion on the contents of the rights under Article 21. Amongst the majority opinions itself, the opinion was divided. While Sastri J. and Mukherjea J. took the restrictive view that limiting the protections to bodily restraint and detention, Kania J. and Das J. take a broader view for it to include the right to sleep, play etc. Through <i>RC Cooper</i><a href="#_ftn35" name="_ftnref35"><sup><sup>[35]</sup></sup></a> and <i>Maneka</i><a href="#_ftn36" name="_ftnref36"><sup><sup>[36]</sup></sup></a>, the Supreme Court took steps to reverse the majority opinion in <i>Gopalan</i> and it was established that that the freedoms and rights in Part III could be addressed by more than one provision. The expansion of ‘personal liberty’ has began in <i>Kharak Singh</i> where the unjustified interference with a person’s right to live in his house, was held to be violative of Article 21. The reasoning in <i>Kharak Singh</i> draws heavily from<i> Munn</i> v. <i>Illinois</i><a href="#_ftn37" name="_ftnref37"><sup><sup>[37]</sup></sup></a> which held life to be “more than mere animal existence.” Curiously, after taking this position <i>Kharak Singh</i> fails to recognise a fundamental right to privacy (analogous to the Fourth Amendment protection in US) under Article 21. The position taken in <i>Kharak Singh</i> was to extrapolate the same method of wide interpretation of ‘personal liberty’ as was accorded to ‘life’. <i>Maneka</i> which evolved the test for enumerated rights within Part III says that the claimed right must be an integral part of or of the the same nature as the named right. It says that the claimed must be ‘in reality and substance nothing but an instance of the exercise of the named fundamental right’. The clear reading of privacy into ‘personal liberty’ in this judgment is effectively a correction of the inherent inconsistencies in the positions taken by the majority in Kharak Singh.</p>
<p style="text-align: justify; ">The other significant change in constitutional interpretation that occurred in Maneka was with respect to the phrase ‘procedure established by law’ in Article 21. In Gopalan, the majority held that the phrase ‘procedure established by law’ does not mean procedural due process or natural justice. What this meant was that, once a ‘procedure’ was ‘established by law’, Article 21 could not be said to have been infringed. This position was entirely reversed in Maneka. The ratio in Maneka said that ‘procedure established by law’ must be fair, just and reasonable, and cannot be arbitrary and fanciful. Therefore, any infringement of the right to privacy must be through a law which follows the principles of natural justice, and is not arbitrary or unfair. It follows that any instances of automated processing for public functioning by state actors or others, must meet this standard of ‘fair, just and reasonable’.</p>
<p style="text-align: justify; ">While there is a lot of focus internationally on what ethical AI must be, it is important that when we consider use of AI by the state, we pay heed to the existing constitutional principles which determine how AI must be evaluated against these standards. These principles however extend only to limited circumstances for protections under Article 21 are not horizontal in nature but only applicable against the state. Whether a party is the state or not is a question that has been considered several times by the Supreme Court and must be determined by functional tests. In our submission of the Justice Srikrishna Committee, we clearly recommended that where automated decision making is used for discharging of public functions, the data protection law must state that such actions are subject the the constitutional standards and are ‘just, fair and reasonable’ and satisfy the tests for both procedural and substantive due process. To a limited extent, the committee seems to have picked up the standards of ‘fair’ and ‘reasonable’ and made it applicable to all forms of processing, whether public or private. It is as yet unclear whether fairness and reasonableness as inserted in the bill would draw from the constitutional standard under Article 21. The report makes a reference to the twin principles of acting in a manner that upholds the best interest of the privacy of the individual, and processing within the reasonable expectations of the individual, which do not seem to cover the fullest essence of the legal standard under Article 21.</p>
<h3 style="text-align: justify; ">Conclusion</h3>
<p style="text-align: justify; ">The Srikrishna Committee Bill attempts to create an accountability framework for the use of emerging technologies including AI that is focused on placing the responsibility on companies to prevent harm. Though not as robust as found in the GDPR, the protections have been enabled through requirements such as fair and reasonable processing, ensuring data quality, and implementing principles of privacy of design. At the sametime, the Srikrishna Bill does not include provisions that can begin to address the consumer facing ‘black box’ of AI by ensuring that individuals have information about the potential impact of decisions taken by automated means. In contrast, the GDPR has already taken important steps to tackle this by requiring companies to explain the logic and potential impact of decisions taken by automated means.</p>
<p style="text-align: justify; ">Most importantly, the Bill gives the Data Protection Authority the necessary tools to hold companies accountable for the use of AI through the requirements of data protection audits. If enacted, it will have to be seen how these audits and the principle of privacy by design are implemented and enforced in the context of companies using AI. Though the Bill creates a Data Protection Authority consisting of members that have significant experience in data protection, information technology, data management, data science, cyber and internet laws, and related subjects, these requirements can be further strengthened by having someone from a background of ethics and human rights.</p>
<p style="text-align: justify; ">One of the responsibilities of the DPA under the Srikrishna Bill will be to monitor technological developments and commercial practices that may affect protection of personal data and promote measures and undertake research for innovation in the field of protection of personal data. If enacted, we hope that AI and solutions towards enhancing privacy in the context of AI like described above will be one of these focus areas of the DPA. It will also be important to see how the DPA develops impact assessments related to AI and what tools associated with the principle of Privacy by Design emerge to address AI.</p>
<hr style="text-align: justify; " />
<p style="text-align: justify; "><a href="#_ftnref1" name="_ftn1"><sup><sup>[1]</sup></sup></a> https://privacyinternational.org/topics/artificial-intelligence</p>
<p style="text-align: justify; "><a href="#_ftnref2" name="_ftn2"><sup><sup>[2]</sup></sup></a> https://www.wired.com/story/our-machines-now-have-knowledge-well-never-understand/</p>
<p style="text-align: justify; "><a href="#_ftnref3" name="_ftn3"><sup><sup>[3]</sup></sup></a> https://iapp.org/news/a/ai-offers-opportunity-to-increase-privacy-for-users/</p>
<p style="text-align: justify; "><a href="#_ftnref4" name="_ftn4"><sup><sup>[4]</sup></sup></a> https://iapp.org/media/pdf/resource_center/GDPR_Study_Maldoff.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref5" name="_ftn5"><sup><sup>[5]</sup></sup></a> https://gdpr-info.eu/art-22-gdpr/</p>
<p style="text-align: justify; "><a href="#_ftnref6" name="_ftn6"><sup><sup>[6]</sup></sup></a> https://gdpr-info.eu/art-14-gdpr/</p>
<p style="text-align: justify; "><a href="#_ftnref7" name="_ftn7"><sup><sup>[7]</sup></sup></a> https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref8" name="_ftn8"><sup><sup>[8]</sup></sup></a> https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref9" name="_ftn9"><sup><sup>[9]</sup></sup></a> https://gdpr-info.eu/art-25-gdpr/</p>
<p style="text-align: justify; "><a href="#_ftnref10" name="_ftn10"><sup><sup>[10]</sup></sup></a> https://ico.org.uk/for-organisations/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance/data-protection-impact-assessments/</p>
<p style="text-align: justify; "><a href="#_ftnref11" name="_ftn11"><sup><sup>[11]</sup></sup></a> https://gdpr-info.eu/art-21-gdpr/</p>
<p style="text-align: justify; "><a href="#_ftnref12" name="_ftn12"><sup><sup>[12]</sup></sup></a> https://gdpr-info.eu/art-22-gdpr/</p>
<p style="text-align: justify; "><a href="#_ftnref13" name="_ftn13"><sup><sup>[13]</sup></sup></a> https://gdpr-info.eu/art-14-gdpr/</p>
<p style="text-align: justify; "><a href="#_ftnref14" name="_ftn14"><sup><sup>[14]</sup></sup></a>Draft Data Protection Bill 2018 - Chapter II section 9</p>
<p style="text-align: justify; "><a href="#_ftnref15" name="_ftn15"><sup><sup>[15]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter VII section 29</p>
<p style="text-align: justify; "><a href="#_ftnref16" name="_ftn16"><sup><sup>[16]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter VII section 33</p>
<p style="text-align: justify; "><a href="#_ftnref17" name="_ftn17"><sup><sup>[17]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter VII section 38</p>
<p style="text-align: justify; "><a href="#_ftnref18" name="_ftn18"><sup><sup>[18]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter VIII section 40</p>
<p style="text-align: justify; "><a href="#_ftnref19" name="_ftn19"><sup><sup>[19]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter X section 60</p>
<p style="text-align: justify; "><a href="#_ftnref20" name="_ftn20"><sup><sup>[20]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter II section 4</p>
<p style="text-align: justify; "><a href="#_ftnref21" name="_ftn21"><sup><sup>[21]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter II section 5</p>
<p style="text-align: justify; "><a href="#_ftnref22" name="_ftn22"><sup><sup>[22]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter IX Section 45</p>
<p style="text-align: justify; "><a href="#_ftnref23" name="_ftn23"><sup><sup>[23]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter XIV section 97</p>
<p style="text-align: justify; "><a href="#_ftnref24" name="_ftn24"><sup><sup>[24]</sup></sup></a> Draft Data Protection Bill 2018 - Chapter VII section 31</p>
<p style="text-align: justify; "><a href="#_ftnref25" name="_ftn25"><sup><sup>[25]</sup></sup></a> Srikrishna Committee Report on Data Protection pg. 36 and 37. Available at: http://www.prsindia.org/uploads/media/Data%20Protection/Committee%20Report%20on%20Draft%20Personal%20Data%20Protection%20Bill,%202018.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref26" name="_ftn26"><sup><sup>[26]</sup></sup></a> https://www.ciosummits.com/Online_Assets_DocAuthority_Whitepaper_-_Guide_to_Intelligent_GDPR_Compliance.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref27" name="_ftn27"><sup><sup>[27]</sup></sup></a> https://jolt.law.harvard.edu/assets/articlePDFs/v31/31HarvJLTech217.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref28" name="_ftn28"><sup><sup>[28]</sup></sup></a> https://standards.ieee.org/content/dam/ieee-standards/standards/web/documents/other/ead_personal_data_v2.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref29" name="_ftn29"><sup><sup>[29]</sup></sup></a> https://www.datatilsynet.no/globalassets/global/english/ai-and-privacy.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref30" name="_ftn30"><sup><sup>[30]</sup></sup></a> https://www.artificial-intelligence.blog/news/capsule-networks</p>
<p style="text-align: justify; "><a href="#_ftnref31" name="_ftn31"><sup><sup>[31]</sup></sup></a> http://raird.no/about/factsheet.html</p>
<p style="text-align: justify; "><a href="#_ftnref32" name="_ftn32"><sup><sup>[32]</sup></sup></a> https://www.darpa.mil/attachments/XAIProgramUpdate.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref33" name="_ftn33"><sup><sup>[33]</sup></sup></a> https://www.darpa.mil/attachments/XAIProgramUpdate.pdf</p>
<p style="text-align: justify; "><a href="#_ftnref34" name="_ftn34"><sup><sup>[34]</sup></sup></a> https://www.oreilly.com/learning/introduction-to-local-interpretable-model-agnostic-explanations-lime</p>
<p style="text-align: justify; "><a href="#_ftnref35" name="_ftn35"><sup><sup>[35]</sup></sup></a> <i>R C Cooper</i> v. <i>Union of India</i>, 1970 SCR (3) 530.</p>
<p style="text-align: justify; "><a href="#_ftnref36" name="_ftn36"><sup><sup>[36]</sup></sup></a> <i>Maneka Gandhi</i> v. <i>Union of India</i>, 1978 SCR (2) 621.</p>
<p style="text-align: justify; "><a href="#_ftnref37" name="_ftn37"><sup><sup>[37]</sup></sup></a> 94 US 113 (1877).</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india'>http://editors.cis-india.org/internet-governance/blog/the-srikrishna-committee-data-protection-bill-and-artificial-intelligence-in-india</a>
</p>
No publisherAmber Sinha and Elonnai HickokInternet GovernanceArtificial IntelligencePrivacy2018-09-03T13:29:12ZBlog EntryCelebrating One Year of the Justice K.S. Puttaswamy v. Union of India Judgment
http://editors.cis-india.org/internet-governance/news/celebrating-one-year-of-the-justice-k-s-puttaswamy-v-union-of-india-judgment
<b>Shweta Mohandas was a panelist at the event, "Celebrating One Year of the Justice K.S. Puttaswamy v. Union of India Judgment", organised by Indian Council for Research on International Economic Relations, and the Centre for Communication Governance at National Law University Delhi. It took place on Friday, 24 August 2018 at India International Centre, New Delhi.</b>
<p style="text-align: justify; ">The event began with Dr. Usha Ramanathan's Opening remarks on the State of Privacy in India & the Challenges to Realising Puttaswamy’s Promise. This was then followed by two panel discussions, the first on Data Protection for a Free and Fair Digital Economy and the second on the Legacy of the Justice K.S. Puttaswamy v. Union of India Judgment. Shweta participated in the second panel. More details of the event <a class="external-link" href="https://ccgnludelhi.wordpress.com/2018/08/22/celebrating-one-year-of-the-puttaswamy-judgment-august-24-6-00-pm-iic/">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/celebrating-one-year-of-the-justice-k-s-puttaswamy-v-union-of-india-judgment'>http://editors.cis-india.org/internet-governance/news/celebrating-one-year-of-the-justice-k-s-puttaswamy-v-union-of-india-judgment</a>
</p>
No publisherAdminInternet GovernancePrivacy2018-08-30T02:53:48ZNews Item20 years of Google: Privacy, fake news and the future
http://editors.cis-india.org/internet-governance/news/hindustan-times-rachel-lopez-august-26-2018-20-years-of-google-privacy-fake-news-and-future
<b>Google once directed you to information. Today, it’s often the source of information, using data you and others have shared, often without you realising it. Public knowledge goes where Google takes it. And 20 years on, not everyone’s happy with the journey.</b>
<p style="text-align: justify; ">The article by Rachel Lopez was published in <a class="external-link" href="https://www.hindustantimes.com/india-news/20-years-of-google-privacy-fake-news-and-the-future/story-0jmwFxnhwz8lWFUCbMxBjM.html">Hindustan Times</a> on August 26, 2018. Pranesh Prakash was quoted.</p>
<hr />
<p style="text-align: justify; ">Happy Birthday, Google. The search engine is 20 this year, and what a ride it’s been! When Sergey Brin and Larry Page were developing software that <a href="https://www.hindustantimes.com/india-news/20-years-of-google-when-information-was-not-just-a-click-away/story-aIDWzxXMQd10ShuhL62vcI.html" target="_blank">searched better and loaded faster </a>than Explorer, Navigator and AltaVista, the web itself consisted of just 1 lakh websites.</p>
<p style="text-align: justify; ">Google’s mission statement was succinct: To organise the world’s information and make it universally accessible. Their corporate code of conduct was even simpler: Don’t be evil.</p>
<p style="text-align: justify; ">Perhaps even Google didn’t realise where its mission would take it. The following decade brought Google News, Gmail, Maps and Chrome. By 2014, the internet had grown to 1 billion websites. The search engine, their core product, had become the default homepage of the Internet.</p>
<p style="text-align: justify; ">In May this year, Google quietly dropped the ‘Don’t be evil’ tag. The same month, its Android operating system crossed 2 billion monthly active devices. <a href="https://www.hindustantimes.com/india-news/20-years-of-google-there-s-something-for-everyone-here/story-eS5rDm76QFNgZIXwY3kGuM.html" target="_blank">Seven products (including YouTube and Google Play</a>) now reach a combined 1 billion users.</p>
<p style="text-align: justify; ">Google once directed you to information. Today, it’s often the source of information (in ads and top-of-the-page blocs), using data you and others have shared, often without you realising it. Public knowledge goes where Google takes it. And 20 years on, not everyone’s happy with the <a href="https://www.hindustantimes.com/india-news/20-years-of-google-the-journey-to-omnipresence/story-Ehr55MBGNOV0j3Jd9XhdyO.html" target="_blank">journey</a>.</p>
<p style="text-align: justify; ">“The key concern is that Google has grown so big,” says Pranesh Prakash, policy director at Bangalore’s Centre for Internet & Society. “It’s like the classic line from [Spiderman’s] Uncle Ben: With great power comes great responsibility. In Google’s case, its great size is what brought great power to begin with.”</p>
<p style="text-align: justify; ">For billions of Google users, the biggest concerns are now of <a href="https://www.hindustantimes.com/india-news/i-believe-the-most-exciting-moment-for-google-in-india-hasn-t-happened-yet-rajan-anandan/story-8goKIyIadDBKit0wyz7xYP.html" target="_blank">privacy and accountability</a>, says Nikhil Pahwa, founder of Medianama, which analyses digital and telecom businesses. “There are few checks on Google’s ability to take, retain and process information from users,” he says.</p>
<h3 style="text-align: justify; ">Hits and misses</h3>
<p style="text-align: justify; ">For Google, all is going according to plan. Its search engine is now smart enough to complete your sentences. It’s learning constantly from what you search for, watch, spend on, share and regret; it knows your commute and your vacation plans. And it’s profiting from this knowledge.</p>
<p style="text-align: justify; ">In the UK, Google is being sued for bypassing iPhone privacy settings to track and collect data from 4.4 million users in 2011 and 2012. Information on race, physical and mental health, political leanings, sexuality, shopping habits and locations was apparently used to build advertising categories. Google also creates products for the US government, and has user data from around the world. “Any entity that has this much insight into us, and is in a position to use it, whether for the government or commercial gain, is cause for worry,” says Prakash. Most users aren’t worried, and that’s worrying too. We don’t realise how much data is being tracked or collected. The more we share, the more useful Google gets, and the greater its potential for misuse, for mapping say, beef-eaters, online dissenters, LGBT supporters or single women who work late.</p>
<p style="text-align: justify; ">The Internet’s other giant, Facebook, recently suspended 400 apps over privacy concerns, admitting that 87 million users may have had data compromised in 2016. Meanwhile, even non-Google apps are capable of hijacking data using software developed by Google. Weather apps look at your photo gallery, ride-sharing software keep tracking you after the ride, games are checking out your texts as you play. Gmail knows your flight timings, how many steps you’ve walked, and your last bank transaction.</p>
<h3 style="text-align: justify; ">Search for tomorrow</h3>
<p style="text-align: justify; ">Perhaps the biggest concerns are with Google’s artificial intelligence technology, the brand’s great leap forward fuelled by its massive data reserves. The tech is already being criticised for being fed biased data, creating global services that mirror the prejudices of an insular, mostly white, mostly male, tech industry.<br /><br />Sara Wachter-Boettcher, author of Technically Wrong, which looks at how technology reflects sexism and the biases of the people that create it, says this creates problems. “Google develops tools that other tech companies rely on to build other products,” she says. So its biases spread to other products too. As machines learn, Google is starting to unlearn too.<br /><br />“Machine unlearning is basically recognising when a machine has learned something inaccurate, or biased, and then erasing that learning,” says Wachter-Boettcher. In Africa, the company (along with Facebook) now funds a Masters course in machine intelligence to improve the industry’s diversity. Last year, Google took its first steps to curb fake news hits on its search engines with tools that allow users to report misleading or offensive content.<br /><br />But perhaps it’s time to work towards a future in which Google will be monitored in real time, in different countries, rather than depending on the company to offer a fix after a misstep. Prakash believes that the way forward is reimagining an Internet where Google isn’t the first and last word on everything. “This doesn’t mean more companies like Google but searching that happens in a more decentralised way,” he says. “We need to save the web from large monopolies in the long run.”</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/hindustan-times-rachel-lopez-august-26-2018-20-years-of-google-privacy-fake-news-and-future'>http://editors.cis-india.org/internet-governance/news/hindustan-times-rachel-lopez-august-26-2018-20-years-of-google-privacy-fake-news-and-future</a>
</p>
No publisherAdminInternet GovernancePrivacy2018-08-30T02:49:06ZNews ItemUNESCAP Google AI Meeting
http://editors.cis-india.org/internet-governance/news/unescap-google-ai-meeting
<b>Arindrajit was a panelist at the event on AI in public service delivery hosted by UNESCAP Bangkok on August 29, 2018. The event was co-organized by Economic and Social Commission for Asia and the Pacific and Google.</b>
<p style="text-align: justify; ">The discussion centered around the two questions (1) Is AI different from other technological advancements in the past and (2) Recommendations for policy-makers to enhance AI in Public Service Delivery.The other panelists were Dr. Urs Gasser (Berkman), Vidushi Marda ( Art.19), Malavika Jayaram (Digital Asia Hub) and Jake Lucchi ( Google) The panel was a platform to discuss some of our findings in our case studies on healthcare and agriculture, which we will receive comments on and will get published in November.<br /><br /></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/unescap-google-ai-meeting'>http://editors.cis-india.org/internet-governance/news/unescap-google-ai-meeting</a>
</p>
No publisherAdminInternet GovernanceArtificial IntelligencePrivacy2018-09-20T15:47:42ZNews ItemConsumer Care Society: Silver Jubilee Year Celebrations
http://editors.cis-india.org/internet-governance/blog/consumer-care-society-silver-jubilee-year-celebrations
<b>Arindrajit Basu delivered a talk the Silver Jubilee Celebrations of the Consumer Care Society (CCS )on 'Privacy and Security in the Age of the Internet.</b>
<p style="text-align: justify; ">CONSUMER CARE SOCIETY (CCS) is an active volunteer based not-for-profit organization involved in Consumer activities. Established as a registered society in the year 1994, CCS has for the past 3 decades functioned as the voice of consumer in many forums. Today CCS is widely recognized as an premier consumer voluntary organization (CVO) in Bangalore and Karnataka. CCS is registered with many goverenmental agencies and regulators like TRAI,BIS, Petroleum and Natural Gas Regulatory Board, DOT, ICMR at the Central Government levels and with almost all service providers at the State Level like BWSSB, BESCOM, BDA, BBMP.</p>
<p style="text-align: justify; ">Shreenivas.S. Galgali, ITS, Adviser, TRAI Regional Office, Bangalore and Aradhana Biradar, User Education and Research Specialist, Google were the other speakers at the event held at CCS.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/consumer-care-society-silver-jubilee-year-celebrations'>http://editors.cis-india.org/internet-governance/blog/consumer-care-society-silver-jubilee-year-celebrations</a>
</p>
No publisherArindrajit BasuInternet GovernancePrivacy2018-08-27T13:51:13ZBlog EntryAn Analysis of the CLOUD Act and Implications for India
http://editors.cis-india.org/internet-governance/blog/an-analysis-of-the-cloud-act-and-implications-for-india
<b>India houses the second largest population in the world at approximately 1.35 billion individuals. In such a diverse and dense context, law enforcement could be a challenging job.</b>
<h3 style="text-align: justify; ">Introduction</h3>
<p style="text-align: justify; "><span>Networked technologies have changed the nature of crime and will continue to do so.</span><span> Access to data generated by digital technologies and on digital platforms is important in solving online and offline crimes. Yet, a significant amount of such data is stored predominantly under the control of companies in the United States. Thus, for Indian law enforcement to access metadata (location data or subscriber information), they can send a request directly to the company. However for access to content data, law enforcement must follow the MLAT process as a result of requirements under the Electronic Communications Privacy Act (ECPA). ECPA allows service providers to share metadata on request of foreign governments, but requires a judicially issued warrant based on a finding of ‘probable cause’ for a service provider to share content data.</span></p>
<p style="text-align: justify; "><span>The challenges associated with accessing data across borders has been an area of concern for India for many years. From data localization requirements</span><span>, legal decryption mandates</span><span>, proposed back doors</span><span>- law enforcement and the government have consistently been trying to find efficient ways to access data across borders. <br /></span></p>
<p style="text-align: justify; "><span>Towards finding solutions to the challenges in the MLAT process, Peter Swire and Deven Desai in the article “A Qualified SPOC Approach for India and Mutual Legal Assistance” have noted the importance of finding a solution to the hurdles in the India - US MLAT and have suggested that reforms for the MLAT process in India should not start with law enforcement, and have instead proposed the establishment of a Single Point of Contact designated to handle and process government to government requests with requests emerging from that office receiving special legal treatment.</span><span> <br /></span></p>
<p style="text-align: justify; "><span>Frustrations with cross border sharing of data are not unique to India and the framework has been recognized by many stakeholders for being outdated, slow, and inefficient - giving rise to calls from governments, law enforcement, and companies for solutions.</span><span> As a note, some research has also highlighted that the identified issues with the MLAT system are broad and more evidence is needed to support each concern and inform policy response.</span></p>
<p style="text-align: justify; "><span>Towards this, the US and EU have undertaken clear policy steps to address the tensions in the MLAT system by enabling direct access by governments to content data. On April 17 2018, the European Union published the E-Evidence Directive and a Regulation that allows for a law enforcement agency to obtain electronic evidence from service providers within 10 days of receiving a request or 6 hours for emergency requests and request the preservation or production of data. Production orders for content and transactional records can be issued only for certain serious crimes and must be issued by a judge. No judicial authorisation is required for production orders for subscriber information and access data, and it can be sought to investigate any criminal offense, not just serious offenses. Preservation orders can be issued without judicial authorisation for all four types of data and for the investigation of any crime.</span><span> Further, requests originating from the European Union must be handled by a designated legal representative.</span><span> Preservation orders can be issued for all four types of data.</span><span> Further, requests originating from the European Union must be handled by a designated legal representative.</span></p>
<p style="text-align: justify; "><span>On the US side, in 2016, the Department of Justice (DoJ) put out draft legislation that would create a framework allowing the US to enter into executive agreements with countries that have been evaluated as meeting criteria defined in the law.</span><span> Our response to the DoJ draft Bill can be found here.</span><span> In February 2018, the Microsoft Ireland Case was presented before the U.S Supreme Court. The question central to the case was whether or not a US warrant issued against a company incorporated in the US was valid if the data was stored in servers outside of the US. On March 23, 2018, the United States government enacted the “Clarifying Lawful Overseas Use of Data Act” also known as the CLOUD Act. The passing of the Act solves the dilemma found in the Microsoft Ireland case.</span><span> The CLOUD Act amends Title 18 of the United States Code and allows U.S. law enforcement agencies to access data stored abroad by increasing the reach of the U.S. Stored Communication Act</span><span>, enabling access without requiring the specific cooperation of foreign governments. Under this law, U.S. law enforcement agencies can seek or issue orders that compel companies to provide data regardless of where the data is located as long as the data is under their “possession, custody or control”. It further allows US communication service providers to intercept or provide the content of communications in response to orders from foreign governments if the foreign government has entered into an executive agreement with the US upon approval by the Attorney General and concurrence with the Secretary of State. The Act also absolves companies from criminal and civil liability when disclosing information in good faith pursuant to an executive agreement between the US and a foreign country. Such access would be reciprocal, with the US government having similar access rights to data stored in the foreign country. <br /></span></p>
<p style="text-align: justify; "><span>Though the E-Evidence Directive is a significant development, in this article - we focus on the CLOUD Act and its implications for cross border sharing of data between India and the US. </span></p>
<hr />
<p>To read more <b><a class="external-link" href="http://cis-india.org/internet-governance/files/analysis-of-cloud-act-and-implications-for-india">download the PDF</a></b></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/an-analysis-of-the-cloud-act-and-implications-for-india'>http://editors.cis-india.org/internet-governance/blog/an-analysis-of-the-cloud-act-and-implications-for-india</a>
</p>
No publisherElonnai Hickok and Vipul KharbandaCloud ActInternet GovernancePrivacy2018-08-22T14:55:56ZBlog EntryDNA ‘Evidence’: Only Opinion, Not Science, And Definitely Not Proof Of Crime!
http://editors.cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime
<b>On August 9, 2018, the DNA Technology (Use and Application) Regulation Bill, 2018 was introduced in the Lok Sabha and we commented on some key aspects of it earlier. </b>
<p style="text-align: justify; ">The article was published in <a class="external-link" href="https://www.bloombergquint.com/opinion/2018/08/20/dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime#gs.nyAe84A">Bloomberg Quint</a> on August 20, 2018.</p>
<hr />
<p style="text-align: justify; ">Though taking some steps in the right direction such as formalising the process for lab accreditation, the Bill ignores many potential cases of ‘harm’ that may arise out of the collection, databasing, and using DNA evidence for criminal and civil purposes.</p>
<p style="text-align: justify; ">DNA evidence is widely touted as the most accurate forensic tool, but what is not widely publicised is it is not infallible. From crime scene to database, it is extremely vulnerable to a number of different unknown variables and outcomes. These variables are only increasing as the technology becomes more precise – profiles can be developed from only a few cells and technology now exists that generates a profile in 90 minutes. Primary and secondary transfer, contamination, incomplete samples, too many mixed samples, and inaccurate or outdated methods of analysis and statistical methodologies that may be used, are all serious reasons as to why DNA evidence may paint an innocent person guilty.</p>
<blockquote class="quoted" style="text-align: justify; ">Importantly, DNA itself is not static and predicting how it may have changed over time is virtually impossible.</blockquote>
<h3 style="text-align: justify; ">Innocent, But Charged</h3>
<p style="text-align: justify; ">In April 2018, <a href="https://www.wired.com/story/dna-transfer-framed-murder/" target="_blank">WIRED carried a story </a>of Lukis Anderson who was charged with the first-degree murder of Raveesh Kumra, a Silicon Valley investor after investigators found Anderson’s DNA on Kumra’s nails. Long story short – Anderson earlier that day had been intoxicated in public and had been attended by paramedics. The same paramedics handled Kumra’s body and inadvertently transferred Anderson’s DNA to Kumra’s body. The story quotes some sobering facts that research has found about DNA:</p>
<ol>
<li>Direct contact is not necessary for DNA to be transferred. In an experiment with a group of individuals sharing a bottle of juice, 50 percent had another’s DNA on their hand and ⅓rd of the glasses contained DNA from individuals that did not have direct contact with them.</li>
<li>An average person sheds 50 million skin cells a day.</li>
<li>Standing still our DNA can travel over a yard away and will be easily carried over miles on others clothing or hair, for example not very differently from pollen.</li>
<li>In an experiment that tested public items, it was found that items can contain DNA from a half-dozen people.</li>
<li>A friendly or inadvertent contact can transfer DNA to private regions or clothing.</li>
<li>Different people shed detritus at different levels that contain DNA.</li>
<li>One in five has some other person’s DNA under the fingernails on a continuous basis.</li>
</ol>
<div style="text-align: center; "><img src="http://editors.cis-india.org/home-images/BloombergPic.png/@@images/6eed536e-0142-44b7-a710-60d812d3bc1e.png" alt="Crime Scene Tape in Alexandria" class="image-inline" title="Crime Scene Tape in Alexandria" /></div>
<div style="text-align: center; ">A police office carries crime scene tape in Alexandria, Virginia, U.S. (Photographer: Andrew Harrer/Bloomberg)</div>
<p style="text-align: justify; "><a href="https://www.wired.com/2015/10/familial-dna-evidence-turns-innocent-people-into-crime-suspects/" target="_blank">In another case</a>, the police in Idaho, USA, used a public DNA database to run a familial DNA search – a technique used to identify suspects whose DNA is not recorded in a law enforcement database, but whose close relatives have had their genetic profiles cataloged, just as India's DNA Bill seeks to do. The partial match that resulted implicated Michael Usry, the son of the man whose DNA was in the public database. It took 33 days for Michael to be cleared of the crime. That an innocent man only spent 33 days under suspicion could be considered a positive outcome when compared to the case of <a href="https://www.theatlantic.com/magazine/archive/2016/06/a-reasonable-doubt/480747/" target="_blank">Josiah Sutton</a> who spent four years convicted of rape in prison due to misinterpretation of DNA samples by the Houston Police Department Crime Laboratory, which is among the largest public forensic centers in Texas. The Atlantic called this out as “The False Promise of DNA Testing – the forensic technique is becoming ever more common and ever less reliable”.</p>
<blockquote class="quoted" style="text-align: justify; ">Presently, there is little confidence that such safeguards exist – prosecutors do not share any exculpatory evidence with the accused and India does not even follow the ‘fruit of a poisonous tree’ doctrine with respect to the admissibility of evidence and India has yet to develop a robust jurisprudence for evaluating scientific evidence.</blockquote>
<p style="text-align: justify; ">The 2015 Law Commission Report cites four cases that speak to the role and reliance on expert opinion as evidence. Though these cases point to the importance of expert opinion they differ on the weight that should be given to the same.<a href="http://www.genewatch.org/uploads/f03c6d66a9b354535738483c1c3d49e4/BestPractice_Report_plus_cover_final.pdf" target="_blank"> International best practice</a> requires the submission of corroborating evidence, training law enforcement, and court officers, and ensuring that prosecution and defence have equal access to forensic evidence.</p>
<p style="text-align: justify; ">Consider India with a population of 1.3 billion people – 70 percent mostly residing in rural areas and less educated and a<a href="https://www.weforum.org/agenda/2017/10/india-has-139-million-internal-migrants-we-must-not-forget-them/" target="_blank"> heavy migrant population</a> in urban centres, an overwhelmed police force in nascent stages of forensic training, and an overburdened judiciary and no concrete laws to govern issues of the <a href="http://jlsr.thelawbrigade.com/index.php/2017/06/16/admissibility-of-dna-in-indian-legal-system/" target="_blank">admissibility of forensic techniques</a>.</p>
<p style="text-align: justify; ">In such circumstances, the question is not only how many criminals can be convicted but also how many innocents could be convicted.</p>
<p style="text-align: center; "><img src="http://editors.cis-india.org/home-images/Handcuffs.png/@@images/ada66bb0-965f-404f-b434-bb8d36110544.png" alt="Handcuffs" class="image-inline" title="Handcuffs" /></p>
<p style="text-align: center; ">A pair of standard issue handcuffs sits on a table. (Photographer: Jerome Favre/Bloomberg)</p>
<p style="text-align: justify; ">The DNA Bill seeks to establish DNA databanks at the regional and national level but how this will be operationalised is not quite clear. The Bill enables the DNA Regulatory Board to accredit DNA labs. Will databases be built from scratch? Will they begin by pulling in existing databases?</p>
<p style="text-align: justify; ">The question is not if the DNA samples match but how they came to match. The greater power that comes from the use of DNA databases requires greater responsibility in ensuring adequate information, process, training, and laws are in place for everyone – those who give DNA, collect DNA, store DNA, process DNA, present DNA, and eventually decide on the use of the DNA. As India matures in its use of DNA evidence for forensic purposes it is important that it keeps at the forefront what is necessary to ensure and protect the rights of the individual.</p>
<div class="story-element-text story-element">
<div>
<hr />
<p style="text-align: justify; "><i>Elonnai Hickok Chief Operating Officer at The Centre for Internet and Society. Murali Neelakantan is an expert in healthcare laws, and the author of</i> ‘<i>DNA Testing as Evidence - A Judge</i>’<i>s Nightmare</i>’ <i>in the Journal of Law and Medicine.</i></p>
</div>
</div>
<ol> </ol>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime'>http://editors.cis-india.org/internet-governance/blog/bloomberg-quint-elonnai-hickok-and-murali-neelakantan-august-20-2018-dna-evidence-only-opinion-not-science-and-definitely-not-proof-of-crime</a>
</p>
No publisherElonnai Hickok and Murali NeelakantanDNA ProfilingInternet GovernancePrivacy2018-08-22T00:43:54ZBlog EntryUse of Visuals and Nudges in Privacy Notices
http://editors.cis-india.org/internet-governance/blog/use-of-visuals-and-nudges-in-privacy-notices
<b>Nudging in privacy notices can be a privacy-enhancing tool. For example, informing users of how many people would have access to their data would help them make a decision. However, nudges can also be used to influence users towards making choices that compromise their privacy. For example, the visual design of default options on digital platforms currently nudge users to share their data. It is critical to ensure that there is mindful use of nudges, and that it is directed at the well being of the users.</b>
<p> </p>
<p><em>Edited by Elonnai Hickok and Amber Sinha</em></p>
<hr />
<p style="text-align: justify;">Former Supreme Court judge, Justice B.N. Srikrishna, who is currently involved in drafting the new data-privacy laws for India, was quoted recently by the Bloomberg<a name="_ftnref1" href="#_ftn1"><sup>[1]</sup></a>. Acknowledging the ineffectiveness of consent forms of tech companies that leads to users’ data being collected and misused, he asked if we should have pictograph warnings for consent much like the warnings that are given on cigarette packets. His concern is that an average Indian does not realise how much data they are generating or how it is being used. He attributed this to the access issues with the consent forms presented by companies which are in the English language. In the Indian context, Justice Srikrishna pointed out, considerations around literacy and languages should be addressed.</p>
<p style="text-align: justify;">The new framework being worked on by Srikrishna and his committee comprising academics and government officials, would make the tech companies more accountable for data collection and use, and allow users to have more control over their own data. But, in addition to this regulatory step towards privacy and data protection, the concern towards communication of companies’ data practices through consent forms or privacy notices is also critical for users. Currently, the cryptic notices are a barrier for users, as are the services that do not provide incremental information about the use of the service - for example, what data is being shared with how many people or what data is being collected at what point, instead relying on blanket consent forms taken at the beginning of a service. Visuals can go a long way in making these notices and services accessible to users.</p>
<p style="text-align: justify;">Although, Justice Srikrishna chose the extreme example of warnings on cigarette packets, visually depicting the health risks of cigarette smoking using repulsive imagery, the underlying intent seems to be of using visuals as a means of giving an immediate and clear warning about how people’s data is being used and by whom. It must be noted that the effectiveness of warnings on cigarette packets is debatable. These warnings are also a way in which manufacturers consider their accountability met, which is a possible danger with privacy notices as well. Most companies consider that their accountability is limited to giving all the information to the users without ensuring that the information is communicated to help the user understand the risks. Hence, one has to be cautious of the role of visuals in notices so that they are used with the primary purpose of meaningful communication and accessibility that can be used to inform further action. The visual summary of the data practice in terms of how it will affect the user will also serve as a warning.</p>
<p style="text-align: justify;">The warning images on cigarette packets are an example of the user-influencing design approach called nudging<a name="_ftnref2" href="#_ftn2"><sup><sup>[2]</sup></sup></a>. While nudging techniques are meant to be aimed at the users’ well being, it brings forward the question of who decides what is beneficial for the users. Moreover, the harm in cigarette smoking is more obvious, and thus the favourable choice for the users is also clearer. But, in the context of data privacy, the harms are less apparent. It is difficult to demonstrate the harms or benefits of data use, particularly when data is re-purposed or used indirectly. There is also no single choice that can be pushed when it comes to the use and collection of data. Different users may have different preferences or degrees to which they would like to allow the use of their data. This raises deeper questions about the extent to which privacy law and regulation should be paternalistic.</p>
<p style="text-align: justify;">Nudges are considered to follow the soft or libertarian paternalism approach, where the user is not forbidden any options but only given a push to alter their behaviour in a predictable way<a name="_ftnref3" href="#_ftn3"><sup><sup>[3]</sup></sup></a>. It is crucial to differentiate between the strong paternalistic approach that doesn’t allow a choice at all, the usability approach, and the soft paternalistic approach of nudging, as mentioned by Alessandro Acquisti in his paper, ‘The Behavioral Economics of Personal Information’<a name="_ftnref4" href="#_ftn4"><sup><sup>[4]</sup></sup></a>. In the usability approach, the design of the system would make it intuitive for users to change settings and secure their data. The soft paternalistic approach of nudging would be a step further and present secure settings as a default. Usability is often prioritised by designers. However, soft paternalism techniques help to enhance choice for users and lead to larger welfare<a name="_ftnref5" href="#_ftn5"><sup><sup>[5]</sup></sup></a>.</p>
<p style="text-align: justify;">Nudging in privacy notices can be a privacy-enhancing tool. For example, informing users of how many people would have access to their data would help them make a decision<a name="_ftnref6" href="#_ftn6"><sup><sup>[6]</sup></sup></a>. However, nudges can also be used to influence users towards making choices that compromise their privacy. For example, the visual design of default options on digital platforms currently nudge users to share their data. It is critical to ensure that there is mindful use of nudges, and that it is directed at the well being of the users.</p>
<p style="text-align: justify;">The design of privacy notices should be re-conceptualised to ensure that they inform the users effectively, keeping in mind certain best practices. For instance, a multilayered privacy notice can be used, which includes a very short notice designed for use on portable digital devices where there is limited space, condensed notice that contains all the key factors in an easy to understand way, and a complete notice with all the legal requirements<a name="_ftnref7" href="#_ftn7"><sup><sup>[7]</sup></sup></a>. Along with the layering of information, the timing of notices should also be designed to be at setup, just in time of the user’s action, or at periodic intervals. In terms of visuals, infographics can be used to depict data flows in a system. Another best practice is to integrate privacy notices with the rest of the system. Designers are needed to be involved early in the process so that the design decisions are not purely visual but also consider information architecture, content design, and research.</p>
<p style="text-align: justify;">Practice based frameworks should be developed for communication designers in order to have a standardised vocabulary around creating privacy notices. Additionally, multiple user groups and their varied privacy preferences must be taken into account. Finally, an ethical framework must be put into place for design practitioners in order to ensure that the users’ well being is prioritised, and notices are designed to facilitate informed consent. Further recommendations and concerns regarding the design of privacy notices, and the use of visuals can be read <a href="https://cis-india.org/internet-governance/blog/design-concerns-in-creating-privacy-notices">here</a>.</p>
<p style="text-align: justify;">Justice Srikrishna’s statement is an important step towards creating effective privacy notices with visuals. The conversation on the need to design privacy notices can lead to clearer and more comprehensible notices. Combined with the enforcement of fair collection and use of data by companies, well designed notices will allow users more control and a real choice to opt-in or out of a service and make informed choices as they engage with a service. Justice Srikrishna’s analogy seems to recommend using visuals to describe what type of data is being collected and for what purposes at the time of taking consent. Though cigarette warnings may not be the most appropriate analogy, this is a good start, and it is important to explore how visuals and design can be used throughout a service - from beginning to end - to convey and promote awareness and informed choices by users. It is also important to extend this conversation outside of privacy into the realm of security and understand how visuals and design can inform users’ awareness and personal choices around security when using a service.</p>
<hr />
<p><a name="_ftn1" href="#_ftnref1"><sup><sup>[1]</sup></sup></a> <a href="https://www.bloomberg.com/news/articles/2018-06-10/tech-giants-nervous-as-judge-drafts-first-data-rules-in-india">https://www.bloomberg.com/news/articles/2018-06-10/tech-giants-nervous-as-judge-drafts-first-data-rules-in-india</a></p>
<p><a name="_ftn2" href="#_ftnref2"><sup><sup>[2]</sup></sup></a> <a href="http://www.ijdesign.org/index.php/IJDesign/article/viewFile/1512/584">http://www.ijdesign.org/index.php/IJDesign/article/viewFile/1512/584</a></p>
<p><a name="_ftn3" href="#_ftnref3"><sup><sup>[3]</sup></sup></a> <a href="https://www.andrew.cmu.edu/user/pgl/psosm2013.pdf">https://www.andrew.cmu.edu/user/pgl/psosm2013.pdf</a></p>
<p><a name="_ftn4" href="#_ftnref4"><sup><sup>[4]</sup></sup></a> <a href="https://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf">https://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf</a></p>
<p><a name="_ftn5" href="#_ftnref5"><sup><sup>[5]</sup></sup></a> <a href="https://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf">https://www.heinz.cmu.edu/~acquisti/papers/acquisti-privacy-nudging.pdf</a></p>
<p><a name="_ftn6" href="#_ftnref6"><sup><sup>[6]</sup></sup></a> <a href="https://cis-india.org/internet-governance/files/rethinking-privacy-principles">https://cis-india.org/internet-governance/files/rethinking-privacy-principles</a></p>
<p><a name="_ftn7" href="#_ftnref7"><sup><sup>[7]</sup></sup></a> <a href="https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf">https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/ten_steps_to_develop_a_multilayered_privacy_notice__white_paper_march_2007_.pdf</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/use-of-visuals-and-nudges-in-privacy-notices'>http://editors.cis-india.org/internet-governance/blog/use-of-visuals-and-nudges-in-privacy-notices</a>
</p>
No publishersaumyaaInternet GovernancePrivacy2018-08-22T13:16:15ZBlog EntryNational Health Stack: Data For Data’s Sake, A Manmade Health Hazard
http://editors.cis-india.org/internet-governance/blog/bloomberg-quint-murali-neelakantan-swaraj-barooah-swagam-dasgupta-torsha-sarkar-august-14-2018-national-health-stack-data-for-datas-sake-a-manmade-health-hazard
<b>On Oct. 5, 2017, an HIV positive woman was denied admission in Hyderabad’s Osmania General Hospital even though she was entitled to free treatment under India’s National AIDS Control Organisation programme. Another incident around the same time witnessed a 24-year-old pregnant woman at Tikamgarh district hospital in Madhya Pradesh being denied treatment by hospital doctors once she tested positive for HIV. The patient reportedly delivered the twins outside the maternity ward after she was turned away by the hospital, but her newborn twin girls died soon after.</b>
<p style="text-align: justify; ">The op-ed was <a class="external-link" href="https://www.bloombergquint.com/opinion/2018/08/14/data-for-datas-sake-a-manmade-health-hazard#gs.bT20zK4">published in Bloomberg Quint</a> on August 14, 2018.</p>
<hr />
<p style="text-align: justify; ">Apart from facing the severity of their condition, patients afflicted with diseases such as HIV, tuberculosis, and mental illnesses, are often subject to social stigma, sometimes even leading to the denial of medical treatment. Given this grim reality would patients want their full medical history in a database?</p>
<p style="text-align: justify; ">The ‘National Health Stack’ as described by the NITI Aayog in its consultation paper, is an ambitious attempt to build a digital infrastructure with a “deep understanding of the incentive structures prevalent in the Indian healthcare ecosystem”. If the government is to create a database of individuals’ health records, then it should appreciate the differential impact that it could have on the patients.</p>
<blockquote>The collection of health data, without sensitisation and accountability, has the potential to deny healthcare to the vulnerable.</blockquote>
<p style="text-align: justify; ">We have innumerable instances of denial of services due to Aadhaar and there is a real risk that another database will lead to more denial of access to the most vulnerable.</p>
<p style="text-align: justify; ">Earlier, we had outlined some key aspects of the NHS, the ‘world’s largest’ government-funded national healthcare scheme. Here we discuss some of the core technical issues surrounding the question of data collection, updating, quality, and utilisation.</p>
<h3>Resting On A Flimsy Foundation: The Unique Health ID</h3>
<p style="text-align: justify; ">The National Health Stack envisages the creation of a unique ID for registered beneficiaries in the system — a ‘Digital Health ID’. Upon the submission of a ‘national identifier’ and completion of the Know Your Customer process, the patient would be registered in the system, and a unique health ID generated.</p>
<p style="text-align: justify; ">This seemingly straightforward process rests on a very flimsy foundation. The base entry in the beneficiary registry would be linked to a ‘strong foundational ID’. Extreme care needs to be taken to ensure that this is not limited to an Aadhaar number. Currently, the unavailability of Aadhaar would not be a ground for denial of treatment to a patient only for their first visit; the patient must provide Aadhaar or an Aadhaar enrolment slip to avail treatment thereafter. This suggests that the national healthcare infrastructure will be geared towards increasing Aadhaar enrollment, with the unstated implication that healthcare is a benefit or subsidy — a largess of government, and not, as the courts have confirmed, a fundamental right.</p>
<blockquote style="text-align: justify; ">Not only is this project using government-funded infrastructure to deny its citizens the fundamental right to healthcare, it is using the desperate need of the vulnerable for healthcare to push the ‘Aadhaar’ agenda.</blockquote>
<p style="text-align: justify; ">Any pretence that Aadhaar is voluntary is slowly fading with the government mandating it at every step of our lives.</p>
<p style="text-align: justify; "><img alt="Aadhaar Seva kendra. (Source: Aadhaar Official Account/Facebook)&nbsp;" class="qt-image" src="https://images.assettype.com/bloombergquint%2F2018-01%2Fd7f4b53a-b069-484d-8c28-511c516aa4d5%2F3a192ed0-8a18-4518-95be-ac5234239e94.jpg?w=480&auto=format%2Ccompress" /></p>
<div class="visualClear" style="text-align: justify; ">Aadhaar Seva kendra. (Source: Aadhaar Official Account/Facebook</div>
<div class="visualClear" style="text-align: justify; "></div>
<h3>Is The Health ID An Effective And Unique Identifier?</h3>
<p style="text-align: justify; ">Even if we choose to look past the fact that the validity of Aadhaar is still pending the test of legality before the apex court, a foundational ID would mean that the data contained within that ID is unique, accurate, incorruptible, and cannot be misused. These principles, unfortunately, have been compromised by the UIDAI in the Aadhaar project with its lack of uniqueness of identity (i.e, fake IDs and duplicity), failure to authenticate identity, numerous alleged data leaks (‘alleged’ because UIDAI maintains that there haven’t been any leaks), lack of connectivity to be able to authenticate identity and numerous instances of inaccurate information which cannot be corrected.</p>
<p>Linking something as crucial and basic as healthcare data with such a database is a potential disaster.</p>
<p>There is a real risk that incorrect linking could cause deaths or inappropriate medical care.</p>
<h3>The High Risk Of Poor Quality Data</h3>
<p style="text-align: justify; ">The NITI Aayog paper envisages several expansive databases that are capable of being updated by different entities. It includes enrollment and updating processes but seems to assume that all these extra steps will be taken by all the relevant stakeholders and does not explain the motivation for stakeholders to do so.</p>
<p style="text-align: justify; ">In a country where government doctors, hospitals, wellness centres, etc are overburdened and understaffed, this reliance is simply not credible. For instance, all attributes within the registries are to be digitally signed by an authorised updater, there must be an audit trail for all changes made to the registries, and surveyors will be tasked with visiting providers in person to validate the data. Identifying these precautions as measures to assure accurate data is a great step towards building a national health database, but this seems an impossible task.</p>
<blockquote>Who are these actors and what will incentivise them to ensure the accuracy and integrity of data?</blockquote>
<p style="text-align: justify; ">In other words, what incentive and accountability structures will ensure that data entry and updating is accurate, and not approached from a more ‘<i>jugaad</i>’ ‘let’s just get this done for the sake of it’ attitude that permeates much of the country. How will patients have access to the database to be able to check its accuracy? Is it possible for a patient (who will presumably be ill) to gain easy access to an updater to change their data? If so, how? It is worth noting that the patient’s ‘right’ to check her data assumes that they have access to a computer that is connected to the internet as well as a good level of digital literacy, which is not the case in India for a significant section of the population. Even data portability loses its potential benefits if the quality of data on these registries is not reliable. In this case, healthcare providers will need to verify their patients’ health history using physical records instead, rendering the stack redundant.</p>
<p>Who will be liable to the patient for misdiagnosis based on the database?</p>
<p><img alt="A sonographic image is displayed on a monitor as a patient undergoes an ultrasound scan in Bikaner, Rajasthan, India. (Photographer: Prashanth Vishwanathan/Bloomberg)" class="qt-image" src="https://images.assettype.com/bloombergquint%2F2018-08%2Fe1659408-49ba-4188-b57e-aef377c69eb0%2Fm1291107.jpg?w=480&auto=format%2Ccompress" /></p>
<div class="visualClear">A sonographic image is displayed on a monitor as a patient undergoes an ultrasound scan in Bikaner, Rajasthan, India. (Photographer: Prashanth Vishwanathan/Bloomberg)</div>
<p style="text-align: justify; ">Leaving the question of accountability vague opens updaters to the possibility of facing dangerous and unnecessarily punitive measures in the future. The NITI Aayog paper fails to address this key issue which arose recently. Despite being a notifiable disease, there are reports that numerous doctors from the private sector failed to notify or update TB cases to the Ministry of Health and Family Welfare ostensibly on the grounds that they did not receive consent from their patients to share their information with the government. This was met with a harsh response from the government which stated that clinical establishment that failed to notify tuberculosis patients would face jail time. According to a few doctors, the government’s new move would coerce patients to go to ‘underground clinics’ to receive treatment discreetly and hence, would not solve the issue of TB.</p>
<blockquote>The document also offers no specific recommended procedures regarding how inaccurate entries will be corrected or deleted.</blockquote>
<p style="text-align: justify; ">It is then perhaps not a stretch to imagine that these scenarios would affect the quality of the data stored; defeating NITI Aayog’s objective of researchers using the stack for high-quality medical data.</p>
<p style="text-align: justify; ">The reason why the quality and integrity of data is at the head of the table is that all the proposed applications of the NHS (analytics, fraud detection etc.) assume a high quality, accurate dataset. At the same time, the enrolment process, updating process and disclosed measures to ensure data quality will effectively lead to poor quality data. If this is the case, then applications derived from the NHS dataset should assume an imperfect data, rather than an accurate dataset, which should make one wonder if no data is better than data that is certainly inaccurate.</p>
<h3>Lack Of Data Utilisation Guidelines</h3>
<p style="text-align: justify; ">Issues with data quality are exacerbated depending on how and where it is used, and who uses it. The paper has identified some users to be health-sector stakeholders such as healthcare providers (hospitals, clinics, labs etc), beneficiaries, doctors, insurers and accredited social health activists but misses laying down utilisation guidelines. The foresight to create a dataset that can be utilised by multiple actors for numerous applications is commendable, but potentially problematic -- especially if guidelines on how this data is to be used by stakeholders (especially the private sector) are ignored.</p>
<p style="text-align: justify; ">In order to bridge this knowledge gap, India has the opportunity to learn from the legal precedent set by foreign institutions. As an example, one could examine the Health Information Technology for Economic and Clinical Health Act (HITECH) and the Health Insurance Portability and Accountability Act (HIPAA) in the U.S. which sets out strict guidelines for how businesses are to handle sensitive health data in order to maintain the individual’s privacy and security. It goes one step further to also lay down incentive and accountability structures in order that business associates necessarily report security breaches to their respective covered entities.</p>
<blockquote>If we do not take necessary precautions now, we not only run the risk of poor security and breach of privacy but of inaccurate data that renders the national health data repository a health risk for the whole patient population.</blockquote>
<p style="text-align: justify; ">There’s also the lack of clarity on who is meant to benefit from using such a database or whether the benefits are equal to all stakeholders, but more on that in a subsequent piece.</p>
<p style="text-align: justify; "><img alt="A medical team uses a glucometer to check the blood glucose level of a patient at a mobile clinic in Pancharala, on the outskirts of Bengaluru, India. (Photographer: Dhiraj Singh/Bloomberg)" class="qt-image" src="https://images.assettype.com/bloombergquint%2F2018-08%2F5e7e7b41-1513-4161-b195-5b8a77c6e4f1%2F314780590_1_20.jpg?w=480&auto=format%2Ccompress" /></p>
<div class="visualClear" style="text-align: justify; ">A medical team uses a glucometer to check the blood glucose level of a patient at a mobile clinic in Pancharala, on the outskirts of Bengaluru, India. (Photographer: Dhiraj Singh/Bloomberg)</div>
<div class="visualClear" style="text-align: justify; "></div>
<h3>It’s Your Recipe, You Try It First!</h3>
<p style="text-align: justify; ">If the NITI Aayog and the government are sure that there is a need for a national healthcare database, perhaps they can start using the Central Government Health Scheme (which includes all current and retired government employees and their families) as a pilot scheme for this. Once the software, database and the various apps built on it are found to be good value for money and patients benefit from excellent treatment all over the country, it could be expanded to those who use the Employees’ State Insurance system, and then perhaps to the armed forces. After all, these three groups already have a unique identifier and would benefit from the portability of healthcare records since they are likely to be transferred and posted all over the country. If, and only if, it works for these groups and the claimed benefits are observed, then perhaps it can be expanded to the rest of the country’s healthcare systems.</p>
<p><i>Murali Neelakantan is an expert in healthcare laws. Swaraj Barooah is Policy Director at The Centre for Internet and Society. Swagam Dasgupta and Torsha Sarkar are interns at The Centre for Internet and Society.</i></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/bloomberg-quint-murali-neelakantan-swaraj-barooah-swagam-dasgupta-torsha-sarkar-august-14-2018-national-health-stack-data-for-datas-sake-a-manmade-health-hazard'>http://editors.cis-india.org/internet-governance/blog/bloomberg-quint-murali-neelakantan-swaraj-barooah-swagam-dasgupta-torsha-sarkar-august-14-2018-national-health-stack-data-for-datas-sake-a-manmade-health-hazard</a>
</p>
No publisherMurali Neelakantan, Swaraj Barooah, Swagam Dasgupta and Torsha SarkarPrivacyAadhaarInternet GovernanceHealthcare2018-09-16T05:01:18ZBlog EntryIndia's Contribution to Internet Governance Debates
http://editors.cis-india.org/internet-governance/blog/nlud-student-law-journal-sunil-abraham-mukta-batra-geetha-hariharan-swaraj-barooah-and-akriti-bopanna-indias-contribution-to-internet-governance-debates
<b>India's Contribution to Internet Governance Debates", an article by Sunil Abraham, Mukta Batra, Geetha Hariharan, Swaraj Barooah and Akriti Bopanna, was recently published in the NLUD Student Law Journal, an annual peer-reviewed journal published by the National Law University, Delhi.</b>
<h2>Abstract</h2>
<p style="text-align: justify; ">India is the leader that championed ‘access to knowledge’ and ‘access to medicine’. However, India holds seemingly conflicting views on the future of the Internet, and how it will be governed. India’s stance is evolving and is distinct from that of authoritarian states who do not care for equal footing and multi-stakeholderism.</p>
<hr />
<h2 style="text-align: justify; ">Introduction</h2>
<p style="text-align: justify; ">Despite John Perry Barlow’s defiant and idealistic Declaration of Independence of Cyberspace1 in 1996, debates about governing the Internet have been alive since the late 1990s. The tug-of-war over its governance continues to bubble among states, businesses, techies, civil society and users. These stakeholders have wondered who should govern the Internet or parts of it: Should it be the Internet Corporation for Assigned Names and Numbers (ICANN)? The International Telecommunications Union (ITU)? The offspring of the World Summit on Information Society (WSIS) - the Internet Governance Forum (IGF) or Enhanced Cooperation (EC) under the UN? Underlying this debate has been the role and power of each stakeholder at the decision-making table.States in both the global North and South have taken various positions on this issue.</p>
<p style="text-align: justify; ">Whether all stakeholders ought to have an equal say in governing the unique structure of the Internet or do states have sovereign public policy authority? India has, in the past, subscribed to the latter view. For instance, at WSIS in 2003, through Arun Shourie, then India’s Minister for Information Technology, India supported the move ‘requesting the Secretary General to set up a Working Group to think through issues concerning Internet Governance,’ offering him ‘considerable experience in this regard... [and] contribute in whatever way the Secretary General deems appropriate’. The United States (US), United Kingdom (UK) and New Zealand have expressed their support for ‘equal footing multi-stakeholderism’ and Australia subscribes to the status quo.</p>
<p style="text-align: justify; ">India’s position has been much followed, discussed and criticised. In this article, we trace and summarise India’s participation in the IGF, UN General Assembly (‘UNGA’), ITU and the NETmundial conference (April 2014) as a representative sample of Internet governance fora. In these fora, India has been represented by one of three arms of its government: the Department of Electronics and Information Technology (DeitY), the Department of Telecommunications (DoT) and the Ministry of External Affairs (MEA). The DeitY was converted to a full-fledged ministry in 2016 known as the Ministry of Electronics and Information Technology (MeitY). DeitY and DoT were part of the Ministry of Communications and Information Technology (MCIT) until 2016 when it was bifurcated into the Ministry of Communications and MeitY.</p>
<p style="text-align: justify; ">DeitY used to be and DoT still is, within the Ministry of Communications and Information Technology (MCIT) in India. Though India has been acknowledged globally for championing ‘access to knowledge’ and ‘access to medicine’ at the World Intellectual Property Organization (WIPO) and World Trade Organization (WTO), global civil society and other stakeholders have criticised India’s behaviour in Internet governance for reasons such as lack of continuity and coherence and for holding policy positions overlapping with those of authoritarian states.</p>
<p style="text-align: justify; ">We argue that even though confusion about the Indian position arises from a multiplicity of views held within the Indian government, India’s position, in totality, is distinct from those of authoritarian states. Since criticism of the Indian government became more strident in 2011, after India introduced a proposal at the UNGA for a UN Committee on Internet-related Policies (CIRP) comprising states as members, we will begin to trace India's position chronologically from that point onwards.</p>
<hr />
<ul>
<li> Download the paper published in NLUD Student Law Journal <a class="external-link" href="http://cis-india.org/internet-governance/files/indias-contribution-to-internet-governance-debates/">here</a></li>
<li>For a timeline of the events described in the article <a class="external-link" href="http://cis-india.org/internet-governance/files/indias-position-on-multi-stakeholderism-vs-multilateralism">click here</a></li>
<li>Read the paper published by NLUD Student Law Journal <a class="external-link" href="https://nludslj.webs.com/archives">on their website</a></li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/nlud-student-law-journal-sunil-abraham-mukta-batra-geetha-hariharan-swaraj-barooah-and-akriti-bopanna-indias-contribution-to-internet-governance-debates'>http://editors.cis-india.org/internet-governance/blog/nlud-student-law-journal-sunil-abraham-mukta-batra-geetha-hariharan-swaraj-barooah-and-akriti-bopanna-indias-contribution-to-internet-governance-debates</a>
</p>
No publisherSunil Abraham, Mukta Batra, Geetha Hariharan, Swaraj Barooah and Akriti BopannaFreedom of Speech and ExpressionICANNInternet GovernancePrivacy2018-08-16T15:38:02ZBlog EntrySpreading unhappiness equally around
http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around
<b>The section of civil society opposed to Aadhaar is unhappy because the UIDAI and all other state agencies that wish to can process data non-consensually.</b>
<p>The article was published in <a class="external-link" href="https://www.business-standard.com/article/opinion/spreading-unhappiness-equally-around-118073100008_1.html">Business Standard</a> on July 31, 2018.</p>
<hr />
<p style="text-align: justify; ">There is a joke in policy-making circles — you know you have reached a good compromise if all the relevant stakeholders are equally unhappy. By that measure, the B N Srikrishna committee has done a commendable job since there are many with complaints.</p>
<p style="text-align: justify; ">Some in the private sector are unhappy because their demonisation of the European Union’s General Data Protection Regulation (GDPR) has failed. The committee’s draft data protection Bill is closely modelled upon the GDPR in terms of rights, principles, design of the regulator and the design of the regulatory tools like impact assessments. With 4 per cent of global turnover as maximum fine, there is a clear signal that privacy infringements by transnational corporations will be reigned in by the regulator. Getting a law that has copied many elements of the European regulation is good news for us because the GDPR is recognised by leading human rights organisations as the global gold standard. But the bad news for us is that the Bill also has unnecessarily broad data localisation mandates for the private sector.</p>
<p style="text-align: justify; ">Some in the fintech sector are unhappy because the committee rejected the suggestion that privacy be regulated as a property right. This is a positive from the human rights perspective, especially because this approach has been rejected across the globe, including the European Union. Property rights are inappropriate because a natural law framing of the enclosure of the commons into private property through labour does not translate to personal data. Also in comparison to patents — or “intellectual property” — the scale of possible discreet property holdings in personal information is several orders higher, posing unimaginable complexity for regulation, possibly creating a gridlock economy.</p>
<p style="text-align: justify; ">The section of civil society opposed to Aadhaar is unhappy because the UIDAI and all other state agencies that wish to can process data non-consensually. A similar loophole exists in the GDPR. Remember the definition of processing includes “operations such as collection, recording, organisation, structuring, storage, adaptation, alteration, retrieval, use, alignment or combination, indexing, disclosure by transmission, dissemination or otherwise making available, restriction, erasure or destruction”. This means the UIDAI can collect data from you without your consent and does not have to establish consent for the data it has collected in the past. There is a “necessary” test which is supposed to constrain data collection. But for the last 10 odd years, the UIDAI has deemed it “necessary” to collect biometrics to give the poor subsidised grain. Will those forms of disproportionate non-consensual data collection continue? Most probably because the report recommends that the UIDAI continue to play the role of the regulator with heightened powers. Which is like trusting the fox with<br />the henhouse.</p>
<p style="text-align: justify; ">Employees should be unhappy because the Bill has an expansive ground under which employers can nonconsensually harvest their data. The Bill allows for non-consensual processing of any data “necessary” for recruitment, termination, providing any benefit or service, verifying the attendance or any other activity related to the assessment of the performance”. This is permitted when consent is not an appropriate basis or would involve disproportionate effort on the part of the employer. This is basically a surveillance provision for employers. Either this ground should be removed like in the GDPR or a “proportionate” test should also be introduced otherwise disproportionate mechanisms like spyware on work computers will be installed by employees without providing notice.</p>
<p style="text-align: justify; ">Some free speech activists are unhappy because the law contains a “right to be forgotten” provision. They are concerned that this will be used by the rich and powerful to censor mainstream and alternative media. On the face of the “right to be forgotten” in the GDPR is a much more expansive “right to erasure”, whilst the Bill only provides for a more limited "right to restrict or prevent continuing disclosure”. However, the GDPR has a clear exception for “archiving purposes in the public interest, scientific or historical research purposes or statistical purposes”. The Bill like the GDPR does identify the two competing human rights imperatives — freedom of expression and the right to information. However, by missing the “public interest” test it does not sufficiently social power asymmetries.</p>
<p style="text-align: justify; ">Privacy and security researchers are unhappy because re-identification has been made an offence without a public interest or research exception. It is indeed a positive that the committee has made re-identification a criminal offence. This is because the de-identification standards notified by the regulator would always be catching up with the latest mathematical development. However, in order to protect the very research that the regulator needs to protect the rights of individuals, the Bill should have granted the formal and non-formal academic community immunity from liability and criminal prosecution.</p>
<p style="text-align: justify; ">Lastly but also most importantly, human rights activists are unhappy because the committee again like the GDPR did not include sufficiently specific surveillance law fixes. The European Union has historically handled this separately in the ePrivacy Regulation. Maybe that is the approach we must also follow or maybe this was a missed opportunity. Overall, the B N Srikrishna committee must be commended for producing a good data protection Bill. The task before us is to make it great and to have it enacted by Parliament at the earliest.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around'>http://editors.cis-india.org/internet-governance/blog/business-standard-july-31-2018-sunil-abraham-spreading-unhappiness-equally-around</a>
</p>
No publishersunilAadhaarInternet GovernancePrivacy2018-07-31T14:49:52ZBlog EntryLining up the data on the Srikrishna Privacy Draft Bill
http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill
<b>In the run-up to the Justice BN Srikrishna committee report, some stakeholders have advocated that consent be eliminated and replaced with stronger accountability obligations. This was rejected and the committee has released a draft bill that has consent as the bedrock just like the GDPR. And like the GDPR there exists legal basis for nonconsensual processing of data for the “functions of the state”. What does this mean for lawabiding persons?</b>
<p>The article was published in <a class="external-link" href="https://economictimes.indiatimes.com/small-biz/startups/newsbuzz/lining-up-the-data-on-the-srikrishna-privacy-draft-bill/articleshow/65192296.cms">Economic Times</a> on July 30, 2018</p>
<hr />
<p style="text-align: justify; ">Non-consensual processing is permitted in the bill as long it is “necessary for any function of the” Parliament or any state legislature. These functions need not be authorised by law.</p>
<p style="text-align: justify; ">Or alternatively “necessary for any function of the state authorised by law” for the provision of a service or benefit, issuance of any certification, licence or permit.<br />Fortunately, however, the state remains bound by the eight obligations in chapter two i.e., fair and reasonable processing, purpose limitation, collection limitation, lawful processing, notice and data quality and data storage limitations and accountability. This ground in the GDPR has two sub-clauses: one, the task passes the public interest test and two, the loophole like the Indian bill that possibly includes all interactions the state has with all persons.</p>
<p style="text-align: justify; ">The “necessary” test appears both on the grounds for non-consensual processing, and in the “collection limitation” obligation in chapter two of the bill. For sensitive personal data, the test is raised to “strictly necessary”. But the difference is not clarified and the word “necessary” is used in multiple senses.</p>
<p style="text-align: justify; ">Under the “collection limitation” obligation the bill says “necessary for the purposes of processing” which indicates a connection to the “purpose limitation” obligation. The “purpose limitation” obligation, however, only requires the state to have a purpose that is “clear, specific and lawful” and processing limited to the “specific purpose” and “any other incidental purpose that the data principal would reasonably expect the personal data to be used for”. It is perhaps important at this point to note that the phrase “data minimisation” does not appear anywhere in the bill.</p>
<p style="text-align: justify; ">Therefore “necessary” could broadly understood to mean data Parliament or the state legislature requires to perform some function unauthorised by law, and data the citizen might reasonably expect a state authority to consider incidental to the provision of a service or benefit, issuance of a certificate, licence or permit.</p>
<p style="text-align: justify; ">Or alternatively more conservatively understood to mean data without which it would be impossible for Parliament and state legislature to carry out functions mandated by the law, and data without it would be impossible for the state to provide the specific service or benefit or issue certificates, licences and permits. It is completely unclear like with the GDPR why an additional test of “strictly necessary” is — if you will forgive the redundancy — necessary.</p>
<p style="text-align: justify; ">After 10 years of Aadhaar, the average citizen “reasonably expects” the state to ask for biometric data to provide subsidised grain. But it is not impossible to provide subsidised grain in a corruption-free manner without using surveillance technology that can be used to remotely, covertly and non-consensually identify persons. Smart cards, for example, implement privacy by design. Therefore a “reasonable expectation” test is not inappropriate since this is not a question about changing social mores.</p>
<p style="text-align: justify; ">When it comes to persons that are not law abiding the bill has two exceptions — “security of the state” and “prevention, detection, investigation and prosecution of contraventions of law”. Here the “necessary” test is combined with the “proportionate” test.</p>
<p style="text-align: justify; ">The proportionate test further constrains processing. For example, GPS data may be necessary for detecting someone has jumped a traffic signal but it might not be a proportionate response for a minor violation. Along with the requirement for “procedure established by law”, this is indeed a well carved out exception if the “necessary” test is interpreted conservatively. The only points of concern here is that the infringement of a fundamental right for minor offences and also the “prevention” of offences which implies processing of personal data of innocent persons.</p>
<p style="text-align: justify; ">Ideally consent should be introduced for law-abiding citizens even if it is merely tokenism because you cannot revoke consent if you have not granted it in the first place. Or alternatively, a less protective option would be to admit that all egovernance in India will be based on surveillance, therefore “necessary” should be conservatively defined and the “proportionate” test should be introduced as an additional safeguard.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill'>http://editors.cis-india.org/internet-governance/blog/economic-times-july-30-2018-sunil-abraham-lining-up-data-on-srikrishna-privacy-draft-bill</a>
</p>
No publishersunilInternet GovernancePrivacy2018-07-31T02:52:23ZBlog EntryPeople Should Have Right To Their Data, Not Companies, Says TRAI
http://editors.cis-india.org/internet-governance/news/bloomberg-quint-july-16-2018-people-should-have-right-to-their-data-not-companies-says-trai
<b>Rules for protection of personal data in the telecom space are not sufficient, regulator TRAI said today while suggesting that consumers be given the right to choice, consent and to be forgotten to safeguard their privacy.</b>
<p style="text-align: justify; ">This was published by <a class="external-link" href="https://www.bloombergquint.com/law-and-policy/2018/07/16/people-should-have-right-to-their-data-not-companies-says-trai#gs.soR5VAU">Bloomberg Quint</a> on July 16, 2018. Pranesh Prakash was interviewed.</p>
<hr />
<p style="text-align: justify; ">Recommending a series of measures of "privacy, security and ownership of data in telecom networks", the Telecom Regulatory Authority of India held that consumers are owners of their data and that entities controlling, processing their information are "mere custodians and do not have primary rights over this data".</p>
<p style="text-align: justify; ">"The Right to Choice, Notice, Consent, Data Portability, and Right to be Forgotten should be conferred upon the telecommunication consumers," TRAI recommended to the Department of Telecom. In order to ensure sufficient choices to the users of digital services, granularities in the consent mechanism should be built-in by the service providers, the regulator added.</p>
<p style="text-align: justify; ">TRAI has suggested that all entities in the digital ecosystem including telecom operators should transparently disclose the information about the privacy breaches on their websites along with the actions taken for mitigation, and preventing such breaches in future.</p>
<p style="text-align: justify; ">“This is the first time I’ve seen TRAI being bold enough to venture into this area,” said Pranesh Prakash, a policy director at the Centre for Internet Society. “There are many positives here in terms of the data protection regime that they want to set up,” he told BloombergQuint in an interview. “It talks about user choice, consent, about notice being mandatory and simplified in language that people understand rather than two hundred pages of legal forms.”</p>
<blockquote>There are many things in it that law and technology nerds will rejoice over, for example, the need for greater amounts of encryption and asks DoT to revisit the limitations it has put on encryption because those limitations actually harm national security and user privacy.</blockquote>
<p>Pranesh Prakash, Policy Director, Centre for Internet Society</p>
<p>Here are the highlights from the TRAI’s recommendation:</p>
<ul>
<li>All entities in the digital ecosystem, which control or process the data, should be restrained from using meta-data to identify the individual users.</li>
<li>A study should be undertaken to formulate the standards for annonymisation/de-identification of personal data generated and collected in the digital eco-system.</li>
<li>Till such time a general data protection law is notified by the government, the existing rules/licence conditions applicable to TSPs for protection of users' privacy be made applicable to all the entities in the digital ecosystem.</li>
<li>The Right to Choice, Notice, Consent, Data Portability, and Right to be forgotten should be conferred upon the telecommunication consumers.</li>
<li>Data Controllers should be prohibited from using "preticked boxes" to gain users consent. Clauses for data collection and purpose limitation should be incorporated in the agreements.</li>
<li>Sharing of information concerning to data security breaches should be encouraged and incentivised to prevent/mitigate such occurrences in future.</li>
</ul>
<p>The recommendations from TRAI come at a time when there are rising concerns around privacy and safety of user data, especially through mobile apps and social media platforms.</p>
<p style="text-align: justify; ">The regulator had issued a consultation paper entitled Privacy, Security and Ownership of Data in the Telecom Sector on Aug 9 last year and an open house discussion was held on Feb. 2. The TRAI had also invited comments and counter comments as part of the consultation.</p>
<p><iframe frameborder="0" height="315" src="https://www.youtube.com/embed/G4XxJuY1ySI" width="560"></iframe></p>
<p>(With inputs from PTI)</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/bloomberg-quint-july-16-2018-people-should-have-right-to-their-data-not-companies-says-trai'>http://editors.cis-india.org/internet-governance/news/bloomberg-quint-july-16-2018-people-should-have-right-to-their-data-not-companies-says-trai</a>
</p>
No publisherAdminInternet GovernancePrivacy2018-07-29T05:44:51ZNews ItemThe crown of thorns that awaits Facebook’s India MD hire
http://editors.cis-india.org/internet-governance/news/factor-daily-sunny-sen-and-jayadevan-pk-july-25-2018-the-crown-of-thorns-that-awaits-facebook-india-md-hire
<b>Between 2015 to 2017, Facebook nearly doubled its user base to about 250 million in India. The two other popular Facebook products, WhatsApp and Instagram, became swimmingly popular in the country, too – the messaging platform counts 200 million users here and the photos and videos sharing app some 60 million.</b>
<p style="text-align: justify; ">The article by Sunny Sen and Jayadevan PK was published by <a class="external-link" href="https://factordaily.com/facebook-india-md-problem/">Factor Daily</a> on July 25, 2018. Sunil Abraham was quoted.</p>
<hr />
<p style="text-align: justify; ">By advertising metrics, such a reach – buttressed by usage through the day – is unprecedented and unrivalled. That should make Facebook India the most powerful advertising platform in the country. And, by corollary, its managing director or CEO among the most powerful executives in India, right?</p>
<p style="text-align: justify; ">Yes, except that no such person exists.</p>
<p style="text-align: justify; ">The corner room position at Facebook India has been unoccupied since October last year despite an extensive search (<a href="https://www.linkedin.com/jobs/search/?currentJobId=628099247&keywords=facebook%20managing%20director" rel="noopener nofollow external noreferrer" target="_blank">even on LinkedIn</a>), a $2-million compensation package, and the immense power that comes with the job.</p>
<p style="text-align: justify; ">Long, winding months of search – there have been extensive meetings with more than half a dozen shortlisted candidates – are yet to culminate in an announcement that will tell the Indian advertising and media world who will lead Facebook in India, the social media giant’s second-largest market by several metrics.</p>
<p style="text-align: justify; ">Why? To put it simply, a yawning trust deficit and the difficulty in fixing it. A deficit that Facebook faces with almost all stakeholders in its ecosystem: users, regulators, advertisers, publishers, and agencies.</p>
<p style="text-align: justify; ">In India, the trust gap with regulators began to form with founder Mark Zuckerberg’s pet Free Basics program of early 2015 that <a href="https://www.theguardian.com/technology/2016/may/12/facebook-free-basics-india-zuckerberg" rel="noopener nofollow external noreferrer" target="_blank">ran afoul</a> of net neutrality principles. India’s telecom regulator <a href="https://www.theregister.co.uk/2016/01/21/facebook_india_free_basics_net_neutrality_dispute_escalates/" rel="noopener nofollow external noreferrer" target="_blank">intervened</a> and the project was ultimately shuttered in February 2016.</p>
<blockquote style="text-align: justify; ">
<p>Facebook tried to change public perception of Free Basics by running multi-million advertising campaigns.</p>
</blockquote>
<p style="text-align: justify; ">Facebook tried to change public perception of Free Basics by running multi-million advertising campaigns – billboards, newspaper advertisements, and the works – but the scepticism and opposition from large swathes of the startup ecosystem, proponents of net neutrality, and many Facebook users saw it in. Facebook also has an important case in the Supreme Court from last year, where petitioners have challenged the sharing of data between Facebook, WhatsApp, and third parties. If that was not all, the Cambridge Analytica scandal from early 2018 has all but singed the company’s reputation – its actions in the country have been questioned by the government with one minister even saying he would <a href="https://www.indiatimes.com/technology/news/it-minister-ravi-shankar-prasad-threatens-zuckerberg-with-court-summons-if-indian-user-data-is-leaked-341928.html" rel="noopener nofollow external noreferrer" target="_blank">subpoena Zuckerberg</a> if needed. The recent spate of lynchings, some traced to rumours that spread on WhatsApp, had the government <a href="https://economictimes.indiatimes.com/tech/software/govt-asks-whatsapp-to-immediately-stop-spread-of-irresponsible-explosive-messages/articleshow/64844025.cms" rel="noopener nofollow external noreferrer" target="_blank">asking the messaging platform</a> what it is doing to stop the killings.</p>
<p style="text-align: justify; ">Facebook’s troubles with publishers is well documented. First, it was accused of promoting clickbaity content that forced people to spend more and more time on the platform. After Facebook changed <a href="https://www.vox.com/2018/1/12/16882536/facebook-news-feed-changes" rel="noopener nofollow external noreferrer" target="_blank">news feed algorithms</a> to show more of friends and family related content and less of news, publishers who had dived headlong into the Facebook ecosystem felt jilted. “Media companies are not making much money from Facebook. DB Corp has said that it is not getting enough revenue from social media so it is taking its content off the platforms… it will try to drive traffic directly to its own websites,” said Abneesh Roy, senior vice president at Edelweiss Capital, a Mumbai investment bank.</p>
<blockquote style="text-align: justify; ">
<p>“Media companies are not making much money from Facebook. DB Corp has said that it is not getting enough revenue from social media so it is taking its content off the platforms”</p>
</blockquote>
<p style="text-align: justify; ">Agencies, who often play a cosy role mediating between the buyers of advertisement space or time and the sellers, don’t like digital platforms such as Facebook and Google because both ultimately aim to disintermediate agencies through a set of self-service tools. The suspicion is rooted in commissions that are squeezed by the digital platforms: while print, TV and other media platforms pay a generous 15% or more commission on ad billings, agencies receive only 2% to 4 % from Facebook and 8% to 10% from Google. The digital platforms get away – or, at least, have gotten away so far thanks to the scale and low costs they operate at.</p>
<p style="text-align: justify; ">Overall, all this makes Facebook look ogreish that it – and, importantly, its people – may not be in real life. But, American writer Terry Goodkind’s “Reality is irrelevant; perception is everything</p>
<p style="text-align: justify; ">” holds true more than ever in the times we live and public perception is hurting the company in India. At least a dozen people, both from within, close and around the company, have told FactorDaily that while user metrics continue to grow strongly in India, especially on the back of an upsurge of data use in India in the last two years (<a href="https://factordaily.com/reliance-jio-profit-and-returns/" rel="noopener" target="_blank">thanks to Reliance Jio</a>), Facebook India is a little at sea. “Facebook needs a face like Rajan Anandan is for Google,” is how one person with close knowledge of the situation put it. Anandan is vice president, South East Asia and India for Google and is its face for the company in this part of the world.</p>
<p style="text-align: justify; ">Facebook did not respond to a request mailed for comments.</p>
<h2 style="text-align: justify; ">Hotshot names all but…</h2>
<p style="text-align: justify; ">Facebook is said to have interviewed – a few of these conversations continue – some of the top names from the India corporate landscape for its India CEO position: Star India MD Sanjay Gupta; Ajit Mohan, CEO, Hotstar; Sameer Nair, CEO, Applause Entertainment, part of the Aditya Birla Group; D Shivakumar, group president, strategy at the Aditya Birla Group; Tata Sky MD Harit Nagpal; Sudhanshu Vats, Viacom18 group CEO; and Sudhir Sitapati, executive director-refreshments at Hindustan Unilever. The hiring conversations even <a href="https://timesofindia.indiatimes.com/business/india-business/3-sr-execs-bureaucrat-in-race-for-fb-india-top-job/articleshow/64361545.cms" rel="noopener nofollow external noreferrer" target="_blank">covered Srivatsa Krishna</a>, an Indian Administrative Service officer who was the Karnataka IT secretary until last year.</p>
<p style="text-align: justify; ">Some of these people confirmed to FactorDaily they had been reached out to by Facebook and the headhunter Spencer Stuart it has engaged for the task, one denied it, and others didn’t respond to requests for comment.</p>
<p style="text-align: justify; ">Mohan and Nair have an edge, according to a hiring firm source and one of the other candidates. “We have heard quite a few names but it seems that Ajit Mohan is a front-runner. He has successfully built Hotstar,” a Facebook insider told FactorDaily, on the condition of anonymity because he is not authorised to speak with the media.</p>
<p style="text-align: justify; ">A person with knowledge of the job position said that Facebook was gravitating towards someone with experience in the media industry. “They believe that they are in the content game and want to build that cache,” the person said describing his conversations with David Fischer, Facebook’s vice president of business and marketing partnerships, who is leading the CEO search.</p>
<p style="text-align: justify; ">More details were not immediately available on what Facebook wants in a person for the role. “I’m sorry but Spencer Stuart is under confidentiality agreements and may not talk about its work,” a spokesperson for the headhunter said on email.</p>
<p style="text-align: justify; ">Facebook’s India leadership crisis, ironically, comes from its stupendous success in the country. India was more a development outpost for the social media giant when it started here in 2010 with a centre in Hyderabad. Kirthiga Reddy, its first Indian employee, transitioned into a market-facing India managing director role when Facebook saw its user base here explode a couple of years later. “She did a great job with setting the foundations of relationships with the big advertisers and agencies here,” said the person with knowledge of the open CEO position quoted earlier. Her successor Umang Bedi, too, was into a sales-heavy role with demand for ad inventory going through the roof at Facebook India.</p>
<p style="text-align: justify; ">But, with its growing presence – the company closed calendar 2017 with $700 million in sales, including spots bought by small businesses by swiping a credit card which typically gets registered outside India – the role of the India managing director now has to change, Facebook seems to have acknowledged. When Reddy’s successor, Bedi was the managing director, India, he reported into Dan Neary, vice president for Asia Pacific at Facebook. Neary’s boss was Carolyn Everson, vice president, global marketing solutions at Facebook, who, in turn, reported to Fischer.</p>
<p style="text-align: justify; ">“For David, India is a big thing. Sheryl (Sandberg) brought him from Google… He understands India well,” said a second source close to Facebook. Sources say Facebook is thinking of making the reporting relationship of the India MD directly into Fischer cutting two layers from the hierarchy.</p>
<p style="text-align: justify; ">“You need a grown-up to lead the market. The kind of role (of a sales head) didn’t help anymore,” said a third source, close to Facebook. “It was like a merry-go-round, especially with the kind of problems (Facebook) India was facing from FreeBasics to fake news.”</p>
<h2 style="text-align: justify; ">The missing hand at the wheel</h2>
<p style="text-align: justify; ">Without a country head, Facebook India is missing on a lot of things. Like any other country head, the role of the new India head will be that of an ambassador at Facebook’s headquarters in Menlo Park, California. A map-tap approach of a leader achieving numbers isn’t enough. “It is very bad for FB or any company to go headless in a rapidly growing market like India,” said Kavil Ramachandran, Thomas Schmidheiny chair professor of family business and wealth management, Indian School of Business.</p>
<p style="text-align: justify; ">The leader will not only have to lobby for investments but also show that India is not a problem child. The company will have to have a growth story of every app and every product that gets rolled out in India. “Why shouldn’t there be a product coming out of India to fight fake news and why does everything have to go up to Dublin,” the third source said. Dublin is where Facebook does a lot of its development work in Europe.</p>
<p style="text-align: justify; ">Apart from Facebook Lite, there is no other product that is aimed at the Indian user. Google, in contrast, offers a slew of them like YouTube Go and Google Tez and projects such as Google Wifi or Internet Saathis – all initiatives rooted or aimed at India. Even Apple, with all its premium swag, is looking at India to build maps and brought out the iPhone SE to stay relevant among Indian buyers.</p>
<p style="text-align: justify; ">Ramachandran helps put the difficulty of finding someone to fill Facebook’s India MD position – Bedi announced his resignation last October – in context. “Typically, this happens when the job is not attractive for various reasons. In the case of FB, it can’t be money. Then what? Most likely, potential legal implications of any action that may not be under the control of the country head. If the head office does something and the company is breaching the country’s law, the local head will be liable or potentially so. (Cambridge) Analytica is a case in point,” he said.</p>
<blockquote style="text-align: justify; ">
<p>“Headquarters has a lot to learn from the India team in terms of sophistication and honesty in the regulatory debate. The Californian ideology has run its course.”</p>
</blockquote>
<p style="text-align: justify; ">Then, there is the question of building trust in a sullied platform. “Basically Facebook has lost consumer trust over the years because they don’t consistently tell the truth, the whole truth and nothing but the truth. Headquarters has a lot to learn from the India team in terms of sophistication and honesty in the regulatory debate. The Californian ideology has run its course,” said Sunil Abraham, executive director of Bengaluru-based Centre for Internet and Society. The California reference is to the brazen manner in which San Francisco-based platforms have grown unmindful of the law and societal norms at times.</p>
<p style="text-align: justify; ">At the end of the day, Facebook is valuable to customers as it is able to tell brands what customers want and thus help target ads. The internal thinking, some of which finds some takers in the advertising fraternity, is that Facebook has headroom in sales growth waiting to be grabbed. They point to Google’s India revenues of over $1 billion or nearly Rs 6,900 crore, and projections for the Indian <a href="https://www.livemint.com/Consumer/Q4SsRrOP5IpIeFsDTsXkmK/Digital-ad-industry-to-grow-32-to-touch-Rs18986-crore-by-2.html" rel="noopener nofollow external noreferrer" target="_blank">digital ad market</a> of some Rs 19,000 crore by 2020. The real value of the Indian digital ad market is actually a lot more: the estimates understate what is actually made because many companies register their <a href="https://economictimes.indiatimes.com/tech/internet/itat-says-google-india-should-pay-tax-on-advertisement-revenue-sent-to-parent/articleshow/64177638.cms" rel="noopener nofollow external noreferrer" target="_blank">ad revenue in tax havens</a> to lower the incidence of tax on them.</p>
<p style="text-align: justify; ">But signing on potential revenues is easier said than done. “In the past one year, our digital ad spend has grown five times. Almost two-thirds of that increased spending has gone to Google,” said a marketing executive with a large two-wheeler company, hinting that Facebook has lost at least a large portion of the incremental revenue. He did not want his name taken in this story because the company doesn’t disclose how it splits its ad spends.</p>
<p style="text-align: justify; ">The marketing head of a leading carmaker said that Facebook is very good when it comes to narrowly targeting people but search-based advertising is still big in India. Many of his company’s dealers prefer campaigns on Google and “that is why a large portion of digital revenue is being cornered by Google,” this executive said.</p>
<p style="text-align: justify; ">The CEO of a consumer durables company said being on Facebook was “unsexy” now. “There has been so much of trust issues with Facebook that I don’t want my product to be seen there so often… I have scaled down on my Facebook budget,” the CEO said without sharing more details.</p>
<p style="text-align: justify; ">An image makeover, then, will be the new India MD’s biggest task and global bosses don’t want it lost in the hierarchical process that most MNCs operate in. The bosses want someone who can take India from $500 million to $5 billion. Fast.</p>
<p style="text-align: justify; ">Preparing an organisation for that kind of growth means resourcing it with people who have handled scale in the past or have the potential to do so. Take the example of Nokia – now gone and buried as a brand but 10 years ago, it was India’s biggest MNC. When Shivakumar, now with Aditya Birla Group, was hired as its India managing director in 2006, Nokia had understood the potential that the country offered. The goal was to grow operations of half a billion dollars manifold. Nokia India became a company with $4 billion in sales in the 2008-2009 period. One way to assess that performance is to check where the team that delivered the vision is today. Vipul Sabharwal, whose five-year stint with Nokia ended in 2011 as sales director is now managing director of Luminous Power. V Ramnath, who also left Nokia as its sales director in 2013 is managing director, Racold Thermo. Vineet Taneja, head of marketing at Nokia when he is left in 2010, is now CEO of Dyson in India after stints in between at Bharti Airtel and Samsung India. Poonam Kaul, former director of communications at Nokia, is director of marketing at Apple India now.</p>
<p style="text-align: justify; ">Large operations need capable people and Facebook is missing its go-to person in India badly. This is evident in its ask of the CEO candidate here and the changes it is willing to put in place. Gurprriet Siingh, senior client partner with headhunter Korn Ferry, said that there are three reasons why the India head role has been moved closer to the US: to speed up decision-making, to signal the importance of India, and to give context to the individual of what is expected. “A managing director’s role is to manage investors, customers, sales, regulators and government relations,” Siingh added.</p>
<p style="text-align: justify; ">With great powers come great responsibilities. That line, immortalised in Spiderman movies, will be playing on the minds of the person who signs up for the Facebook India job. With one tweak: “With great powers come great responsibilities. And, a lot to do.”</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/factor-daily-sunny-sen-and-jayadevan-pk-july-25-2018-the-crown-of-thorns-that-awaits-facebook-india-md-hire'>http://editors.cis-india.org/internet-governance/news/factor-daily-sunny-sen-and-jayadevan-pk-july-25-2018-the-crown-of-thorns-that-awaits-facebook-india-md-hire</a>
</p>
No publisherAdminInternet GovernancePrivacy2018-07-29T02:00:23ZNews ItemBit by byte protecting her privacy
http://editors.cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy
<b>The Srikrishna committee draft law on data protection is days away. Here’s a bucket list of issues that will matter</b>
<p style="text-align: justify; ">The article by Mihir Dalal and Anirban Sen was published in <a class="external-link" href="https://www.livemint.com/Politics/qZg7qJoXhHIwnyLUYVsaxL/Bit-by-byte-protecting-her-privacy.html">Livemint</a> on July 26, 2018. Amber Sinha was quoted.</p>
<hr />
<p style="text-align: justify; ">In an era dominated by “free” platforms such as Google, Facebook and Amazon, among others, data privacy had largely been considered an academic matter. However, in the past one year that notion has changed forever, bringing data privacy to the fore, as one of the defining issues of the internet, both in India and abroad.</p>
<p style="text-align: justify; ">Last August, the Supreme Court ruled that privacy was a fundamental right under the Constitution of India. Concomitantly, the debate over Aadhaar and its potential misuse picked up steam on the back of reports about data breaches in the biometric ID system though these reports were denied by the Unique Identification Authority of India, which built Aadhaar. (The apex Court will deliver its verdict on petitions that have challenged the constitutional validity of Aadhaar and its legal framework)</p>
<p style="text-align: justify; ">Globally, Facebook came under severe criticism after it was revealed that the social media giant had compromised user data in the run up to the US elections. Finally, in May, Europe introduced its landmark data privacy law, General Data Protection Regulation (GDPR), which has put users in control of their data through various measures.</p>
<p style="text-align: justify; ">The stage is now set for the much-delayed draft law on data protection, which is expected to be submitted soon by the 10-member panel headed by former Supreme Court justice B.N. Srikrishna.</p>
<p style="text-align: justify; ">The committee, which had been set up last July, has attracted criticism from some quarters. Earlier this month, more than 150 lawyers, activists and journalists, among others, wrote to the Srikrishna committee, complaining about the lack of transparency in its process, the lack of diversity in the views held by members of the committee, besides other issues. In an earlier letter in November last, activists, lawyers and others had alleged that too many members of the committee held pro-Aadhaar views. Some experts believe that the mandate of the committee was flawed to begin with. “Given that personal information is omnipresent in so many different sectors, it is better to have a light touch legislation that deals mostly with key principles of data privacy and empowers a data commissioner to frame more detailed regulations,” said Stephen Mathias, partner, Kochhar and Co.</p>
<p style="text-align: justify; ">Last week, the Telecom Regulatory Authority of India (Trai) released a set of recommendations on data privacy that favour giving users control of their data and personal information, while severely restricting the ways in which telecom and internet companies can use customer data. Here are the major issues to watch out for in the draft data protection law.</p>
<p class="orangeXh" style="text-align: justify; "><b>Users vs. collectors </b></p>
<p style="text-align: justify; ">This broad umbrella includes mandatory consent of users for data collection, data portability, the right to be forgotten and the right to erasure. Last week, Trai gave its recommendations on some of these issues in what were considered pro-privacy and progressive suggestions. Those recommendations tracked GDPR measures. The Srikrishna committee is also expected to suggest pro-privacy measures, though the details will be all-important. The committee is also expected to define what is ‘sensitive’ or ‘critical’ data. “In India, government agencies, private entities and others collect various forms of data on individuals,” said Chetan Nagendra, partner, AZB Partners. “The committee will have to clarify what category of data is allowed to be collected and whether this should this be standardized across different entities. It will also have to standardize rules on how long is it okay to store such user-collected data.”</p>
<p style="text-align: justify; ">The flip side of user rights is the role of data repositories that collect and process user data. The committee will be required to clarify what data firms and government agencies can gather on users and what will be their responsibilities toward the usage of that data. This includes the principle of privacy by design, that is, companies must ensure by default that their platforms are designed to protect rather than exploit user data and privacy.</p>
<p style="text-align: justify; ">IndusLaw partner Namita Viswanath said that in terms of data repositories, there was a need to distinguish between a data controller and a data processor. A data controller is the user-facing platform that gathers data, whereas a data processor is often a third-party firm that provides infrastructure for the platform. “Responsibilities of user personal data should be shared between a data controller and processor. The nature and extent of liability should depend on the nature of data, the party responsible for handling data and the measures adopted, but ultimately, the data controller should most responsibility,” Viswanath said.</p>
<p class="orangeXh" style="text-align: justify; "><b>Regulation vs. Self-control</b></p>
<p style="text-align: justify; ">Given that data is such a broad-ranging topic, the Srikrishna committee will be expected to recommend who should have oversight of data-related matters. Will there be a new data protection authority? If so, what will be its scope, given that regulators, such as the RBI, Sebi and Trai, will all be affected by a privacy framework in their respective areas? And what will be the punitive measures and fines for offenders on data matters?</p>
<p style="text-align: justify; ">Some experts said the government should appoint a data protection authority. As the recent travails at Facebook show, relying solely on self-regulation of internet platforms, is a disastrous policy. But it’s unlikely that the entire burden of regulation will fall on one authority.</p>
<p style="text-align: justify; ">“Logistical problems are likely, especially in the early days, with having a top-down regulatory approach,” said Kriti Trehan, partner, Panag and Babu. “The process of training, requirement of funding and access to skilled human resources will necessitate organisational and administrative inputs. With this in mind, I believe that a co-regulatory framework for data protection will be efficient. With this approach, established parameters may guide escalation in specific instances.”</p>
<p class="orangeXh" style="text-align: justify; "><b>Data localisation </b></p>
<p style="text-align: justify; ">In April, the RBI had issued norms on the storage of payments system data, which requires digital payment providers to store data in India. That has sparked another debate over the possible stance of the Srikrishna committee. Many start-ups and firms use data servers located in overseas locations because of several reasons, including economies of scale and tax planning. “Data protection should not be confused with data access,” said Kartik Maheshwari, leader, Nishith Desai Associates. “For instance, if a firm is storing user data abroad, that should be fine as long as it is secure and access in India is provided, whenever required. Storing data locally is not necessarily the best solution from the perspective of data security as better infrastructure may be available abroad. However, the government may, in exceptional cases of sensitivity, legitimately require local storage of very narrowly defined streams of data.”</p>
<p class="orangeXh" style="text-align: justify; "><b>Surveillance is key</b></p>
<p style="text-align: justify; ">The law will also need to clearly define the contours of the contentious issue of surveillance and how to ensure that India does not end up replicating the policies in place in countries such as China, which are notorious for mass surveillance practices. Surveillance that has been legally sanctioned is part of the exceptions to regular privacy practices. The committee will have to define the parameters of these exceptions. In the case of surveillance, some experts, including Amber Sinha of Centre for Internet and Society, said that while it needs to be allowed in specific instances such as issues related to national security, a judicial system needs to be in place to protect the rights of the parties that are being put under surveillance. This, in many ways, is the heart of a very important matter.</p>
<p class="orangeXh" style="text-align: justify; "><b>The Aadhaar factor</b></p>
<p style="text-align: justify; ">The most hot-button of all issues for the committee is, of course, Aadhaar. Former UIDAI chairman Nandan Nilekani told <i>Mint </i>this week that “if something needs to be modified in the Aadhaar law, it will be done” by the Srikrishna committee. The changes that the committee will suggest to the Aadhaar law will go a long way in determining whether its draft law is truly pro-privacy.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy'>http://editors.cis-india.org/internet-governance/news/livemint-july-26-2018-mihir-dalal-and-anirban-sen-byte-by-byte-protecting-her-privacy</a>
</p>
No publisherAdminInternet GovernancePrivacy2018-07-29T01:46:38ZNews Item