The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 15.
Clause 12 Of The Data Protection Bill And Digital Healthcare: A Case Study
http://editors.cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study
<b>In light of the state’s emerging digital healthcare apparatus, how does Clause 12 alter the consent and purpose limitation model?</b>
<p>The blog post was <a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-digital-healthcare-case-study/">published in Medianama</a> on February 21, 2022. This is the second in a two-part series by Amber Sinha.</p>
<hr />
<p style="text-align: justify; ">In the <a href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/">previous post</a>, I looked at provisions on non-consensual data processing for state functions under the most recent version of recommendations by the Joint Parliamentary Committee on India’s Data Protection Bill (DPB). The true impact of these provisions can only be appreciated in light of ongoing policy developments and real-life implications.</p>
<p style="text-align: justify; ">To appreciate the significance of the dilutions in Clause 12, let us consider the Indian state’s range of schemes promoting digital healthcare. In July 2018, NITI Aayog, a central government policy think tank in India released a strategy and approach paper (Strategy Paper) on the formulation of the National Health Stack which envisions the creation of a federated application programming interface (API)-enabled health information ecosystem. While the Ministry of Health and Family Welfare has focused on the creation of Electronic Health Records (EHR) Standards for India during the last few years and also identified a contractor for the creation of a centralised health information platform (IHIP), this Strategy Paper advocates a completely different approach, which is described as a Personal Health Records (PHR) framework. In 2021, the National Digital Health Mission (NDHM) was launched under which a citizen shall have the option to obtain a digital health ID. A digital health ID is a unique ID and will carry all health records of a person.</p>
<h2 style="text-align: justify; ">A Stack Model for Big Data Ecosystem in Healthcare</h2>
<p style="text-align: justify; ">A stack model as envisaged in the Strategy Paper, consists of several layers of open APIs connected to each other, often tied together by a unique health identifier. The open nature of APIs has the advantage that it allows public and private actors to build solutions on top of it, which are interoperable with all parts of the stack. It is however worth considering both the ‘openness’ and the role that the state plays in it.</p>
<p style="text-align: justify; ">Even though the APIs are themselves open, they are a part of a pre-decided technological paradigm, built by private actors and blessed by the state. Even though innovators can build on it, the options available to them are limited by the information architecture created by the stack model. When such a technological paradigm is created for healthcare reform and health data, the stack model poses additional challenges. By tying the stack model to the unique identity, without appropriate processes in place for access control, siloed information, and encrypted communication, the stack model poses tremendous privacy and security concerns. The broad language under Clause 12 of the DPB needs to be looked at in this context.</p>
<p>Clause 12 allows non-consensual processing of personal data where it is necessary “for the performance of any function of the state authorised by law” in order to provide a service or benefit from the State. In the previous post, I had highlighted the import of the use of only ‘necessity’ to the exclusion of ‘proportionality’. Now, we need to consider its significance in light of the emerging digital healthcare apparatus being created by the state.</p>
<p style="text-align: justify; ">The National Health Stack and National Digital Health Mission together envision an intricate system of data collection and exchange which in a regulatory vacuum would ensure unfettered access to sensitive healthcare data for both the state and private actors registered with the platforms. The Stack framework relies on repositories where data may be accessed from multiple nodes within the system. Importantly, the Strategy Paper also envisions health data fiduciaries to facilitate consent-driven interaction between entities that generate the health data and entities that want to consume the health records for delivering services to the individual. The cast of characters involve the National Health Authority, health care providers and insurers who access the National Health Electronic Registries, unified data from different programmes such as National Health Resource Repository (NHRR), NIN database, NIC and the Registry of Hospitals in Network of Insurance (ROHINI), private actors such as Swasth, iSpirt who assist the Mission as volunteers. The currency that government and private actors are interested in is data.</p>
<p style="text-align: justify; ">The promised benefits of healthcare data in an anonymised and aggregate form range from Disease Surveillance to Pharmacovigilance as well as Health Schemes Management Systems and Nutrition Management, benefits which have only been more acutely emphasised during the pandemic. However, the pandemic has also normalised the sharing of sensitive healthcare data with a variety of actors, without much thinking on much-needed data minimisation practises.</p>
<p style="text-align: justify; ">The potential misuses of healthcare data include greater state surveillance and control, predatory and discriminatory practices by private actors which rely on Clause 12 to do away with even the pretense of informed consent so long as the processing of data is deemed necessary by the state and its private sector partners to provide any service or benefit.</p>
<p style="text-align: justify; ">Subclause (e) in Clause 12, which was added in the last version of the Bill drafted by MeitY and has been retained by the JPC, allows processing wherever it is necessary for ‘any measures’ to provide medical treatment or health services during an epidemic, outbreak or threat to public health. Yet again, the overly-broad language used here is designed to ensure that any annoyances of informed consent can be easily brushed aside wherever the state intends to take any measures under any scheme related to public health.</p>
<p style="text-align: justify; ">Effectively, how does the framework under Clause 12 alter the consent and purpose limitation model? Data protection laws introduce an element of control by tying purpose limitation to consent. Individuals provide consent to specified purposes, and data processors are required to respect that choice. Where there is no consent, the purposes of data processing are sought to be limited by the necessity principle in Clause 12. The state (or authorised parties) must be able to demonstrate necessity to the exercise of state function, and data must only be processed for those purposes which flow out of this necessity. However, unlike the consent model, this provides an opportunity to keep reinventing purposes for different state functions.</p>
<p style="text-align: justify; ">In the absence of a data protection law, data collected by one agency is shared indiscriminately with other agencies and used for multiple purposes beyond the purpose for which it was collected. The consent and purpose limitation model would have addressed this issue. But, by having a low threshold for non-consensual processing under Clause 12, this form of data processing is effectively being legitimised.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study'>http://editors.cis-india.org/internet-governance/blog/medianama-february-21-2022-amber-sinha-data-protection-bill-digital-healthcare-case-study</a>
</p>
No publisheramberData GovernanceInternet GovernanceData ProtectionPrivacy2022-03-01T15:07:44ZBlog EntryHow Function Of State May Limit Informed Consent: Examining Clause 12 Of The Data Protection Bill
http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function
<b>The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state.</b>
<p>The blog post was <a class="external-link" href="https://www.medianama.com/2022/02/223-data-protection-bill-consent-clause-state-function/">published in Medianama</a> on February 18, 2022. This is the first of a two-part series by Amber Sinha.</p>
<hr />
<p style="text-align: justify; ">In 2018, hours after the Committee of Experts led by Justice Srikrishna Committee released their report and draft bill, I wrote <a href="https://www.livemint.com/Opinion/zY8NPWoWWZw8AfI5JQhjmL/Draft-privacy-bill-and-its-loopholes.html">an opinion piece</a> providing my quick take on what was good and bad about the bill. A section of my analysis focused on Clause 12 (then Clause 13) which provides for non-consensual processing of personal data for state functions. I called this provision a ‘carte-blanche’ which effectively allowed the state to process a citizen’s data for practically all interactions between them without having to deal with the inconvenience of seeking consent. My former colleague, Pranesh Prakash <a href="https://twitter.com/pranesh/status/1023116679440621568">pointed out</a> that this was not a correct interpretation of the provision as I had missed the significance of the word ‘necessary’ which was inserted to act as a check on the powers of the state. He also pointed out, correctly, that in its construction, this provision is equivalent to the position in European General Data Protection Regulation (Article 6 (i) (e)), and is perhaps even more restrictive.</p>
<p style="text-align: justify; ">While I agree with what Pranesh says above (his claims are largely factual, and there can be no basis for disagreement), my view of Clause 12 has not changed. While Clause 35 has been a focus of considerable discourse and analysis, for good reason, I continue to believe that Clause 12 remains among the most dangerous provisions of this bill, and I will try to unpack here, why.</p>
<p style="text-align: justify; ">The Data Protection Bill 2021 has a chapter on the grounds for processing personal data, and one of those grounds is consent by the individual. The rest of the grounds deal with various situations in which personal data can be processed without seeking consent from the individual. Clause 12 lays down one of the grounds. It allows the state to process data without the consent of the individual in the following cases —</p>
<p>a) where it is necessary to respond to a medical emergency<br />b) where it is necessary for state to provide a service or benefit to the individual<br />c) where it is necessary for the state to issue any certification, licence or permit<br />d) where it is necessary under any central or state legislation, or to comply with a judicial order<br />e) where it is necessary for any measures during an epidemic, outbreak or public health<br />f) where it is necessary for safety procedures during disaster or breakdown of public order</p>
<p>In order to carry out (b) and (c), there is also the added requirement that the state function must be authorised by law.</p>
<h2>Twin restrictions in Clause 12</h2>
<p style="text-align: justify; ">The use of the words ‘necessary’ and ‘authorised by law’ is intended to pose checks on the powers of the state. The first restriction seeks to limit actions to only those cases where the processing of personal data would be necessary for the exercise of the state function. This should mean that if the state function can be exercised without non-consensual processing of personal data, then it must be done so. Therefore, while acting under this provision, the state should only process my data if it needs to do so, to provide me with the service or benefit. The second restriction means that this would apply to only those state functions which are authorised by law, meaning only those functions which are supported by validly enacted legislation.</p>
<p style="text-align: justify; ">What we need to keep in mind regarding Clause 12 is that the requirement of ‘authorised by law’ does not mean that legislation must provide for that specific kind of data processing. It simply means that the larger state function must have legal backing. The danger is how these provisions may be used with broad mandates. If the activity in question is non-consensual collection and processing of, say, demographic data of citizens to create state resident hubs which will assist in the provision of services such as healthcare, housing, and other welfare functions; all that may be required is that the welfare functions are authorised by law.</p>
<h2 style="text-align: justify; ">Scope of privacy under Puttaswamy</h2>
<p style="text-align: justify; ">It would be worthwhile, at this point, to delve into the nature of restrictions that the landmark Puttaswamy judgement discussed that the state can impose on privacy. The judgement clearly identifies the principles of informed consent and purpose limitation as central to informational privacy. As discussed repeatedly during the course of the hearings and in the judgement, privacy, like any other fundamental right, is not absolute. However, restrictions on the right must be reasonable in nature. In the case of Clause 12, the restrictions on privacy in the form of denial of informed consent need to be tested against a constitutional standard. In Puttaswamy, the bench was not required to provide a legal test to determine the extent and scope of the right to privacy, but they do provide sufficient guidance for us to contemplate how the limits and scope of the constitutional right to privacy could be determined in future cases.</p>
<p style="text-align: justify; ">The Puttaswamy judgement clearly states that “the right to privacy is protected as an intrinsic part of the right to life and personal liberty under Article 21 and as a part of the freedoms guaranteed by Part III of the Constitution.” By locating the right not just in Article 21 but also in the entirety of Part III, the bench clearly requires that “the drill of various Articles to which the right relates must be scrupulously followed.” This means that where transgressions on privacy relate to different provisions in Part III, the different tests under those provisions will apply along with those in Article 21. For instance, where the restrictions relate to personal freedoms, the tests under both Article 19 (right to freedoms) and Article 21 (right to life and liberty) will apply.</p>
<p style="text-align: justify; ">In the case of Clause 12, the three tests laid down by Justice Chandrachud are most operative —<br />a) the existence of a “law”<br />b) a “legitimate State interest”<br />c) the requirement of “proportionality”.</p>
<p style="text-align: justify; ">The first test is already reflected in the use of the phrase ‘authorised by law’ in Clause 12. The test under Article 21 would imply that the function of the state should not merely be authorised by law, but that the law, in both its substance and procedure, must be ‘fair, just and reasonable.’ The next test is that of ‘legitimate state interest’. In its report, the Joint Parliamentary Committee places emphasis on Justice Chandrachud’s use of “allocation of resources for human development” in an illustrative list of legitimate state interests. The report claims that the ground, functions of the state, thus satisfies the legitimate state interest. We do not dispute this claim.</p>
<h2 style="text-align: justify; ">Proportionality and Clause 12</h2>
<p style="text-align: justify; ">It is the final test of ‘proportionality’ articulated by the Puttaswamy judgement, which is most operative in this context. Unlike Clauses 42 and 43 which include the twin tests of necessity and proportionality, the committee has chosen to only employ one ground in Clause 12. Proportionality is a commonly employed ground in European jurisprudence and common law countries such as Canada and South Africa, and it is also an integral part of Indian jurisprudence. As commonly understood, the proportionality test consists of three parts —</p>
<p>a) the limiting measures must be carefully designed, or rationally connected, to the objective<br />b) they must impair the right as little as possible<br />c) the effects of the limiting measures must not be so severe on individual or group rights that the legitimate state interest, albeit important, is outweighed by the abridgement of rights.</p>
<p style="text-align: justify; ">The first test is similar to the test of proximity under Article 19. The test of ‘necessity’ in Clause 12 must be viewed in this context. It must be remembered that the test of necessity is not limited to only situations where it may not be possible to obtain consent while providing benefits. My reservations with the sufficiency of this standard stem from observations made in the report, as well as the relatively small amount of jurisprudence on this term in Indian law.</p>
<p style="text-align: justify; ">The Srikrishna Report interestingly mentions three kinds of scenarios where consent should not be required — where it is not appropriate, necessary, or relevant for processing. The report goes on to give an example of inappropriateness. In cases where data is being gathered to provide welfare services, there is an imbalance in power between the citizen and the state. Having made that observation, the committee inexplicably arrives at a conclusion that the response to this problem is to further erode the power available to citizens by removing the need for consent altogether under Clause 12. There is limited jurisprudence on the standard of ‘necessity’ under Indian law. The Supreme Court has articulated this test as ‘having reasonable relation to the object the legislation has in view.’ If we look elsewhere for guidance on how to read ‘necessity’, the ECHR in Handyside v United Kingdom held it to be neither “synonymous with indispensable” nor does it have the “flexibility of such expressions as admissible, ordinary, useful, reasonable or desirable.” In short, there must be a pressing social need to satisfy this ground.</p>
<p style="text-align: justify; ">However, the other two tests of proportionality do not find a mention in Clause 12 at all. There is no requirement of ‘narrow tailoring’, that the scope of non-consensual processing must impair the right as little as possible. It is doubly unfortunate that this test does not find a place, as unlike necessity, ‘narrow tailoring’ is a test well understood in Indian law. This means that while there is a requirement to show that processing personal data was necessary to provide a service or benefit, there is no requirement to process data in a way that there is minimal non-consensual processing. The fear is that as long as there is a reasonable relation between processing data and the object of the function of state, state authorities and other bodies authorised by it, do not need to bother with obtaining consent.</p>
<p style="text-align: justify; ">Similarly, the third test of proportionality is also not represented in this provision. It provides a test between the abridgement of individual rights and legitimate state interest in question, and it requires that the first must not outweigh the second. The absence of the proportionality test leaves Clause 12 devoid of any such consideration. Therefore, as long as the test of necessity is met under this law, it need not evaluate the denial of consent against the service or benefit that is being provided.</p>
<p style="text-align: justify; ">The collective implication of leaving out ‘proportionality’ from Clause 12 is to provide very wide discretionary powers to the state, by setting the threshold to circumvent informed consent extremely low. In the next post, I will demonstrate the ease with which Clause 12 can allow indiscriminate data sharing by focusing on the Indian government’s digital healthcare schemes.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function'>http://editors.cis-india.org/internet-governance/blog/medianama-february-18-2021-amber-sinha-data-protection-bill-consent-clause-state-function</a>
</p>
No publisheramberData GovernanceInternet GovernanceData ProtectionPrivacy2022-03-01T14:56:49ZBlog EntryBeyond Public Squares, Dumb Conduits, and Gatekeepers: The Need for a New Legal Metaphor for Social Media
http://editors.cis-india.org/internet-governance/files/beyond-public-squares-dumb-conduits-and-gatekeepers.pdf
<b></b>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/files/beyond-public-squares-dumb-conduits-and-gatekeepers.pdf'>http://editors.cis-india.org/internet-governance/files/beyond-public-squares-dumb-conduits-and-gatekeepers.pdf</a>
</p>
No publisheramberInternet Governance2021-05-31T10:19:33ZFileBeyond Public Squares, Dumb Conduits, and Gatekeepers: The Need for a New Legal Metaphor for Social Media
http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-beyond-public-squares-dumb-conduits-and-gatekeepers
<b>In the past few years, social networking sites have come to play a central role in intermediating the public’s access to and deliberation of information critical to a thriving democracy. In stark contrast to early utopian visions which imagined that the internet would create a more informed public, facilitate citizen-led engagement, and democratize media, what we see now is the growing association of social media platforms with political polarization and the entrenchment of racism, homophobia, and xenophobia.</b>
<p style="text-align: justify; ">There is a dire need to think of regulatory strategies that look beyond the ‘dumb conduit’ metaphors that justify safe harbor protection to social networking sites. Alongside, it is also important to critically analyze the outcomes of regulatory steps such that they do not adversely impact free speech and privacy. By surveying the potential analogies of company towns, common carriers, and editorial functions, this essay provides a blueprint for how we may envision differentiated intermediary liability rules to govern social networking sites in a responsive manner.</p>
<h2>Introduction</h2>
<p style="text-align: justify; ">Only months after Donald Trump’s 2016 election victory — a feat mired in controversy over alleged Russian interference using social media, specifically Facebook — Mark Zuckerberg remarked that his company has grown to serve a role more akin to government, rather than a corporation. Zuckerberg argued that Facebook was responsible for creating guidelines and rules that governed the exchange of ideas of over two billion people online. Another way to look at the same argument is to acknowledge that, today, a quarter of the world’s population (and of India) are subject to the laws of Facebook’s terms and conditions and privacy policies, and public discourse around the globe is shaped within the constraints and conditions they create. Social media platforms, like Facebook, wield hitherto unimaginable power to catalyze public opinions, causing a particular narrative to gather steam — that Big Tech can pose an existential threat to democracy.</p>
<p style="text-align: justify; "><span>This, of course, is in absolute contrast to the early utopian visions which imagined that the internet would create a more informed public, facilitate citizen-led engagement, and democratize media. Instead, what we see now is the growing association of social media platforms with political polarization and the entrenchment of racism, homophobia, and xenophobia. The regulation of social networking sites has emerged as one of the most important and complex policy problems of this time. In this essay, I will explore the inefficacy of the existing regulatory framework, and provide a blueprint for how to think of appropriate regulatory metaphors to revisit it.</span></p>
<hr />
<ul>
<li><a class="external-link" href="https://itforchange.net/digital-new-deal/2020/11/01/beyond-public-squares-dumb-conduits-and-gatekeepers-the-need-for-a-new-legal-metaphor-for-social-media/"> Click on to read the article</a> published by IT for Change</li>
<li><a href="http://editors.cis-india.org/internet-governance/files/beyond-public-squares-dumb-conduits-and-gatekeepers.pdf" class="external-link">Download the PDF</a> (34,328 Kb) to read the full article, pages 126 - 138.</li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-beyond-public-squares-dumb-conduits-and-gatekeepers'>http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-beyond-public-squares-dumb-conduits-and-gatekeepers</a>
</p>
No publisheramberSocial MediaInternet Governance2021-05-31T10:23:36ZBlog EntryRegulating Sexist Online Harassment as a Form of Censorship
http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment
<b>This paper is part of a series under IT for Change’s project, Recognize, Resist, Remedy: Combating Sexist Hate Speech Online. The series, titled Rethinking Legal-Institutional Approaches to Sexist Hate Speech in India, aims to create a space for civil society actors to proactively engage in the remaking of online governance, bringing together inputs from legal scholars, practitioners, and activists. The papers reflect upon the issue of online sexism and misogyny, proposing recommendations for appropriate legal-institutional responses. The series is funded by EdelGive Foundation, India and International Development Research Centre, Canada.</b>
<p><span>Introduction</span></p>
<p style="text-align: justify; ">The proliferation of internet use was expected to facilitate greater online participation of women and <a class="external-link" href="https://ssrn.com/abstract=2039116">other marginalised groups</a>. However, over the past few years, as more and more people have come online, it is evident that social power in online spaces mirrors offline hierarchies. While identity and security thefts may be universal experiences, women and the LGBTQ+ community continue to face barriers to safety that men often do not, aside from structural barriers to access. Sexist harassment pervades the online experience of women, be it on dating sites, <a class="external-link" href="https://academic.oup.com/bjc/article/57/6/1462/2623986">online forums, or social media</a>.</p>
<p style="text-align: justify; ">In her book, <i><a class="external-link" href="https://yalebooks.yale.edu/book/9780300215120/twitter-and-tear-gas">Twitter and Tear Gas: The Power and Fragility of Networked Protest</a></i>, Zeynep Tufekci argues that the nature and impact of censorship on social media are very different. Earlier, censorship was enacted by restricting speech. But now, it also works in the form of organised harassment campaigns, which use the qualities of viral outrage to impose a disproportionate cost on the very act of speaking out. Therefore, censorship plays out not merely in the form of the removal of speech but through disinformation and hate speech campaigns.</p>
<p style="text-align: justify; ">In most cases, this censorship of content does not necessarily meet the threshold of hate speech, and free speech advocates have traditionally argued for counter speech as the most effective response to such speech acts. However, the structural and organised nature of harassment and extreme speech often renders counter speech ineffective. This paper will explore the nature of online sexist hate and extreme speech as a mode of censorship. Online sexualised harassment takes various forms including doxxing, cyberbullying, stalking, identity theft, incitement to violence, etc. While there are some regulatory mechanisms – either in law, or in the form of community guidelines that address them, this paper argues for the need to evolve a composite framework that looks at the impact of such censorious acts on online speech and regulatory strategies to address them.</p>
<hr />
<p style="text-align: justify; "><a href="http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf/at_download/file" class="external-link">Click on to read the full text</a> [PDF; 495 Kb]</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment'>http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment</a>
</p>
No publisheramberFreedom of Speech and ExpressionInternet GovernanceCensorship2021-05-31T09:56:31ZBlog EntryRegulating Sexist Online Harassment: A Model of Online Harassment as a Form of Censorship
http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf
<b></b>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf'>http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf</a>
</p>
No publisheramberFreedom of Speech and ExpressionInternet GovernanceCensorship2021-05-31T09:39:14ZFileRegulating Sexist Online Harassment: A Model of Online Harassment as a Form of Censorship
http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship
<b>Amber Sinha wrote a paper on regulating sexist online harassment, and how online harassment serves as a form of censorship, for the “Recognize, Resist, Remedy: Addressing Gender-Based Hate Speech in the Online Public Sphere” project, a collaborative project between IT for Change, India and InternetLab, Brazil.</b>
<p> </p>
<p>Read the full paper <a class="external-link" href="https://itforchange.net/sites/default/files/1883/Amber-Sinha-Rethinking-Legal-Institutional-Approaches-to-Sexist-Hate-Speech-ITfC-IT-for-Change_0.pdf">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship'>http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship</a>
</p>
No publisheramber2021-03-11T04:14:28ZBlog EntryTechnical Appendix to 'Use of sentiment analysis by law enforcement: An analysis of scrutability for juridical purposes'
http://editors.cis-india.org/internet-governance/technical-appendix-to-use-of-sentiment-analysis-by-law-enforcement-an-analysis-of-scrutability-for-juridical-purposes
<b>This file contains the technical appendix to the paper titled 'Use of sentiment analysis by law enforcement: An analysis of scrutability for juridical purposes' by Dr. Hans Varghese Mathews and Amber Sinha</b>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/technical-appendix-to-use-of-sentiment-analysis-by-law-enforcement-an-analysis-of-scrutability-for-juridical-purposes'>http://editors.cis-india.org/internet-governance/technical-appendix-to-use-of-sentiment-analysis-by-law-enforcement-an-analysis-of-scrutability-for-juridical-purposes</a>
</p>
No publisheramber2020-05-03T12:43:05ZFileGoverning ID: Kenya’s Huduma Namba Programme
http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme
<b></b>
<p>In our fourth case-study, we use our Evaluation Framework for Digital ID to examine the use of Digital ID in Kenya.</p>
<p>Read the <a class="external-link" href="https://digitalid.design/evaluation-framework-case-studies/kenya.html">case-study</a> or download as <a href="http://editors.cis-india.org/internet-governance/digital-id-kenya-case-study" class="internal-link" title="Digital ID Kenya Case Study">PDF</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme'>http://editors.cis-india.org/internet-governance/blog/governing-id-kenya2019s-huduma-namba-programme</a>
</p>
No publisheramberinternet governanceInternet GovernanceDigital IDDigital Identity2020-03-02T13:19:15ZBlog EntryThe Appropriate Use of Digital Identity
http://editors.cis-india.org/internet-governance/blog/the-appropriate-use-of-digital-identity
<b></b>
<p>As governments across the globe implement new, foundational, digital identification systems (“Digital ID”), or modernize existing ID programs, there is dire need for greater research and discussion about appropriate uses of Digital ID systems. This significant momentum for creating Digital ID in several parts of the world has been accompanied with concerns about the privacy and exclusion harms of a state issued Digital ID system, resulting in campaigns and litigations in countries such as UK, India, Kenya, and Jamaica. Given the very large range of considerations required to evaluate Digital ID projects, it is necessary to think of evaluation frameworks that can be used for this purpose.</p>
<p>At RightsCon 2019 in Tunis, we presented <a class="external-link" href="http://bit.ly/CISDigitalIDAppropriateUse">working drafts</a> on appropriate use of Digital ID by the partner organisations of this <a class="external-link" href="https://www.omidyar.com/blog/appropriate-use-digital-identity-why-we-invested-three-region-research%C2%A0alliance">three-region research alliance</a> - ITS from Brazil, CIPIT from Kenya, and CIS from India.</p>
<p>In the <a class="external-link" href="https://digitalid.design/evaluation-framework-01.html">draft by CIS</a>, we propose a set of principles against which Digital ID may be evaluated. We hope that these draft principles can evolve into a set of best practices that can be used by policymakers when they create and implement Digital ID systems, provide guidance to civil society examinations of Digital ID and highlight questions for further research on the subject. We have drawn from approaches used in documents such as the necessary and proportionate principles, the OECD privacy guidelines and scholarship on harms based approach.</p>
<p>Read and comment on CIS’s Draft framework <a class="external-link" href="https://digitalid.design/evaluation-framework-01.html">here</a>.</p>
<p>Download Working drafts by CIPIT, CIS, and ITS <a class="external-link" href="http://bit.ly/CISDigitalIDAppropriateUse">here</a>.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-appropriate-use-of-digital-identity'>http://editors.cis-india.org/internet-governance/blog/the-appropriate-use-of-digital-identity</a>
</p>
No publisheramberDigital IDPrivacyInternet GovernanceAppropriate Use of Digital IDDigital Identity2019-08-08T10:24:40ZBlog EntryCall for Design Interns
http://editors.cis-india.org/jobs/call-for-design-interns-201906
<b>CIS is seeking graphic design interns to create communication material (information and data visualizations, publication layouts, presentations, etc.) for our projects. The intern will assist our researchers in presenting their research in accessible and easy-to-understand forms, as well as design social media collaterals. They will be working with a multi-disciplinary team across two cities, and be supervised by a designer.</b>
<p> </p>
<h4>Who can apply?</h4>
<p>Students of design or recent design graduates, who are available to work full-time for at least a month, and have experience in editorial design and creating data visualizations. Others who can demonstrate similar skills and aptitude are also welcome to apply. Applicants with an interest in digital technology research would be preferred.</p>
<p>Our work is strengthened by the diversity in background, culture, experience, religion, caste, sexual orientation, gender, gender identity, race, ethnicity, age and disability. We welcome applications from candidates belonging to marginalised communities.</p>
<h4>Skills</h4>
<ul>
<li>Comfortable working with Adobe InDesign, Illustrator, and Photoshop,</li>
<li>Comfortable working with Google Docs and Slides, and</li>
<li>Knowledge of HTML/CSS will be preferred.</li></ul>
<h4>Duration of the internship</h4>
<p>1 – 2 months</p>
<h4>Location</h4>
<p>Bangalore or New Delhi</p>
<h4>Remuneration</h4>
<p>A modest stipend will be paid</p>
<h4>How to apply?</h4>
<p>To apply, please send –</p>
<ul>
<li>Resumé,</li>
<li>Relevant work samples (less than 5MB), and</li>
<li>Link to online portfolio, if any.</li></ul>
<p>Applications should be sent to Saumyaa Naidu (saumyaa [at] cis-india.org) and Karan Saini (karan [at] cis-india.org) by <strong>June 28, 2019</strong>.</p>
<h4>Organisational policies</h4>
<p>All interns working at CIS must read and abide by CIS' <a href="https://cis-india.org/about/policies" target="_blank">organisational policies</a>.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/jobs/call-for-design-interns-201906'>http://editors.cis-india.org/jobs/call-for-design-interns-201906</a>
</p>
No publisheramber2019-06-12T06:16:13ZBlog EntryAnnouncement of a Three-Region Research Alliance on the Appropriate Use of Digital Identity
http://editors.cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement
<b>Omidyar Network has recently announced its decision to invest in establishment of a three-region research alliance — to be co-led by the Institute for Technology & Society (ITS), Brazil, the Centre for Intellectual Property and Information Technology Law (CIPIT) , Kenya, and the CIS, India — on the Appropriate Use of Digital Identity. As part of this Alliance, we at the CIS will look at the policy objectives of digital identity projects, how technological policy choices can be thought through to meet the objectives, and how legitimate uses of a digital identity framework may be evaluated.</b>
<p> </p>
<p>As governments across the globe are implementing new, digital foundational identification systems or modernizing existing ID programs, there is a dire need for greater research and discussion about appropriate design choices for a digital identity framework. There is significant momentum on digital ID, especially after the adoption of UN Sustainable Development Goal 16.9, which calls for legal identity for all by 2030. Given the importance of this subject, its implications for both the development agenda as well its impact on civil, social and economic rights, there is a need for more focused research that can enable policymakers to take better decisions, guide civil society in different jurisdictions to comment on and raise questions about digital identity schemes, and provide actionable material to the industry to create identity solutions that are privacy enhancing and inclusive.</p>
<p> </p>
<h4>Excerpt from the <a href="https://www.omidyar.com/blog/appropriate-use-digital-identity-why-we-invested-three-region-research%C2%A0alliance" target="_blank">blog post by Subhashish Bhadra</a> announcing this new research alliance</h4>
<p>...In the absence of any widely-accepted thinking on this issue, we run the risk of digital identity systems suffering from mission creep, that is being made mandatory or being used for an ever-expanding set of services. We believe this creates several risks. First, people may be excluded from services if they do not have a digital identity or because it malfunctions. Second, this approach creates a wider digital footprint that can be used to create a profile of an individual, sometimes without consent. This can increase privacy risk. Third, this approach increases the power of institutions versus individuals and can be used as rationale to intentionally deny services, especially to vulnerable or persecuted groups.</p>
<p>Three exceptional research groups have undertaken the effort of answering this complex and important question. Over the next six months, these think tanks will conduct independent research, as well as involve experts from across the globe. Based in South America, Africa, and Asia, these institutions represent the collective wisdom and experiences of three very distinct geographies in emerging markets. While drawing on their local context, this research effort is globally oriented. The think tanks will create a set of recommendations and tools that can be used by stakeholders to engage with digital identity systems in any part of the world...</p>
<p>This research will use a collaborative and iterative process. The researchers will put out some ideas every few weeks, with the objective of seeking thoughts, questions, and feedback from various stakeholders. They will participate in several digital rights and identity events across the globe over the next several months. They will also organize webinars to seek input from and present their interim findings to interested communities from across the globe. Each of these provide an opportunity for you to provide your thoughts and help this research program provide an independent, rigorous, transparent, and holistic answer to the question of when it’s appropriate for digital identity to be used. We need a diversity of viewpoints and collaborative dissent to help solve the most pressing issues of our times.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement'>http://editors.cis-india.org/internet-governance/blog/appropriate-use-of-digital-identity-alliance-announcement</a>
</p>
No publisheramberDigital IDInternet GovernanceAppropriate Use of Digital IDFeaturedDigital IdentityHomepage2019-05-13T09:06:23ZBlog EntryProgramme Officer - Privacy
http://editors.cis-india.org/jobs/programme-officer-privacy-2019
<b>The Centre for Internet and Society (CIS) is seeking applications for the position of Programme Officer, to undertake public policy research on privacy and related themes. For this position, we will hire one full time researcher, to be based in the Delhi office of CIS, for the duration of one year.</b>
<p> </p>
<h4>To apply for this position please write to amber@cis-india.org along with a CV, two writing samples and contact details of two references, Interested candidates are invited to send their applications at the earliest — latest by April 30th.</h4>
<hr />
<h3>Organisation Profile</h3>
<p>The Centre for Internet and Society (CIS) is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with disabilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, open access, open educational resources, and open video), internet governance, telecommunication reform, digital privacy, and cyber-security. The academic research at CIS seeks to understand the reconfiguration of social processes and structures through the internet and digital media technologies, and vice versa. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and practices around internet, technology and society in India, and elsewhere.</p>
<h3>Privacy Research at CIS</h3>
<p>While privacy has been a key subject of study for digital rights and development organisations in India for the last decade, recent and ongoing legal and policy developments have placed this issue at the forefront of human rights and regulatory research. CIS has conducted extensive research into the areas of privacy, data protection, data security, and was also a member of the Committee of Experts constituted under Justice A P Shah. CIS has also been cited multiple times in the Report of the Committee of Experts led by Justice Srikrishna. CIS values the fundamental principles of justice, equality, freedom and economic development and strongly advocates the right to privacy.</p>
<p>Over the next year, CIS intends to look at several research questions on data protection which may include the global experience with privacy enforcement, need for effective redressal mechanisms, documenting the design of business models and data flows, regulation of social media big data, how data of disadvantaged groups including children may be protected. Additionally, while we now have the Supreme Court’s unanimous and emphatic recognition of the fundamental right to privacy, there is a need for research enquiry into several issues such as a clarification of the scope of the Puttaswamy judgment, unpacking the different dimensions of privacy, how state actions interact with privacy.</p>
<h3>The Role</h3>
<ul>
<li>Research and analysis: Literature review, policy design, detailed analysis of research topics<br /><br /></li>
<li>Knowledge management: Staying up-to-date on developments of interest to the project, and sharing/debating these with the team. Contributing to documentary and knowledge management processes<br /><br /></li>
<li>Policy outreach and stakeholder engagement: Supporting the project manager in the dissemination of research findings in innovative formats. Attending, planning and executing events<br /><br /></li>
<li>Writing op-eds, short notes, policy briefs and longer form academic writing for a range of audiences<br /><br /></li>
<li>Presentations and formal discussions: Preparing and delivering presentations to various audiences<br /><br /></li>
<li>Helping manage communications with stakeholders including international experts, regulators and policy makers<br /><br /></li>
<li>Managing interns and team: Managing work outputs with our interns; coordinating research with team members and the project manager</li></ul>
<h3>Qualifications and Skills</h3>
<p>We are looking for professionals from law, regulatory theory and public policy backgrounds.</p>
<p>We are looking for candidates who are interested in studying the regulatory challenges of notice and consent, state capacity, how business models thwart privacy and the future of privacy post Puttaswamy.</p>
<p>This is a full-time position based out of Delhi. The position is for a duration of one year. Salary will be commensurate with qualifications and experience.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/jobs/programme-officer-privacy-2019'>http://editors.cis-india.org/jobs/programme-officer-privacy-2019</a>
</p>
No publisheramberJobsInternet GovernancePrivacy2019-04-15T06:53:44ZBlog EntryCIS Response to Draft E-Commerce Policy
http://editors.cis-india.org/internet-governance/blog/cis-response-to-draft-e-commerce-policy
<b>CIS is grateful for the opportunity to submit comments to the Department of Industrial Policy and Promotion on the draft national e-commerce policy. This response was authored by Amber Sinha, Arindrajit Basu, Elonnai Hickok and Vipul Kharbanda.</b>
<p> </p>
<h4>Access our response to the draft policy here: <a href="https://cis-india.org/internet-governance/resources/e-commerce-submission">Download</a> (PDF)</h4>
<hr />
<h3>The E-Commerce Policy is a much needed and timely document that seeks to enable the growth of India's digital ecosystem. Crucially, it backs up India's stance at the WTO, which has been a robust pushback against digital trade policies that would benefit the developed world at the cost of emerging economies. However, in order to ensure that the benefits of the digital economy are truly shared, focus must not only be on the sellers but also on the consumers, which automatically brings in individual rights into the question. No right is absolute but there needs to be a fair trade-off between the mercantilist aspirations of a burgeoning digital economy and the civil and political rights of the individuals who are spurring the economy on. We also appreciate the recognition that the regulation of e-commerce must be an inter-disciplinary effort and the assertion of the roles of various other departments and ministries. However, we also caution against over-reach and encroaching into policy domains that fall within the mandate of existing laws.</h3>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-response-to-draft-e-commerce-policy'>http://editors.cis-india.org/internet-governance/blog/cis-response-to-draft-e-commerce-policy</a>
</p>
No publisheramberE-CommerceFeaturedHomepageInternet Governance2019-04-26T06:40:34ZBlog EntryProgramme Officer - Digital Identity
http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019
<b>The Centre for Internet and Society (CIS) is seeking applications for the position of Programme Officer, to be associated with a two year long research project on digital identity. We may hire up to three Programme Officers as part of this project. The position is full time and will be based in the Delhi office of CIS. </b>
<p> </p>
<h4>To apply for this position please write to amber@cis-india.org along with a CV, two writing samples and contact details of two references. Interested candidates are invited to send their applications at the earliest - latest by April 15th.</h4>
<hr />
<h3>Organisation Profile</h3>
<p>The Centre for Internet and Society (CIS) is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with disabilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, open access, open educational resources, and open video), internet governance, telecommunication reform, digital privacy, and cyber-security. The academic research at CIS seeks to understand the reconfiguration of social processes and structures through the internet and digital media technologies, and vice versa. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and practices around internet, technology and society in India, and elsewhere.</p>
<h3>About Digital Identity Project</h3>
<p>We are embarking on a two year research project on digital identity. As governments across the globe are implementing new, digital foundational identification systems or modernizing existing ID programs, there is a dire need for greater research and discussion about appropriate design choices for a digital identity framework. There is significant momentum on digital ID, especially after the adoption of UN Sustainable Development Goal 16.9, which calls for legal identity for all by 2030. Instances of emerging new digital identity schemes include national projects in Algeria, Belgium (mobile ID), Cameroon, Ecuador, Jordan, Kyrgyzstan, Italy, Iran, Japan, Senegal, Thailand, Turkey, major announcements in Afghanistan, Denmark, the Netherlands, Bulgaria, the Maldives, Norway, Liberia, Poland, Jamaica, Sri Lanka, Zambia and a pilot scheme in Myanmar.</p>
<p>The nature of choices made towards the creation of a digital identity system have significant consequences for privacy, security, inclusivity, scalability, fraud-detection capabilities and implementation costs of the framework. These choices exist in the context of a complex set of political, legal, technological, economic, and societal factors. In this project we will be looking at technical policy options and appropriate uses of a digital identity ecosystem.</p>
<h3>The Role</h3>
<p>Your role will require you to work closely with our team on research and policy analysis, and to engage with external researchers from whom we will commission research. Doing so will involve the following activities.</p>
<ul>
<li>Interdisciplinary research and analysis: Literature review, policy design, detailed analysis on topics including technology design options and appropriate uses of digital identity systems;</li>
<li>Policy dissemination and stakeholder engagement: Supporting the Project Manager in the dissemination of research findings in innovative formats, as well as attending, planning, and executing events;</li>
<li>Writing op-eds, short notes, policy briefs and longer form academic writing for a range of audiences;</li>
<li>Presentations and formal discussions: Preparing and delivering presentations to various audiences;</li>
<li>Helping manage communications with stakeholders including international experts, regulators and policy makers;</li>
<li>Knowledge management: Staying up-to-date on developments of interest to the Initiative, and sharing and debating these with the team;</li>
<li>Contributing to documentary and knowledge management processes; and</li>
<li>Managing interns and team: Managing work outputs with our interns, and coordinating research with team members and the Project Manager.</li></ul>
<h3>Qualifications and Skills</h3>
<p>We are looking for up to three professionals who may come from the following backgrounds: law, regulatory theory, public policy, economics, ethics, technology and development studies.</p>
<p>We are looking for candidates who can exhibit constructive problem-solving skills, sound analytical and critical thinking skills, with the ability to analyse issues from first principles and develop solutions.</p>
<p>This is a full-time position based out of Delhi. The position is for a duration of two years. Salary will be commensurate with qualifications and experience.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019'>http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019</a>
</p>
No publisheramberJobsDigital ID2019-03-29T11:02:42ZBlog Entry