<?xml version="1.0" encoding="utf-8" ?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:syn="http://purl.org/rss/1.0/modules/syndication/" xmlns="http://purl.org/rss/1.0/">




    



<channel rdf:about="http://editors.cis-india.org/search_rss">
  <title>Centre for Internet and Society</title>
  <link>http://editors.cis-india.org</link>
  
  <description>
    
            These are the search results for the query, showing results 11 to 25.
        
  </description>
  
  
  
  
  <image rdf:resource="http://editors.cis-india.org/logo.png"/>

  <items>
    <rdf:Seq>
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/rethinking-national-privacy-principles"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/jobs/programme-officer-privacy-2019"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle"/>
        
        
            <rdf:li rdf:resource="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime"/>
        
    </rdf:Seq>
  </items>

</channel>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments">
    <title>Right to be Forgotten: A Tale of Two Judgements</title>
    <link>http://editors.cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments</link>
    <description>
        &lt;b&gt;In the last few months, there have been contrasting judgments from two Indian high courts, Karnataka and Gujarat, on matters relating to the right to be forgotten. The two high courts heard pleas on issues to do the right of individuals to have either personal information redacted from the text of judgments available online or removal of such judgment from publically available sources.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;While one High Court (Karnataka) ordered the removal of personal details from the judgment,&lt;a href="#_ftn1" name="_ftnref1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt; the other (Gujarat) dismissed the plea&lt;a href="#_ftn2" name="_ftnref2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt;. In this post, we try to understand the global jurisprudence on the right to be forgotten, and how the contrasting judgments in India may be located within it.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Background&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt; &lt;/b&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ‘right to be forgotten’ has gained prominence since a matter was referred to the Court of Justice of European Union (CJEU) in 2014 by a Spanish court.&lt;a href="#_ftn3" name="_ftnref3"&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/a&gt; In this case, Mario Costeja González had disputed the Google search of his name continuing to show results leading to an auction notice of his reposed home. The fact that Google continued to make available in its search results, an event in his past, which had long been resolved, was claimed by González as a breach of his privacy. He filed a complaint with the Spanish Data Protection Agency (AEPD in its Spanish acronym), to have the online newspaper reports about him as well as related search results appearing on Google deleted or altered. While AEPD did not agree to his demand to have newspaper reports altered, it ordered Google Spain and Google, Inc. to remove the links in question from their search results. The case was brought in appeal before the Spanish High Court, which referred the matter to CJEU. In a judgement having far reaching implications, CJEU held that where the information is ‘inaccurate, inadequate, irrelevant or excessive,’ individuals have the right to ask search engines to remove links with personal information about them. The court also ruled that even if the physical servers of the search engine provider are located outside the jurisdiction of the relevant Member State of EU, these rules would apply if they have branch office or subsidiary in the Member State.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The ‘right to be forgotten’ is a misnomer, and essentially when we speak of it in the context of the proposed laws in EU, we refer to the rights of individuals to seek erasure of certain data that concerns them. The basis of what has now evolved into this right is contained in the 1995 EU Data Protection Directive, with Article 12 of the Directive allowing a person to seek deletion of personal data once it is no longer required.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Critical to our understanding of the rationale for how the ‘right to be forgotten’ is being framed in the EU, is an appreciation of how European laws perceive privacy of individuals. Unlike the United States (US), where privacy may be seen as a corollary of personal liberty protecting against unreasonable state intrusions, European laws view privacy as an aspect of personal dignity, and are more concerned with protection from third parties, particularly the media. The most important way in which this manifests itself is in where the burden to protect privacy rights lie. In Europe, privacy policy often dictates intervention from the state, whereas in the US, in many cases it is up to the individuals to protect their privacy.&lt;a href="#_ftn4" name="_ftnref4"&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Since the advent of the Internet, both the nature and quantity of information existing about individuals has changed dramatically. This personal information is no longer limited to newspaper reports and official or government records either. Our use of social media, micro-discussions on Twitter, photographs and videos uploaded by us or others tagging us, every page or event we like, favourite or share—all contribute to our digital footprint. Add to this the information created not by us but about us by both public and private bodies storing data about individuals in databases, our digital shadows begin to far exceed the data we create ourselves. It is abundantly clear that we exist in a world of Big Data, which relies on algorithms tracking repeated behaviour by our digital selves. It is in this context that a mechanism which enables the purging of some of this digital shadow makes sense.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Further, it is not only the nature and quantity of information that has changed, but also the means through which this information can be accessed. In the pre-internet era, access to records was often made difficult by procedural hurdles. Permissions or valid justifications were required to access certain kinds of data. Even for the information available in the public domain, often the process of gaining access were far too cumbersome. Now digital information not only continues to exist indefinitely, but can also be easily accessed readily through search engines. It is in this context that in a 2007 paper, Viktor Mayer-Schöenberger pioneered the idea of memory and forgetting for the digital age.&lt;a href="#_ftn5" name="_ftnref5"&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/a&gt; He proposed that all forms of personal data should have an additional meta data of expiration date to switch the default from information existing endlessly to having a temporal limit after which it is deleted. While this may be a radical suggestion, we have since seen proposals to allow individuals some control over information about them.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In 2016, the EU released the final version of the General Data Protection Regulation. The regulation provides for a right to erasure under Article 17, which would enable a data-subject to seek deletion of data.&lt;a href="#_ftn6" name="_ftnref6"&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/a&gt; Notably, except in the heading of the provision, Article 17 makes no reference to the word ‘forgetting.’ Rather the right made available in this regulation is in the form of making possible ‘erasure’ and ‘abstention from further dissemination.’ This is significant because what the proposed regulations provide for is not an overarching framework to enable or allow ‘forgetting’ but a limited right which may be used to delete certain data or search results. Providing a true right to be forgotten would pose issues of interpretation as to what ‘forgetting’ might mean in different contexts and the extent of measures that data controllers would have to employ to ensure it. The proposed regulation attempts to provide a specific remedy which can be exercised in the defined circumstances without having to engage with the question of ‘forgetting’.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The primary arguments made against the ‘right to be forgotten’ have come from its conflict with the right to freedom of speech. Jonathan Zittrain has argued against the rationale that the right to be forgotten merely alters results on search engines without deleting the actual source, thus, not curtailing the freedom of expression.&lt;a href="#_ftn7" name="_ftnref7"&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/a&gt; He has compared this altering of search results to letting a book remain in the library but making the catalogue unavailable. According to Zittrain, a better approach would be to allow data subjects to provide their side of the story and more context to the information about them, rather than allowing any kind of erasure. Unlike in the US, the European approach is to balance free speech against other concerns. So while one of the exceptions in sub-clause (3) of Article 17 provides that information may not be deleted where it is necessary to exercise the right to free speech, free speech does not completely trump privacy as the value that must be protected. On the other hand, US constitutional law would tend to give more credence to the First Amendment rights and allow them to be compromised in very limited circumstances. As per the position of the US Supreme Court in &lt;i&gt;Florida Star&lt;/i&gt; v. &lt;i&gt;B.J.F.&lt;/i&gt;, lawfully obtained information may be restricted from publication only in cases involving a ‘state interest of the highest order’. This position would allow any potential right to be forgotten to be exercised in the most limited of circumstances and privacy and reputational harm would not satisfy the standard. For these reasons the rights to be forgotten as it exists in Article 17 may be unworkable in the US.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Issues in application&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;Significant technical challenges remain in the effective and consistent application of Article 17 of the EU Directive. One key issue is concerned with how ‘personal data’ is defined and understood, and how its interpretation will impact this right in different contexts. According to Article 17 of the EU directive, the term ‘personal data’ includes any information relating to an individual. Some ambiguity remains about whether information which may not uniquely identify a person, but as a part of small group, could be considered within the scope of personal data. This becomes relevant, for instance, where one seeks the erasure of information which, without referring to an individual, points fingers towards a family. At the same time, often the piece of information sought to be erased by a person may contain personal information about more than one individual. There is no clarity over whether a consensus of all the individuals concerned should be required, and if not, on what parameters should the wishes of one individual prevail over the others. Another important question, which is as yet unanswered, is whether the same standards for removal of content should apply to most individuals and those in public life.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The issue of what is personal data and can therefore be erased gets further complicated in cases of derived data about individuals used in statistics and other forms of aggregated content. While, it would be difficult to argue that the right to be forgotten needs to be extended to such forms of information, not erasing such derived content poses the risk of the primary information being inferred from it. In addition, Article 17(1)(a) provides for deletion in cases where the data is no longer necessary for the purposes for which they were collected or used. The standards for circumstances which satisfy this criteria are, as yet, unclear and may only be fully understood through a consistent application of this law.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Finally, once there are reasonable grounds to seek erasure of information, it is not clear how this erasure will be enforced practically. It may not be prudent to require that all copies of the impugned data are deleted such that they may not be recovered, to the extent technologically possible. A more reasonable solution might be to permit the data to continue to remain available in encrypted forms, much like certain records are sealed and subject to the strictest confidentiality obligations. In most cases, it may be sufficient to ensure that the records of the impugned data is removed from search results and database reports without actually tampering with information as it may exist. These are some of the challenges which the practical application of this right will face, and it is necessary to take them into account in enforcing the proposed regulations.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;The two Indian judgments&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;In the first case, (before the Gujarat High Court), the petitioner entered a plea for “permanent restraint [on] free public exhibition of the judgment and order.” The judgment in question concerned proceeding against the petitioner for a number of offences, including culpable homicide amounting to murder. The petitioner was acquitted, both by the Sessions court and the High Court before which he was pleading. The petitioner’s primary contention was that despite the judgment being classified as ‘unreportable’, it was published by an online repository of judgments and was also indexed by Google search. The decision of the High Court to dismiss the petition, rest of the following factors: a) failure on the part of the petitioner to show any provisions in law which are attracted, or threat to the constitutional right to life and liberty, b) publication on a website does not amount to ‘reporting’, as reporting only refers to that by law reports.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the second point of reasoning made by the courts is problematic in terms of the function of precedent served by the reported judgments, and the basis for reducing the scope of ‘reporting’ to only law reports, the first point is of direct relevance to our current discussion. The lack of available legal provisions points to the absence of data protection legislation in India. Had there been a privacy legislation which addressed the issues of how personal information may be dealt with, it is possible that it may have had instructive provisions to address situation like these. In the absence of such law, the only recourse that an individual has is to seek constitutional protection under one of the fundamental rights, most notably Article 21, which over the years, has emerged as the infinite repository of unenumerated rights. However, typically rights under Article 21 are of a vertical nature, i.e., available only against the state. Their application in cases where a private party is involved remains questionable, at best.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In contrast, in the second case, the Karnataka High Court ruled in favor of the petitioner. In this case, the petitioner’s daughter instituted both criminal and civil proceedings against a person. However, later they arrived at a compromise and one of the conditions was quashing all the proceedings which had been initiated. The petitioner had raised concerns about the appearance of his daughter’s name in the cause title and was easily searchable. The court, while making vague references to “trend in the Western countries where they follow this as a matter of rule “Right to be forgotten” in sensitive cases involving women in general and highly sensitive cases involving rape or affecting the modesty and reputation of the person concerned, held in the petitioner’s favor, and order that the name be redacted from the cause title and the body of the order before releasing to any service provider.  The second judgment is all the more problematic for while it makes a reference to jurisprudence in other countries, yet it does not base it on the fundamental right to privacy, but to the idea of modesty and reputation of women, which has no clear legal basis on either Indian or comparative jurisprudence.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify; "&gt;The above two cases demonstrate the problem of lack of a clear legal basis being employed by the judiciary in interpreting the right to be forgotten. Not only were no clear legal provisions in Indian law were taken refuge of while ruling on the existence of this right, the court also do not engage in any analysis of comparative jurisprudence such as the GDPR or the Costeja judgment. Such ad-hoc jurisprudence underlines the need for a data protection legislation, as in its absence, it is likely that divergent views are taken upon this issue, without a clear legal direction. It is likely that most matters concerning the right to erasure concern private parties as data controllers. In such cases, the existing jurisprudence on the right to privacy as interpreted under Article 21 may also be of limited value. Further, as has been pointed out above, the right to be forgotten needs to be a right qualified by conditions very clearly, and its conflict with the right to freedom of expression under Article 19. Therefore, it is imperative that a comprehensive data protection law addresses these issues.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref1" name="_ftn1"&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/a&gt; Sri Vasunathan vs The Registrar, available at &lt;a href="http://www.iltb.net/2017/02/karnataka-hc-on-the-right-to-be-forgotten/"&gt;http://www.iltb.net/2017/02/karnataka-hc-on-the-right-to-be-forgotten/&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref2" name="_ftn2"&gt;&lt;sup&gt;[2]&lt;/sup&gt;&lt;/a&gt; Dharmraj Bhanushankar Dave v. State of Gujarat, available at &lt;a href="https://drive.google.com/file/d/0BzXilfcxe7yueXFJWG5mZ1pKaTQ/view"&gt;https://drive.google.com/file/d/0BzXilfcxe7yueXFJWG5mZ1pKaTQ/view&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref3" name="_ftn3"&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/a&gt; Google Spain et al v. Mario Costeja González, available at &lt;a href="http://curia.europa.eu/juris/document/document_print.jsf?doclang=EN&amp;amp;docid=152065"&gt;http://curia.europa.eu/juris/document/document_print.jsf?doclang=EN&amp;amp;docid=152065&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref4" name="_ftn4"&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/a&gt; &lt;a href="http://www.europarl.europa.eu/RegData/etudes/STUD/2015/536459/IPOL_STU(2015)536459_EN.pdf"&gt;http://www.europarl.europa.eu/RegData/etudes/STUD/2015/536459/IPOL_STU(2015)536459_EN.pdf&lt;/a&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref5" name="_ftn5"&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/a&gt; Mayer-Schoenberger, Viktor, Useful Void: The Art of Forgetting in the Age of Ubiquitous Computing (April 2007). KSG Working Paper No. RWP07-022. Available at SSRN: https://ssrn.com/abstract=976541 or &lt;a href="http://dx.doi.org/10.2139/ssrn.976541"&gt;http://dx.doi.org/10.2139/ssrn.976541&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref6" name="_ftn6"&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/a&gt; Article 17 (1) states: &lt;i&gt;The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies: &lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(b) the data subject withdraws consent on which the processing is based according to point (a) of Article 6(1), or point (a) of Article 9(2), and where there is no other legal ground for the processing;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(c) the data subject objects to the processing pursuant to Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to Article 21(2);&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(d) the personal data have been unlawfully processed;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(e) the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;i&gt;(f) the personal data have been collected in relation to the offer of information society services referred to in Article 8(1).&lt;/i&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="#_ftnref7" name="_ftn7"&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/a&gt; Zittrain, Jonathan, “Don’t Force Google to ‘Forget’”, The New York Times, May 14, 2014. Available at &lt;a href="https://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html"&gt;https://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments'&gt;http://editors.cis-india.org/internet-governance/blog/right-to-be-forgotten-a-tale-of-two-judgments&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Right to be Forgotten</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-04-07T02:27:03Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/rethinking-national-privacy-principles">
    <title>Rethinking National Privacy Principles: Evaluating Principles for India's Proposed Data Protection Law</title>
    <link>http://editors.cis-india.org/internet-governance/blog/rethinking-national-privacy-principles</link>
    <description>
        &lt;b&gt;This report is intended to be the first part in a series of white papers that CIS will publish which seeks to contribute to the discussions around the enactment of a privacy legislation in India. In subsequent pieces we will focus on subjects such as regulatory framework to implement, supervise and enforce privacy principles, and principles to regulate surveillance in India under a privacy law.&lt;/b&gt;
        &lt;p&gt;Edited by Elonnai Hickok and Vipul Kharbanda&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;This analysis intends to build on the substantial work done in the formulation of the National Privacy Principles by the Committee of Experts led by Justice AP Shah.1 This brief, hopes to evaluate the National Privacy Principles and the assertion by the Committee that right to privacy be considered a fundamental right under the Indian Constitution. The national privacy principles have been revisited in light of technological developments such as big data, Internet of Things, algorithmic decision making and artificial intelligence which are increasingly playing a greater role in the collection and processing of personal data of individuals, its analysis and decisions taken on the basis of such analysis. The solutions and principles articulated in this report are intended to provide starting points for a meaningful and nuanced discussion on how we need to rethink the privacy principles that should inform the data protection law in India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;a class="external-link" href="http://cis-india.org/internet-governance/files/rethinking-privacy-principles"&gt;Click to read the full blog post&lt;/a&gt;&lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
&lt;p&gt; &lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/rethinking-national-privacy-principles'&gt;http://editors.cis-india.org/internet-governance/blog/rethinking-national-privacy-principles&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-09-11T02:22:01Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf">
    <title>Regulating Sexist Online Harassment: A Model of Online Harassment as a Form of Censorship</title>
    <link>http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf</link>
    <description>
        &lt;b&gt;&lt;/b&gt;
        
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf'&gt;http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Censorship</dc:subject>
    

   <dc:date>2021-05-31T09:39:14Z</dc:date>
   <dc:type>File</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship">
    <title>Regulating Sexist Online Harassment: A Model of Online Harassment as a Form of Censorship</title>
    <link>http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship</link>
    <description>
        &lt;b&gt;Amber Sinha wrote a paper on regulating sexist online harassment, and how online harassment serves as a form of censorship, for the “Recognize, Resist, Remedy: Addressing Gender-Based Hate Speech in the Online Public Sphere” project, a collaborative project between IT for Change, India and InternetLab, Brazil.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;Read the full paper &lt;a class="external-link" href="https://itforchange.net/sites/default/files/1883/Amber-Sinha-Rethinking-Legal-Institutional-Approaches-to-Sexist-Hate-Speech-ITfC-IT-for-Change_0.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship'&gt;http://editors.cis-india.org/internet-governance/blog/regulating-sexist-online-harassment-a-model-of-online-harassment-as-a-form-of-censorship&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>


   <dc:date>2021-03-11T04:14:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment">
    <title>Regulating Sexist Online Harassment as a Form of Censorship</title>
    <link>http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment</link>
    <description>
        &lt;b&gt;This paper is part of a series under IT for Change’s project, Recognize, Resist, Remedy: Combating Sexist Hate Speech Online. The series, titled Rethinking Legal-Institutional Approaches to Sexist Hate Speech in India, aims to create a space for civil society actors to proactively engage in the remaking of online governance, bringing together inputs from legal scholars, practitioners, and activists. The papers reflect upon the issue of online sexism and misogyny, proposing recommendations for appropriate legal-institutional responses. The series is funded by EdelGive Foundation, India and International Development Research Centre, Canada.&lt;/b&gt;
        &lt;p&gt;&lt;span&gt;Introduction&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The proliferation of internet use was expected to facilitate greater online participation of women and &lt;a class="external-link" href="https://ssrn.com/abstract=2039116"&gt;other marginalised groups&lt;/a&gt;.  However, over the past few years, as more and more people have come online, it is evident that social power in online spaces mirrors offline hierarchies. While identity and security thefts may be universal experiences, women and the LGBTQ+ community continue to face barriers to safety that men often do not, aside from structural barriers to access. Sexist harassment pervades the online experience of women, be it on dating sites, &lt;a class="external-link" href="https://academic.oup.com/bjc/article/57/6/1462/2623986"&gt;online forums, or social media&lt;/a&gt;.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In her book, &lt;i&gt;&lt;a class="external-link" href="https://yalebooks.yale.edu/book/9780300215120/twitter-and-tear-gas"&gt;Twitter and Tear Gas: The Power and Fragility of Networked Protest&lt;/a&gt;&lt;/i&gt;, Zeynep Tufekci argues that the nature and impact of censorship on social media are very different. Earlier, censorship was enacted by restricting speech. But now, it also works in the form of organised harassment campaigns, which use the qualities of viral outrage to impose a disproportionate cost on the very act of speaking out. Therefore, censorship plays out not merely in the form of the removal of speech but through disinformation and hate speech campaigns.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In most cases, this censorship of content does not necessarily meet the threshold of hate speech, and free speech advocates have traditionally argued for counter speech as the most effective response to such speech acts. However, the structural and organised nature of harassment and extreme speech often renders counter speech ineffective. This paper will explore the nature of online sexist hate and extreme speech as a mode of censorship. Online sexualised harassment takes various forms including doxxing, cyberbullying, stalking, identity theft, incitement to violence, etc. While there are some regulatory mechanisms – either in law, or in the form of community guidelines that address them, this paper argues for the need to evolve a composite framework that looks at the impact of such censorious acts on online speech and regulatory strategies to address them.&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a href="http://editors.cis-india.org/internet-governance/files/it-for-change-february-2021-amber-sinha-regulating-sexist-online-harassment.pdf/at_download/file" class="external-link"&gt;Click on to read the full text&lt;/a&gt; [PDF; 495 Kb]&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment'&gt;http://editors.cis-india.org/internet-governance/blog/it-for-change-amber-sinha-regulating-sexist-online-harassment&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Censorship</dc:subject>
    

   <dc:date>2021-05-31T09:56:31Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017">
    <title>Rankathon on Digital Rights (Delhi, January 08)</title>
    <link>http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017</link>
    <description>
        &lt;b&gt;Please join us on Sunday, January 08, at the CIS office in Hauz Khas, Delhi, for a rankathon to visualise, and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology. The event will begin at 10:00 in the morning and participants can focus on one or more of three kinds of tasks: 1) visualising the CIS and Ranking Digital Rights data, 2) evaluating additional companies using the RDR methodology, and 3) evaluating the RDR methodology and its suitability for independent use.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;Download: &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS_RDRIndia-Rankathon_08012017_Invitation.pdf"&gt;Invitation&lt;/a&gt; (PDF)&lt;/h4&gt;
&lt;hr /&gt;
&lt;p&gt;The &lt;a href="https://rankingdigitalrights.org/"&gt;Ranking Digital Rights Corporate Responsibility Index&lt;/a&gt; is a project hosted by the Open Technology Institute at New America Foundation that aims to rank Information and Communications Technology (ICTs) companies with respect to their Governance, Freedom of Expression, and Privacy practices. The inaugural Corporate Accountability Index, released in November 2015, evaluated 16 companies based on the project’s methodology that included 31 indicators in total.&lt;/p&gt;
&lt;p&gt;Towards developing an understanding of how Indian ICT companies are recognising and upholding digital rights of their users, and to raise public awareness about the same, the Center for Internet and Society (CIS), with the support of &lt;a href="https://privacyinternational.org/"&gt;Privacy International&lt;/a&gt;, has studied 8 Indian ICT companies, using the same methodology as the 2015 Corporate Accountability Index, to gain greater insight into company practices and initiate public dialogues.&lt;/p&gt;
&lt;p&gt;Please join us on Sunday, January 08, at the CIS office in Hauz Khas, Delhi, for a rankathon to visualise, and contribute to the findings of the Ranking Digital Rights study, and critique the underlying methodology. The event will begin at 10:00 in the morning and participants can focus on one or more of three kinds of tasks:&lt;/p&gt;
&lt;ul&gt;&lt;li&gt;
&lt;p&gt;visualising the CIS and Ranking Digital Rights data,&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;evaluating additional companies using the RDR methodology, and&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;evaluating the RDR methodology and its suitability for independent use.&lt;/p&gt;
&lt;/li&gt;&lt;/ul&gt;
&lt;p&gt;The event is open to all but the venue has limited space. The participants are requested to RSVP by sending an email to &lt;a href="mailto:nisha@cis-india.org?subject=RSVP: Rankathon on Digital Rights"&gt;nisha@cis-india.org&lt;/a&gt;. The final date for registering for the event is &lt;strong&gt;January 04&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;All visualisations and other outputs produced at the event will be published under open licenses. All participants are expected to bring their own laptop or any other items needed for their work. CIS will offer data, help with understanding how the Ranking Digital Rights methodology work, refreshments, and any other support as needed.&lt;/p&gt;
&lt;p&gt;We are also organising a discussion event on Saturday, January 07, at the India Islamic Cultural Centre, Delhi, to present our findings on digital rights practices of 8 Indian ICT companies, followed by an open structured discussion on the methodology of the Ranking Digital Rights study. Please find more details about this &lt;a href="http://cis-india.org/internet-governance/events/discussion-on-ranking-digital-rights-in-india-delhi-jan-07-2017"&gt;here&lt;/a&gt;.&lt;/p&gt;
&lt;p&gt;We look forward to your participation and contribution to the discussion. Please support us by sharing this invitation with your colleagues and networks.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017'&gt;http://editors.cis-india.org/internet-governance/events/rankathon-on-digital-rights-delhi-jan-08-2017&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Freedom of Speech and Expression</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Digital Rights</dc:subject>
    

   <dc:date>2016-12-29T07:10:09Z</dc:date>
   <dc:type>Event</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/jobs/programme-officer-privacy-2019">
    <title>Programme Officer - Privacy</title>
    <link>http://editors.cis-india.org/jobs/programme-officer-privacy-2019</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) is seeking applications for the position of Programme Officer, to undertake public policy research on privacy and related themes. For this position, we will hire one full time researcher, to be based in the Delhi office of CIS, for the duration of one year.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;To apply for this position please write to amber@cis-india.org along with a CV, two writing samples and contact details of two references, Interested candidates are invited to send their applications at the earliest — latest by April 30th.&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;Organisation Profile&lt;/h3&gt;
&lt;p&gt;The Centre for Internet and Society (CIS) is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with disabilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, open access, open educational resources, and open video), internet governance, telecommunication reform, digital privacy, and cyber-security. The academic research at CIS seeks to understand the reconfiguration of social processes and structures through the internet and digital media technologies, and vice versa. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and practices around internet, technology and society in India, and elsewhere.&lt;/p&gt;
&lt;h3&gt;Privacy Research at CIS&lt;/h3&gt;
&lt;p&gt;While privacy has been a key subject of study for digital rights and development organisations in India for the last decade, recent and ongoing legal and policy developments have placed this issue at the forefront of human rights and regulatory research. CIS has conducted extensive research into the areas of privacy, data protection, data security, and was also a member of the Committee of Experts constituted under Justice A P Shah. CIS has also been cited multiple times in the Report of the Committee of Experts led by Justice Srikrishna. CIS values the fundamental principles of justice, equality, freedom and economic development and strongly advocates the right to privacy.&lt;/p&gt;
&lt;p&gt;Over the next year, CIS intends to look at several research questions on data protection which may include the global experience with privacy enforcement, need for effective redressal mechanisms, documenting the design of business models and data flows, regulation of social media big data, how data of disadvantaged groups including children may be protected. Additionally, while we now have the Supreme Court’s unanimous and emphatic recognition of the fundamental right to privacy, there is a need for research enquiry into several issues such as a clarification of  the scope of the Puttaswamy judgment, unpacking the different dimensions of privacy, how state actions interact with privacy.&lt;/p&gt;
&lt;h3&gt;The Role&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Research and analysis: Literature review, policy design, detailed analysis of research topics&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Knowledge management: Staying up-to-date on developments of interest to the project, and sharing/debating these with the team. Contributing to documentary and knowledge management processes&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Policy outreach and stakeholder engagement: Supporting the project manager in the dissemination of research findings in innovative formats. Attending, planning and executing events&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Writing op-eds, short notes, policy briefs and longer form academic writing for a range of audiences&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Presentations and formal discussions: Preparing and delivering presentations to various audiences&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Helping manage communications with stakeholders including international experts, regulators and policy makers&lt;br /&gt;&lt;br /&gt;&lt;/li&gt;
&lt;li&gt;Managing interns and team: Managing work outputs with our interns; coordinating research with team members and the project manager&lt;/li&gt;&lt;/ul&gt;
&lt;h3&gt;Qualifications and Skills&lt;/h3&gt;
&lt;p&gt;We are looking for professionals from law, regulatory theory and public policy backgrounds.&lt;/p&gt;
&lt;p&gt;We are looking for candidates who are interested in studying the regulatory challenges of notice and consent, state capacity, how business models thwart privacy and the future of privacy post Puttaswamy.&lt;/p&gt;
&lt;p&gt;This is a full-time position based out of Delhi. The position is for a duration of one year. Salary will be commensurate with qualifications and experience.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/jobs/programme-officer-privacy-2019'&gt;http://editors.cis-india.org/jobs/programme-officer-privacy-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Jobs</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2019-04-15T06:53:44Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019">
    <title>Programme Officer - Digital Identity</title>
    <link>http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) is seeking applications for the position of Programme Officer, to be associated with a two year long research project on digital identity. We may hire up to three Programme Officers as part of this project. The position is full time and will be based in the Delhi office of CIS. &lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;h4&gt;To apply for this position please write to amber@cis-india.org along with a CV, two writing samples and contact details of two references. Interested candidates are invited to send their applications at the earliest - latest by April 15th.&lt;/h4&gt;
&lt;hr /&gt;
&lt;h3&gt;Organisation Profile&lt;/h3&gt;
&lt;p&gt;The Centre for Internet and Society (CIS) is a non-profit organisation that undertakes interdisciplinary research on internet and digital technologies from policy and academic perspectives. The areas of focus include digital accessibility for persons with disabilities, access to knowledge, intellectual property rights, openness (including open data, free and open source software, open standards, open access, open educational resources, and open video), internet governance, telecommunication reform, digital privacy, and cyber-security. The academic research at CIS seeks to understand the reconfiguration of social processes and structures through the internet and digital media technologies, and vice versa. Through its diverse initiatives, CIS explores, intervenes in, and advances contemporary discourse and practices around internet, technology and society in India, and elsewhere.&lt;/p&gt;
&lt;h3&gt;About Digital Identity Project&lt;/h3&gt;
&lt;p&gt;We are embarking on a two year research project on digital identity. As governments across the globe are implementing new, digital foundational identification systems or modernizing existing ID programs, there is a dire need for greater research and discussion about appropriate design choices for a digital identity framework. There is significant momentum on digital ID, especially after the adoption of UN Sustainable Development Goal 16.9, which calls for legal identity for all by 2030. Instances of emerging  new digital identity schemes include national projects in Algeria, Belgium (mobile ID), Cameroon, Ecuador, Jordan, Kyrgyzstan, Italy, Iran, Japan, Senegal, Thailand, Turkey, major announcements in Afghanistan, ​Denmark, the Netherlands, Bulgaria, the Maldives, Norway, Liberia, Poland, Jamaica, Sri​ Lanka, Zambia and a pilot scheme in Myanmar.&lt;/p&gt;
&lt;p&gt;The nature of choices made towards the creation of a digital identity system have significant consequences for privacy, security, inclusivity, scalability, fraud-detection capabilities and implementation costs of the framework. These choices exist in the context of a complex set of political, legal, technological, economic, and societal factors. In this project we will be looking at technical policy options and appropriate uses of a digital identity ecosystem.&lt;/p&gt;
&lt;h3&gt;The Role&lt;/h3&gt;
&lt;p&gt;Your role will require you to work closely with our team on research and policy analysis, and to engage with external researchers from whom we will commission research. Doing so will involve the following activities.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Interdisciplinary research and analysis: Literature review, policy design, detailed analysis on topics including technology design options and appropriate uses of digital identity systems;&lt;/li&gt;
&lt;li&gt;Policy dissemination and stakeholder engagement: Supporting the Project Manager in the dissemination of research findings in innovative formats, as well as attending, planning, and executing events;&lt;/li&gt;
&lt;li&gt;Writing op-eds, short notes, policy briefs and longer form academic writing for a range of audiences;&lt;/li&gt;
&lt;li&gt;Presentations and formal discussions: Preparing and delivering presentations to various audiences;&lt;/li&gt;
&lt;li&gt;Helping manage communications with stakeholders including international experts, regulators and policy makers;&lt;/li&gt;
&lt;li&gt;Knowledge management: Staying up-to-date on developments of interest to the Initiative, and sharing and debating these with the team;&lt;/li&gt; 
&lt;li&gt;Contributing to documentary and knowledge management processes; and&lt;/li&gt;
&lt;li&gt;Managing interns and team: Managing work outputs with our interns, and coordinating research with team members and the Project Manager.&lt;/li&gt;&lt;/ul&gt;
&lt;h3&gt;Qualifications and Skills&lt;/h3&gt;
&lt;p&gt;We are looking for up to three professionals who may come from the following backgrounds: law, regulatory theory, public policy, economics, ethics, technology and development studies.&lt;/p&gt;
&lt;p&gt;We are looking for candidates who can exhibit constructive problem-solving skills, sound analytical and critical thinking skills, with the ability to analyse issues from first principles and develop solutions.&lt;/p&gt;
&lt;p&gt;This is a full-time position based out of Delhi. The position is for a duration of two years. Salary will be commensurate with qualifications and experience.&lt;/p&gt;
&lt;p&gt;&amp;nbsp;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019'&gt;http://editors.cis-india.org/jobs/programme-officer-digital-identity-2019&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Jobs</dc:subject>
    
    
        <dc:subject>Digital ID</dc:subject>
    

   <dc:date>2019-03-29T11:02:42Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept">
    <title>Privacy is not a unidimensional concept</title>
    <link>http://editors.cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept</link>
    <description>
        &lt;b&gt;Right  to privacy is important not only for our negotiations with the information age but also to counter the transgressions of a welfare state. A robust right to privacy is essential for all citizens in India to defend their individual autonomy in the face of invasive state actions purportedly for the public good. The ruling of this nine-judge bench will have far-reaching impact on the extent and scope of rights available to us all.&lt;/b&gt;
        
&lt;div&gt;This article, written by Amber Sinha was published in the &lt;a class="external-link" href="http://economictimes.indiatimes.com/news/politics-and-nation/aadhar-privacy-is-not-a-unidimensional-concept/articleshow/59716562.cms"&gt;Economic Times&lt;/a&gt; on July 23, 2017.&amp;nbsp;&lt;/div&gt;
&lt;div&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;div&gt;In a disappointing case of judicial evasion by the apex court,
      it has taken over 600 days since a reference order passed in
      August 11, 2015, for this bench to be constituted. Over two days
      of arguments, the counsels for the petitioners have presented
      before the court why the right to privacy, despite not finding a
      mention in the Constitution of India, is a fundamental right
      essential to a person’s dignity and liberty, and must be read into
      not one but multiple articles of the Constitution. The government
      will make its arguments in the coming week.&lt;/div&gt;
&lt;div&gt;One must wonder why we are debating the contours of the right
      to privacy, which 40 years of jurisprudence had lulled us into
      believing we already had. The answer to that can be found in a
      series of hearings in the Aadhaar case that began in 2012. Justice
      KS Puttaswamy, a former Karnataka High Court judge, filed a
      petition before the Supreme Court, questioning the validity of the
      Aadhaar project due its lack of legislative basis (since then the
      Aadhaar Act was passed in 2016) and its transgressions on our
      fundamental rights. Over time, a number of other petitions also
      made their way to the apex court, challenging different aspects of
      the Aadhaar project. Since then, five different interim orders by
      the Supreme Court have stated that no person should suffer because
      they do not have an Aadhaar number. Aadhaar, according to the
      court, could not be made mandatory to avail benefits and services
      from government schemes. Further, the court has limited the use of
      Aadhaar to specific schemes: LPG, PDS, MGNREGA, National Social
      Assistance Programme, the Pradhan Mantri Jan Dhan Yojna and EPFO.&lt;br /&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;div&gt;The real spanner in the works in the progress of this case was
      the stand taken by Mukul Rohatgi, then attorney general of India
      who, in a hearing before the court in July 2015, stated that there
      is no constitutionally guaranteed right to privacy. His reliance
      was on two Supreme Court judgments in MP Sharma v Satish Chandra
      (1954) and Kharak Singh v State of Uttar Pradesh (1962): both
      cases, decided by eight- and six-judge benches respectively,
      denied the existence of a constitutional right to privacy. As the
      subsequent judgments which upheld the right to privacy were by
      smaller benches, Rohatgi claimed that MP Sharma and Kharak Singh
      still prevailed over them, until they were overruled by a larger
      bench.&lt;/div&gt;
&lt;div&gt;The reference to a larger bench has since delayed the entire
      matter, even as a number of government schemes have made Aadhaar
      mandatory. This reading of privacy as a unidimensional concept by
      the courts is, with due respect, erroneous. Privacy, as a concept,
      includes within its scope, spatial, familial, informational and
      decisional aspects. We all have a legitimate expectation of
      privacy in our private spaces, such as our homes, and in our
      personal relationships. Similarly, we must be able to exercise
      some control over how personal data, like our financial
      information, are disseminated. Most importantly, privacy gives us
      the space to make autonomous choices and decisions without
      external interference. All these dimensions of privacy must stand
      as distinct rights. In MP Sharma, the court rejected a certain
      aspect of the right of privacy by refusing to acknowledge a right
      against search and seizure. This, in no way prevented the court,
      even in the form of a smaller bench, from ruling on any other
      aspects of privacy, including those that are relevant to the
      Aadhaar case.&lt;/div&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;
&lt;div&gt;The limited referral to this bench means that the court will
      have to rule on the status of privacy and its possible limitations
      in isolation, without even going into the details of the Aadhaar
      case (based on the nature of protection that this bench accords to
      privacy, the petitioners and defendants in the Aadhaar case will
      have to argue afresh on whether the project does impede on this
      most fundamental right). There are no facts of the case to ground
      the legal principles in, and defining the contours of a right can
      be a difficult exercise. The court must be wary of how any limits
      they put on the right may be used in future. Equally, it is
      important to articulate that any limitations on the right to
      privacy due to competing interests such as national security and
      public interest must be imposed only when necessary and always be
      proportionate. &lt;br /&gt;
      &lt;br /&gt;&lt;/div&gt;
&lt;p&gt;
    
    
    
    
    
    It will not be enough for the court to merely state that we have a
    constitutional right to privacy. They would be well advised to cut
    through the muddle of existing privacy jurisprudence, and
    unequivocally establish the various facets of the right. Without
    that, we may not be able to withstand the modern dangers of
    surveillance, denial of bodily integrity and self-determination
    through forcible collection of information. The nine judges, in
    their collective wisdom, must not only ensure that we have a right
    to privacy, but also clearly articulate a robust reading of this
    right capable of withstanding the growing interferences with our
    autonomy.&lt;/p&gt;
&lt;div&gt;&amp;nbsp;&lt;/div&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept'&gt;http://editors.cis-india.org/internet-governance/privacy-is-not-a-unidimensional-concept&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Data Protection</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-08-07T08:02:20Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data">
    <title>Privacy in the Age of Big Data</title>
    <link>http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data</link>
    <description>
        &lt;b&gt;Personal data is freely accessible, shared and even sold, and those to whom this information belongs have little control over its flow.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="http://www.asianage.com/india/all-india/100417/privacy-in-the-age-of-big-data.html"&gt;Asian Age&lt;/a&gt; on April 10, 2017.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;In 2011 it was estimated that the quantity of data produced globally surpassed 1.8 zettabyte. By 2013, it had increased to 4 zettabytes. This is a result of digital services which involve constant data trails left behind by human activity. This expansion in the volume, velocity, and variety of data available, together with the development of innovative forms of statistical analytics on the data collected, is generally referred to as “Big Data”. Despite significant (though largely unrealised) promises about Big Data, which range from improved decision-making, increased efficiency and productivity to greater personalisation of services, concerns remain about the impact of such datafication of all human activity on an individual’s privacy. Privacy has evolved into a sweeping concept, including within its scope matters pertaining to control over one’s body, physical space in one’s home, protection from surveillance, and from search and seizure, protection of one’s reputation as well as one’s thoughts. This generalised and vague conception of privacy not only comes with great judicial discretion, it also thwarts a fair understanding of the subject. Robert Post called privacy a concept so complex and “entangled in competing and contradictory dimensions, so engorged with various and distinct meanings”, that he sometimes “despairs whether it can be usefully addressed at all”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This also leaves the idea of privacy vulnerable to considerable suspicion and ridicule. However, while there is a lack of clarity over the exact contours of what constitutes privacy, there is general agreement over its fundamental importance to our ability to lead whole lives. In order to understand the impact of datafied societies on privacy, it is important to first delve into the manner in which we exercise our privacy. The ideas of privacy and data management that are prevalent can be traced to the Fair Information Practice Principles (FIPP). These principles are the forerunners of most privacy regimes internationally, such as the OECD Privacy Guidelines, APEC Framework, or the nine National Privacy Principles articulated by the Justice A.P. Shah Committee Report. All of these frameworks have rights to notice, consent and correction, and how the data may be used, as their fundamental principles. It makes the data subject to the decision-making agent about where and when her/his personal data may be used, by whom, and in what way. The individual needs to be notified and his consent obtained before his personal data is used. If the scope of usage extends beyond what he has agreed to, his consent will be required for the increased scope.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In theory, this system sounds fair. Privacy is a value tied to the personal liberty and dignity of an individual. It is only appropriate that the individual should be the one holding the reins and taking the large decisions about the use of his personal data. This makes the individual empowered and allows him to weigh his own interests in exercising his consent. The allure of this paradigm is that in one elegant stroke, it seeks to ensure that consent is informed and free and also to implement an acceptable trade-off between privacy and competing concerns. This approach worked well when the number of data collectors were less and the uses of data was narrower and more defined. Today’s infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users. Despite a growing willingness to share information online, most people have no understanding of what happens to their data.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The quantity of data being generated is expanding at an exponential rate. From smartphones and televisions, trains and airplanes, sensor-equipped buildings and even the infrastructures of our cities, data now streams constantly from almost every sector and function of daily life, “creating countless new digital puddles, lakes, tributaries and oceans of information”. The inadequacy of the regulatory approaches and the absence of a comprehensive data protection regulation is exacerbated by the emergence of data-driven business models in the private sector and the adoption of data-driven governance approach by the government. The Aadhaar project, with over a billion registrants, is intended to act as a platform for a number of digital services, all of which produce enormous troves of data. The original press release by the Central Government reporting the approval by the Cabinet of Ministers of the Digital India programme, speaks of “cradle to grave” digital identity as one of its vision areas.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While the very idea of the government wanting to track its citizens’ lives from cradle to grave is creepy enough in itself, let us examine for a minute what this form of datafied surveillance will entail. A host of schemes under Digital India shall collect and store information through the life cycle of an individual. The result, as we can see, is building databases on individuals, which when combined, will provide a 360 degree view into the lives of individuals. Alongside the emergence of India Stack, a set of APIs built on top of the Aadhaar, conceptualised by iSPIRT, a consortium of select IT companies from India, to be deployed and managed by several agencies, including the National Payments Corporation of India, promises to provide a platform over which different private players can build their applications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The sum of these interconnected parts will lead to a complete loss of anonymity, greater surveillance and impact free speech and individual choice. The move towards a cashless economy — with sharp nudges from the government — could lead to lack of financial agencies in case of technological failures as has been the case in experiments with digital payments in Africa. Lack of regulation in emerging data driven sectors such as Fintech can enable predatory practices where right to remotely deny financial services can be granted to private sector companies. An architecture such as IndiaStack enables datafication of financial transactions in a way that enables linked and structured data that allows continued use of the transaction data collected. It is important to recognise that at the stage of giving consent, there are too many unknowns for us to make informed decisions about the future uses of our personal data. Despite blanket approvals allowing any kind of use granted contractually through terms of use and privacy policies, there should be legal obligations overriding this consent for certain kinds of uses that may require renewed consent.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Biometrics-based identification in UK: &lt;/b&gt;In  2005, researchers from London School of Economics and Political Science  came out with a detailed report on the UK Identity Cards Bill (‘UK  Bill’) — the proposed legislation for a national identification system  based on biometrics. The project also envisaged a centralised database  (like India) that would store personal information along with the entire  transaction history of every individual. The report pointed strongly  against the centralising storage of information and suggested other  alternatives such as a system based on smartcards (where biometrics are  stored on the card itself) or offline biometric-reader terminals.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;As per the report, the alternatives would also have been cheaper as neither required real-time online connectivity. In India, online authentication is a far greater challenge. According to Network Readiness Index, 2016, India ranks 91, whereas UK is placed eight. Poor Internet connectivity can raise a lot of problems in the future including paralysis of transactions. The UK identification project was subsequently discarded as a result of the privacy and cost considerations raised in this report.&lt;/p&gt;
&lt;h3 style="text-align: justify; "&gt;Aadhaar: Privacy concerns&lt;/h3&gt;
&lt;ol style="text-align: justify; "&gt;
&lt;li&gt;Once the data is collected through National Information Utilities, it will be privatised and controlled by private utilities.&lt;/li&gt;
&lt;li&gt;Once an individual’s data is entered in the system, it cannot be deleted. That individual will have no control over it.&lt;/li&gt;
&lt;li&gt;Aadhaar Data (Demographic details along with photographs) are shared/transferred with the private entities including telecom companies as per the Aadhaar (Targeted delivery of Financial and other subsidies, benefits and services) Act, 2016 with the consent of Aadhaar number holder to fulfil their e-KYC requirements. The data is shared in encrypted form through secured channel.&lt;/li&gt;
&lt;li&gt;Aadhaar Enabled Payment System (AEPS) on which 119 banks are live.&lt;/li&gt;
&lt;li&gt;More than 33.87 crore transactions have taken place through AEPS, which was only 46 lakhs in May 2014.&lt;/li&gt;
&lt;li&gt;As on 30-9-2016, 78 government schemes were linked to Aadhaar.&lt;/li&gt;
&lt;li&gt;The Aadhaar (Targeted Delivery of Financial and Other Subsidies, Benefits and Services) Act, 2016, provides that no core-biometric information (fingerprints, iris scan) shall be shared with anyone for any reason whatsoever (Sec 29) and that the biometric information shall not be used for any purpose other than generation of Aadhaar and authentication.&lt;/li&gt;
&lt;li&gt;Access to the data repository of UIDAI, called the Central Identities Data Repository(CIDR), is provided to third parties or private companies.&lt;/li&gt;
&lt;/ol&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Central Monitoring System&lt;/b&gt; (CMS) is already live in  Delhi, New Delhi and Mumbai. Union minister Ravi Shankar Prasad revealed  this in one of his replies in the Lok Sabha last year. CMS has been set  up to automate the process of Lawful Interception &amp;amp; Monitoring of  telecommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Central Monitoring System&lt;/b&gt; (CMS) is already live in  Delhi, New Delhi and Mumbai. Union minister Ravi Shankar Prasad revealed  this in one of his replies in the Lok Sabha last year. CMS has been set  up to automate the process of Lawful Interception &amp;amp; Monitoring of  telecommunications.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;&lt;b&gt;Lawful Intercept &lt;/b&gt;and Monitoring (LIM) systems are used  by the Indian Government to intercept records of voice, SMSes, GPRS  data, details of a subscriber’s application and recharge history and  call detail record (CDR) and monitor Internet traffic, emails,  web-browsing, Skype and any other Internet activity of Indian users.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data'&gt;http://editors.cis-india.org/internet-governance/blog/asian-age-amber-sinha-april-10-2017-privacy-in-the-age-of-big-data&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-04-11T14:43:59Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report">
    <title>Privacy after Big Data - Workshop Report</title>
    <link>http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report</link>
    <description>
        &lt;b&gt;The Centre for Internet and Society (CIS) and the Sarai programme, CSDS, organised a workshop on 'Privacy after Big Data: What Changes? What should Change?' on Saturday, November 12, 2016 at Centre for the Study of Developing Societies in New Delhi. &lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;This workshop aimed to build a dialogue around some of the key government-led big data initiatives in India and elsewhere that are contributing significant new challenges and concerns to the ongoing debates on the right to privacy. It was an open event.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In this age of big data, discussions about privacy are intertwined with the use of technology and the data deluge. Though big data possesses enormous value for driving innovation and contributing to productivity and efficiency, privacy concerns have gained significance in the dialogue around regulated use of data and the means by which individual privacy might be compromised through means such as surveillance, or protected. The tremendous opportunities big data creates in varied sectors ranges from financial technology, governance, education, health, welfare schemes, smart cities to name a few. With the UID project re-animating the Right to Privacy debate in India, and the financial technology ecosystem growing rapidly, striking a balance between benefits of big data and privacy concerns is a critical policy question that demands public dialogue and research to inform an evidence based decision. Also, with the advent of potential big data initiatives like the ambitious Smart Cities Mission under the Digital India Scheme, which would rely on harvesting large data sets and the use of analytics in city subsystems to make public utilities and services efficient, the tasks of ensuring data security on one hand and protecting individual privacy on the other become harder.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This workshop sought to discuss some of the emerging problems due to the advent of big data and possible ways to address these problems. The workshop began with Amber Sinha of CIS and Sandeep Mertia of Sarai introducing the topic of big data and implications for privacy. Both speakers tried to define big data and brief history of the evolution of the term and raised questions about how we understand it. Dr. Usha Ramanathan spoke on the right to privacy in the context of the ongoing Aadhaar case and Vipul Kharbanda introduced the concept of Habeas Data as a possible solution to the privacy problems posed by big data.  Amelia Andersotter discussed national centralised digital ID systems and their evolution in Europe, often operating at a cross-functional scale, and highlighted its implications for discussions on data protection, welfare governance, and exclusion from public and private services. Srikanth Lakshmanan spoke of the issues with technology and privacy, and possible technological solutions.  Dr. Anupam Saraph discussed the rise of digital banking and Aadhaar based payments and its potential use for corrupt practices. Astha Kapoor of Microsave spoke about her experience of implementation of digital money solution in rural India.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Post lunch, Dr. Anja Kovacs and Mathew Rice spoke on the rise of mass communication surveillance across the world, and the evolving challenges of regulating surveillance by government agencies. Mathew also spoke of privacy movements by citizens and civil society in regions. In the final speaking session, Apar Gupta and Kritika Bhardwaj traced the history of jurisprudence on the right to privacy and the existing regulations and procedures. In the final session, the participants discussed various possible solutions to privacy threats from big data and identity projects including better regulation, new approached such as harms based regulation and privacy risk assessments, and conceiving privacy as a horizontal right. The workshop ended with vote of thanks from the organizers.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The agenda for the event can be accessed &lt;a href="https://github.com/cis-india/website/raw/master/docs/CIS-Sarai_PrivacyAfterBigData_ConceptAgenda.pdf"&gt;here&lt;/a&gt;, and the transcript is available &lt;a class="external-link" href="http://cis-india.org/internet-governance/files/privacy-after-big-data/"&gt;here&lt;/a&gt;.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report'&gt;http://editors.cis-india.org/internet-governance/blog/privacy-after-big-data-workshop-report&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2017-01-27T01:09:17Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-">
    <title>New Recommendations to Regulate Online Hate Speech Could Pose More Problems Than Solutions</title>
    <link>http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-</link>
    <description>
        &lt;b&gt;The T.K. Viswanathan committee’s recommendations could prove to be dangerous for free speech if acted upon without resolving its flaws.&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published by &lt;a class="external-link" href="https://thewire.in/187381/new-recommendations-regulate-online-hate-speech-problems/"&gt;Wire&lt;/a&gt; on October 14, 2017&lt;/p&gt;
&lt;hr /&gt;
&lt;p style="text-align: justify; "&gt;&lt;a title="It was reported last week" href="https://thewire.in/184920/post-section-66a-central-panel-tells-government-to-amend-ipc-crpc-it-act-to-punish-online-hate-speech/" rel="noopener
        noreferrer" target="_blank"&gt;&lt;span&gt;It was reported last week&lt;/span&gt;&lt;/a&gt; that an expert       committee headed by T.K. Viswanathan, former secretary general of       Lok Sabha, recommended that the Indian Penal Code (IPC), the Code       of Criminal Procedure and the Information Technology Act be       amended to include stringent penal provisions regarding online       hate speech. While this report has not been made public, &lt;a title="the Indian
        Express reported" href="http://indianexpress.com/article/india/hate-speech-online-punishment-supreme-court-section-66a-information-technology-act-narendra-modi-4876648/" rel="external nofollow" target="_blank"&gt;&lt;span&gt;the&lt;em&gt; Indian Express&lt;/em&gt; reported&lt;/span&gt;&lt;/a&gt; that       the committee’s recommendations include, among other things,       insertion and expansion of penal provisions in the IPC on       ‘incitement to hatred’ (Section 153C) and ‘causing fear, alarm or       provocation of violence’ (Section 505A) to include online speech,       and creation of the offices of state cyber crime coordinator and       district cyber crime cell.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Online hate speech has been among the more complex issues with       regard to the regulation of technology. The complexity of       restricting hate speech has to do with a number of factors,       including the ubiquity of strong opinions in online speech, often       offensive to certain groups, the interplay between individual and       group rights, and the tensions between the values of dignity,       liberty and equality. Siddharth Narrain has &lt;a title="pointed out" href="http://jmi.ac.in/upload/menuupload/16_ccmg_epwsedition.pdf" rel="external nofollow" target="_blank"&gt;&lt;span&gt;pointed         out&lt;/span&gt;&lt;/a&gt; in his thesis on hate speech law that the use of law to       curb offensive or hurtful speech has been done by religious       groups, caste based groups, occupation based groups with strong       caste associations, language groups and gender based groups. The       range of actions arising from such uses of the law include the       banning of books, criminal proceedings for political satire, or       even ‘liking’ political posts on social media.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The relationship between speech acts and acts of violence is a       complicated issue with little consensus on appropriate ways to       regulate it. Scholars such as Jonathan Maynard have advocated       greater reliance on non-legal responses such as counter speech, as       the use of criminal law to tackle speech often has the effect of       chilling forms of dissent. The f&lt;span&gt;&lt;span&gt;ormulation and application of legal           tests in criminal law with respect to hate speech is also hard           as hate speech has much to do with the content of speech as it           has to do with the context, including factors such as power           structures.&lt;/span&gt; &lt;span&gt;Speech by a           figure in a position of power also has a greater likelihood to           result in a call for violence. &lt;/span&gt;&lt;/span&gt;&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Before looking at the specific recommendations made by the T.K.       Viswanathan committee, it would be worthwhile to also look at the       background of this committee. The committee notes with approval       the &lt;a title="Law Commission of
        India’s 267th report on the issue of hate speech" href="http://lawcommissionofindia.nic.in/reports/Report267.pdf" rel="external nofollow" target="_blank"&gt;&lt;span&gt;Law Commission         of India’s 267th report on the issue of hate speech&lt;/span&gt;&lt;/a&gt;. The Law       Commission, in turn, was acting at the behest of observations made       by the Supreme Court in &lt;a title="Pravasi Bhalai
        Sangathan v. Union of India" href="https://indiankanoon.org/docfragment/61854231/?formInput=ramesh%20union%20india%20" rel="external nofollow" target="_blank"&gt;&lt;span&gt;&lt;i&gt;Pravasi Bhalai Sangathan&lt;/i&gt; v.         &lt;i&gt;Union of India&lt;/i&gt;&lt;/span&gt;&lt;/a&gt; in 2014. In this case, the Supreme       Court exhibited judicial restraint and refused to frame guidelines       prohibiting political hate speech, and had instead requested the       Law Commission to look into it. However, the court noted with       approval international case law on the issues, particularly the       observations in the Canadian case &lt;a title="Saskatchewan v. Whatcott" href="https://scc-csc.lexum.com/scc-csc/scc-csc/en/item/12876/index.do" rel="external nofollow" target="_blank"&gt;&lt;span&gt;&lt;i&gt;Saskatchewan&lt;/i&gt; v. &lt;i&gt;Whatcott&lt;/i&gt;&lt;/span&gt;&lt;/a&gt;.       Relying on &lt;i&gt;Whatcott&lt;/i&gt;, the Supreme Court provides a       definition of hate speech that includes the following statements:&lt;/p&gt;
&lt;blockquote style="text-align: justify; "&gt;
&lt;p&gt;“Hate speech is an effort to marginalise individuals based on         their membership in a group. Using expression that exposes the         group to hatred, hate speech seeks to delegitimise group members         in the eyes of the majority, reducing their social standing and         acceptance within society. Hate speech, therefore, rises beyond         causing distress to individual group members..[and] lays the         groundwork for later, broad attacks on vulnerable that can range         from discrimination, to ostracism, segregation, deportation,         violence and, in the most extreme cases, to genocide. Hate         speech also impacts a protected group’s ability to respond to         the substantive ideas under debate, thereby placing a serious         barrier to their full participation in our democracy.”&lt;/p&gt;
&lt;/blockquote&gt;
&lt;p style="text-align: justify; "&gt;Thus, it is evident that the Supreme Court itself clearly states       that hate speech must be viewed through the lens of the right to       equality, and relates to speech not merely offensive or hurtful to       specific individuals, but also inciting discrimination or violence       on the basis of inclusion of individuals within certain groups. It       is important to note that it is the consequence of speech that is       the determinative factor in interpreting hate speech, more so than       even perhaps the content of the speech. This is also broadly       reflected in the Law Commission’s report that identifies the       status of the author of the speech, the status of victims of the       speech, the potential impact of the speech and whether it amounts       to incitement as key identifying criteria of hate speech.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, in the commission’s recommendations, these principles       are not fairly represented in the suggested new Sections 153C and       505A, as per a &lt;a title="draft released" href="https://internetfreedom.in/government-committee-wants-to-bring-back-section-66a/" rel="external nofollow" target="_blank"&gt;&lt;span&gt;draft         released&lt;/span&gt;&lt;/a&gt; by the Internet Freedom Foundation. Section 505A,       for instance, refers to “highly disparaging, indecent, abusive,       inflammatory, false or grossly offensive information” and       “derogatory information.” These are extremely broad terms, not       having any guiding jurisprudence within Indian or international       law, which may be helpful in restrictively interpreting them. It       is important to note the similarities between this provision and       the repealed Section 66A of the Information Technology Act, which       sought to criminalise speech that was “grossly offensive,” having       “menacing character,” or “causing       annoyance..danger..insult..enmity, hatred or ill will.”&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;These terms in the recommended Section 505A also run foul of the       observations of Justice Nariman in &lt;em&gt;&lt;a title="Shreya
          Singhal v. Union of India" href="https://cis-india.org/internet-governance/blog/shreya-singhal-judgment.pdf" rel="external nofollow" target="_blank"&gt;&lt;span&gt;Shreya Singhal v. Union of India&lt;/span&gt;&lt;/a&gt;,&lt;/em&gt; where       he took exception to the nature of the terms in Section 66A by       stating that, “Information that may be grossly offensive or which       causes annoyance or inconvenience are undefined terms which take       into the net a very large amount of protected and innocent       speech.” While these terms are somewhat tempered in this provision       with a requirement to show intent to “cause fear of injury or       alarm,” they remain exceedingly broad and contrary to the       requirement that restrictions on speech must be couched in the       narrowest possible terms.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The T.K. Viswanathan committee, in addition, seeks to bring,       within the scope of the prospective Sections 153C and 505A,       electronic speech. As per its recommendations, ‘means of       communication’ would include “any words either spoken or written,       signs, visible representations, information, audio, video or       combination of both transmitted, retransmitted or sent through any       telecommunication service, communication device or computer       resource.” This could have the impact of bringing in a provision       that has some similar effects as that of the now defunct Section       66A of the Information Technology Act. The lack of regard for the       Supreme Court’s observations on hate speech, the need to look at       it through the lens of equality and the over-broadness of       restrictions on speech are likely to be dangerous for free speech       if the recommendations of this committee are acted upon.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-'&gt;http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Hate Speech</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    

   <dc:date>2018-01-02T03:06:18Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms">
    <title>New Media, personalisation and the role of algorithms</title>
    <link>http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms</link>
    <description>
        &lt;b&gt;In his much acclaimed book, The Filter Bubble, Eli Pariser explains how personalisation of services on the web works and laments that they are creating individual bubbles for each user, which run counter to the idea of the Internet as an inherently open place. While Pariser’s book looks at the practices of various large companies providing online services, he briefly touches upon the role of new media such as search engines and social media portals in new curation. Building upon Pariser’s unexplored argument, this article looks at the impact of algorithmic decision-making and Big Data in the context of news reporting and curation.&lt;/b&gt;
        &lt;em&gt;&lt;br /&gt;&lt;/em&gt;
&lt;blockquote&gt;
&lt;div&gt;
&lt;div&gt;&lt;em&gt;Everything which bars freedom and fullness of communication sets up barriers that divide human beings into sets and cliques, into antagonistic sects and factions, and thereby undermines the democratic way of life. &lt;/em&gt;—John Dewey&lt;/div&gt;
&lt;/div&gt;
&lt;/blockquote&gt;
&lt;p&gt;&amp;nbsp;Eli Pariser, in his book, The Filter Bubble,[1] refers to the scholarship by Walter Lippmann and John Dewey as integral to the evolution of the understanding of the democratic and ethical duties of the Fourth Estate. Lippmann was disillusioned by the role of newspapers in propaganda for the First World War. He responded with three books in quick succession — Liberty and the News,[2] Public Opinion[3] and The Phantom Public.[4] Lippmann brought attention the fact that the process of news-reporting was conducted through privately determined and unexamined standards. The failure of the Fourth Estate to perform its democratic functions, was, in the opinion of Lippmann, one of the prime factors responsible for the public not being an informed and rational entity. John Dewey, while rejecting Lippmann’s arguments that matters of public policy can only be determined by inside experts with training and education, did acknowledge the his critique of the media.&lt;/p&gt;
&lt;p&gt;Pariser points to the creation of a wall between editorial decisionmaking and advertiser interests, as the eventual result of the Lippmann and Dewey debate. While accepting that this division between the financial and reporting sides of media houses has not been always observed, Pariser emphasises that the fact that the standard exists is important.[5] Unlike traditional media, the new media which relies on algorithmic decision-making for personalisation is not subject to the same standards which try to mitigate the influence of commercial interests on editorial decisions while performing many of the same functions as the traditional media.[6] &amp;nbsp;&lt;/p&gt;
&lt;h3&gt;How personalisation algorithms work&lt;/h3&gt;
&lt;p dir="ltr"&gt;Kevin Slavin, at his famous talk in the TEDGLobal Conference, characterised algorithms as “maths that computers use to decide stuff” and that it was infiltrating every aspect of our lives.[7] According to Slavin’s view, algorithms can be seen as control technologies and shape our world constantly through media and information systems, dynamically modifying content and function through these programmed routines. Search engines and social media platforms perpetually rank user-generated content through algorithms.[8]&lt;/p&gt;
&lt;p&gt;Personalisation technologies have various advantages. It translates into more relevant content, which for service providers means more clicks and revenue and for consumer, less time spent on finding the content.[9] However, it also leads to privacy compromise, lack of control and reduced individual capability.[10] Search engines like Google use the famous PageRank algorithm, which combined with geographical location and previous searches yields most relevant search results.[11] PageRank algorithm uses various real time variables dependent on both voluntary and involuntary user inputs. These variables include number of clicks, number of occurrences of the key terms and number of references by other credible pages etc. This data in turn determines the order of pages in search results and influences the way we perceive, understand and analyse information.[12] Maps showing real time traffic information retrieve data from laser and infrared sensors alongside the road and from information from devices of users. Once this real time data is combined with historical trends, these maps recommend rout to every user, hence influencing the traffic patterns.[13]&lt;/p&gt;
&lt;p&gt;Even though this phenomenon of personalization may appears to be new, it has been prevalent in the society for ages.[14] The history of mass media culture clearly shows personalization has always been a method to increase market, market reach and customer satisfaction.[15] Newspapers have sections dedicated to special topics, radio and TV have channels dedicated to different interest groups, age groups and consumers.[16] These personalised sections in a newspaper and personalised channels on radio and television don’t just provide greater satisfaction to the readers or listeners or consumers, they also provide targeted advertisement space for the advertisers and content developers. However, digital footprints and mass collection of data have made this phenomenon much more granular and detailed. Geographical location of an individual can tell a lot about their community, their culture and other important traits local to a community.[17] This data further assists in personalisation. Current developments in technology not only help in better collection of data about personal preferences but also help in better personalisation.&lt;/p&gt;
&lt;p&gt;Pariser mentions three ways in which the personalization technologies of this day are different from those of the past. First, for the very first time, individuals are alone in the filter bubble. While in traditional forms of personalisation, there were various individuals who shared the same frame of reference, now there is a separate sets of filters governing the dissemination of content to each individual.[18] Second, the personalisation technologies are entirely invisible now, and there is little that consumers can do to control or modify them.[19] Third, often the decision to be subject to these personalisation technologies is not an informed choice. A good example of this would be an individual’s geographical location.[20]&lt;/p&gt;
&lt;h3&gt;The neutrality of New Media?&lt;/h3&gt;
&lt;p dir="ltr"&gt;More and more, we have noticed personalisation technologies having an impact on how we consume news on the Internet. Google News, Facebook’s News Feed which tries to put together a dynamic feed for both personal and global stories, and Twitter’s trending hashtag feature, have brought forward these services are key drivers of an emerging news ecosystem. Initially, this new media was hailed as a natural consequence of the Internet which would enable greater public participation, allow journalists to find more stories and engage with the readers directly. &amp;nbsp;An illustration of the same could be seen in the way Internet based news media and social networking websites behaved in the aftermath of Israel’s attacks on a United Nations run school in Gaza strip. While much of the international Internet media covered the story, Israel’s home media did not cover the story. The only exception to this was the liberal Israeli news website Ha’aretz.[21] Network graph details of Twitter, for a few days immediately after the incident clearly show the social media manifestation of the event in the personalised cyberspace. It is clearly visible that when most of the word was re-tweeting news of this heinous act of Israel, Israeli’s hardly re-tweeted this news. In fact they were busty re-tweeting the news of rocket attacks on Israel.[22]&lt;/p&gt;
&lt;p&gt;The use of social media in newsmaking was hailed by many scholars as symptomatic of the decentralisation characteristic of the Internet. It has been seen as movement towards greater grassroots participation by negating the ‘gatekeeping’ role traditionally played by editors. &amp;nbsp;Thomas Poell and José van Dijck punch holes in theory of social media and other online technologies as mere facilitators of user participation and translators of user preferences through Big Data analytics.[23] They quote T. Gillespie’s work which talks of the narrative of these online services as platforms which are “open, neutral, egalitarian and progressive support for activity.”[24]&lt;/p&gt;
&lt;p&gt;Pedro Domingos calls the overwhelming number of choices as the defining problem of the information age, and machine learning and data analytics as the largest part of this solution.[25] The primary function of algorithmic decision making in the context of consumption of content is to narrow down the choices. Domingos is more optimistic about the impact of these technologies, and he says “last step of the decision is usually still for humans to make, but learners intelligently reduce the choices to something a human can manage.”[26] On the other hand, Pariser is more circumspect about the coercive result of machine learning algorithms. Whichever way we lean, we have to accept that a large part of personalisation algorithms is to select and prioritize content by categorising it on the basis of relevance and popularity. &amp;nbsp;&lt;/p&gt;
&lt;p&gt;Poell and van Dijck call this a new knowledge logic which in effect replaces human judgement (as, earlier exercised by editors) to some kind of proxy decisionmaking based on data. Their main thesis is that there is little evidence to suggest that the latter is more democratic than former and creates new problems of its own. They go on to compare the practices of various services including Facebook’s new graph and Twitter’s trending topic, and conclude that they prioritise breaking news stories over other kinds of content.[27] For instance, the algorithm for the trending topics depends not on the volume but the velocity of the tweets with the hashtag or term. It could be argued that given this predilection, the algorithms will rarely prefer complex content. If we go by Lippmann and Dewey’s idea that the role of the Fourth Estate is to inform public debate and accountability of those in positions of power, this aspect of Big Data algorithms does not correspond with this role.&lt;/p&gt;
&lt;h3&gt;Quantified Audience&lt;/h3&gt;
&lt;p dir="ltr"&gt;Another aspect of use of Big Data and algorithms in New Media that requires attention is that the networked infrastructure enables a quantified audience. C W Anderson who has studied newsroom practices in the US looked at role played by audience quantification and rationalization in shifting newswork practices. He concluded that more and more, journalists are less autonomous in their news decisions and increasingly reliant on audience metrics as a supplement to news &amp;nbsp;judgment.[28] Poell and van Dijck review the the practices by some leading publications such a New York Times, L.A. Times and Huffington Post, and degree to which audience metrics &amp;nbsp;dictates editorial decisions. While New York Times seems to prioritise content on their social media portals based on expectation of spike in user traffic, L.A. Times goes one step further by developing content specifically aimed towards promoting greater social participation. Neither of these practices though compare to the reliance on SEO and SMO strategies of web-born news providers like Huffington Post. They have traffic editors who trawl the Internet for trending topics and popular search terms, the feedback from them dictates the content creation.[29]&lt;/p&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p dir="ltr"&gt;The above factors demonstrate that the idea of New Media leading to the Fourth Estate performing its democratic functions does not take into account the actual practices. This idea is based on the erroneous assumption that technology, in general and algorithms, in particular are neutral. While the emergence of New Media might have reduced the gatekeeping role played by the editors, its strong prioritisation of content that will be popular reduce the validity of arguments that it leads to more informed public discussion. As Pariser said, the traditional media scores over the New Media inasmuch as there is an existence of a standard of division between editorial decisionmaking and advertiser interest. While this standard is flouted by media houses all the time, it exists as a metric to aspire to and measure service providers against. The New Media performs many of the same functions and maybe it is time to evolve some principles and ethical standards that take into account the need for it to perform these democratic functions.&lt;/p&gt;
&lt;h3&gt;Endnotes&amp;nbsp;&lt;/h3&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[1]&lt;/sup&gt;&lt;/sup&gt; Eli Pariser, The Filter Bubble: What the Internet is
hiding from you (The Penguin Press, New York, 2011)&amp;nbsp;&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span class="MsoFootnoteReference"&gt;&lt;span class="MsoFootnoteReference"&gt;[2]&lt;/span&gt;&lt;/span&gt;&amp;nbsp;Walter Lippmann, Liberty and News (Harcourt, Brace
and Howe, New York 1920) available at&lt;a href="https://archive.org/details/libertyandnews01lippgoog"&gt;https://archive.org/details/libertyandnews01lippgoog&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[3]&lt;/sup&gt;&lt;/sup&gt; Walter Lippmann, Public Opinion (Harcourt, Brace and
Howe, New York 1920) available at &lt;a href="http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html"&gt;http://xroads.virginia.edu/~Hyper2/CDFinal/Lippman/cover.html&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[4]&lt;/sup&gt;&lt;/sup&gt; Walter Lippmann, The Phantom Public (Transaction
Publishers, New York, 1925)&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[5]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 35.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[6]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 36.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[7]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en"&gt;https://www.ted.com/talks/kevin_slavin_how_algorithms_shape_our_world/transcript?language=en&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[8]&lt;/sup&gt;&lt;/sup&gt; Fenwick McKelvey, “Algorithmic Media Need Democratic
Methods: Why Publics Matter”, available at &lt;a href="http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf"&gt;http://www.fenwickmckelvey.com/wp-content/uploads/2014/11/2746-9231-1-PB.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[9]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1"&gt;http://mashable.com/2011/06/03/filters-eli-pariser/#9tIHrpa_9Eq1&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[10]&lt;/sup&gt;&lt;/sup&gt; Helen Ashman, Tim Brailsford, Alexandra Cristea, Quan
Z Sheng, Craig Stewart, Elaine Torns and Vincent Wade, “The ethical and social
implications of personalization technologies for e-learning” available at &lt;a href="http://www.sciencedirect.com/science/article/pii/S0378720614000524"&gt;http://www.sciencedirect.com/science/article/pii/S0378720614000524&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[11]&lt;/sup&gt;&lt;/sup&gt; Sergey Brin and Lawrence Page, “The Anatomy of a
Large-Scale Hypertextual Web Search Engine” available at &lt;a href="http://infolab.stanford.edu/pub/papers/google.pdf"&gt;http://infolab.stanford.edu/pub/papers/google.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[12]&lt;/sup&gt;&lt;/sup&gt; Ian Rogers, “The Google Pagerank Algorithm and How It
Works” available at &lt;a href="http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm"&gt;http://www.cs.princeton.edu/~chazelle/courses/BIB/pagerank.htm&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[13]&lt;/sup&gt;&lt;/sup&gt; Trygve Olson and Terry Nelson, “The Internet’s Impact
on Political Parties and Campaigns”, available at &lt;a href="http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942"&gt;http://www.kas.de/wf/doc/kas_19706-544-2-30.pdf?100526130942&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[14]&lt;/sup&gt;&lt;/sup&gt; Ian Witten, “Bias, privacy and and personalisation on
the web”, available at &lt;a href="http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf"&gt;http://www.cs.waikato.ac.nz/~ihw/papers/07-IHW-Bias,privacyonweb.pdf&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[15]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 10.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[16]&lt;/sup&gt;&lt;/sup&gt; &lt;a href="https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/"&gt;https://www.americanpressinstitute.org/publications/reports/survey-research/social-demographic-differences-news-habits-attitudes/&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[17]&lt;/sup&gt;&lt;/sup&gt; Charles Heatwole, “Culture: A Geographical Perspective”
available at &lt;a href="http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html"&gt;http://www.p12.nysed.gov/ciai/socst/grade3/geograph.html&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[18]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 10.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[19]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Id&lt;/em&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[20]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
1 at 11.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[21]&lt;/sup&gt;&lt;/sup&gt; Paul Mason, “Why Israel is losing the social media
war over Gaza?” available at &lt;a href="http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182"&gt;http://blogs.channel4.com/paul-mason-blog/impact-social-media-israelgaza-conflict/1182&lt;/a&gt;.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[22]&lt;/sup&gt;&lt;/sup&gt; Gilad Lotan, Israel, Gaza, War &amp;amp; Data: Social
Networks and the Art of Personalizing Propaganda available at &lt;a href="http://www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html"&gt;www.huffingtonpost.com/entry/israel-gaza-war-social-networks-data_b_5658557.html&lt;/a&gt;&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[23]&lt;/sup&gt;&lt;/sup&gt; Thomas Poell and José van Dijck, “Social Media and
Journalistic Independence” in Media Independence: Working with Freedom or
Working for Free?, edited by James Bennett &amp;amp; Niki Strange. (Routledge,
London, 2015)&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[24]&lt;/sup&gt;&lt;/sup&gt; T Gillespie, “The politics of ‘platforms,” in New
Media &amp;amp; Society (Volume 12, Issue 3).&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[25]&lt;/sup&gt;&lt;/sup&gt; Pedro Domingos, The Master Algorithm: How the quest
for the ultimate learning machine will re-make the world (Basic Books, New
York, 2015) at 38.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[26]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Ibid&lt;/em&gt; at 40.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[27]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra&lt;/em&gt; Note
23.&lt;/p&gt;
&lt;p class="normal"&gt;&lt;sup&gt;&lt;sup&gt;[28]&lt;/sup&gt;&lt;/sup&gt; C W Anderson, Between creative and quantified
audiences: Web metrics and changing patterns of newswork in local US newsrooms,
available at &lt;a href="https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms"&gt;https://www.academia.edu/10937194/Between_Creative_And_Quantified_Audiences_Web_Metrics_and_Changing_Patterns_of_Newswork_in_Local_U.S._Newsrooms&lt;/a&gt;&lt;/p&gt;
&lt;p dir="ltr"&gt;
&lt;sup&gt;&lt;sup&gt;[29]&lt;/sup&gt;&lt;/sup&gt; &lt;em&gt;Supra &lt;/em&gt;Note 23.&lt;/p&gt;
&lt;p dir="ltr"&gt;&lt;span id="docs-internal-guid-24b4db2a-a606-d425-16ff-1d76b980367d"&gt;&lt;br /&gt;&lt;/span&gt;&lt;/p&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms'&gt;http://editors.cis-india.org/internet-governance/new-media-personalisation-and-the-role-of-algorithms&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Human Rights</dc:subject>
    
    
        <dc:subject>Big Data</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Machine Learning</dc:subject>
    
    
        <dc:subject>Algorithms</dc:subject>
    
    
        <dc:subject>New Media</dc:subject>
    

   <dc:date>2017-01-16T07:20:52Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle">
    <title>New Approaches to Information Privacy – Revisiting the Purpose Limitation Principle</title>
    <link>http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle</link>
    <description>
        &lt;b&gt;Article on Aadhaar throwing light on privacy and data protection.&lt;/b&gt;
        
&lt;p&gt;&amp;nbsp;&lt;/p&gt;
&lt;p&gt;This was &lt;a class="external-link" href="http://www.digitalpolicy.org/revisiting-the-principles-of-purpose-limitation-under-existing-data-protection-norms/"&gt;published in Digital Policy Portal&lt;/a&gt; on July 13, 2016.&lt;/p&gt;
&lt;hr /&gt;
&lt;h3&gt;Introduction&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Last year, Mukul Rohatgi, the Attorney General of India, called into question existing jurisprudence of the last 50 years on the constitutional validity of the right to privacy.&lt;sup&gt;1&lt;/sup&gt; Mohatgi was rebutting the arguments on privacy made against Aadhaar, the unique identity project initiated and implemented in the country without any legislative mandate.&lt;sup&gt;2&lt;/sup&gt; The question of the right to privacy becomes all the more relevant in the context of events over the last few years—among them, the significant rise in data collection by the state through various e-governance schemes,&lt;sup&gt;3&lt;/sup&gt; systematic access to personal data by various wings of the state through a host of surveillance and law enforcement initiatives launched in the last decade,&lt;sup&gt;4&lt;/sup&gt; the multifold increase in the number of Indians online, and the ubiquitous collection of personal data by private parties.&lt;sup&gt;5&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;These developments have led to a call for a comprehensive privacy legislation in India and the adoption of the National Privacy Principles as laid down by the Expert Committee led by Justice AP Shah.&lt;sup&gt;6&lt;/sup&gt; There are privacy-protection legislation currently in place such as the Information Technology Act, 2000 (IT Act), which was enacted to govern digital content and communication and provide legal recognition to electronic transactions. This legislation has provisions that can safeguard—and dilute—online privacy. At the heart of the data protection provisions in the IT Act lies section 43A and the rules framed under it, i.e., Reasonable security practices and procedures and sensitive personal data information.&lt;sup&gt;7&lt;/sup&gt;Section 43A mandates that body corporates who receive, possess, store, deal, or handle any personal data to implement and maintain ‘reasonable security practices’, failing which, they are held liable to compensate those affected. Rules drafted under this provision also mandated a number of data protection obligations on corporations such the need to seek consent before collection, specifying the purposes of data collection, and restricting the use of data to such purposes only. There have been questions raised about the validity of the Section 43A Rules as they seek to do much more than mandate in the parent provisions, Section 43A— requiring entities to maintain reasonable security practices.&lt;/p&gt;
&lt;h3&gt;Privacy as control?&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Even setting aside the issue of legal validity, the kind of data protection framework envisioned by Section 43A rules is proving to be outdated in the context of how data is now being collected and processed. The focus of Section 43 A Rules—as well as that of draft privacy legislations in India&lt;sup&gt;8&lt;/sup&gt;—is based on the idea of individual control. Most apt is Alan Westin’s definition of privacy: “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to other.”&lt;sup&gt;9&lt;/sup&gt; Westin and his followers rely on the normative idea of “informational self- determination”, the notion of a pure, disembodied, and atomistic self, capable of making rational and isolated choices in order to assert complete control over personal information. More and more this has proved to be a fiction especially in a networked society.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Much before the need for governance of information technologies had reached a critical mass in India, Western countries were already dealing with the implications of the use of these technologies on personal data. In 1973, the US Department of Health, Education and Welfare appointed a committee to address this issue, leading to a report called ‘Records, Computers and Rights of Citizens.’&lt;sup&gt;10&lt;/sup&gt; The Committee’s mandate was to “explore the impact of computers on record keeping about individuals and, in addition, to inquire into, and make recommendations regarding, the use of the Social Security number.” The Report articulated five principles which were to be the basis of fair information practices: transparency; use limitation; access and correction; data quality; and security. Building upon these principles, the Committee of Ministers of the Organization for Economic Cooperation and Development (OECD) arrived at the Guidelines on the Protection of Privacy and Transborder Flows of Personal Data in 1980.&lt;sup&gt;11&lt;/sup&gt; These principles— Collection Limitation, Data Quality, Purpose Specification, Use Limitation, Security Safeguards, Openness, Individual Participation and Accountability—are what inform most data protection regulations today including the APEC Framework, the EU Data Protection Directive, and the Section 43A Rules and Justice AP Shah Principles in India.&lt;/p&gt;
&lt;p&gt;Fred Cate describes the import of these privacy regimes as such:&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;“All of these data protection instruments reflect the same approach: tell individuals what data you wish to collect or use, give them a choice, grant them access, secure those data with appropriate technologies and procedures, and be subject to third-party enforcement if you fail to comply with these requirements or individuals’ expressed preferences”&lt;sup&gt;12&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;This is in line with Alan Westin’s idea of privacy exercised through individual control. Therefore the focus of these principles is on empowering the individuals to exercise choice, but not on protecting individuals from harmful or unnecessary practices of data collection and processing. The author of this article has earlier written&lt;sup&gt;13&lt;/sup&gt; about the sheer inefficacy of this framework which places the responsibility on individuals. Other scholars like Daniel Solove,&lt;sup&gt;14&lt;/sup&gt; Jonathan Obar&lt;sup&gt;15&lt;/sup&gt; and Fred Cate&lt;sup&gt;16&lt;/sup&gt; have also written about the failure of traditional data protection practices of notice and consent. While these essays dealt with the privacy principles of choice and informed consent, this paper will focus on the principles of purpose limitation.&lt;/p&gt;
&lt;h3&gt;Purpose Limitation and Impact of Big Data&lt;/h3&gt;
&lt;p&gt;The principles of purpose limitation or purpose specification seeks to ensure the following four objectives:&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li&gt;Personal information collected and processed should be adequate and relevant to the purposes for which they are processed.&lt;/li&gt;
&lt;li&gt;The entities collect, process, disclose, make available, or otherwise use personal information only for the stated purposes.&lt;/li&gt;
&lt;li&gt;In case of change in purpose, the data’s subject needs to be informed and their consent has to be obtained.&lt;/li&gt;
&lt;li&gt;After personal information has been used in accordance with the identified purpose, it has to be destroyed as per the identified procedures.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The purpose limitation along with the data minimisation principle—which requires that no more data may be processed than is necessary for the stated purpose—aim to limit the use of data to what is agreed to by the data subject. These principles are in direct conflict with new technology which relies on ubiquitous collection and indiscriminate uses of data. The main import of Big Data technologies on the inherent value in data which can be harvested not by the primary purposes of data collection but through various secondary purposes which involve processing of the data repeatedly.&lt;sup&gt;17&lt;/sup&gt;Further, instead to destroying the data when its purpose has been achieved, the intent is to retain as much data as possible for secondary uses. Importantly, as these secondary uses are of an inherently unanticipated nature, it becomes impossible to account for it at the stage of collection and providing the choice to the data subject.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Followers of the discourse on Big Data would be well aware of its potential impacts on privacy. De-identification techniques to protect the identities of individuals in dataset face a threat from an increase in the amount of data available either publicly or otherwise to a party seeking to reverse-engineer an anonymised dataset to re-identify individuals. &lt;sup&gt;18&lt;/sup&gt; Further, Big Data analytics promise to find patterns and connections that can contribute to the knowledge available to the public to make decisions. What is also likely is that it will lead to revealing insights about people that they would have preferred to keep private.&lt;sup&gt;19&lt;/sup&gt;In turn, as people become more aware of being constantly profiled by their actions, they will self-regulate and ‘discipline’ their behaviour. This can lead to a chilling effect.&lt;sup&gt;20&lt;/sup&gt; Meanwhile, Big Data is also fuelling an industry that incentivises businesses to collect more data, as it has a high and growing monetary value. However, Big Data also promises a completely new kind of knowledge that can prove to be revolutionary in fields as diverse as medicine, disaster-management, governance, agriculture, transport, service delivery, and decision-making.&lt;sup&gt;21&lt;/sup&gt; As long as there is a sufficiently large and diverse amount of data, there could be invaluable insights locked in it, accessing which can provide solutions to a number of problems. In light of this, it is important to consider what kind of regulatory framework is most suitable which could facilitate some of the promised benefits of Big Data and at the same time mitigate its potential harm. This, coupled with the fact that the existing data protection principles have, by most accounts, run their course, makes the examination of alternative frameworks even more important. This article will examine some alternate proposals made to the existing framework of purpose limitation below.&lt;/p&gt;
&lt;h3&gt;Harms-based approach&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Some scholars like Fred Cate&lt;sup&gt;22&lt;/sup&gt; and Daniel Solove&lt;sup&gt;23&lt;/sup&gt; have argued that there is a need for the primary focus of data protection law to move from control at the stage of data collection to actual use cases. In his article on the failure of Fair Information Practice Principles,&lt;sup&gt;24&lt;/sup&gt;Cate puts forth a proposal for ‘Consumer Privacy Protection Principles.’ Cate envisions a more interventionist role of the data protection authorities by regulating information flows when required, in order to protect individuals from risky or harmful uses of information. Cate’s attempt is to extend the principles of consumer protection law of prevention and remedy of harms.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;In a re-examination of the OECD Privacy Principles, Cate and Viktor Mayer Schöemberger attempt to discard the use of personal data to only purposes specified. They felt that restricting the use of personal to only specified purposes could significantly threaten various research and beneficial uses of Big Data. Instead of articulating a positive obligations of what personal data collected could be used for, they attempt to arrive at a negative obligation of use-cases prevented by law. Their working definition of the Use specification principle broaden the scope of use cases by only preventing use of data “if the use is fraudulent, unlawful, deceptive or discriminatory; society has deemed the use inappropriate through a standard of unfairness; the use is likely to cause unjustified harm to the individual; or the use is over the well-founded objection of the individual, unless necessary to serve an over-riding public interest, or unless required by law.”&lt;sup&gt;25&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;While most standards in the above definition have established understanding in jurisprudence, the concept of unjustifiable harm is what we are interested in. Any theory of harms-based approach goes back to John Stuart Mill’s dictum that the only justifiable purpose to exert power over the will of an individual is to prevent harm to others. Therefore, any regulation that seeks to control or prevent autonomy of individuals (in this case, the ability of individuals to allow data collectors to use their personal data, and the ability of data collectors to do so, without any limitation) must clearly demonstrate the harm to the individuals in question.&lt;/p&gt;
&lt;p&gt;Fred Cate articulates the following steps to identify tangible harm and respond to its presence:&lt;sup&gt;26&lt;/sup&gt;&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li&gt;Focus on Use — Actual use of the data should be considered, not mere possession. The assumption is that the collection, possession, or transfer of information do not significantly harm people, rather it is the use of information following such collection, possession, or transfer.&lt;/li&gt;
&lt;li&gt;Proportionality — Any regulatory measure must be proportional to the likelihood and severity of the harm identified.&lt;/li&gt;
&lt;li&gt;Per se Harmful Uses — Uses which are always harmful must be prohibited by law&lt;/li&gt;
&lt;li&gt;Per se not Harmful Uses — If uses can be considered inherently not harmful, they should not be regulated.&lt;/li&gt;
&lt;li&gt;Sensitive Uses — In case where the uses are not per se harmful or not harmful, individual consent must be sought for using that data for those purposes.&lt;/li&gt;&lt;/ol&gt;
&lt;p style="text-align: justify;"&gt;The proposal by Cate argues for what is called a ‘use based system’, which is extremely popular with American scholars. Under this system, data collection itself is not subject to restrictions; rather, only the use of data is regulated. This argument has great appeal for both businesses who can reduce their overheads significantly if consent obligations are done away with as long as they use the data in ways which are not harmful, as well as critics of the current data protection framework which relies on informed consent. Lokke Moerel explains the philosophy of ‘harms based approach’ or ‘use based system’ in United States by juxtaposing it against the ‘rights based approach’ in Europe.&lt;sup&gt;27&lt;/sup&gt; In Europe, rights of individuals with regard to processing of their personal data is a fundamental human right and therefore, a precautionary principle is followed with much greater top-down control upon data collection. However, in the United States, there is a far greater reliance on market mechanisms and self-regulating organisations to check inappropriate processing activities, and government intervention is limited to cases where a clear harm is demonstrable.&lt;sup&gt;28&lt;/sup&gt;&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Continuing research by the Centre for Information Policy Leadership under its Privacy Risk Framework Project looks at a system of articulating what harms and risks arising from use of collected data. They have arrived a matrix of threats and harms. Threats are categorised as —a) inappropriate use of personal information and b) personal information in the wrong hands. More importantly for our purposes, harms are divided into: a) tangible harms which are physical or economic in nature (bodily harm, loss of liberty, damage to earning power and economic interests); b) intangible harms which can be demonstrated (chilling effects, reputational harm, detriment from surveillance, discrimination and intrusion into private life); and c) societal harm (damage to democratic institutions and loss of social trust).&lt;sup&gt;29&lt;/sup&gt;For any harms-based system, a matrix like above needs to emerge clearly so that regulation can focus on mitigating practices leading to the harms.&lt;/p&gt;
&lt;h3&gt;Legitimate interests&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Lokke Moerel and Corien Prins, in their article “Privacy for Homo Digitalis – Proposal for a new regulatory framework for data protection in the light of Big Data and Internet of Things”&lt;sup&gt;30&lt;/sup&gt; use the ideal of responsive regulation which considers empirically observable practices and institutions while determining the regulation and enforcement required. They state that current data protection frameworks—which rely on mandating some principles of how data has to be processed—is exercised through merely procedural notification and consent requirements. Further, Moerel and Prins feel that data protection law cannot only involve a consideration of individual interest but also needs to take into account collective interest. Therefore, the test must be a broader assessment than merely the purpose limitation articulating the interests of the parties directly involved, but whether a legitimate interest is achieved.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Legitimate interest has been put forth as an alternative to the purpose limitation. Legitimate is not a new concept and has been a part of the EU Data Protection Directive and also finds a place in the new General Data Protection Regulation. Article 7 (f) of the EU Directive&lt;sup&gt;31&lt;/sup&gt; provided for legitimate interest balanced against the interests or fundamental rights and freedoms of the data subject as the last justifiable reason for use of data. Due to confusion in its interpretation, the Article 29 Working Party, in 2014,&lt;sup&gt;32&lt;/sup&gt;looked into the role of legitimate interest and arrived at the following factors to determine the presence of a legitimate interest— a) the status of the individual (employee, consumer, patient) and the controller (employer, company in a dominant position, healthcare service); b) the circumstances surrounding the data processing (contract relationship of data subject and processor); c) the legitimate expectations of the individual.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;Federico Ferretti has criticised the legitimate interest principle as vague and ambiguous. The balancing of legitimate interest in using the data against fundamental rights and freedoms of the data subject gives the data controllers some degree of flexibility in determining whether data may be processed; however, this also reduces the legal certainty that data subject have of their data not being used for purposes they have not agreed to.&lt;sup&gt;33&lt;/sup&gt;However, it is this paper’s contention that it is not the intent of the legitimate interest criteria but the lack of consensus on its application which creates an ambiguity. Moerel and Prins articulate a test for using legitimate interest which is cognizant of the need to use data for the purpose of Big Data processing, as well as ensuring that the rights of data subjects are not harmed.&lt;/p&gt;
&lt;p style="text-align: justify;"&gt;As demonstrated earlier, the processing of data and its underlying purposes have become exceedingly complex and the conventional tool to describe these processes ‘privacy notices’ are too lengthy, too complex and too profuse in numbers to have any meaningful impact.&lt;sup&gt;34&lt;/sup&gt;The idea of information self-determination, as contemplated by Westin in American jurisprudence, is not achieved under the current framework. Moerel and Prins recommend five factors&lt;sup&gt;35&lt;/sup&gt; as relevant in determining the legitimate interest. Of the five, the following three are relevant to the present discussion:&lt;/p&gt;
&lt;ol style="list-style-type: lower-alpha;"&gt;
&lt;li style="text-align: justify;"&gt;Collective Interest — A cost-benefit analysis should be conducted, which examines the implications for privacy for the data subjects as well as the society, as a whole.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;The nature of the data — Rather than having specific categories of data, the nature of data needs to be assessed contextually to determine legitimate interest.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Contractual relationship and consent not independent grounds — This test has two parts. First, in case of contractual relationship between data subject and data controller: the more specific the contractual relationship, the more restrictions apply to the use of the data. Second, consent does not function as a separate principle which, once satisfied, need not be revisited. The nature of the consent (opportunities made available to data subject, opt in/opt out, and others) will continue to play a role in determining legitimate interest.&lt;/li&gt;&lt;/ol&gt;
&lt;h3&gt;Conclusion&lt;/h3&gt;
&lt;p style="text-align: justify;"&gt;Replacing the purpose limitation principles with a use-based system as articulated above poses the danger of allowing governments and the private sector to carry out indiscriminate data collection under the blanket guise that any and all data may be of some use in the future. The harms-based approach has many merits and there is a stark need for more use of risk assessments techniques and privacy impact assessments in data governance. However, it is important that it merely adds to the existing controls imposed at data collection, and not replace them in their entirety. On the other hand, the legitimate interests principle, especially as put forth by Moerel and Prins, is more cognizant of the different factors at play — the inefficacy of existing purpose limitation principles, the need for businesses to use data for purposes unidentified at the stage of collection, and the need to ensure that it is not misused for indiscriminate collection and purposes. However, it also poses a much heavier burden on data controllers to take into account various factors before determining legitimate interest. If legitimate interest has to emerge as a realistic alternative to purpose limitation, there needs to be greater clarity on how data controllers must apply this principle.&lt;/p&gt;
&lt;h3&gt;Endnotes&lt;/h3&gt;
&lt;ol&gt;
&lt;li style="text-align: justify;"&gt;Prachi Shrivastava, “Privacy not a fundamental right, argues Mukul Rohatgi for Govt as Govt affidavit says otherwise,” Legally India, Jyly 23, 2015, http://www.legallyindia.com/Constitutional-law/privacy-not-a-fundamental-right-argues-mukul-rohatgi-for-govt-as-govt-affidavit-says-otherwise.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt; Rebecca Bowe, “Growing Mistrust of India’s Biometric ID Scheme,” Electronic Frontier Foundation, May 4, 2012, https://www.eff.org/deeplinks/2012/05/growing-mistrust-india-biometric-id-scheme.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Lisa Hayes, “Digital India’s Impact on Privacy: Aadhaar numbers, biometrics, and more,” Centre for Democracy and Technology, January 20, 2015, https://cdt.org/blog/digital-indias-impact-on-privacy-aadhaar-numbers-biometrics-and-more/.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;“India’s Surveillance State,” Software Freedom Law Centre, http://sflc.in/indias-surveillance-state-our-report-on-communications-surveillance-in-india/.&lt;/li&gt;
&lt;li&gt;“Internet Privacy in India,” Centre for Internet and Society, http://cis-india.org/telecom/knowledge-repository-on-internet-access/internet-privacy-in-india.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Vivek Pai, “Indian Government says it is still drafting privacy law, but doesn’t give timelines,” Medianama, May 4, 2016, http://www.medianama.com/2016/05/223-government-privacy-draft-policy/.&lt;/li&gt;
&lt;li&gt;Information Technology (Intermediaries Guidelines) Rules, 2011,&lt;br /&gt; http://deity.gov.in/sites/upload_files/dit/files/GSR314E_10511%281%29.pdf.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Discussion Points for the Meeting to be taken by Home Secretary at 2:30 pm on 7-10-11 to discuss the drat Privacy Bill, http://cis-india.org/internet-governance/draft-bill-on-right-to-privacy.&lt;/li&gt;
&lt;li&gt;Alan Westin, Privacy and Freedom (New York: Atheneum, 2015).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;US Secretary’s Advisory Committee on Automated Personal Data Systems, Records, Computers and the Rights of Citizens, http://www.justice.gov/opcl/docs/rec-com-rights.pdf.&lt;/li&gt;
&lt;li&gt;OECD Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Fred Cate, “The Failure of Information Practice Principles,” in Consumer Protection in the Age of the Information Economy, ed. Jane K. Winn (Burlington: Aldershot, Hants, England, 2006) http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1156972.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Amber Sinha and Scott Mason, “A Critique of Consent in Informational Privacy,” Centre for Internet and Society, January 11, 2016, http://cis-india.org/internet-governance/blog/a-critique-of-consent-in-information-privacy.&lt;/li&gt;
&lt;li&gt;Daniel Solove, “Privacy self-management and consent dilemma,” Harvard Law Review 126, (2013): 1880.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Jonathan Obar, “Big Data and the Phantom Public: Walter Lippmann and the fallacy of data privacy self management,” Big Data and Society 2(2), (2015), doi: 10.1177/2053951715608876.&lt;/li&gt;
&lt;li&gt;Supra Note 12.&lt;/li&gt;
&lt;li&gt;Supra Note 14.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Paul Ohm, “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization” available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1450006; Arvind Narayanan and Vitaly Shmatikov, “Robust De-anonymization of Large Sparse Datasets” available at https://www.cs.utexas.edu/~shmat/shmat_oak08netflix.pdf.&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;D. Hirsch, “That’s Unfair! Or is it? Big Data, Discrimination and the FTC’s Unfairness Authority,” Kentucky Law Journal, Vol. 103, available at: http://www.kentuckylawjournal.org/wp-content/uploads/2015/02/103KyLJ345.pdf&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;A Marthews and C Tucker, “Government Surveillance and Internet Search Behavior”, available at http://ssrn.com/abstract=2412564; Danah Boyd and Kate Crawford, “Critical Questions for Big Data: Provocations for a cultural, technological, and scholarly phenomenon”, Information, Communication &amp;amp; Society, Vol. 15, Issue 5, (2012).&lt;/li&gt;
&lt;li style="text-align: justify;"&gt;Scott Mason, “Benefits and Harms of Big Data”, Centre for Internet and Society, available at http://cis-india.org/internet-governance/blog/benefits-and-harms-of-big-data#_ftn37.&lt;/li&gt;
&lt;li&gt;Cate, “The Failure of Information Practice Principles.”&lt;/li&gt;
&lt;li&gt;Solove, “Privacy self-management and consent dilemma,” 1882.&lt;/li&gt;
&lt;li&gt;Cate, “The Failure of Information Practice Principles.”&lt;/li&gt;
&lt;li&gt;Fred Cate and Viktor Schoenberger, “Notice and Consent in a world of Big Data,” International Data Privacy Law 3(2), (2013): 69.&lt;/li&gt;
&lt;li&gt;Solove, “Privacy self-management and consent dilemma,” 1883.&lt;/li&gt;
&lt;li&gt;Lokke Moerel, “Netherlands: Big Data Protection: How To Make The Draft EU Regulation On Data Protection Future Proof”, Mondaq, March 11. 2014, http://www.mondaq.com/x/298416/data+protection/Big+Data+Protection+How+To+Make+The+Dra%20ft+EU+Regulation+On+Data+Protection+Future+Proof%20al%20Lecture.&lt;/li&gt;
&lt;li&gt;Moerel, “Netherlands: Big Data Protection.”&lt;/li&gt;
&lt;li&gt;Centre for Information Policy Leadership, “A Risk-based Approach to Privacy: Improving Effectiveness in Practice,” Hunton and Williams LLP, June 19, 2014, https://www.informationpolicycentre.com/uploads/5/7/1/0/57104281/white_paper_1-a_risk_based_approach_to_privacy_improving_effectiveness_in_practice.pdf.&lt;/li&gt;
&lt;li&gt;Lokke Moerel and Corien Prins, “Privacy for Homo Digitalis: Proposal for a new regulatory framework for data protection in the light of Big Data and Internet of Things”, Social Science Research Network, May 25, 2016, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2784123.&lt;/li&gt;
&lt;li&gt;EU Directive 95/46/EC – The Data Protection Directive, https://www.dataprotection.ie/docs/EU-Directive-95-46-EC-Chapter-2/93.htm.&lt;/li&gt;
&lt;li&gt;Article 29 Data Protection Working Party, “Opinion 06/2014 on the notion of legitimate interests of the data controller under Article 7 of Directive 95/46/EC,” http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp217_en.pdf.&lt;/li&gt;
&lt;li&gt;Frederico Ferretti, “Data protection and the legitimate interest of data controllers: Much ado about nothing or the winter of rights?,” Common Market Law Review 51(2014): 1-26. http://bura.brunel.ac.uk/bitstream/2438/9724/1/Fulltext.pdf.&lt;/li&gt;
&lt;li&gt;Sinha and Mason, “A Critique of Consent in Informational Privacy.”&lt;/li&gt;
&lt;li&gt;Moerel and Prins, “Privacy for Homo Digitalis.”&lt;/li&gt;&lt;/ol&gt;

        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle'&gt;http://editors.cis-india.org/internet-governance/blog/digital-policy-portal-july-13-2016-new-approaches-to-information-privacy-revisiting-the-purpose-limitation-principle&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2016-11-09T13:54:28Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>


    <item rdf:about="http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime">
    <title>India’s Data Protection Regime Must Be Built Through an Inclusive and Truly Co-Regulatory Approach</title>
    <link>http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime</link>
    <description>
        &lt;b&gt;We must move India past its existing consultative processes for rule-making, which often prompts stakeholders to take adversarial and extremely one-sided positions.
&lt;/b&gt;
        &lt;p style="text-align: justify; "&gt;The article was published in the &lt;a class="external-link" href="https://thewire.in/201123/inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime/"&gt;Wire&lt;/a&gt; on December 1, 2017.&lt;/p&gt;
&lt;hr style="text-align: justify; " /&gt;
&lt;p style="text-align: justify; "&gt;Earlier this week, the Ministry of Electronics and Information Technology released &lt;a title="a white paper" href="http://meity.gov.in/white-paper-data-protection-framework-india-public-comments-invited" target="_blank"&gt;&lt;span style="text-decoration: underline;"&gt;a white paper&lt;/span&gt;&lt;/a&gt; by a “committee of experts” appointed a few months back led by former Supreme Court judge, Justice B.N. Srikrishna, on a data protection framework for India. The other members of the committee are Aruna Sundararajan, Ajay Bhushan Pandey, Ajay Kumar, Rajat Moona, Gulshan Rai, Rishikesha Krishnan, Arghya Sengupta and Rama Vedashree.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;With the exception of Justice Srikrishna and Krishnan, the rest of the committee members are either part of the government or part of organisations that have worked closely with the government on separate issues relating to technology, with some of them also having taken positions against the fundamental right to privacy.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Refreshingly, the committee and the ministry has opted for a consultative process outlining the issues they felt relevant to a data protection law, and espousing provisional views on each of the issues and seeking public responses on them. The paper states that on the basis of the response received, the committee will conduct public consultations with citizens and stakeholders. Legitimate concerns &lt;a title="were raised earlier" href="http://indianexpress.com/article/india/citizens-group-questions-data-privacy-panel-composition-aadhaar-4924220/" target="_blank"&gt;&lt;span style="text-decoration: underline;"&gt;were raised earlier&lt;/span&gt;&lt;/a&gt; about the constitution of the committee and the lack of inclusion of different voices on it. However, if the committee follows an inclusive, transparent and consultative process in the drafting of the data protection legislation, it would go a long way in addressing these concerns.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The paper seeks response to as many as 231 questions covering a broad spectrum of issues relating to data protection – including definitions of terms such as personal data, sensitive personal data, processing, data controller and processor – the purposes for which exemptions should be available, cross border flow of data, data localisation and the right to be forgotten.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;While a thorough analysis of all the issues up for discussion would require a more detailed evaluation, at this point, the process of rule-making and the kind of governance model envisaged in this paper are extremely important issues to consider.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;In part IV of the paper on ‘Regulation and Enforcement’, there is a discussion on a co-regulatory approach for the governance of data protection in India. The paper goes so far as to provisionally take a view that it may be appropriate to pursue a co-regulatory approach which involves “a spectrum of frameworks involving varying levels of government involvement and industry participation”.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, the discussion on co-regulation in the white paper is limited to the section on regulation and enforcement. A truly inclusive and co-regulatory approach ought to involve active participation from non-governmental stakeholders in the rule-making process itself. In India, unfortunately, we lack a strong tradition of lawmakers engaging in public consultations and participation of other stakeholders in the process of drafting laws and regulation. One notable exception has been the Telecom Regulatory Authority of India (TRAI), which periodically seeks public responses on consultation papers it releases and also holds open houses occasionally. It is heartening to see the committee of experts and the ministry follow a similar process in this case.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;However, these are essentially examples of ‘notice and comment’ rulemaking where the government actors stand as neutral arbiters who must decide on written briefs submitted to it in response to consultation papers or draft regulations that it notifies to the public.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This process is, by its very nature, adversarial, and often means that different stakeholders do not reveal their true priorities but must take extreme one-sided positions, as parties tend to at the beginning of a negotiation.This also prevents the stakeholders from sharing an honest assessment of the actual regulatory challenge they may face, lest it undermine their position.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;This often pits industry and public interest proponents against each other, sometimes also leading to different kinds of industry actors in adversarial positions. An excellent example of this kind of posturing, also relevant to this paper, is visible in the responses submitted to the TRAI on the its recent consultation paper on ‘Privacy, Security and Ownership of data in Telecom Sector’. One of the more contentious issue raised by the TRAI was about the adequacy of the existing data protection framework under the license agreement with telecom companies, and if there was a need to bring about greater parity in regulation between telecom companies and over-the-top (OTT) service providers. Rather than facilitating an actual discussion on what is a complex regulatory issues, and the real practical challenges it poses for the stakeholders, this form of consultation simply led to the telecom companies and OTT services providers submitting contrasting extreme positions without much scope for engagement between two polar arguments.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;A truly co-regulatory approach which also extends to rulemaking would involve collaborative processes which are far less adversarial in their design and facilitate joint problem solving through multiple face to face meetings. Such processes are also more likely to lead to better rule making by using the more specialised knowledge of the different stakeholders about technology, domain-specific issues, industry realities and low cost solutions. Further, by bringing the regulated parties into the rulemaking process, the ownership of the policy is shared, often leading to better compliance.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Within the domain of data protection law itself, we have a few existing models of robust co-regulation which entail the involvement of stakeholders not just at the level of enforcement but also at the level of drafting. The oldest and most developed form of this kind of privacy governance can be seen in the study of the Dutch privacy statute. It involved a central privacy legislations with broad principles, sectoral industry-drafted “codes of conduct”, government evaluations and certifications of these codes; and a legal safe harbour for those companies that follow the approved code for their sector. Over a period of 20 years, the Dutch experience saw the approval of 20 sectoral codes across a variety of sectors such as banking, insurance, pharmaceuticals, recruitment and medical research.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;Other examples of policies espousing this approach include two documents from the US – first, a draft bill titled ‘Commercial Privacy Bill of Rights Act of 2011’ introduced before the Congress by John McCain and John Kerry, and second, a White House Paper titled ‘Consumer Data Privacy In A Networked World: A Framework For Protecting Privacy And Promoting Innovation In The Global Digital Economy’ released by the Obama administration. Neither of these documents have so far led to a concrete policy. Both of these policies envisioned broadly worded privacy requirements to be passed by the Congress, followed by the detailed rules to be&lt;span&gt; drafted&lt;/span&gt;. The Obama administration white paper is more inclusive in mandating that ‘multi-stakeholder groups’ draft the codes that include not only industry representatives but also privacy advocates, consumer groups, crime victims, academics, international partners, federal and state civil and criminal law enforcement representatives and other relevant groups.&lt;/p&gt;
&lt;p style="text-align: justify; "&gt;The principles that emerge out this consultative process are likely to guide the data protection law in India for a long time to come. Among democratic regimes with a significant data-driven market, India is extremely late in arriving at a data protection law. The least that it can do at this point is to learn from the international experience and scholarship which has shown that merits of a co-regulatory approach which entails active participation of the government, industry, civil society and academia in the drafting and enforcement of a robust data protection law.&lt;/p&gt;
        &lt;p&gt;
        For more details visit &lt;a href='http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime'&gt;http://editors.cis-india.org/internet-governance/blog/the-wire-amber-sinha-december-1-2017-inclusive-co-regulatory-approach-possible-building-indias-data-protection-regime&lt;/a&gt;
        &lt;/p&gt;
    </description>
    <dc:publisher>No publisher</dc:publisher>
    <dc:creator>amber</dc:creator>
    <dc:rights></dc:rights>

    
        <dc:subject>Aadhaar</dc:subject>
    
    
        <dc:subject>Internet Governance</dc:subject>
    
    
        <dc:subject>Privacy</dc:subject>
    

   <dc:date>2018-01-01T16:18:54Z</dc:date>
   <dc:type>Blog Entry</dc:type>
   </item>




</rdf:RDF>
