The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 21 to 35.
Right to Exclusion, Government Spaces, and Speech
http://editors.cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech
<b>The conclusion of the litigation surrounding Trump blocking its critiques on Twitter brings to forefront two less-discussed aspects of intermediary liability: a) if social media platforms could be compelled to ‘carry’ speech under any established legal principles, thereby limiting their right to exclude users or speech, and b) whether users have a constitutional right to access social media spaces of elected officials. This essay analyzes these issues under the American law, as well as draws parallel for India, in light of the ongoing litigation around the suspension of advocate Sanjay Hegde’s Twitter account.</b>
<p> </p>
<p>This article first appeared on the Indian Journal of Law and Technology (IJLT) blog, and can be accessed <a class="external-link" href="https://www.ijlt.in/post/right-to-exclusion-government-controlled-spaces-and-speech">here</a>. Cross-posted with permission. </p>
<p>---</p>
<h2><span class="s1">Introduction</span></h2>
<p class="p2"><span class="s1">On April 8, the Supreme Court of the United States (SCOTUS), vacated the judgment of the US Court of Appeals for Second Circuit’s in <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2"><em>Knight First Amendment Institute v Trump</em></span></a>. In that case, the Court of Appeals had precluded Donald Trump, then-POTUS, from blocking his critics from his Twitter account on the ground that such action amounted to the erosion of constitutional rights of his critics. The Court of Appeals had held that his use of @realDonaldTrump in his official capacity had transformed the nature of the account from private to public, and therefore, blocking users he disagreed with amounted to viewpoint discrimination, something that was incompatible with the First Amendment.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">The SCOTUS <a href="https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf"><span class="s2">ordered</span></a> the case to be dismissed as moot, on account of Trump no longer being in office. Justice Clarence Thomas issued a ten-page concurrence that went into additional depth regarding the nature of social media platforms and user rights. It must be noted that the concurrence does not hold any direct precedential weightage, since Justice Thomas was not joined by any of his colleagues at the bench for the opinion. However, given that similar questions of public import, are currently being deliberated in the ongoing <em>Sanjay Hegde</em> <a href="https://www.barandbench.com/news/litigation/delhi-high-court-sanjay-hegde-challenge-suspension-twitter-account-hearing-july-8"><span class="s2">litigation</span></a> in the Delhi High Court, Justice Thomas’ concurrence might hold some persuasive weightage in India. While the facts of these litigations might be starkly different, both of them are nevertheless characterized by important questions of applying constitutional doctrines to private parties like Twitter and the supposedly ‘public’ nature of social media platforms.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In this essay, we consider the legal questions raised in the opinion as possible learnings for India. In the first part, we analyze the key points raised by Justice Thomas, vis-a-vis the American legal position on intermediary liability and freedom of speech. In the second part, we apply these deliberations to the <em>Sanjay Hegde </em>litigation, as a case-study and a roadmap for future legal jurisprudence to be developed.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">A flawed analogy</span></h2>
<p class="p2"><span class="s1">At the outset, let us briefly refresh the timeline of Trump’s tryst with Twitter, and the history of this litigation: the Court of Appeals decision was <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2">issued</span></a> in 2019, when Trump was still in office. Post-November 2020 Presidential Election, where he was voted out, his supporters <a href="https://indianexpress.com/article/explained/us-capitol-hill-siege-explained-7136632/"><span class="s2">broke</span></a> into Capitol Hill. Much of the blame for the attack was pinned on Trump’s use of social media channels (including Twitter) to instigate the violence and following this, Twitter <a href="https://blog.twitter.com/en_us/topics/company/2020/suspension"><span class="s2">suspended</span></a> his account permanently.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is this final fact that seized Justice Thomas’ reasoning. He noted that a private party like Twitter’s power to do away with Trump’s account altogether was at odds with the Court of Appeals’ earlier finding about the public nature of the account. He deployed a hotel analogy to justify this: government officials renting a hotel room for a public hearing on regulation could not kick out a dissenter, but if the same officials gather informally in the hotel lounge, then they would be within their rights to ask the hotel to kick out a heckler. The difference in the two situations would be that, <em>“the government controls the space in the first scenario, the hotel, in the latter.” </em>He noted that Twitter’s conduct was similar to the second situation, where it “<em>control(s) the avenues for speech</em>”. Accordingly, he dismissed the idea that the original respondents (the users whose accounts were blocked) had any First Amendment claims against Trump’s initial blocking action, since the ultimate control of the ‘avenue’ was with Twitter, and not Trump.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In the facts of the case however, this analogy was not justified. The Court of Appeals had not concerned itself with the question of private ‘control’ of entire social media spaces, and given the timeline of the litigation, it was impossible for them to pre-empt such considerations within the judgment. In fact, the only takeaway from the original decision had been that an elected representative’s utilization of his social media account for official purposes transformed </span><span class="s3">only that particular space</span><span class="s1"><em> </em>into a public forum where constitutional rights would find applicability. In delving into questions of ‘control’ and ‘avenues of speech’, issues that had been previously unexplored, Justice Thomas conflates a rather specific point into a much bigger, general conundrum. Further deliberations in the concurrence are accordingly put forward upon this flawed premise.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Right to exclusion (and must carry claims)</span></h2>
<p class="p2"><span class="s1">From here, Justice Thomas identified the problem to be “<em>private, concentrated control over online content and platforms available to the public</em>”, and brought forth two alternate regulatory systems — common carrier and public accommodation — to argue for ‘equal access’ over social media space. He posited that successful application of either of the two analogies would effectively restrict a social media platform’s right to exclude its users, and “<em>an answer may arise for dissatisfied platform users who would appreciate not being blocked</em>”. Essentially, this would mean that platforms would be obligated to carry <em>all </em>forms of (presumably) legal speech, and users would be entitled to sue platforms in case they feel their content has been unfairly taken down, a phenomenon Daphne Keller <a href="http://cyberlaw.stanford.edu/blog/2018/09/why-dc-pundits-must-carry-claims-are-relevant-global-censorship"><span class="s2">describes</span></a> as ‘must carry claims’.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">Again, this is a strange place to find the argument to proceed, since the original facts of the case were not about ‘<em>dissatisfied platform users’,</em> but an elected representative’s account being used in dissemination of official information. Beyond the initial ‘private’ control deliberation, Justice Thomas did not seem interested in exploring this original legal position, and instead emphasized on analogizing social media platforms in order to enforce ‘equal access’, finally arriving at a position that would be legally untenable in the USA.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">The American law on intermediary liability, as embodied in Section 230 of the Communications Decency Act (CDA), has two key components: first, intermediaries are <a href="https://www.eff.org/issues/cda230"><span class="s2">protected</span></a> against the contents posted by its users, under a legal model <a href="https://www.article19.org/wp-content/uploads/2018/02/Intermediaries_ENGLISH.pdf"><span class="s2">termed</span></a> as ‘broad immunity’, and second, an intermediary does not stand to lose its immunity if it chooses to moderate and remove speech it finds objectionable, popularly <a href="https://intpolicydigest.org/section-230-how-it-actually-works-what-might-change-and-how-that-could-affect-you/"><span class="s2">known</span></a> as the Good Samaritan protection. It is the effect of these two components, combined, that allows platforms to take calls on what to remove and what to keep, translating into a ‘right to exclusion’. Legally compelling them to carry speech, under the garb of ‘access’ would therefore, strike at the heart of the protection granted by the CDA.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Learnings for India</span></h2>
<p class="p2"><span class="s1">In his petition to the Delhi High Court, Senior Supreme Court Advocate, Sanjay Hegde had contested that the suspension of his Twitter account, on the grounds of him sharing anti-authoritarian imagery, was arbitrary and that:<span class="Apple-converted-space"> </span></span></p>
<ol style="list-style-type: lower-alpha;" class="ol1"><li class="li2"><span class="s1">Twitter was carrying out a public function and would be therefore amenable to writ jurisdiction under Article 226 of the Indian Constitution; and</span></li><li class="li2"><span class="s1">The suspension of his account had amounted to a violation of his right to freedom of speech and expression under Article 19(1)(a) and his rights to assembly and association under Article 19(1)(b) and 19(1)(c); and</span></li><li class="li2"><span class="s1">The government has a positive obligation to ensure that any censorship on social media platforms is done in accordance with Article 19(2).<span class="Apple-converted-space"> </span></span></li></ol>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">The first two prongs of the original petition are perhaps easily disputed: as previous <a href="https://indconlawphil.wordpress.com/2020/01/28/guest-post-social-media-public-forums-and-the-freedom-of-speech-ii/"><span class="s2">commentary</span></a> has pointed out, existing Indian constitutional jurisprudence on ‘public function’ does not implicate Twitter, and accordingly, it would be a difficult to make out a case that account suspensions, no matter how arbitrary, would amount to a violation of the user’s fundamental rights. It is the third contention that requires some additional insight in the context of our previous discussion.<span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Does the Indian legal system support a right to exclusion?<span class="Apple-converted-space"> </span></span></h3>
<p class="p2"><span class="s1">Suing Twitter to reinstate a suspended account, on the ground that such suspension was arbitrary and illegal, is in its essence a request to limit Twitter’s right to exclude its users. The petition serves as an example of a must-carry claim in the Indian context and vindicates Justice Thomas’ (misplaced) defence of ‘<em>dissatisfied platform users</em>’. Legally, such claims perhaps have a better chance of succeeding here, since the expansive protection granted to intermediaries via Section 230 of the CDA, is noticeably absent in India. Instead, intermediaries are bound by conditional immunity, where availment of a ‘safe harbour’, i.e., exemption from liability, is contingent on fulfilment of statutory conditions, made under <a href="https://indiankanoon.org/doc/844026/"><span class="s2">section 79</span></a> of the Information Technology (IT) Act and the rules made thereunder. Interestingly, in his opinion, Justice Thomas had briefly visited a situation where the immunity under Section 230 was made conditional: to gain Good Samaritan protection, platforms might be induced to ensure specific conditions, including ‘nondiscrimination’. This is controversial (and as commentators have noted, <a href="https://www.lawfareblog.com/justice-thomas-gives-congress-advice-social-media-regulation"><span class="s2">wrong</span></a>), since it had the potential to whittle down the US' ‘broad immunity’ model of intermediary liability to a system that would resemble the Indian one.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is worth noting that in the newly issued Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, proviso to Rule 3(1)(d) allows for “<em>the removal or disabling of access to any information, data or communication link [...] under clause (b) on a voluntary basis, or on the basis of grievances received under sub-rule (2) [...]</em>” without dilution of statutory immunity. This does provide intermediaries a right to exclude, albeit limited, since its scope is restricted to content removed under the operation of specific sub-clauses within the rules, as opposed to Section 230, which is couched in more general terms. Of course, none of this precludes the government from further prescribing obligations similar to those prayed in the petition.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">On the other hand, it is a difficult proposition to support that Twitter’s right to exclusion should be circumscribed by the Constitution, as prayed. In the petition, this argument is built over the judgment in <a href="https://indiankanoon.org/doc/110813550/"><span class="s2"><em>Shreya Singhal v Union of India</em></span></a>, where it was held that takedowns under section 79 are to be done only on receipt of a court order or a government notification, and that the scope of the order would be restricted to Article 19(2). This, in his opinion, meant that “<em>any suo-motu takedown of material by intermediaries must conform to Article 19(2)</em>”.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">To understand why this argument does not work, it is important to consider the context in which the <em>Shreya Singhal </em>judgment was issued. Previously, intermediary liability was governed by the Information Technology (Intermediaries Guidelines) Rules, 2011 issued under section 79 of the IT Act. Rule 3(4) made provisions for sending takedown orders to the intermediary, and the prerogative to send such orders was on ‘<em>an affected person</em>’. On receipt of these orders, the intermediary was bound to remove content and neither the intermediary nor the user whose content was being censored, had the opportunity to dispute the takedown.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">As a result, the potential for misuse was wide-open. Rishabh Dara’s <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf"><span class="s2">research</span></a> provided empirical evidence for this; intermediaries were found to act on flawed takedown orders, on the apprehension of being sanctioned under the law, essentially chilling free expression online. The <em>Shreya Singhal</em> judgment, in essence, reined in this misuse by stating that an intermediary is legally obliged to act <em>only when </em>a takedown order is sent by the government or the court. The intent of this was, in the court’s words: “<em>it would be very difficult for intermediaries [...] to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate and which are not.</em>”<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">In light of this, if Hegde’s petition succeeds, it would mean that intermediaries would now be obligated to subsume the entirety of Article 19(2) jurisprudence in their decision-making, interpret and apply it perfectly, and be open to petitions from users when they fail to do so. This might be a startling undoing of the court’s original intent in <em>Shreya Singhal</em>. Such a reading also means limiting an intermediary’s prerogative to remove speech that may not necessarily fall within the scope of Article 19(2), but is still systematically problematic, including unsolicited commercial communications. Further, most platforms today are dealing with an unprecedented spread and consumption of harmful, misleading information. Limiting their right to exclude speech in this manner, we might be <a href="https://www.hoover.org/sites/default/files/research/docs/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech_0.pdf"><span class="s2">exacerbating</span></a> this problem. <span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Government-controlled spaces on social media platforms</span></h3>
<p class="p2"><span class="s1">On the other hand, the original finding of the Court of Appeals, regarding the public nature of an elected representative’s social media account and First Amendment rights of the people to access such an account, might yet still prove instructive for India. While the primary SCOTUS order erases the precedential weight of the original case, there have been similar judgments issued by other courts in the USA, including by the <a href="https://globalfreedomofexpression.columbia.edu/cases/davison-v-randall/"><span class="s2">Fourth Circuit</span></a> court and as a result of a <a href="https://knightcolumbia.org/content/texas-attorney-general-unblocks-twitter-critics-in-knight-institute-v-paxton"><span class="s2">lawsuit</span></a> against a Texas Attorney General.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">A similar situation can be envisaged in India as well. The Supreme Court has <a href="https://indiankanoon.org/doc/591481/"><span class="s2">repeatedly</span></a> <a href="https://indiankanoon.org/doc/27775458/"><span class="s2">held</span></a> that Article 19(1)(a) encompasses not just the right to disseminate information, but also the right to <em>receive </em>information, including <a href="https://indiankanoon.org/doc/438670/"><span class="s2">receiving</span></a> information on matters of public concern. Additionally, in <a href="https://indiankanoon.org/doc/539407/"><span class="s2"><em>Secretary, Ministry of Information and Broadcasting v Cricket Association of Bengal</em></span></a>, the Court had held that the right of dissemination included the right of communication through any media: print, electronic or audio-visual. Then, if we assume that government-controlled spaces on social media platforms, used in dissemination of official functions, are ‘public spaces’, then the government’s denial of public access to such spaces can be construed to be a violation of Article 19(1)(a).<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Conclusion</span></h2>
<p class="p2"><span class="s1">As indicated earlier, despite the facts of the two litigations being different, the legal questions embodied within converge startlingly, inasmuch that are both examples of the growing discontent around the power wielded by social media platforms, and the flawed attempts at fixing it.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">While the above discussion might throw some light on the relationship between an individual, the state and social media platforms, many questions still continue to remain unanswered. For instance, once we establish that users have a fundamental right to access certain spaces within the social media platform, then does the platform have a right to remove that space altogether? If it does so, can a constitutional remedy be made against the platform? Initial <a href="https://indconlawphil.wordpress.com/2018/07/01/guest-post-social-media-public-forums-and-the-freedom-of-speech/"><span class="s2">commentary</span></a> on the Court of Appeals’ decision had contested that the takeaway from that judgment had been that constitutional norms had a primacy over the platform’s own norms of governance. In such light, would the platform be constitutionally obligated to <em>not </em>suspend a government account, even if the content on such an account continues to be harmful, in violation of its own moderation standards?<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">This is an incredibly tricky dimension of the law, made trickier still by the dynamic nature of the platforms, the intense political interests permeating the need for governance, and the impacts on users in the instance of a flawed solution. Continuous engagement, scholarship and emphasis on having a human rights-respecting framework underpinning the regulatory system, are the only ways forward.<span class="Apple-converted-space"> </span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space">---</span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"></span></span></p>
<p>The author would like to thank Gurshabad Grover and Arindrajit Basu for reviewing this piece. </p>
<div> </div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech'>http://editors.cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech</a>
</p>
No publisherTorSharkFreedom of Speech and ExpressionIntermediary LiabilityInformation Technology2021-07-02T12:05:13ZBlog EntryRethinking the intermediary liability regime in India
http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india
<b>The article consolidates some of our broad thematic concerns with the draft amendments to the intermediary liability rules, published by MeitY last December.
</b>
<p>The blog post by Torsha Sarkar was <a class="external-link" href="https://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/">published by CyberBRICS</a> on August 12, 2019.</p>
<hr />
<h3 style="text-align: justify; ">Introduction</h3>
<p style="text-align: justify; ">In December 2018, the Ministry of Electronics and Information Technology (“MeitY”) released the Intermediary Liability Guidelines (Amendment) Rules (“the Guidelines”), which would be significantly altering the intermediary liability regime in the country. While the Guidelines has drawn a considerable amount of attention and criticism, from the perspective of the government, the change has been overdue.</p>
<p style="text-align: justify; ">The Indian government has been determined to overhaul the pre-existing safe harbour regime since last year. The draft<a href="https://www.medianama.com/wp-content/uploads/Draft-National-E-commerce-Policy.pdf">version</a> of the e-commerce policy, which were leaked last year, also hinted at similar plans. As effects of mass dissemination of disinformation, propaganda and hate speech around the world spill over to offline harms, governments have been increasingly looking to enact interventionist laws that leverage more responsibility on the intermediaries. India has not been an exception.</p>
<p style="text-align: justify; ">A major source of these harmful and illegal content in India come through the popular communications app WhatsApp, despite the company’s enactment of several anti-spam measures over the past few years. Last year, rumours circulated on WhatsApp prompted a series of lynchings. In May, Reuters <a href="https://in.reuters.com/article/india-election-socialmedia-whatsapp/in-india-election-a-14-software-tool-helps-overcome-whatsapp-controls-idINKCN1SL0PZ" rel="noreferrer noopener" target="_blank">reported</a> that clones and software tools were available at minimal cost in the market, for politicians and other interested parties to bypass these measures, and continue the trend of bulk messaging.</p>
<p style="text-align: justify; ">These series of incidents have made it clear that disinformation is a very real problem, and the current regulatory framework is not enough to address it. The government’s response to this has been accordingly, to introduce the Guidelines. This rationale also finds a place in its preliminary<a href="https://www.meity.gov.in/comments-invited-draft-intermediary-rules" rel="noreferrer noopener" target="_blank">statement of reasons</a>.</p>
<p style="text-align: justify; ">While enactment of such interventionist laws has triggered fresh rounds of debate on free speech and censorship, it would be wrong to say that such laws were completely one-sided, or uncalled for.</p>
<p style="text-align: justify; ">On one hand, automated amplification and online mass circulation of purposeful disinformation, propaganda, of terrorist attack videos, or of plain graphic content, are all problems that the government would concern itself with. On the other hand, several online companies (including <a href="https://www.blog.google/outreach-initiatives/public-policy/oversight-frameworks-content-sharing-platforms/" rel="noreferrer noopener" target="_blank">Google</a>) also seem to be in an uneasy agreement that simple self-regulation of content would not cut it. For better oversight, more engagement with both government and civil society members is needed.</p>
<p style="text-align: justify; ">In March this year, Mark Zuckerberg wrote an<a href="https://www.washingtonpost.com/opinions/mark-zuckerberg-the-internet-needs-new-rules-lets-start-in-these-four-areas/2019/03/29/9e6f0504-521a-11e9-a3f7-78b7525a8d5f_story.html?utm_term=.4d177c66782f" rel="noreferrer noopener" target="_blank">op-ed</a> for the Washington Post, calling for more government involvement in the process of content regulation on its platform. While it would be interesting to consider how Zuckerberg’s view aligns with those similarly placed, it would nevertheless be correct to say that online intermediaries are under more pressure than ever to keep their platforms clean of content that is ‘illegal, harmful, obscene’. And this list only grows.</p>
<p style="text-align: justify; ">That being said, the criticism from several stakeholders is sharp and clear in instances of such law being enacted – be it the ambitious <a href="https://www.ivir.nl/publicaties/download/NetzDG_Tworek_Leerssen_April_2019.pdf" rel="noreferrer noopener" target="_blank">NetzDG</a> aimed at combating Nazi propaganda, hate speech and fake news, or the controversial new European Copyright Directive which has been welcomed by journalists but has been severely critiqued by online content creators and platforms as detrimental against user-generated content.</p>
<p style="text-align: justify; ">In the backdrop of such conflicting interests on online content moderation, it would be useful to examine the Guidelines released by MeitY. In the first portion we would be looking at certain specific concerns existing within the rules, while in the second portion, we would be pushing the narrative further to see what an alternative regulatory framework may look like.</p>
<p style="text-align: justify; ">Before we jump to the crux of this discussion, one important disclosure must be made about the underlying ideology of this piece. It would be unrealistic to claim that the internet should be absolutely free from regulation. Swathes of content on child sexual abuse, or terrorist propaganda, or even the hordes of death and rape threats faced by women online are and should be concerns of a civil society. While that is certainly a strong driving force for regulation, this concern should not override the basic considerations for human rights (including freedom of expression). These ideas would be expanded a bit more in the upcoming sections.</p>
<h3 style="text-align: justify; ">Broad, thematic concerns with the Rules</h3>
<h3 style="text-align: justify; ">A uniform mechanism of compliance</h3>
<h3 style="text-align: justify; ">Timelines</h3>
<p style="text-align: justify; ">Rule 3(8) of the Guidelines mandates intermediaries, prompted by <em>a</em> <em>court order or a government notification</em>, to take down content relating to unlawful acts within 24 hours of such notification. In case they fail to do so, the safe harbour applicable to them under section 79 of the Information Technology Act (“the Act”) would cease to apply, and they would be liable. Prior to the amendment, this timeframe was 36 hours.</p>
<p style="text-align: justify; ">There is a visible lack of research which could rationalize that a 24-hour timeline for compliance is the optimal framework, for <em>all</em> intermediaries, irrespective of the kind of services they provide, or the sizes or resources available to them. As Mozilla Foundation has <a href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/" rel="noreferrer noopener" target="_blank">commented</a>, regulation of illegal content online simply cannot be done in an one-size-fits-all approach, nor can <a href="https://blog.mozilla.org/netpolicy/2019/04/10/uk_online-harms/" rel="noreferrer noopener" target="_blank">regulation be made</a> with only the tech incumbents in mind. While platforms like YouTube can comfortably <a href="https://www.bmjv.de/SharedDocs/Pressemitteilungen/DE/2017/03142017_Monitoring_SozialeNetzwerke.html" rel="noreferrer noopener" target="_blank">remove</a> criminal prohibited content within a span of 24 hours, this still can place a large burden on smaller companies, who may not have the necessary resources to comply within this timeframe. There are a few unintended consequences that would arise out of this situation.</p>
<p style="text-align: justify; ">One, sanctions under the Act, which would include both organisational ramifications like website blocking (under section 69A of the Act) as well as individual liability, would affect the smaller intermediaries more than it would affect the bigger ones. A bigger intermediary like Facebook may be able to withstand a large fine in lieu of its failure to control, say, hate speech on its platform. That may not be true for a smaller online marketplace, or even a smaller online social media site, targeted towards a very specific community. This compliance mechanism, accordingly, may just go on to strengthen the larger companies, and eliminating the competition from the smaller companies.</p>
<p style="text-align: justify; ">Two, intermediaries, in fear of heavy criminal sanctions would err on the side of law. This would mean that the decisions involved in determining whether a piece of content is illegal or not would be shorter, less nuanced. This would also mean that legitimate speech would also be under risk from censorship, and intermediaries would pay <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf" rel="noreferrer noopener" target="_blank">less heed</a> to the technical requirements or the correct legal procedures required for content takedown.</p>
<h3 style="text-align: justify; ">Utilization of ‘automated technology’</h3>
<p style="text-align: justify; ">Another place where the Guidelines assume that all intermediaries operating in India are on the same footing is Rule 3(9). This mandates these entities to proactively monitor for ‘unlawful content’ on their platforms. Aside the unconstitutionality of this provision, this also assumes that all intermediaries would have the requisite resource to actually set up this tool and operate it successfully. YouTube’s ContentID, which began in 2007, has already seen a whopping <a href="https://www.blog.google/outreach-initiatives/public-policy/protecting-what-we-love-about-internet-our-efforts-stop-online-piracy/" rel="noreferrer noopener" target="_blank">100 million dollars investment by 2018</a>.</p>
<p style="text-align: justify; ">Funnily enough, ContentID is a tool exclusively dedicated to finding copyright violation of rights-holder, and even then, it has been proven to be not <a href="https://www.plagiarismtoday.com/2019/01/10/youtubes-copyright-insanity/" rel="noreferrer noopener" target="_blank">infallible</a>. The Guidelines’ sweeping net of ‘unlawful’ content include far many more categories than mere violations of IP rights, and the framework assumes that intermediaries would be able to set up and run an automated tool that would filter through <em>all</em> these categories of ‘unlawful content’ at one go.</p>
<h3 style="text-align: justify; ">The problems of AI</h3>
<p style="text-align: justify; ">Aside the implementation-related concerns, there are also technical challenges related with Rule 3(9). Supervised learning systems (like the one envisaged under the Guidelines) use training data sets for pro-active filtering. This means if the system is taught that for ten instances of A being the input, the output would be B, then for the eleventh time, it sees A, it would give the output B. In the lingo of content filtering, the system would be taught, for example, that nudity is bad. The next time the system encounters nudity in a picture, it would automatically flag it as ‘bad’ and violating the community standards.</p>
<p style="text-align: justify; "><a href="https://www.theguardian.com/technology/2016/sep/08/facebook-mark-zuckerberg-napalm-girl-photo-vietnam-war" rel="noreferrer noopener" target="_blank">Except, that is not how it should work</a>. For every post that is under the scrutiny of the platform operators, numerous nuances and contextual cues act as mitigating factors, none of which, at this point, would be<a href="https://scholarship.law.nd.edu/cgi/viewcontent.cgi?referer=https://www.google.co.in/&httpsredir=1&article=1704&context=ndlr" rel="noreferrer noopener" target="_blank">understandable</a> by a machine.</p>
<p style="text-align: justify; ">Additionally, the training data used to feed the system <a href="https://www.cmu.edu/dietrich/philosophy/docs/london/IJCAI17-AlgorithmicBias-Distrib.pdf" rel="noreferrer noopener" target="_blank">can be biased</a>. A self-driving car who is fed training data from only one region of the country would learn the customs and driving norms of that particular region, and not the patterns that apply across the intended purpose of driving throughout the country.</p>
<p style="text-align: justify; ">Lastly, it is not disputed that bias would be completely eliminated in case the content moderation was undertaken by a human. However, the difference between a human moderator and an automated one, would be that there would be a measure of accountability in the first one. The decision of the human moderator can be disputed, and the moderator would have a chance to explain his reasons for the removal. Artificial intelligence (“AI”) is identified by the algorithmic ‘<a href="http://raley.english.ucsb.edu/wp-content/Engl800/Pasquale-blackbox.pdf" rel="noreferrer noopener" target="_blank">black box</a>’ that processes inputs, and generates usable outputs. Implementing workable accountability standards for this system, including figuring out appeal and grievance redressal mechanisms in cases of dispute, are all problems that the regulator must concern itself with.</p>
<p style="text-align: justify; ">In the absence of any clarity or revision, it seems unlikely that the provision would actually ever see full implementation. Neither would the intermediaries know what kind of ‘automated technology’ they are supposed to use for filtering ‘unlawful content’, nor would there be any incentives for them to actually deploy this system effectively for their platforms.</p>
<h3 style="text-align: justify; ">What can be done?</h3>
<p style="text-align: justify; ">First, more research is needed to understand the effect of compliance timeframes on the accuracy of content takedown. Several jurisdictions are operating now on different timeframes of compliance, and it would be a far more holistic regulation should the government consider the dialogue around each of them and see what it means for India.</p>
<p style="text-align: justify; ">Second, it might be useful to consider the concept of an independent regulator as an alternative and as a compromise between pure governmental regulation (which is more or less what the system is) or self-regulation (which the Guidelines, albeit problematically, also espouse through Rule 3(9)).</p>
<p style="text-align: justify; ">The <a href="https://www.gov.uk/government/consultations/online-harms-white-paper" rel="noreferrer noopener" target="_blank">UK White Paper on Harms</a>, a piece of important document in the system of liability overhaul, proposes an arms-length regulator who would be responsible for drafting codes of conduct for online companies and responsible for their enforcement. While the exact merits of the system is still up for debate, the concept of having a separate body to oversee, formulate and also possibly<a href="https://medium.com/adventures-in-consumer-technology/regulating-social-media-a-policy-proposal-a2a25627c210" rel="noreferrer noopener" target="_blank">arbitrate</a> disputes regarding content removal, is finding traction in several parallel developments.</p>
<p style="text-align: justify; ">One of the Transatlantic Working Group Sessions seem to discuss this idea in terms of having an ‘<a href="https://medium.com/whither-news/proposals-for-reasonable-technology-regulation-and-an-internet-court-58ac99bec420" rel="noreferrer noopener" target="_blank">internet court</a>’ for illegal content regulation. This would have the noted advantage of a) formulating norms of online content in a transparent, public fashion, something previously done behind closed doors of either the government or the tech incumbents and b) having specially trained professionals who would be able to dispose of matters in an expeditious manner.</p>
<p style="text-align: justify; ">India is not unfamiliar to the idea of specialized tribunals, or quasi-judicial bodies for dealing with specific challenges. In 2015, for example, the Government of India passed the Commercial Courts Act, by which specific courts were tasked to deal with matters of very large value. This is neither an isolated instance of the government choosing to create new bodies for dealing with a specific problem, nor would it be inimitable in the future.</p>
<p style="text-align: justify; ">There is no<a href="https://www.thehindubusinessline.com/opinion/resurrecting-the-marketplace-of-ideas/article26313605.ece" rel="noreferrer noopener" target="_blank"> silver bullet</a> when it comes to moderation of content on the web. However, in light of these parallel convergence of ideas, the appeal of an independent regulatory system as a sane compromise between complete government control and <em>laissez-faire</em>autonomy, is worth considering.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india'>http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india</a>
</p>
No publishertorshaInternet GovernanceIntermediary LiabilityArtificial Intelligence2019-08-16T01:49:47ZBlog EntryResponse to the Draft of The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018
http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018
<b>In this response, we aim to examine whether the draft rules meet tests of constitutionality and whether they are consistent with the parent Act. We also examine potential harms that may arise from the Rules as they are currently framed and make recommendations to the draft rules that we hope will help the Government meet its objectives while remaining situated within the constitutional ambit.</b>
<p><br style="text-align: start;" /><span style="text-align: start; float: none;">This document presents the Centre for Internet & Society (CIS) response</span><span style="text-align: start; float: none;"> to the Ministry of Electronics and Information Technology’s invitation</span><span style="text-align: start; float: none;"> to comment and suggest changes to the draft of The Information</span><span style="text-align: start; float: none;"> Technology [Intermediary Guidelines (Amendment) Rules] 2018 (hereinafter</span><span style="text-align: start; float: none;"> referred to as the “draft rules”) published on December 24, 2018. CIS is</span><span style="text-align: start; float: none;"> grateful for the opportunity to put forth its views and comments. This response was sent on the January 31, 2019.</span><br style="text-align: start;" /><br style="text-align: start;" /><span style="text-align: start; float: none;">In this response, we aim to examine whether the draft rules meet tests</span><span style="text-align: start; float: none;"> of constitutionality and whether they are consistent with the parent</span><span style="text-align: start; float: none;"> Act. We also examine potential harms that may arise from the Rules as</span><span style="text-align: start; float: none;"> they are currently framed and make recommendations to the draft rules</span><span style="text-align: start; float: none;"> that we hope will help the Government meet its objectives while</span><span style="text-align: start; float: none;"> remaining situated within the constitutional ambit.</span></p>
<p><span style="text-align: start; float: none;">The response can be accessed <a href="https://cis-india.org/internet-governance/resources/Intermediary%20Liability%20Rules%202018.pdf">here</a>.<br /></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018'>http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018</a>
</p>
No publisherGurshabad Grover, Elonnai Hickok, Arindrajit Basu, AkritiFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-02-07T08:06:41ZBlog EntryReport on CIS' Workshop at the IGF:'An Evidence Based Framework for Intermediary Liability'
http://editors.cis-india.org/internet-governance/report-on-cis-workshop-at-igf
<b>An evidence based framework for intermediary liability' was organised to present evidence and discuss ongoing research on the changing definition, function and responsibilities of intermediaries across jurisdictions.</b>
<p style="text-align: justify; ">The discussion from the workshop will contribute to a comprehensible framework for liability, consistent with the capacity of the intermediary and with international human-rights standards.</p>
<p style="text-align: justify; ">Electronic Frontier Foundation (USA), Article 19 (UK) and Centre for Internet and Society (India) have come together towards the development of best practices and principles related to the regulation of online content through intermediaries. The nine principles are: Transparency, Consistency, Clarity, Mindful Community Policy Making, Necessity and Proportionality in Content Restrictions, Privacy, Access to Remedy, Accountability, and Due Process in both Legal and Private Enforcement. The workshop discussion will contribute to a comprehensible framework for liability that is consistent with the capacity of the intermediary and with international human-rights standards. The session was hosted by Centre for Internet and Society (India) and Centre for Internet and Society, Stanford (USA) and attended by 7 speakers and 40 participants.</p>
<p style="text-align: justify; ">Jeremy Malcolm, Senior Global Policy Analyst EFF kicked off the workshop highlighting the need to develop a liability framework for intermediaries that is derived out of an understanding of their different functions, their role within the economy and their impact on human rights. He went on to structure the discussion which would follow to focus on ongoing projects and examples that highlight central issues related to gathering and presenting evidence to inform the policy space.</p>
<p style="text-align: justify; ">Martin Husovec from the International Max Planck Research School for Competition and Innovation, began his presentation, tracking the development of safe harbour frameworks within social contract theory. Opining that safe harbour was created as a balancing mechanism between a return of investments of the right holders and public interest for Internet as a public space, he introduced emerging claims that technological advancement have altered this equilibrium. Citing injunctions and private lawsuits as instruments, often used against law abiding intermediaries, he pointed to the problem within existing liability frameoworks, where even intermediaries, who diligently deal with illegitimate content on their services, can be still subject to a forced cooperation to the benefit of right holders. He added that for liability frameworks to be effective, they must keep pace with advances in technology and are fair to right holders and the public interest.</p>
<p style="text-align: justify; ">He also pointed that in any liability framework because the ‘law’ that prescribes an interference, must be always sufficiently clear and foreseeable, as to both the meaning and nature of the applicable measures, so it sufficiently outlines the scope and manner of exercise of the power of interference in the exercise of the rights guaranteed. He illustrated this with the example of the German Federal Supreme Court attempts with Wi-Fi policy-making in 2010. He also raised issues of costs of uncertainty in seeking courts as the only means to balance rights as they often, do not have the necessary information. Similarly, society also does not benefit from open ended accountability of intermediaries and called for a balanced approach to regulation.</p>
<p style="text-align: justify; ">The need for consistency in liability regimes across jurisdictions, was raised by Giancarlo Frosio, Intermediary Liability Fellow at Stanford's Centre for Internet and Society. He introduced the World Intermediary Liability Map, a project mapping legislation and case law across 70 countries towards creating a repository of information that informs policymaking and helps create accountability. Highlighting key takeaways from his research, he stressed the necessity of having clear definitions in the field of intermediary liability and the need to develop taxonomy of issues to deepen our understanding of the issues at stake towards an understanding of type of liability appropriate for a particular jurisdiction.</p>
<p style="text-align: justify; ">Nicolo Zingales, Assistant Professor of Law at Tilburg University highlighted the need for due process and safeguards for human rights and called for more user involvement in systems that are in place in different countries to respond to requests of takedown. Presenting his research findings, he pointed to the imbalance in the way notice and takedown regimes are structured, where content is taken down presumptively, but the possibility of restoring user content is provided only at a subsequent stage or not at all in many cases. He cited several examples of enhancing user participation in liability mechanisms including notice and notice, strict litigation sanction inferring the knowledge that the content might have been legal and shifting the presumption in favor of the users and the reverse notice and takedown procedure. He also raised the important question, if multistakeholder cooperation is sufficient or adequate to enable the users to have a say and enter as part of the social construct in this space? Reminding the participants of the failure of the multistakeholder agreement process regarding the cost for the filters in the UK, that would be imposed according to judicial procedure, he called for strengthening our efforts to enable users to get more involved in protecting their rights online.</p>
<p style="text-align: justify; ">Gabrielle Guillemin from Article 19 presented her research on the types of intermediaries and models of liability in place across jurisdictions. Pointing to the problems associated with intermediaries having to monitor content and determine legality of content, she called for procedural safeguards and stressed the need to place the dispute back in the hands of users and content owners and the person who has written the content rather than the intermediary. She goes on to provide some useful and practically-grounded solutions to strengthen existing takedown mechanisms including, adding details to the notices, introducing fees in order to extend the number of claims that are made and defining procedure regards criminal content.</p>
<p style="text-align: justify; ">Elonnai Hickok introduced CIS' research to the UNESCO report Fostering Freedom Online: the Role of Internet Intermediaries, comparing a range of liability models in different stages of development and provisions across jurisdictions. She argued for a liability framework that tackles procedural and regulatory uncertainty, lack of due process, lack of remedy and varying content criteria.</p>
<p style="text-align: justify; ">Francisco Vera, Advocacy Director, Derechos Digitales from Chile raised issues related to mindful community policy-making expounding on Chile's implementation of intermediary liability obligation with the USA, the introduction of judicial oversight under Chilean legislation which led to US objection to Chile on grounds of not fulfilling their standards in terms of Internet property protection. He highlighted the tensions that arise in balancing the needs of the multiple communities and interests engaged over common resources and stressed the need for evidence in policy-making to balance the needs of rights holders and public interest. He stressed the need for evidence to inform policy-making and ensure it keeps pace with technological developments citing the example of the ongoing Transpacific Partnership Agreement negotiations that call for exporting provisions DMCA provisions to 11 countries even though there is no evidence of the success of the system for public interest. He concluded by cautioning against the development of frameworks that are or have the potential to be used as anti-competitive mechanisms that curtail innovation and therby do not serve public interest.</p>
<p style="text-align: justify; ">Malcolm Hutty associated with the European Internet Service Providers Association, Chair of the Intermediary Reliability Committee and London Internet Exchange brought in the intermediaries' perspective into the discussion. He argued for challenging the link between liability and forced cooperation, understated the problems arising from distinction without a difference and incentives built in within existing regimes. He raised issues arising from the expectancy on the part of those engaged in pre-emptive regulation of unwanted or undesirable content for intermediaries to automate content. Pointing to the increasing impact of intermediaries in our lives he underscored how exposing vast areas of people's lives to regulatory enforce, which enhances power of the state to implement public policy in the public interest and expect it to be executed, can have both positive and negative implications on issues such as privacy and freedom of expression.</p>
<p style="text-align: justify; ">He called out practices in regulatory regimes that focus on one size fits all solutions such as seeking automating filters on a massive scale and instead called for context and content specific solutions, that factor the commercial imperatives of intermediaries. He also addressed the economic consequences of liability frameworks to the industry including cost effectiveness of balancing rights, barriers to investments that arise in heavily regulated or new types of online services that are likely to be the targeted for specific enforcement measures and the long term costs of adapting old enforcement mechanisms that apply, while networks need to be updated to extend services to users.</p>
<p style="text-align: justify; ">The workshop presented evidence of a variety of approaches and the issues that arise in applying those approaches to impose liability on intermediaries. Two choices emerged towards developing frameworks for enforcing responsibility on intermediaries. We could either rely on a traditional approach, essentially court-based and off-line mechanisms for regulating behaviour and disputes. The downside of this is it will be slow and costly to the public purse. In particular, we will lose a great deal of the opportunity to extend regulation much more deeply into people's lives so as to implement the public interest.<br /><br />Alternatively, we could rely on intermediaries to develop and automate systems to control our online behaviour. While this approach does not suffer from efficiency problems of the earlier approach it does lack, both in terms of hindering the developments of the Information Society, and potentially yielding up many of the traditionally expected protections under a free and liberal society. The right approach lies somewhere in the middle and development of International Principles for Intermediary Liability, announced at the end of the workshop, is a step closer to the developing a balanced framework for liability.</p>
<hr />
<p>See the <a class="external-link" href="http://www.intgovforum.org/cms/174-igf-2014/transcripts/1968-2014-09-03-ws206-an-evidence-based-liability-policy-framework-room-5">transcript on IGF website</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/report-on-cis-workshop-at-igf'>http://editors.cis-india.org/internet-governance/report-on-cis-workshop-at-igf</a>
</p>
No publisherjyotiPrivacyFreedom of Speech and ExpressionInternet Governance ForumInternet GovernanceIntermediary Liability2014-09-24T10:47:30ZBlog EntryRebuttal of DIT's Misleading Statements on New Internet Rules
http://editors.cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries
<b>The press statement issued on May 11 by the Department of Information Technology (DIT) on the furore over the newly-issued rules on 'intermediary due diligence' is misleading and is, in places, plainly false. We are presenting a point-by-point rebuttal of the DIT's claims.</b>
<p>In its <a class="external-link" href="http://pib.nic.in/newsite/erelease.aspx?relid=72066">press release on Wednesday, May 11, 2011</a>, the DIT stated:
<blockquote>The
attention of Government has been drawn to news items in a section of
media on certain aspects of the Rules notified under Section 79
pertaining to liability of intermediaries under the Information
Technology Act, 2000. These items have raised two broad issues. One is
that words used in Rules for objectionable content are broad and could
be interpreted subjectively. Secondly, there is an apprehension that the
Rules enable the Government to regulate content in a highly subjective
and possibly arbitrary manner. <br /></blockquote>
<p>There are actually more issues than merely "subjective interpretation" and "arbitrary governmental regulation".</p>
<ul><li style="list-style-type: disc;">The
Indian Constitution limits how much the government can regulate
citizens’ fundamental right to freedom of speech and expression. Any
measure afoul of the constitution is invalid. </li><li style="list-style-type: disc;">Several
portions of the rules are beyond the limited powers that Parliament had
granted the Department of IT to create interpretive rules under the
Information Technology Act. Parliament directed the Government to merely
define what “due diligence” requirements an intermediary would have to
follow in order to claim the qualified protection against liability that
Section 79 of the Information Technology Act provides; these current
rules have gone dangerously far beyond that, by framing rules that
insist that intermediaries, without investigation, has to remove content within 36-hours of receipt of a
complaint, keep records of a users' details and provide them to
law enforcement officials.</li></ul>
<p>The Department of Information Technology (DIT), Ministry of
Communications & IT has clarified that the Intermediaries Guidelines
Rules, 2011 prescribe that due diligence need to be observed by the
Intermediaries to enjoy exemption from liability for hosting any third
party information under Section 79 of the Information Technology Act,
2000. These due diligence practices are the best practices followed
internationally by well-known mega corporations operating on the
Internet. The terms specified in the Rules are in accordance with the
terms used by most of the Intermediaries as part of their existing
practices, policies and terms of service which they have published on
their website.</p>
<ol><li>We are not aware of any country that actually goes to the extent of
deciding what Internet-wide ‘best practices’ are and actually converting
those ‘best practices’ into law by prescribing a universal terms of
service that all Internet services, websites, and products should enforce.</li><li>The Rules require all intermediaries to include the
government-prescribed terms in an agreement, no matter what services
they provide. It is one thing for a company to choose the terms of its
terms of service agreement, and completely another for the government to
dictate those terms of service. As long as the terms of service of an
intermediary are not unlawful or bring up issues of users’ rights (such
as the right to privacy), there is no reason for the government to jump
in and dictate what the terms of service should or should not be.</li><li>The DIT has not offered any proof to back up its assertion that 'most'
intermediaries already have such terms. Google, a ‘mega corporation’
which is an intermediary, <a class="external-link" href="http://www.google.com/accounts/TOS?hl=en">does not have such an overarching policy</a>. Indiatimes, another ‘mega
corporation’ intermediary, <a class="external-link" href="http://www.indiatimes.com/policyterms/1555176.cms">does not either</a>. Just because <a class="external-link" href="http://www.rediff.com/termsofuse.html">a
company like Rediff</a> and <a class="external-link" href="http://us.blizzard.com/en-us/company/legal/wow_tou.html">
Blizzard's World of Warcraft</a> have some of those terms does not mean a) that they should have all of those terms, nor that b) everyone else should as well.<br /><br />In
attempting to take different terms of service from different Internet
services and products—the very fact of which indicate the differing
needs felt across varying online communities—the Department has put in
place a one-size-fits-all approach. How can this be possible on the Internet, when we wouldn't regulate the post-office and a book publisher under the same rules of liability for, say, defamatory speech.</li><li>There is also a significant difference between the effect of those
terms of service and that of these Rules. An intermediary-framed terms of service
suggest that the intermediary <em>may</em> investigate and boot someone off a service for violation, while the Rules insist that
the intermediary simply has to mandatorily remove content, keep records of users' details and provide them to law enforcement officials,
else be subject to crippling legal liability.</li></ol>
<p>So
to equate the effect of these Rules to merely following ‘existing
practices’ is plainly wrong. An intermediary—like the CIS website—should have the freedom to choose not to have terms of service
agreements. We now don’t.“In case any issue arises concerning the interpretation of the terms
used by the Intermediary, which is not agreed to by the user or affected
person, the same can only be adjudicated by a Court of Law. The
Government or any of its agencies have no power to intervene or even
interpret. DIT has reiterated that there is no intention of the
Government to acquire regulatory jurisdiction over content under these
Rules. It has categorically said that these rules do not provide for any
regulation or control of content by the Government.”</p>
<p>The
Rules are based on the presumption that all complaints (and resultant
mandatory taking down of the content) are correct, and that the
incorrectness of the take-downs can be disputed in court. Why not just
invert that, and presume that all complaints need to be proven first, and the correctness of the complaints (instead of the take-downs) be disputed in court? </p>
<p>Indeed,
the courts have insisted that presumption of validity is the only
constitutional way of dealing with speech. (See, for instance, <em>Karthikeyan R. v. Union
of India</em>, a 2010 Madras High Court judgment.)</p>
<p>Further,
only constitutional courts (namely High Courts and the Supreme Court)
can go into the question of the validity of a law. Other courts have to
apply the law, even if it the judge believes it is constitutionally
invalid. So, most courts will be forced to apply this law of highly
questionable constitutionality until a High Court or the Supreme Court
strikes it down.</p>
<p>What
the Department has in fact done is to explicitly open up the floodgates
for increased liability claims and litigation - which runs exactly
counter to the purpose behind the amendment of Section 79 by Parliament
in 2008.</p>
<blockquote>“The
Government adopted a very transparent process for formulation of the
Rules under the Information Technology Act. The draft Rules were
published on the Department of Information Technology website for
comments and were widely covered by the media. None of the Industry
Associations and other stakeholders objected to the formulation which is
now being cited in some section of media.”<br /></blockquote>
<p>This is a blatant lie.</p>
<p>Civil
society voices, including <a href="http://editors.cis-india.org/internet-governance/blog/2011/02/25/intermediary-due-diligence" class="external-link">CIS</a>, <a class="external-link" href="http://www.softwarefreedom.in/index.php?option=com_idoblog&task=viewpost&id=86&Itemid=70">Software Freedom Law Centre</a>, and
individual experts (such as the lawyer and published author <a class="external-link" href="http://www.iltb.net/2011/02/draft-rules-on-intermediary-liability-released-by-the-ministry-of-it/">Apar Gupta</a>)
sent in comments. Companies <a class="external-link" href="http://online.wsj.com/article/SB10001424052748704681904576314652996232860.html?mod=WSJINDIA_hps_LEFTTopWhatNews">such as Google</a>, <a class="external-link" href="http://e2enetworks.com/2011/05/13/e2e-networks-response-to-draft-rules-for-intermediary-guidelines/">E2E Networks</a>, and others had apparently
raised concerns as well. The press has published many a cautionary note, including editorials, op-ed and articles in <a class="external-link" href="http://www.thehindu.com/opinion/lead/article1487299.ece">the</a> <a class="external-link" href="http://www.thehindu.com/opinion/editorial/article1515144.ece">Hindu</a>, <a class="external-link" href="http://www.thehoot.org/web/home/story.php?sectionId=6&mod=1&pg=1&valid=true&storyid=5163">the Hoot</a>, Medianama.com, and Kafila.com, well before the new rules were notified. We at CIS even received a 'read notification'
from the email account of the Group Coordinator of the DIT’s Cyber Laws
Division—Dr. Gulshan Rai—on Thursday, March 3, 2011 at 12:04 PM (we had
sent the mail to Dr. Rai on Monday, February 28, 2011). We never
received any acknowledgement, though, not even after we made an express
request for acknowledgement (and an offer to meet them in person to
explain our concerns) on Tuesday, April 5, 2011 in an e-mail sent to Mr.
Prafulla Kumar and Dr. Gulshan Rai of DIT.</p>
<p>The
process can hardly be called 'transparent' when the replies received
from 'industry associations and other stakeholders' have not been made
public by the DIT. Those comments which are public all indicate that
serious concerns were raised as to the constitutionality of the Rules.</p>
<p>The Government has been forward looking to create a conducive
environment for the Internet medium to catapult itself onto a different
plane with the evolution of the Internet. The Government remains fully
committed to freedom of speech and expression and the citizen’s rights
in this regard.</p>
<p><span id="internal-source-marker_0.8528041979429147">The DIT has limited this statement to the rules on intermediary due
diligence, and has not spoken about the controversial new rules that
stifle cybercafes, and restrict users' privacy and freedom to receive
information.<br /></span></p>
<p><span id="internal-source-marker_0.8528041979429147"></span>If
the government is serious about creating a conducive environment for
innovation, privacy and free expression on the Internet, then it wouldn’t be
passing Rules that curb down on them, and it definitely will not be
doing so in such a non-transparent fashion.</p></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries'>http://editors.cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries</a>
</p>
No publisherpraneshFreedom of Speech and ExpressionIT ActFeaturedIntermediary Liability2012-07-11T13:18:04ZBlog EntryReading the Fine Script: Service Providers, Terms and Conditions and Consumer Rights
http://editors.cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights
<b>This year, an increasing number of incidents, related to consumer rights and service providers, have come to light. This blog illustrates the facts of the cases, and discusses the main issues at stake, namely, the role and responsibilities of providers of platforms for user-created content with regard to consumer rights.</b>
<p style="text-align: justify; "><span>On 1st July, 2014 the Federal Trade Commission (FTC) filed a complaint against T-Mobile USA,</span><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn1">[1]</a><span> accusing the service provider of 'cramming' customers bills, with millions of dollars of unauthorized charges. Recently, another service provider, received flak from regulators and users worldwide, after it published a paper, 'Experimental evidence of massive-scale emotional contagion through social networks'.</span><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn2">[2]</a><span> The paper described Facebook's experiment on more than 600,000 users, to determine whether manipulating user-generated content, would affect the emotions of its users.</span></p>
<p style="text-align: justify; ">In both incidents the terms that should ensure the protection of their user's legal rights, were used to gain consent for actions on behalf of the service providers, that were not anticipated at the time of agreeing to the terms and conditions (T&Cs) by the consumer. More precisely, both cases point to the underlying issue of how users are bound by T&Cs, and in a mediated online landscape—highlight, the need to pay attention to the regulations that govern the online engagement of users.</p>
<p style="text-align: justify; "><b>I have read and agree to the terms</b></p>
<p style="text-align: justify; ">In his statement, Chief Executive Officer, John Legere might have referred to T-Mobile as "the most pro-consumer company in the industry",<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn3">[3]</a> however the FTC investigation revelations, that many customers never authorized the charges, suggest otherwise. The FTC investigation also found that, T-Mobile received 35-40 per cent of the amount charged for subscriptions, that were made largely through innocuous services, that customers had been signed up to, without their knowledge or consent. Last month news broke, that just under 700,000 users 'unknowingly' participated in the Facebook study, and while the legality and ethics of the experiment are being debated, what is clear is that Facebook violated consumer rights by not providing the choice to opt in or out, or even the knowledge of such social or psychological experiments to its users.</p>
<p style="text-align: justify; ">Both incidents boil down to the sensitive question of consent. While binding agreements around the world work on the condition of consent, how do we define it and what are the implications of agreeing to the terms?</p>
<p style="text-align: justify; "><b>Terms of Service: Conditions are subject to change </b></p>
<p style="text-align: justify; ">A legal necessity, the existing terms of service (TOS)—as they are also known—as an acceptance mechanism are deeply broken. The policies of online service providers are often, too long, and with no shorter or multilingual versions, require substantial effort on part of the user to go through in detail. A 2008 Carnegie Mellon study estimated it would take an average user 244 hours every year to go through the policies they agree to online.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn4">[4]</a> Based on the study, Atlantic's Alexis C. Madrigal derived that reading all of the privacy policies an average Internet user encounters in a year, would take 76 working days.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn5">[5]</a></p>
<p style="text-align: justify; ">The costs of time are multiplied by the fact that terms of services change with technology, making it very hard for a user to keep track of all of the changes over time. Moreover, many services providers do not even commit to the obligation of notifying the users of any changes in the TOS. Microsoft, Skype, Amazon, YouTube are examples of some of the service providers that have not committed to any obligations of notification of changes and often, there are no mechanisms in place to ensure that service providers are keeping users updated.</p>
<p style="text-align: justify; ">Facebook has said that the recent social experiment is perfectly legal under its TOS,<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn6">[6]</a> the question of fairness of the conditions of users consent remain debatable. Facebook has a broad copyright license that goes beyond its operating requirements, such as the right to 'sublicense'. The copyright also does not end when users stop using the service, unless the content has been deleted by everyone else.</p>
<p style="text-align: justify; ">More importantly, since 2007, Facebook has brought major changes to their lengthy TOS about every year.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn7">[7]</a> And while many point that Facebook is transparent, as it solicits feedback preceding changes to their terms, the accountability remains questionable, as the results are not binding unless 30% of the actual users vote. Facebook can and does, track users and shares their data across websites, and has no obligation or mechanism to inform users of the takedown requests.</p>
<p style="text-align: justify; ">Courts in different jurisdictions under different laws may come to different conclusions regarding these practices, especially about whether changing terms without notifying users is acceptable or not. Living in a society more protective of consumer rights is however, no safeguard, as TOS often include a clause of choice of law which allow companies to select jurisdictions whose laws govern the terms.</p>
<p style="text-align: justify; ">The recent experiment bypassed the need for informed user consent due to Facebook's Data Use Policy<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn8">[8]</a>, which states that once an account has been created, user data can be used for 'internal operations, including troubleshooting, data analysis, testing, research and service improvement.' While the users worldwide may be outraged, legally, Facebook acted within its rights as the decision fell within the scope of T&Cs that users consented to. The incident's most positive impact might be in taking the questions of Facebook responsibilities towards protecting users, including informing them of the usage of their data and changes in data privacy terms, to a worldwide audience.</p>
<p style="text-align: justify; "><b>My right is bigger than yours</b></p>
<p style="text-align: justify; ">Most TOS agreements, written by lawyers to protect the interests of the companies add to the complexities of privacy, in an increasingly user-generated digital world. Often, intentionally complicated agreements, conflict with existing data and user rights across jurisdictions and chip away at rights like ownership, privacy and even the ability to sue. With conditions that that allow for change in terms at anytime, existing users do not have ownership or control over their data.</p>
<p style="text-align: justify; ">In April New York Times, reported of updates to the legal policy of General Mills (GM), the multibillion-dollar food company.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn9">[9]</a> The update broadly asserted that consumers interacting with the company in a variety of ways and venues no longer can sue GM, but must instead, submit any complaint to “informal negotiation” or arbitration. Since then, GM has backtracked and clarified that “online communities” mentioned in the policy referred only to those online communities hosted by the company on its own websites.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn10">[10]</a> Clarification aside, as Julia Duncan, Director of Federal programs at American Association for Justice points out, the update in the terms were so broad, that they were open to wide interpretation and anything that consumers purchase from the company could have been held to this clause. <a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn11">[11]</a></p>
<p style="text-align: justify; "><b>Data and whose rights?</b></p>
<p style="text-align: justify; ">Following Snowden revelations, data privacy has become a contentious issue in the EU, and TOS, that allow the service providers to unilaterally alter terms of the contract, will face many challenges in the future. In March Edward Snowden sent his testimony to the European Parliament calling for greater accountability and highlighted that in "a global, interconnected world where, when national laws fail like this, our international laws provide for another level of accountability."<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn12">[12]</a> Following the testimony came the European Parliament's vote in favor of new safeguards on the personal data of EU citizens, when it’s transferred to non-EU.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn13">[13]</a> The new regulations seek to give users more control over their personal data including the right to ask for data from companies that control it and seek to place the burden of proof on the service providers.</p>
<p style="text-align: justify; ">The regulation places responsibility on companies, including third-parties involved in data collection, transfer and storing and greater transparency on concerned requests for information. The amendment reinforces data subject right to seek erasure of data and obliges concerned parties to communicate data rectification. Also, earlier this year, the European Court of Justice (ECJ) ruled in favor of the 'right to be forgotten'<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn14">[14]</a>. The ECJ ruling recognised data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life.</p>
<p style="text-align: justify; ">In May, the Norwegian Consumer Council filed a complaint with the Norwegian Consumer Ombudsman, “… based on the discrepancies between Norwegian Law and the standard terms and conditions applicable to the Apple iCloud service...”, and, “...in breach of the law regarding control of marketing and standard agreements.”<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn15">[15]</a> The council based its complaint on the results of a study, published earlier this year, that found terms were hazy and varied across services including iCloud, Drop Box, Google Drive, Jotta Cloud, and Microsoft OneDrive. The Norwegian Council study found that Google TOS, allow for users content to be used for other purposes than storage, including by partners and that it has rights of usage even after the service is cancelled. None of the providers provide a guarantee that data is safe from loss, while many, have the ability to terminate an account without notice. All of the service providers can change the terms of service but only Google and Microsoft give an advance notice.</p>
<p style="text-align: justify; ">The study also found service providers lacking with respect to European privacy standards, with many allowing for browsing of user content. Tellingly, Google had received a fine in January by the French Data Protection Authority, that stated regarding Google's TOS, "permits itself to combine all the data it collects about its users across all of its services without any legal basis."</p>
<p style="text-align: justify; "><b>To blame or not to blame</b></p>
<p style="text-align: justify; ">Facebook is facing a probe by the UK Information Commissioner's Office, to assess if the experiment conducted in 2012 was a violation of data privacy laws.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn16">[16]</a> The FTC asked the court to order T-Mobile USA, to stop mobile cramming, provide refunds and give up any revenues from the practice. The existing mechanisms of online consent, do not simplify the task of agreeing to multiple documents and services at once, a complexity which manifolds, with the involvement of third parties.</p>
<p style="text-align: justify; ">Unsurprisingly, T-Mobile's Legere termed the FTC lawsuit misdirected and blamed the companies providing the text services for the cramming.<a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftn17">[17]</a> He felt those providers should be held accountable, despite allegations that T-Mobile's billing practices made it difficult for consumers to detect that they were being charged for unauthorized services and having shared revenues with third-party providers. Interestingly, this is the first action against a wireless carrier for cramming and the FTC has a precedent of going after smaller companies that provide the services.</p>
<p style="text-align: justify; ">The FTC charged T-Mobile USA with deceptive billing practices in putting the crammed charges under a total for 'use charges' and 'premium services' and failure to highlight that portion of the charge was towards third-party charges. Further, the company urged customers to take complaints to vendors and was not forthcoming with refunds. For now, T-Mobile may be able to share the blame, the incident brings to question its accountability, especially as going forward it has entered a pact along with other carriers in USA including Verizon and AT&T, agreeing to stop billing customers for third-party services. Even when practices such as cramming are deemed illegal, it does not necessarily mean that harm has been prevented. Often users bear the burden of claiming refunds and litigation comes at a cost while even after being fined companies could have succeeded in profiting from their actions.</p>
<p style="text-align: justify; "><b>Conclusion </b></p>
<p style="text-align: justify; ">Unfair terms and conditions may arise when service providers include terms that are difficult to understand or vague in their scope. TOS that prevent users from taking legal action, negate liability for service providers actions despite the companies actions that may have a direct bearing on users, are also considered unfair. More importantly, any term that is hidden till after signing the contract, or a term giving the provider the right to change the contract to their benefit including wider rights for service provider wide in comparison to users such as a term that that makes it very difficult for users to end a contract create an imbalance. These issues get further complicated when the companies control and profiting from data are doing so with user generated data provided free to the platform.</p>
<p style="text-align: justify; ">In the knowledge economy, web companies play a decisive role as even though they work for profit, the profit is derived out of the knowledge held by individuals and groups. In their function of aggregating human knowledge, they collect and provide opportunities for feedback of the outcomes of individual choices. The significance of consent becomes a critical part of the equation when harnessing individual information. In France, consent is part of the four conditions necessary to be forming a valid contract (article 1108 of the Code Civil).</p>
<p style="text-align: justify; ">The cases highlight the complexities that are inherent in the existing mechanisms of online consent. The question of consent has many underlying layers such as reasonable notice and contractual obligations related to consent such as those explored in the case in Canada, which looked at whether clauses of TOS were communicated reasonably to the user, a topic for another blog. For now, we must remember that by creating and organising social knowledge that further human activity, service providers, serve a powerful function. And as the saying goes, with great power comes great responsibility.</p>
<hr size="1" style="text-align: justify; " width="33%" />
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref1">[1]</a> 'FTC Alleges T-Mobile Crammed Bogus Charges onto Customers’ Phone Bills', published 1 July, 2014. See: http://www.ftc.gov/news-events/press-releases/2014/07/ftc-alleges-t-mobile-crammed-bogus-charges-customers-phone-bills</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref2">[2]</a> 'Experimental evidence of massive-scale emotional contagion through social networks', Adam D. I. Kramera,1, Jamie E. Guilloryb, and Jeffrey T. Hancock, published March 25, 2014. See:http://www.pnas.org/content/111/24/8788.full.pdf+html?sid=2610b655-db67-453d-bcb6-da4efeebf534</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref3">[3]</a> 'U.S. sues T-Mobile USA, alleges bogus charges on phone bills, Reuters published 1st July, 2014 See: http://www.reuters.com/article/2014/07/01/us-tmobile-ftc-idUSKBN0F656E20140701</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref4">[4]</a> 'The Cost of Reading Privacy Policies', Aleecia M. McDonald and Lorrie Faith Cranor, published I/S: A Journal of Law and Policy for the Information Society 2008 Privacy Year in Review issue. See: http://lorrie.cranor.org/pubs/readingPolicyCost-authorDraft.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref5">[5]</a> 'Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days', Alexis C. Madrigal, published The Atlantic, March 2012 See: http://www.theatlantic.com/technology/archive/2012/03/reading-the-privacy-policies-you-encounter-in-a-year-would-take-76-work-days/253851/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref6">[6]</a> Facebook Legal Terms. See: https://www.facebook.com/legal/terms</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref7">[7]</a> 'Facebook's Eroding Privacy Policy: A Timeline', Kurt Opsahl, Published Electronic Frontier Foundation , April 28, 2010 See:https://www.eff.org/deeplinks/2010/04/facebook-timeline</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref8">[8]</a> Facebook Data Use Policy. See: https://www.facebook.com/about/privacy/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref9">[9]</a> 'When ‘Liking’ a Brand Online Voids the Right to Sue', Stephanie Strom, published in New York Times on April 16, 2014 See: http://www.nytimes.com/2014/04/17/business/when-liking-a-brand-online-voids-the-right-to-sue.html?ref=business</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref10">[10]</a> Explaining our website privacy policy and legal terms, published April 17, 2014 See:http://www.blog.generalmills.com/2014/04/explaining-our-website-privacy-policy-and-legal-terms/#sthash.B5URM3et.dpufhttp://www.blog.generalmills.com/2014/04/explaining-our-website-privacy-policy-and-legal-terms/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref11">[11]</a> General Mills Amends New Legal Policies, Stephanie Strom, published in New York Times on 1http://www.nytimes.com/2014/04/18/business/general-mills-amends-new-legal-policies.html?_r=0</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref12">[12]</a> Edward Snowden Statement to European Parliament published March 7, 2014. See: http://www.europarl.europa.eu/document/activities/cont/201403/20140307ATT80674/20140307ATT80674EN.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref13">[13]</a> Progress on EU data protection reform now irreversible following European Parliament vote, published 12 March 201 See: http://europa.eu/rapid/press-release_MEMO-14-186_en.htm</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref14">[14]</a> European Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties, Jyoti Panday, published on CIS blog on May 14, 2014. See: http://cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref15">[15]</a> Complaint regarding Apple iCloud’s terms and conditions , published on 13 May 2014 See:http://www.forbrukerradet.no/_attachment/1175090/binary/29927</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref16">[16]</a> 'Facebook faces UK probe over emotion study' See: http://www.bbc.co.uk/news/technology-28102550</p>
<p style="text-align: justify; "><a href="file:///C:/Users/jyoti/Desktop/Reading%20the%20fine%20script%20When%20terms%20and%20conditions%20apply.docx#_ftnref17">[17]</a> Our Reaction to the FTC Lawsuit See: http://newsroom.t-mobile.com/news/our-reaction-to-the-ftc-lawsuit.htm</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights'>http://editors.cis-india.org/internet-governance/blog/reading-between-the-lines-service-providers-terms-and-conditions-and-consumer-rights</a>
</p>
No publisherjyotiSocial MediaConsumer RightsGoogleinternet and societyPrivacyTransparency and AccountabilityIntermediary LiabilityAccountabilityFacebookData ProtectionPoliciesSafety2014-07-04T06:31:37ZBlog EntryProblems Remain with Standing Committee's Report on Copyright Amendments
http://editors.cis-india.org/a2k/blogs/sc-report-on-amendments
<b>The Rajya Sabha Standing Committee on Human Resource Development (under which ministry copyright falls) recently tabled their report on the Copyright (Amendment) Bill, 2010 before Parliament. There is much to be applauded in the report, including the progressive stand that the Committee has taken on the issue of providing access by persons with disabilities. This post, however, will concern itself with highlighting some of the problems with that report, along with some very important considerations that got missed out of the entire amendment debate.</b>
<h2 id="internal-source-marker_0.7517305351026772">Fair Dealings and Intermediary Liability</h2>
<p>The
amendments make a number of changes to s.52(1) of the Act, including to
the fair dealing provisions under s.52(1)(a), and introduction of two
new sub-sections (s.52(1)(b) and (c)) with s.52(1)(c) introducing a
modicum of protection for intermediaries involved in "transient and
incidental storage for the purpose of providing electronic links, access
or integration" (but only if the copyright holder has not expressed any
objections, and if the intermediary believes it to be non-infringing).
The provision allows the intermediary to ask the person complaining
against it to provide a court order within 14 days, since the
intermediary is in no position to determine the judicial question of
whether the copyright holder holds copyright and if the third party has
violated that copyright. However this provision was opposed tooth and
nail by the copyright holders' associations that dominated the
representations, while intermediaries and consumers remained woefully
under-represented before the Standing Committee.</p>
<p>Predictably,
the Standing Committee dealt a blow against intermediaries and
consumers by asking the government to review the "viability of the
duration of 14 days... by way of balancing the views of the stakeholders
as well as the legal requirement in the matter". They recommended a
relatively minor change of changing the phrase "transient and
incidental" to "transient or incidental". By doing this, they failed to
address the concerns raised by Yahoo India, Google India, and also
failed to acknowledge the submissions made by 22 civil society
organizations (available here:
http://cis-india.org/advocacy/ipr/upload/copyright-bill-submission).</p>
<p> </p>
<h2>Technological Protection Measures and Rights Management Information Provision</h2>
<p>The
amendments aim to bring about two new criminal provisions, and seek to
make circumvention of technological protection measures (digital locks)
and alteration of rights management information (which are embedded into
digital files and signals) illegal.</p>
<p>The Standing Committee heard a number of organizations on technological protection measures, which <a href="http://editors.cis-india.org/a2k/blogs/tpm-copyright-amendment">we had argued</a>
are harmful as they a) cannot distinguish between fair dealing and
infringement, and b) are harmful even if a legal right to circumvent for
fair dealings is provided because the technological means to circumvent
doesn't necessarily exist. (Imagine a law that says that breaking a
lock using lock-breaking implements isn't a crime if it is done to enter
into your own house. Such a law doesn't help you if you can't get your
hands on the lock-breaking implements in the first place.) The Indian
Broadcasting Federation, the Business Software Alliance, and the Motion
Picture Association (which represents six studios, all American), the
Indian Music Industry, and the Indian Performing Right Society Limited
all felt that this provision did not go far enough. The Motion Picture
Association, for instance, wants not just controls over that which
copyright covers</p>
<p>Yahoo
India and Google India on the other hand thought that provision went
too far. Google made it clear that they thought having criminal
repercussions for circumvention was clearly disproportionate. Thus, a
clearer split is established between old media companies; the old media
companies clutching on to straws that they feel will save them from
adapting their business practices to the digital environment, and online
companies that understand the digital environment better having a
markedly different idea.</p>
<p>Currently
section 65B (read with the definition of "Rights Management
Information" in section 2(xa)) of the proposed amendments ensures that
Rights Management Information cannot be used to spy on users. The Indian
Reprographic Rights Organization however believes that this is wrong:
it believes that copyright owners should have the ability to track users
without their consent. Yahoo India, on the other hand, believes that
this is a harmful provision, and state that "the imposition of criminal
and monetary liability could adversely affect consumers", and cites the
instance of difficulties that would be faced by "entities engaged in
creating copies of any copyright material into a format specially
designed for persons suffering from disability" because of the language
of the provision that requires knowledge instead of intention. The
committee responds to this by summing up with a tautology, stating:</p>
<blockquote>
<p>The
Committee is of the view that the parties responsible for distribution
or broadcasting or communication to the public through authorized
licence from the author or rights holder and who do not remove any
rights management information deliberately for making unauthorized
copies need not worry about this provision as long as their act is as
per the framework of this provision.</p>
</blockquote>
<h2><br /></h2>
<h2>Implications of Standing Committee's Report Unclear</h2>
<p>Many of the comments made by the Standing Committee are unclear. On compulsory licensing, the committee states:</p>
<blockquote>The
Committee also takes note of the proposed amendments in section 31 A
relating to compulsory licence in unpublished Indian works. The
provision of compulsory licence for orphaned works available under this
section is proposed to be extended to published works as well. Like in
the case of section 31, extension of applicability to all foreign works
(including film, DVDs, etc.) could be violative of Berne Convention and
TRIPS Agreement and seem to fall short of the minimum obligations
imposed by such instruments. The Committee is of the view that future
implication of proposed amendment in Section 31A vis-à-vis India's
commitment to international agreement needs to be free from any
ambiguity so as to prevent any negative fallout.<br /></blockquote>
<p>However,
the usage of the phrase "could be violative" leaves it unclear whether
the Standing Committee believes the proposed amendments to be violative
of the TRIPS Agreement or not. All that the Standing Committee says is
that the provision needs to be unambiguous, and that TRIPS compliance
must be ensured. That word of caution does not directly rebut the
government's contention that the proposed amendment is TRIPS-compliant.</p>
<p>Similarly,
the Committee's views on increase of copyright term for cinematograph
films is unclear. While commenting on the clause that introduces the
term increase (as part of the proposal to include the principal director
as an author of the film along with the producer), the Committee
states:</p>
<blockquote>It,
therefore, recommends that the proposal to include principal director
as author of the film along with producer may be dropped altogether.<br /></blockquote>
<p>While
this presumably means that the proposal to increase term is also being
rejected, that is not made clear by the Committee's comments.</p>
<h2><br /></h2>
<h2>Increased Copyright Duration, Expansive Moral Rights and Other Negative Changes</h2>
<p>In
the submission of CIS and twenty-one other civil society organizations
to the Standing Committee, we highlighted all of the below concerns.
However, our submission was not tabled before the Standing Committee
for reasons unknown to us.</p>
<ul><li><strong>WCT
and WPPT compliance</strong>: India has not signed either of these two treaties,
which impose TRIPS-plus copyright protection, but without any
corresponding increase in fair dealing / fair use rights. Given that
the Standing Committee has recommended against some aspects of WCT
compliance (such as the move to change "hire" to "commercial rental")
and that without such changes India cannot be a signatory to the WCT, it
is unclear why other forms of WCT compliance (such as TPMs) should be
implemented.</li><li><strong>Increase
in duration of copyright</strong>: The duration of copyright of photographs and
video recordings is sought to be increased. The term of copyright for photographs is being increased from sixty years from creation to sixty years from death of the photographer. This will
significantly reduce the public domain, which India has been arguing for
internationally, especially through its push for the Development Agenda at the World Intellectual Property Organization.<br /></li><li><strong>Moral
rights</strong>: Changes have been made to author’s moral rights (and
performer’s moral rights have been introduced) but these have been made
without requisite safeguards.</li><li><strong>Version
recordings</strong>: The amendments make cover version much more difficult to
produce, and while the Standing Committee has addressed the concerns of
some in the music industry, it hasn't addressed the concerns of artists
and consumers.</li></ul>
<h2><br /></h2>
<h2>Criminal Provisions, Government Works, and Other Missed Opportunities</h2>
<p>The
following important changes should have been made by the government,
but haven't. While on some issues the Standing Committee has gone
beyond the proposed amendments, it hasn't touched upon any of the
following, which we believe are very important changes that are required
to be made.</p>
<ul><li><strong>Criminal
provisions</strong>: Our law still criminalises individual, non-commercial
copyright infringement. This has now been extended to the proposal for
circumvention of Technological Protection Measures and removal of Rights
Management Information also.</li><li><strong>Government
works:</strong> Taxpayers are still not free to use works that were paid for by
them. This goes against the direction that India has elected to march
towards with the Right to Information Act. A simple amendment of
s.52(1)(q) would suffice. The amended subsection would except "the
reproduction, communication to the public, or publication of any
government work" as being non-infringing uses.</li><li><strong>Copyright
terms</strong>: The duration of all copyrights are above the minimum required by
our international obligations, thus decreasing the public domain which
is crucial for all scientific and cultural progress.</li><li><strong>Educational exceptions</strong>: The exceptions for education still do not fully embrace distance and digital education.</li><li><strong>Communication
to the public</strong>: No clear definition is given of what constitute a
‘public’, and no distinction is drawn between commercial and
non-commercial ‘public’ communication.</li><li><strong>Internet
intermediaries</strong>: More protections are required to be granted to Internet
intermediaries to ensure that non-market based peer-production projects
such as Wikipedia, and other forms of social media and grassroots
innovation are not stifled.</li><li><strong>Fair
dealing and fair use</strong>: We would benefit greatly if, apart from the
specific exceptions provided for in the Act, more general guidelines
were also provided as to what do not constitute infringement. This would
not take away from the existing exceptions.</li></ul>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/sc-report-on-amendments'>http://editors.cis-india.org/a2k/blogs/sc-report-on-amendments</a>
</p>
No publisherpraneshAccess to KnowledgeCopyrightIntellectual Property RightsIntermediary LiabilityTechnological Protection Measures2011-09-06T07:50:12ZBlog EntryPrimer on the New IT Act
http://editors.cis-india.org/internet-governance/blog/primer-it-act
<b>With this draft information bulletin, we briefly discuss some of the problems with the Information Technology Act, and invite your comments.</b>
<p align="justify">The latest amendments to
the Information Technology Act 2000, passed in December 2008 by the
Lok Sabha, and the draft rules framed under it contain several provisions
that can be abused and misused to infringe seriously on citizens'
fundamental rights and basic civil liberties. We have already <a href="http://editors.cis-india.org/internet-governance/it-act/short-note-on-amendment-act-2008" class="internal-link" title="Short note on IT Amendment Act, 2008">written about some of the problems</a> with this Act earlier. With this information bulletin, drafted by Chennai-based advocate Ananth Padmanabhan, we wish to extend that analysis into the form of a citizens' dialogue highlighting ways in which the Act and the rules under it fail. Thus, we invite your comments, suggestions, and queries, as this is very much a work in progress. We will eventually consolidate this dialogue and follow up with the government on the concerns of its citizens.</p>
<h3 align="justify">Intermediaries
beware</h3>
<p align="justify">Internet service
providers, webhosting service providers, search engines, online
payment sites, online auction sites, online market places, and cyber
cafes are all examples of “intermediaries” under this Act. The
Government can force any of these intermediaries to cooperate with
any interception, monitoring or decryption of data by stating broad
and ambiguous reasons such as the “interest of the sovereignty or
integrity of India”, “defence of India”, “security of the
State”, “friendly relations with foreign States”, “public
order” or for “preventing incitement to” or “investigating”
the commission of offences related to those. This power can be abused
to infringe on the privacy of intermediaries as well as to hamper
their constitutional right to conduct their business without interference.</p>
<p align="justify">If a Google search on
“Osama Bin Laden” throws up an article that claims to have
discovered his place of hiding, the Government of India can issue a
direction authorizing the police to monitor Google’s servers to
find the source of this information. While Google can, of course,
establish that this information cannot be attributed directly to the
organization, making the search unwarranted, that would not help it
much. While section 69 grants the government these wide-ranging
powers, it does not provide for adequate safeguards in the form of having to show due cause or having an in-built right of appeal against a decision by the government. If Google refused
to cooperate under such circumstances, its directors would be liable
to imprisonment of up to seven years.</p>
<h3 align="justify">Pre-censorship<br /></h3>
<p align="justify">The State has been given
unbridled power to block access to websites as long as such blocking
is deemed to be in the interest of sovereignty and integrity of
India, defence of India, security of the State, friendly relations
with foreign States, and other such matters.</p>
<p align="justify">Thus, if a web portal or
blog carries or expresses views critical of the Indo-US nuclear deal,
the government can block access to the website and thus muzzle criticism
of its policies. While some may find that suggestion outlandish, it is very much possible under the Act. Since there is no right to be heard before your website is taken down nor is there an in-built mechanism for the website owner to appeal, the decisions made by the government cannot be questioned unless you are prepared to undertake a costly legal battle. </p>
<p align="justify">Again, if an intermediary (like Blogspot or an ISP like Airtel) refuses to cooperate, its directors may be personally liable to imprisonment for up to a period of seven years. Thus, being personally liable, the intermediaries are rid of any incentive to stand up for the freedom of speech and expression.</p>
<h3 align="justify">We need to monitor your computer: you have a virus<br /></h3>
<p align="justify">The government has been
vested with the power to authorize the monitoring and collection of
traffic data and information generated, transmitted, received or
stored in any computer resource. This provision is much too
widely-worded. </p>
<p align="justify">For instance, if the
government feels that there is a virus on your computer that can
spread to another computer, it can demand access to monitor your
e-mails on the ground that such monitoring enhances “cyber
security” and prevents “the spread of computer contaminants”.</p>
<h3 align="justify">Think before you click "Send"<br /></h3>
<p align="justify">If out of anger you send
an e-mail for the purpose of causing “annoyance” or
“inconvenience”, you may be liable for imprisonment up to three
years along with a fine. While that provision (section 66A(c)) was
meant to combat spam and phishing attacks, it criminalizes much more
than it should.</p>
<h3 align="justify">A new brand of "cyber terrorists" <br /></h3>
<p align="justify">The new offence of “cyber
terrorism” has been introduced, which is so badly worded that it
borders on the ludicrous. If a journalist gains
unauthorized access to a computer where information regarding
corruption by certain members of the judiciary is stored, she becomes
a “cyber terrorist” as the information may be used to cause
contempt of court. There is no precedent for any such definition of cyberterrorism. It is unclear what definition of terrorism the government is going by when even unauthorized access to defamatory material is considered cyberterrorism.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/primer-it-act'>http://editors.cis-india.org/internet-governance/blog/primer-it-act</a>
</p>
No publisherpraneshIT ActDigital GovernancePublic AccountabilityIntermediary LiabilityCensorship2011-08-02T07:41:54ZBlog EntryPress Coverage of Online Censorship Row
http://editors.cis-india.org/internet-governance/blog/press-coverage-online-censorship
<b>We are maintaining a rolling blog with press references to the row created by the proposal by the Union Minister for Communications and Information Technology to pre-screen user-generated Internet content.</b>
<h2>Monday, December 5, 2011</h2>
<p><a href="http://india.blogs.nytimes.com/2011/12/05/india-asks-google-facebook-others-to-screen-user-content/?pagemode=print">India Asks Google, Facebook to Screen Content</a> | Heather Timmons (New York Times, India Ink)</p>
<h2>Tuesday, December 6, 2011</h2>
<p><a href="http://www.thehindu.com/news/national/article2690084.ece">Sibal warns social websites over objectionable content</a> | Sandeep Joshi (The Hindu)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2691781.ece">Hate speech must be blocked, says Sibal</a> | Praveen Swami & Sujay Mehdudia (The Hindu)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2692821.ece">Won't remove material just because it's controversial: Google</a> | (Press Trust of India)</p>
<p><a class="external-link" href="http://india.blogs.nytimes.com/2011/12/06/any-normal-human-being-would-be-offended/">Any Normal Human Being Would Be Offended </a>| Heather Timmons (New York Times, India Ink)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2692047.ece">After Sibal, Omar too feels some online content inflammatory </a>| (Press Trust of India)</p>
<p><a class="external-link" href="http://www.reuters.com/article/2011/12/06/us-india-internet-idUSTRE7B50CV20111206">Online uproar as India seeks social media screening</a> | Devidutta Tripathy and Anurag Kotoky (Reuters)</p>
<p><a class="external-link" href="http://articles.economictimes.indiatimes.com/2011-12-06/news/30481824_1_kapil-sibal-objectionable-content-twitter">Kapil Sibal for content screening: Facebook, Twitter full of posts against censorship</a> | (IANS)</p>
<p><a class="external-link" href="http://www.pcworld.com/businesscenter/article/245548/india_may_overstep_its_own_laws_in_demanding_content_filtering.html">India May Overstep Its Own Laws in Demanding Content Filtering</a> | John Ribeiro (IDG)</p>
<p><a class="external-link" href="http://articles.timesofindia.indiatimes.com/2011-12-06/internet/30481147_1_shashi-tharoor-objectionable-content-bjp-mp">Kapil Sibal warns websites: Mixed response from MPs</a> | (Press Trust of India)</p>
<p><a class="external-link" href="http://www.youtube.com/watch?v=WJp8HOPzc7k">Websites must clean up content, says Sibal </a>| (NewsX)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/tech/news/internet/Kapil-Sibal-warns-websites-Google-says-wont-remove-material-just-because-its-controversial/articleshow/11008985.cms">Kapil Sibal warns websites; Google says won't remove material just because it's controversial </a>| Press Trust of India</p>
<p><a class="external-link" href="http://www.livemint.com/2011/12/06155955/Views--Censorship-by-any-othe.html?h=A1">Censorship By Any Other Name...</a> | Yamini Lohia (Mint)</p>
<p><a class="external-link" href="http://articles.timesofindia.indiatimes.com/2011-12-06/internet/30481193_1_facebook-and-google-facebook-users-facebook-page">Kapil Sibal: We have to take care of sensibility of our people</a> | Associated Press</p>
<p><a class="external-link" href="http://articles.timesofindia.indiatimes.com/2011-12-06/india/30481473_1_digvijaya-singh-websites-content">Kapil Sibal gets backing of Digvijaya Singh over social media screening</a> | Press Trust of India</p>
<p><a class="external-link" href="http://www.hindustantimes.com/News-Feed/newdelhi/Sibal-gets-what-he-set-out-to-censor/Article1-778388.aspx">Sibal Gets What He Set Out To Censor </a>| (Hindustan Times, Agencies)</p>
<p><a class="external-link" href="http://newstonight.net/content/objectionable-matter-will-be-removed-censorship-not-picture-yet-kapil-sibal">Objectionable Matter Will Be Removed, Censorship Not in Picture Yet: Kapil Sibal</a> | Amar Kapadia (News Tonight)</p>
<h2>Wednesday, December 7, 2011</h2>
<p><a class="external-link" href="http://indiatoday.intoday.in/story/kapil-sibal-for-monitoring-offensive-content-on-internet/1/163107.html">Kapil Sibal Doesn't Understand the Internet</a> | Shivam Vij (India Today)</p>
<p><a class="external-link" href="http://india.blogs.nytimes.com/2011/12/07/chilling-impact-of-indias-april-internet-rules/">'Chilling' Impact of India's April Internet Rules</a> | Heather Timmons (New York Times, India Ink)</p>
<p><a class="external-link" href="http://www.business-standard.com/india/news/screening-not-censorship-says-sibal/457797/">Screening, not censorship, says Sibal</a> | (Business Standard)</p>
<p><a class="external-link" href="http://www.livemint.com/2011/12/07202955/Chandni-Chowk-to-China.html">Chandni Chowk to China</a> | Salil Tripathi (Mint)</p>
<p><a class="external-link" href="http://www.livemint.com/2011/12/07131308/Views--Kapil-Sibal-vs-the-int.html">Kapil Sibal vs the internet</a> | Sandipan Deb (Mint)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/tech/news/internet/No-need-for-censorship-of-internet-Cyber-law-experts/articleshow/11014990.cms">No Need for Censorship of the Internet: Cyber Law Experts</a> | (Times News Network)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2695832.ece">Protest with flowers for Sibal</a> | (The Hindu)</p>
<p><a class="external-link" href="http://www.dnaindia.com/india/report_kapil-sibal-cannot-screen-this-report_1622435">Kapil Sibal cannot screen this report</a> | Team DNA, Blessy Chettiar & Renuka Rao (Daily News and Analysis)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/india/Kapil-Sibal-warns-websites-but-experts-say-prescreening-of-user-content-not-practical/articleshow/11019481.cms">Kapil Sibal warns websites, but experts say prescreening of user content not practical </a>| (Reuters)</p>
<p><a class="external-link" href="http://newstonight.net/content/sibal-s-remarks-brought-disgust">Sibal's Remarks Brought Disgust</a> | Hitesh Mehta (News Tonight)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2695884.ece">BJP backs mechanism to curb objectionable content on websites</a> | (The Hindu)</p>
<p><a class="external-link" href="http://economictimes.indiatimes.com/news/politics/nation/move-to-regulate-networking-sites-should-be-discussed-in-parliament-bjp/articleshow/11023284.cms">Move to regulate networking sites should be discussed in Parliament: BJP</a> | (Press Trust of India)</p>
<p><a class="external-link" href="http://www.dailypioneer.com/pioneer-news/top-story/26016-sibal-under-attack-in-cyberspace.html">Sibal under attack in cyberspace</a> | (Press Trust of India)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/tech/news/internet/Google-Govt-wanted-358-items-removed/articleshow/11021470.cms">Kapil Sibal's web censorship: Indian govt wanted 358 items removed, says Google</a> | (Press Trust of India)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/india/Kapil-Sibal-gets-BJP-support-but-with-rider/articleshow/11020128.cms">Kapil Sibal gets BJP support but with rider</a> | (Indo-Asian News Service)</p>
<p><a class="external-link" href="http://www.hindustantimes.com/India-news/NewDelhi/Sibal-s-way-of-regulating-web-not-okay-says-BJP/Article1-779221.aspx">Sibal's way of regulating web not okay, says BJP</a> | (Indo-Asian News Service)</p>
<p><a class="external-link" href="http://blogs.hindustantimes.com/just-faith/?p=1034">Censorship in Blasphemy's Clothings</a> | Gautam Chikermane (Hindustan Times, Just Faith)</p>
<p><a class="external-link" href="http://www.computerworld.com/s/article/9222500/India_wants_Google_Facebook_to_screen_content">India wants Google, Facebook to screen content</a> | Sharon Gaudin (Computer World)</p>
<p><a class="external-link" href="http://www.zdnetasia.com/blogs/should-we-be-taming-social-media-62303153.htm">Should we be taming social media?</a> | Swati Prasad (ZDNet, Inside India)</p>
<p><a class="external-link" href="http://www.dnaindia.com/bangalore/report_kapil-sibal-gets-lampooned-for-views-on-web-control_1622491">Kapil Sibal gets lampooned for views on Web control</a> | (Daily News and Analysis)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/life-style/people/We-dont-need-no-limitation/articleshow/11020244.cms">'We don't need no limitation'</a> | Asha Prakash (Times of India)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/tech/news/internet/Five-reasons-why-India-cant-censor-the-internet/articleshow/11018172.cms">Five reasons why India can't censor the internet</a> | Prasanto K. Roy (Indo-Asian News Service)</p>
<p><a class="external-link" href="http://www.indianexpress.com/news/we-are-the-web/884753/">We Are the Web</a> | (Indian Express)</p>
<h2>Thursday, December 8, 2011</h2>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/india/Kapil-Sibal-under-attack-in-cyberspace/articleshow/11029319.cms">Kapil Sibal under attack in cyberspace</a>, (Press Trust of India)</p>
<p><a class="external-link" href="http://www.indianexpress.com/news/speak-up-for-freedom/885132/">Speak Up for Freedom </a>| Pranesh Prakash (Indian Express)</p>
<p><a class="external-link" href="http://india.blogs.nytimes.com/2011/12/08/newswallah-censorship/">Newswallah: Censorship</a> | Neha Thirani (New York Times, India Ink)</p>
<p><a class="external-link" href="http://www.ndtv.com/article/india/no-question-of-censoring-internet-says-sachin-pilot-156281">No Question of Censoring the Internet, Says Sachin Pilot </a>| (NDTV)</p>
<p><a class="external-link" href="http://www.economist.com/blogs/babbage/2011/12/web-censorship-india">Mind Your Netiquette, or We'll Mind it for You</a> | A.A.K. (The Economist)</p>
<p><a class="external-link" href="http://timesofindia.indiatimes.com/india/Take-Parliaments-view-to-regulate-social-networking-sites-BJP-tells-govt/articleshow/11025858.cms">Take Parliament's view to regulate social networking sites, BJP tells govt</a> | (Times News Network)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2696027.ece">India wanted 358 items removed</a> | Priscilla Jebaraj (The Hindu)</p>
<p><a class="external-link" href="http://www.barandbench.com/brief/2/1891/indian-government-v-social-networking-sites-expert-views">Indian Government v Social Networking sites: Expert Views</a> | (Bar & Bench News Network)</p>
<p><a class="external-link" href="http://business-standard.com/india/news/can-government-muzzle-websites/457909/">Can Government Muzzle Websites?</a> | Priyanka Joshi & Piyali Mandal (Business Standard)</p>
<p><a class="external-link" href="http://economictimes.indiatimes.com/news/international-business/us-concerned-over-internet-curbs-sidesteps-india-move/articleshow/11029532.cms">US concerned over internet curbs, sidesteps India move</a> | (Indo-Asian News Service)</p>
<p><a class="external-link" href="http://www.rediff.com/business/slide-show/slide-show-1-why-internet-companies-are-upset-with-kapil-sibal/20111208.htm">Why Internet Companies Are Upset with Kapil Sibal</a> | (Rediff)</p>
<p><a class="external-link" href="http://www.siliconindia.com/shownews/Why_Censor_Facebook_When_You_Dont_Censor_Sunny_Leone-nid-99931-cid-1.html">Why Censor Facebook When You Don't Censor Sunny Leone?</a> | (Indo-Asian News Service)</p>
<p><a class="external-link" href="http://www.thehindu.com/news/national/article2697432.ece">Online content issue: Talks with India on, says U.S.</a> | (Press Trust of India)</p>
<p><a class="external-link" href="http://www.google.com/hostednews/afp/article/ALeqM5h0BfQkpJMZISTc3fjs3VgH7orciw?docId=CNG.8dc3992299cb598cecde0fffb1db8bcd.1c1">US calls for Internet freedom amid India plan</a> | Agence France-Presse</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/press-coverage-online-censorship'>http://editors.cis-india.org/internet-governance/blog/press-coverage-online-censorship</a>
</p>
No publisherpraneshIT ActLinksFreedom of Speech and ExpressionInternet GovernanceFacebookIntermediary LiabilityCensorship2011-12-08T11:31:30ZBlog EntryPanel Discussion on Internet Intermediaries, Law and Innovation
http://editors.cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation
<b>CII, Google and Centre For Communications Governance, NLU Delhi hosted a panel discussion on June 2 in New Delhi. Jyoti Panday attended.</b>
<p style="text-align: justify; ">The Centre for Internet & Society (CIS) participated in the panel discussion on 'Internet Intermediaries, Law and Innovation' hosted by CII, Google and Centre For Communications Governance, NLU Delhi. The panel discussed the impact of the existing provisions on intermediary liability and innovation and sought suggestions and the way forward.<br /><br />The panel was moderated by Dr Subho Ray, President, IAMAI<br /><br />Other panelists included:</p>
<ul style="text-align: justify; ">
<li> Mr Anupam Chander, Eminent Global Lawyer & Academician</li>
<li> Mr Apar Gupta, Advocate</li>
<li> Ms Mishi Choudhary, Founding Director , Software Freedom Law Centre</li>
<li> Mr J Sai Deepak, Associate Partner, Litigation Team, Saikrishna & Associates</li>
</ul>
<ul style="text-align: justify; ">
<li> Mr Indranil Choudhury, Founder and CEO, Lexplosion</li>
</ul>
<p><a href="http://editors.cis-india.org/internet-governance/blog/internet-intermediaries-law-and-innovation-panel.odp" class="internal-link">Click to download the presentation.</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation'>http://editors.cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation</a>
</p>
No publisherjyotiInternet GovernanceIntermediary Liability2015-06-14T16:37:56ZNews ItemOverview of the Constitutional Challenges to the IT Act
http://editors.cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact
<b>There are currently ten cases before the Supreme Court challenging various provisions of the Information Technology Act, the rules made under that, and other laws, that are being heard jointly. Advocate Gopal Sankaranarayanan who's arguing Anoop M.K. v. Union of India has put together this chart that helps you track what's being challenged in each case.</b>
<br />
<br />
<br />
<table class="tg" style="undefined;table-layout: fixed; border=">
<tr>
<th class="tg-s6z2">PENDING MATTERS</th>
<th class="tg-s6z2">CASE NUMBER</th>
<th class="tg-0ord">PROVISIONS CHALLENGED</th>
</tr>
<tr>
<td class="tg-4eph">Shreya Singhal v. Union of India</td>
<td class="tg-spn1">W.P.(CRL.) NO. 167/2012</td>
<td class="tg-zapm">66A</td>
</tr>
<tr>
<td class="tg-031e">Common Cause & Anr. v. Union of India</td>
<td class="tg-s6z2">W.P.(C) NO. 21/2013</td>
<td class="tg-0ord">66A, 69A & 80</td>
</tr>
<tr>
<td class="tg-4eph">Rajeev Chandrasekhar v. Union of India & Anr.</td>
<td class="tg-spn1">W.P.(C) NO. 23/2013</td>
<td class="tg-zapm">66A & Rules 3(2), 3(3), 3(4) & 3(7) of the Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-031e">Dilip Kumar Tulsidas Shah v. Union of India & Anr.</td>
<td class="tg-s6z2">W.P.(C) NO. 97/2013</td>
<td class="tg-0ord">66A</td>
</tr>
<tr>
<td class="tg-4eph">Peoples Union for Civil Liberties v. Union of India & Ors.</td>
<td class="tg-spn1">W.P.(CRL.) NO. 199/2013</td>
<td class="tg-zapm">66A, 69A, Intermediaries Rules 2011 (s.79(2) Rules) & Blocking of Access of Information by Public Rules 2009 (s.69A Rules)</td>
</tr>
<tr>
<td class="tg-031e">Mouthshut.Com (India) Pvt. Ltd. & Anr. v. Union of India & Ors.</td>
<td class="tg-s6z2">W.P.(C) NO. 217/2013</td>
<td class="tg-0ord">66A & Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-4eph">Taslima Nasrin v. State of U.P & Ors.</td>
<td class="tg-spn1">W.P.(CRL.) NO. 222/2013</td>
<td class="tg-zapm">66A</td>
</tr>
<tr>
<td class="tg-031e">Manoj Oswal v. Union of India & Anr.</td>
<td class="tg-s6z2">W.P.(CRL.) NO. 225/2013</td>
<td class="tg-0ord">66A & 499/500 Indian Penal Code</td>
</tr>
<tr>
<td class="tg-4eph">Internet and Mobile Ass'n of India & Anr. v. Union of India & Anr.</td>
<td class="tg-spn1">W.P.(C) NO. 758/2014</td>
<td class="tg-zapm">79(3) & Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-031e">Anoop M.K. v. Union of India & Ors.</td>
<td class="tg-s6z2">W.P.(CRL.) NO. 196/2014</td>
<td class="tg-0ord">66A, 69A, 80 & S.118(d) of the Kerala Police Act, 2011</td>
</tr>
</table>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact'>http://editors.cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact</a>
</p>
No publisherpraneshIT ActCourt CaseFreedom of Speech and ExpressionIntermediary LiabilityConstitutional LawCensorshipSection 66AArticle 19(1)(a)Blocking2014-12-19T09:01:50ZBlog EntryOnline Pre-Censorship is Harmful and Impractical
http://editors.cis-india.org/internet-governance/online-pre-censorship-harmful-impractical
<b>The Union Minister for Communications and Information Technology, Mr. Kapil Sibal wants Internet intermediaries to pre-censor content uploaded by their users. Pranesh Prakash takes issue with this and explains why this is a problem, even if the government's heart is in the right place. Further, he points out that now is the time to take action on the draconian IT Rules which are before the Parliament.</b>
<p>Mr. Sibal is a knowledgeable lawyer, and according to a senior lawyer friend of his with whom I spoke yesterday, greatly committed to ideals of freedom of speech. He would not lightly propose regulations that contravene Article 19(1)(a) [freedom of speech and expression] of our Constitution. Yet his recent proposals regarding controlling online speech seem unreasonable. My conclusion is that the minister has not properly grasped the way the Web works, is frustrated because of the arrogance of companies like Facebook, Google, Yahoo and Microsoft. And while he has his heart in the right place, his lack of knowledge of the Internet is leading him astray. The more important concern is the<a class="external-link" href="http://www.mit.gov.in/sites/upload_files/dit/files/RNUS_CyberLaw_15411.pdf"> IT Rules</a> that have been in force since April 2011.</p>
<h3>Background <br /></h3>
<p>The New York Times scooped a story on Monday revealing that Mr. Sibal and the <a class="external-link" href="http://www.mit.gov.in/">MCIT</a> had been <a class="external-link" href="http://india.blogs.nytimes.com/2011/12/05/india-asks-google-facebook-others-to-screen-user-content/?scp=2&sq=kapil%20sibal&st=cse">in touch with Facebook, Google, Yahoo, and Microsoft</a>, asking them to set up a system whereby they would manually filter user-generated content before it is published, to ensure that objectionable speech does not get published. Specifically, he mentioned content that hurt people's religious sentiments and content that Member of Parliament Shashi Tharoor described as <a class="external-link" href="http://zeenews.india.com/news/nation/i-am-against-web-censorship-shashi-tharoor_745587.html">'vile' and capable of inciting riots as being problems</a>. Lastly, Mr. Sibal defended this as not being "censorship" by the government, but "supervision" of user-generated content by the companies themselves.</p>
<h3>Concerns <br /></h3>
<p>One need not give lectures on the benefits of free speech, and Mr. Sibal is clear that he does not wish to impinge upon it. So one need not point out that freedom of speech means nothing if not the freedom to offend (as long as no harm is caused). There can, of course, be reasonable limitations on freedom of speech as provided in Article 19 of the <a class="external-link" href="http://www2.ohchr.org/english/law/ccpr.htm">ICCPR</a> and in Article 19(2) of our Constitution. My problem lies elsewhere.</p>
<h3>Secrecy <br /></h3>
<p>It is unfortunate that the New York Times has to be given credit for Mr. Sibal addressing a press conference on this issue (and he admitted as much). What he is proposing is not enforcement of existing rules and regulations, but of a new restriction on online speech. This should have, in a democracy, been put out for wide-ranging public consultations first.</p>
<h3>Making intermediaries responsible <br /></h3>
<p>The more fundamental disagreement is that over how the question of what should not be published should be decided, and how that decision should be and how that should be carried out, and who can be held liable for unlawful speech. I believe that "to make the intermediary liable for the user violating that code would, I think, not serve the larger interests of the market." Mr. Sibal said that in May this year <a class="external-link" href="http://online.wsj.com/article/SB10001424052702304563104576355223687825048.html">in an interview with the Wall Street Journal</a>. The intermediaries (that is, all persons and companies who transmit or host content on behalf of a third party), are but messengers just like a post office and do not exercise editorial control, unlike a newspaper. (By all means prosecute Facebook, Google, Yahoo, and Microsoft whenever they have created unlawful content, have exercised editorial control over unlawful content, have incited and encouraged unlawful activities, or know after a court order or the like that they are hosting illegal content and still do not remove it.)
Newspapers have editors who can take responsibility for content published in the newspaper. They can afford to, because the number of articles in a newspaper is limited. YouTube, which has 48 hours of videos uploaded every minutes, cannot. One wag suggested that Mr. Sibal was not suggesting a means of censorship, but of employment generation and social welfare for censors and editors. To try and extend editorial duties to these 'intermediaries' by executive order or through 'forceful suggestions' to these companies cannot happen without amending s.79 of the Information Technology Act which ensures they are not to be held liable for their user's content: the users are.
Internet speech has, to my knowledge, and to date, has never caused a riot in India. It is when it is translated into inflammatory speeches on the ground with megaphones that offensive speech, whether in books or on the Internet, actually become harmful, and those should be targeted instead. And the same laws that apply to offline speech already apply online. If such speech is inciting violence then the police can be contacted and a magistrate can take action. Indeed, Internet companies like Facebook, Google, etc., exercise self-regulation already (excessively and wrongly, I feel sometimes). Any person can flag any content on YouTube or Facebook as violating the site's terms of use. Indeed, even images of breast-feeding mothers have been removed from Facebook on the basis of such complaints. So it is mistaken to think that there is no self-regulation. In two recent cases, the High Courts of Bombay (<a href="http://editors.cis-india.org/internet-governance/janhit-manch-v-union-of-india" class="internal-link" title="Janhit Manch & Ors. v. The Union of India"><em>Janhit Manch v. Union of India</em></a>) and Madras (<em>R. Karthikeyan v. Union of India</em>) refused to direct the government and intermediaries to police online content, saying that places an excessive burden on freedom of speech.</p>
<h3>IT Rules, 2011 <br /></h3>
<p>In this regard, the IT Rules published in April 2011 are great offenders. While speech that is 'disparaging' (while not being defamatory) is not prohibited by any statute, yet intermediaries are required not to carry 'disparaging' speech, or speech to which the user has no right (how is this to be judged? do you have rights to the last joke that you forwarded?), or speech that promotes gambling (as the government of Sikkim does through the PlayWin lottery), and a myriad other kinds of speech that are not prohibited in print or on TV. Who is to judge whether something is 'disparaging'? The intermediary itself, on pain of being liable for prosecution if it is found have made the wrong decision. And any person may send a notice to an intermediary to 'disable' content, which has to be done within 36 hours if the intermediary doesn't want to be held liable. Worst of all, there is no requirement to inform the user whose content it is, nor to inform the public that the content is being removed. It just disappears, into a memory hole. It does not require a paranoid conspiracy theorist to see this as a grave threat to freedom of speech.
Many human rights activists and lawyers have made a very strong case that the IT Rules on Intermediary Due Diligence are unconstitutional. Parliament still has an opportunity to reject these rules until the end of the 2012 budget session. Parliamentarians must act now to uphold their oaths to the Constitution.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/online-pre-censorship-harmful-impractical'>http://editors.cis-india.org/internet-governance/online-pre-censorship-harmful-impractical</a>
</p>
No publisherpraneshIT ActObscenityFreedom of Speech and ExpressionPublic AccountabilityYouTubeSocial mediaInternet GovernanceFeaturedIntermediary LiabilityCensorshipSocial Networking2011-12-12T17:00:50ZBlog EntryOn the legality and constitutionality of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
http://editors.cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021
<b>This note examines the legality and constitutionality of the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The analysis is consistent with previous work carried out by CIS on issues of intermediary liability and freedom of expression. </b>
<p><span id="docs-internal-guid-6127737f-7fff-b2eb-1b4a-ff9009a1050f"></span></p>
<p dir="ltr">On 25 February 2021, the Ministry of Electronics and Information Technology (Meity) notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (hereinafter, ‘the rules’). In this note, we examine whether the rules meet the tests of constitutionality under Indian jurisprudence, whether they are consistent with the parent Act, and discuss potential benefits and harms that may arise from the rules as they are currently framed. Further, we make some recommendations to amend the rules so that they stay in constitutional bounds, and are consistent with a human rights based approach to content regulation. Please note that we cover some of the issues that CIS has already highlighted in comments on previous versions of the rules.</p>
<p dir="ltr"> </p>
<p dir="ltr">The note can be downloaded <a class="external-link" href="https://cis-india.org/internet-governance/legality-constitutionality-il-rules-digital-media-2021">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021'>http://editors.cis-india.org/internet-governance/blog/on-the-legality-and-constitutionality-of-the-information-technology-intermediary-guidelines-and-digital-media-ethics-code-rules-2021</a>
</p>
No publisherTorsha Sarkar, Gurshabad Grover, Raghav Ahooja, Pallavi Bedi and Divyank KatiraFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityInternet FreedomInformation Technology2021-06-21T11:52:39ZBlog EntryNo more 66A!
http://editors.cis-india.org/internet-governance/blog/no-more-66a
<b>In a landmark decision, the Supreme Court has struck down Section 66A. Today was a great day for freedom of speech on the Internet! When Section 66A was in operation, if you made a statement that led to offence, you could be prosecuted. We are an offence-friendly nation, judging by media reports in the last year. It was a year of book-bans, website blocking and takedown requests. Facebook’s Transparency Report showed that next to the US, India made the most requests for information about user accounts. A complaint under Section 66A would be a ground for such requests.</b>
<p style="text-align: justify; ">Section 66A hung like a sword in the middle: Shaheen Dhada was arrested in Maharashtra for observing that Bal Thackeray’s funeral shut down the city, Devu Chodankar in Goa and Syed Waqar in Karnataka were arrested for making posts about Narendra Modi, and a Puducherry man was arrested for criticizing P. Chidambaram’s son. The law was vague and so widely worded that it was prone to misuse, and was in fact being misused.</p>
<p style="text-align: justify; ">Today, the Supreme Court struck down Section 66A in its judgment on a <a class="external-link" href="http://cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact">set of petitions</a> heard together last year and earlier this year. Stating that the law is vague, the bench comprising Chelameshwar and Nariman, JJ. held that while restrictions on free speech are constitutional insofar as they are in line with Article 19(2) of the Constitution. Section 66A, they held, does not meet this test: The central protection of free speech is the freedom to make statements that “offend, shock or disturb”, and Section 66A is an unconstitutional curtailment of these freedoms. To cross the threshold of constitutional limitation, the impugned speech must be of such a nature that it incites violence or is an exhortation to violence. Section 66A, by being extremely vague and broad, does not meet this threshold. These are, of course, drawn from news reports of the judgment; the judgment is not available yet.</p>
<p style="text-align: justify; ">Reports also say that Section 79(3)(b) has been read down. Previously, any private individual or entity, and the government and its departments could request intermediaries to take down a website, without a court order. If the intermediaries did not comply, they would lose immunity under Section 79. The Supreme Court judgment states that both in Rule 3(4) of the Intermediaries Guidelines and in Section 79(3)(b), the "actual knowledge of the court order or government notification" is necessary before website takedowns can be effected. In effect, this mean that intermediaries <i>need not</i> act upon private notices under Section 79, while they can act upon them if they choose. This stops intermediaries from standing judge over what constitutes an unlawful act. If they choose not to take down content after receiving a private notice, they will not lose immunity under Section 79.</p>
<p style="text-align: justify; ">Section 69A, the website blocking procedure, has been left intact by the Court, despite infirmities such as a lack of judicial review and non-transparent operation. More updates when the judgment is made available.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/no-more-66a'>http://editors.cis-india.org/internet-governance/blog/no-more-66a</a>
</p>
No publishergeethaCensorshipFreedom of Speech and ExpressionHomepageIntermediary LiabilityFeaturedChilling EffectSection 66AArticle 19(1)(a)Blocking2015-03-26T02:01:31ZBlog EntryNew Release of IPR Chapter of India-EU Free Trade Agreement
http://editors.cis-india.org/a2k/blogs/july-2010-ipr-india-eu-fta
<b>A draft of the IPR chapter of the EU-India FTA, made publicly available now for the first time, provides insight into India's response in July 2010 to several EU proposals on intellectual property protection and enforcement.</b>
<p>A draft of the IPR chapter of the EU-India FTA, made <a href="http://editors.cis-india.org/a2k/upload/india-eu-fta-ipr-july-2010/at_download/file" class="external-link">publicly available for the first time</a> (PDF, 296Kb), provides insight into India's response in July 2010 to several EU proposals on intellectual property protection and enforcement.
The consolidated draft which was prepared to serve as the basis of talks that took place from July 12-14, 2010, in New Delhi, reveals parties' negotiating stances in response to preliminary positions put forth earlier (see <a class="external-link" href="http://www.bilaterals.org/spip.php?article17290">IPR Chapter May draft</a>).</p>
<p>In particular, this draft reflects India's rejection of many EU proposals that would require India to:</p>
<ul><li>exceed its obligations under the WTO's Agreement on Trade-Related Aspects of Intellectual Property Rights (TRIPS), e.g by providing data exclusivity for pharmaceutical products; <br /></li><li>impose radical enforcement provisions, such as liability of intermediary service providers, border measures for goods in transit, and raised norms for damages and injunctions; or <br /></li><li>require legislative change, e.g., on data protection, and to accommodate the full EU demands on geographical indicators. <br /></li></ul>
<p>
A chart compiled by CIS comparing proposed language by India and the EU in several provisions with TRIPS can be found <a href="http://editors.cis-india.org/a2k/india-eu-fta-chart.pdf" class="internal-link" title="New Release of IPR Chapter">here</a> (PDF, 402 Kb).</p>
<p>Sources close to the negotiations have also confirmed that during the July talks India reiterated its refusal to go beyond TRIPS, and its refusal to discuss issues that require changes to Indian law. India appears to have also reiterated that it could not finalise FTA copyright provisions before passage of the Copyright Amendment Bill in the Indian Parliament.</p>
<p>
It is hard to assess the current state of the negotiations on IP or to measure the outcomes of subsequently held talks without access to recent drafts, a public record of deliberations, or the schedule of full and intersessional rounds taking place. However, from press and other statements attributed to the European Commission and Indian officials after the December 2010 EU-India Summit in Brussels, it appears that:</p>
<ul><li>
both parties plan to conclude the FTA, the biggest ever for the EU, by Spring 2011; <br /></li><li>the EU has not relaxed its pursuit of at least some "TRIPS plus" provisions such as data protection for pharmaceuticals <br /></li><li>a mutually agreed solution to India's WTO case against the EU over the seizure of generic medicines may be round the corner. Its impact on the FTA is open to speculation. <br /></li></ul>
<p>Because the India-EU FTA is likely to set a new precedent for future trade agreements between developed and developing countries, and with enormous stakes for patients across the globe, India and the EU need to get it right and ensure no provision runs counter to the interests of millions of citizens.</p>
<p>For further information about the text, contact Malini Aisola <malini.aisola@gmail.com> or Pranesh Prakash <pranesh@cis-india.org></p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/july-2010-ipr-india-eu-fta'>http://editors.cis-india.org/a2k/blogs/july-2010-ipr-india-eu-fta</a>
</p>
No publisherpraneshAccess to MedicineIntellectual Property RightsIntermediary LiabilityAccess to Knowledge2011-09-22T12:34:05ZBlog Entry