The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 21 to 35.
Roundtable on Intermediary Liability and Gender Based Violence at the Digital Citizen Summit, 2018
http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018
<b>Akriti Bopanna and Ambika Tandon conducted a panel on 'Gender and Intermediary Liability' at the Digital Citizen Summit, hosted by the Digital Empowerment Foundation, on November 1, 2018 at India International Centre, New Delhi.</b>
<p class="moz-quote-pre">Ambika was the moderator for the panel, with Apar Gupta, Jyoti Pandey, Amrita Vasudevan, Anja Kovacs, and Japleen Pasricha as speakers. Click to read the <a class="external-link" href="http://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit">concept note</a> and the <a class="external-link" href="http://cis-india.org/internet-governance/files/dcs-2018-agenda">agenda</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018'>http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2018-11-07T02:55:40ZNews ItemA trust deficit between advertisers and publishers is leading to fake news
http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news
<b>Transparency regulations is need of the hour. And urgently for election and political advertising. What do the ads look like? Who paid for them? Who was the target? How many people saw these advertisements? How many times? Transparency around viral content is also required.</b>
<p style="text-align: justify; ">The article was published in <a class="external-link" href="https://www.hindustantimes.com/analysis/a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news/story-SVNH9ot3KD50XRltbwOyEO.html">Hindustan Times</a> on September 24, 2018.</p>
<hr />
<p style="text-align: justify; ">Traditionally, we have depended on the private censorship that intermediaries conduct on their platforms. They enforce, with some degree of success, their own community guidelines and terms of services (TOS). Traditionally, these guidelines and TOS have been drafted keeping in mind US laws since historically most intermediaries, including non-profits like Wikimedia Foundation were founded in the US.</p>
<p style="text-align: justify; ">Across the world, this private censorship regime was accepted by governments when they enacted intermediary liability laws (in India we have Section 79A of the IT Act). These laws gave intermediaries immunity from liability emerging from third party content about which they have no “actual knowledge” unless they were informed using takedown notices. Intermediaries set up offices in countries like India, complied with some lawful interception requests, and also conducted geo-blocking to comply with local speech regulation.</p>
<p style="text-align: justify; ">For years, the Indian government has been frustrated since policy reforms that it has pursued with the US have yielded little fruit. American policy makers keep citing shortcomings in the Indian justice systems to avoid expediting the MLAT (Mutual Legal Assistance Treaties) process and the signing of an executive agreement under the US Clout Act. This agreement would compel intermediaries to comply with lawful interception and data requests from Indian law enforcement agencies no matter where the data was located.</p>
<p style="text-align: justify; ">The data localisation requirement in the draft national data protection law is a result of that frustration. As with the US, a quickly enacted data localisation policy is absolutely non-negotiable when it comes to Indian military, intelligence, law enforcement and e-governance data. For India, it also makes sense in the cases of health and financial data with exceptions under certain circumstances. However, it does not make sense for social media platforms since they, by definition, host international networks of people. Recently an inter ministerial committee recommended that “criminal proceedings against Indian heads of social media giants” also be considered. However, raiding Google’s local servers when a lawful interception request is turned down or arresting Facebook executives will result in retaliatory trade actions from the US.</p>
<p style="text-align: justify; ">While the consequences of online recruitment, disinformation in elections and fake news to undermine public order are indeed serious, are there alternatives to such extreme measures for Indian policy makers? Updating intermediary liability law is one place to begin. These social media companies increasingly exercise editorial control, albeit indirectly, via algorithms to claim that they have no “actual knowledge”.</p>
<p style="text-align: justify; ">But they are no longer mere conduits or dumb pipes as they are now publishers who collect payments to promote content. Germany passed a law called NetzDG in 2017 which requires expedited compliance with government takedown orders. Unfortunately, this law does not have sufficient safeguards to prevent overzealous private censorship. India should not repeat this mistake, especially given what the Supreme Court said in the Shreya Singhal judgment.</p>
<p style="text-align: justify; ">Transparency regulations are imperative. And they are needed urgently for election and political advertising. What do the ads look like? Who paid for them? Who was the target? How many people saw these advertisements? How many times? Transparency around viral content is also required. Anyone should be able to see all public content that has been shared with more than a certain percentage of the population over a historical timeline for any geographic area. This will prevent algorithmic filter bubbles and echo chambers, and also help public and civil society monitor unconstitutional and hate speech that violates terms of service of these platforms. So far the intermediaries have benefitted from surveillance — watching from above. It is time to subject them to sousveillance — watched by the citizens from below.</p>
<p style="text-align: justify; ">Data portability mandates and interoperability mandates will allow competition to enter these monopoly markets. Artificial intelligence regulations for algorithms that significantly impact the global networked public sphere could require – one, a right to an explanation and two, a right to influence automated decision making that influences the consumers experience on the platform.</p>
<p style="text-align: justify; ">The real solution lies elsewhere. Google and Facebook are primarily advertising networks. They have successfully managed to destroy the business model for real news and replace it with a business model for fake news by taking away most of the advertising revenues from traditional and new news media companies. They were able to do this because there was a trust deficit between advertisers and publishers. Perhaps this trust deficit could be solved by a commons-based solutions based on free software, open standards and collective action by all Indian new media companies.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news'>http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news</a>
</p>
No publishersunilInternet GovernanceIntermediary LiabilityCensorship2018-10-02T06:44:55ZBlog EntryInter Movements Open Forum: Trafficking Bill
http://editors.cis-india.org/internet-governance/news/inter-movements-open-forum-trafficking-bill
<b>On 18 May 2018 Gurshabad Grover on behalf of CIS presented comments on the Trafficking (Prevention, Protection and Rehabilitation) Bill 2018 at a meeting of the Inter Movements Open Forum jointly organised by Sangram, Naz Foundation, NNSW, Tarshi and VAMP. The meeting was held at India International Centre in New Delhi.</b>
<p style="text-align: justify;">Gurshabad's presentation was based on Swaraj's <a href="https://cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill">blogpost</a> and subsequent research by Kumarjeet that highlights certain problematic sections (36, 39, 41, 59) in the Bill which may have an adverse impact on freedom of expression, and may additionally change the landscape of intermediary liability rules in India.</p>
<p style="text-align: justify;">Read the <a class="external-link" href="http://cis-india.org/internet-governance/files/the-trafficking-bill">agenda here</a></p>
<p style="text-align: justify;">Clarification (18th August, 2018): A letter sent to the Ministry of Women and Child Development mentioned the Centre for Internet & Society as instituionally endorsing a critique of the The Trafficking of Persons (Prevention, Protection and Rehabilitation) Bill, 2018. We seek to clarify that the Centre for Internet & Society did not endorse the letter to the Ministry.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/inter-movements-open-forum-trafficking-bill'>http://editors.cis-india.org/internet-governance/news/inter-movements-open-forum-trafficking-bill</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2018-08-18T09:21:02ZNews ItemIndian Intermediary Liability Regime: Compliance with the Manila Principles on Intermediary Liability
http://editors.cis-india.org/internet-governance/blog/indian-intermediary-liability-regime
<b>This report assesses the compliance of the Indian intermediary liability framework with the Manila Principles on Intermediary Liability, and recommends substantive legislative changes to bring the legal framework in line with the Manila Principles. </b>
<p><span style="text-align: justify; ">The report was edited by Elonnai Hickok and Swaraj Barooah</span></p>
<hr />
<p style="text-align: justify; ">The report is an examination of Indian laws based upon the background paper to the Manila Principles as the explanatory text on which these recommendations have been based, and not an assessment of the principles themselves. To do this, the report considers the Indian regime in the context of each of the principles defined in the Manila Principles. As such, the explanatory text to the Manila Principles recognizes that diverse national and political scenario may require different intermediary liability legal regimes, however, this paper relies only on the best practices prescribed under the Manila Principles.</p>
<p style="text-align: justify; ">The report is divided into the following sections</p>
<ul>
<li>Principle I: Intermediaries should be shielded by law from liability for third-party content</li>
<li>Principle II: Content must not be required to be restricted without an order by a judicial authority</li>
<li>Principle III: Requests for restrictions of content must be clear, be unambiguous, and follow due process</li>
<li>Principle IV: Laws and content restriction orders and practices must comply with the tests of necessity and proportionality</li>
<li>
<div id="_mcePaste">Principle V: Laws and content restriction policies and practices must respect due process</div>
</li>
<li>
<div id="_mcePaste">Principle VI: Transparency and accountability must be built into laws and content restriction policies and practices</div>
</li>
<li>
<div id="_mcePaste">Conclusion</div>
</li>
</ul>
<p style="text-align: justify; "><a class="external-link" href="http://cis-india.org/internet-governance/files/indian-intermediary-liability-regime">Download the Full report here</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/indian-intermediary-liability-regime'>http://editors.cis-india.org/internet-governance/blog/indian-intermediary-liability-regime</a>
</p>
No publisherdivijInternet GovernanceIntermediary LiabilityPrivacy2018-05-20T15:14:21ZBlog EntryA look at two problematic provisions of the draft Anti-trafficking bill
http://editors.cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill
<b>This post examines two badly drafted provisions of the new Anti-Trafficking bill that have the potential to severely impinge upon the Freedom of Expression, including through a misunderstanding of intermediary liability. </b>
<p style="text-align: justify;" class="normal">On 28 Feb 2018, the Union Cabinet approved ‘The Trafficking of Persons (Prevention, Protection and Rehabilitation) Bill, 2018’ (‘the bill’) for introduction to the Parliament. This comes after a series of consultations on an earlier 2016 draft bill, that had faced its fair share of <a href="https://scroll.in/article/813268/six-counts-on-which-the-draft-anti-trafficking-bill-fails-short" target="_blank">criticism</a>. As per the Press Information Bureau <a href="http://pib.nic.in/newsite/PrintRelease.aspx?relid=176878" target="_blank">announcement</a>, the Ministry of Women and Child Development met with various stakeholders including 60 NGOs and have incorporated many of the suggestions put forth. They’ve also stated that ‘the new law will make India a leader among South Asian countries to combat trafficking.’</p>
<p style="text-align: justify;" class="normal">However, at first glance, there appear to be several issues with overbroad or vague language used in the drafting of the bill, that stretch it into potentially problematic areas. This current post will focus on two such provisions that could lead to a deleterious effect on the Freedom of Expression. As the bill is currently not publicly available, a stakeholder’s copy of the draft is being used to source these provisions. The relevant sections have been reproduced below for convenience. (Emphasis in bold is as provided by the author).</p>
<p style="text-align: justify;" class="normal"><em>Section 39: Buying or Selling of any person</em></p>
<p style="text-align: justify;" class="normal"><em>39. (l) Whoever buys or sells any person for a consideration, shall be punished with rigorous imprisonment for a term which shall not be less than seven years but may extend to ten years, and shall also be liable to fine which shall not be less than one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal"><em>(2) Whoever solicits or publicises electronically, taking or distributing obscene photographs or videos or providing materials or soliciting or guiding tourists or using agents or any other form <strong>which may lead to the trafficking of a person shall be punished</strong> with rigorous imprisonment for a term which shall not be less than five years but may extend to ten years, and shall also be liable to fine which shall not be less than fifty thousand rupees but which may extend to one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal">The grammatical acrobatics of section 39(2) aside, this anti-solicitation provision is severely problematic in that it mandates punishment even for a vaguely defined action or actions that may not actually be connected to the trafficking of a person. In other words, the provision doesn’t require any of the actions to be connected to trafficking in their intent or even outcome, but only in <em>potential</em> <em>connection</em> to the outcome. At the same time, it says these ‘shall’ be punished!</p>
<p style="text-align: justify;" class="normal">This vagary that ignores actual or even probabilistic causation flies in the face of standard criminal law which requires <em>mens rea</em> along with <em>actus rea</em>. The excessively wide scope of this badly drafted provision leaves it prone to abuse. For example, currently the provision allows the following interpretation to be included: ‘Whoever publicizes electronically, by providing materials in any form, which may lead to trafficking of a person shall be punished…’. Even the electronic publicizing of an academic study on trafficking could fall under the provision as it currently reads, if it is argued that publishing studies that show the prevalence of trafficking ‘may lead to the trafficking of a person’! It is not hard to imagine that an academic study that shows trafficking numbers at embarrassingly high rates could be threatened with this provision. Similarly, any of our vast number of self-appointed moral guardians could also pull within this provision any artistic work that they may personally find offensive or ‘obscene’. Simply put, without any burden of showing a causal connect, it could be argued that <em>anything</em> ‘may lead’ to the trafficking of a person. Needless to say, this paves the way for a severe chilling effect on free speech, especially on critical speech around trafficking issues.</p>
<p style="text-align: justify;" class="normal"><em>Section 41: Offences related to media</em></p>
<p style="text-align: justify;" class="normal"><em>41. (l) Whoever commits trafficking of a person with the aid of media, including, but not limited to print, internet, digital or electronic media, shall be punished with rigorous imprisonment for a term which shall not be less than seven years but may extend to ten years and shall also be liable to fine which shall not be less than one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal"><em>(2) Whoever <strong>distributes, or sells or stores</strong>, in any form in any electronic or printed form showing incidence of sexual exploitation, sexual assault, or rape for the purpose of exploitation or for coercion of the victim or his family members, or for unlawful gain <strong>shall be punished</strong> with rigorous imprisonment for a term which shall not be less than three years but may extend to seven years and shall also be liable to fine which shall not be less than one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal">The drafters of this bill have perhaps overlooked the fact that unlike the physical world, the infrastructure of the electronic / digital world requires 3rd party intermediaries to handle information during most forms of electronic activities, whether it is transmission, storage or display. As it is not feasible, desirable or even practically possible for intermediaries to verify the legality of every bit of data that gets transferred or stored by the intermediary, ‘safe harbours’ are provided in law for intermediaries, protecting them from liability of the information being transmitted through them. These ensure that entities that act as architectural requirements and intermediary platforms are able to operate smoothly and without fear. If intermediaries are not granted this protection, it puts them in the unenviable position of having to monitor un-monitorable amounts of data, and face legal action for the slip-ups that are bound to happen regularly. Furthermore, there are several levels of free speech and privacy issues associated with having multiple gatekeepers on the expression of speech online. A charitable reading of the intent of a provision which does not recognise safe harbours for 3rd party intermediaries, would be that the drafters of the bill have simply not realised that users who upload and initiate transfer of information online, are not the same parties who do the actual transmission of the information.</p>
<p style="text-align: justify;" class="normal">Distribution, selling or storing of information online would require the transmission of information over intermediaries, as well as the temporary storage of such information on intermediary platforms. In India, intermediaries engaging with transmission or temporary storage of information are provided safe harbour<a href="imap://prasad@mail.cis-india.org:143/fetch%3EUID%3E/INBOX%3E176833#_ftn1">[1]</a> by Section 79 of the Information Technology Act, 2000 (‘IT Act’), so long as they:</p>
<p style="text-align: justify;" class="normal">(i) act as a mere ‘conduit’ and do not initiate the transmission, select the receiver of the transmission, or select or modify the information contained in the transmission.</p>
<p style="text-align: justify;" class="normal">(ii) exercise due diligence while discharging duties under this Act, and observes other guidelines that the Central Government may prescribe.</p>
<p style="text-align: justify;" class="normal">The Information Technology (Intermediary Guidelines) Rules, 2011, list out the nature of the due diligence to be followed by intermediaries to claim exemption under Section 79 of the IT Act.</p>
<p style="text-align: justify;" class="normal">Intermediaries will not be granted safe harbour if they have conspired, abetted, aided or induced commission of the unlawful act, or if they do not remove or disable access to information upon receiving actual knowledge, or notice from the Government, of the information that is transmitted or stored by the intermediary being used for unlawful purposes.</p>
<p style="text-align: justify;" class="normal">Thus it can be seen that the IT Act already provides an in-depth regime for intermediary liability, and given its <em>non-obstante </em>clause which states that Section 79 of the IT Act would apply “Notwithstanding anything contained in any law for the time being in force” , as well as the reiteration of the IT Act’s overriding effect via Section 81, which states that the provisions of the Act ‘shall have effect notwithstanding anything inconsistent therewith contained in any other law for the time being in force’ (barring the exercise of copyright or patent rights), it is generally considered the appropriate legal framework for this issue. However, it appears that the drafters of the 2018 Anti-trafficking bill have not considered this aspect at all, since they have not referenced the IT Act in this context in the bill, and have additionally added their own <em>non-obstante </em>clause in Section 59 of the bill:</p>
<p style="text-align: justify;" class="normal">59.<em> The provisions of this Act, shall be in addition to and not in derogation of the provisions of any other law for the time being in force and, in case of any inconsistency, the provisions of this Act shall have overriding effect on the provisions of any such law to the extent of the inconsistency.</em></p>
<p style="text-align: justify;" class="normal">So the regime as prescribed by the IT Act allows for safe harbours, whereas the regime as prescribed by the Anti-Trafficking bill does not allow for safe harbours, and both say that they would an overriding effect for any conflicting law. This legislative bumble could potentially be solved by using the settled principle that a special Act prevails over a general legislation. This is still a little tricky as they are technically both special Acts. It could be argued that given the context of the Anti-trafficking bill as focusing on trafficking, and the context of the IT Act focusing on the interface of law and technology, that for the purposes of Section 41(2) of the Anti-trafficking bill, the IT Act is the special legislation. And thus Section 79 of the IT Act should make redundant the relevant portion of Section 41(2) of the Anti-trafficking bill. This reading would require the bill to be modified so as to remove the redundancy and the conflicting portion of Section 41(2).</p>
<hr />
<p style="text-align: justify;">[1] In 2016, a division bench of the Delhi High Court held in the case of Myspace Inc vs Super Cassettes Industries Ltd that a safe harbour immunity for intermediaries was necessary as it was not technically feasible to pre-screen content from third parties, and that tasking intermediaries with this responsibility could have a chilling effect on free speech, It held that their responsibility was limited to the extent of acting upon receiving ‘actual knowledge’. Earlier, in determining what ‘actual knowledge’ refers to, in 2015 the Supreme Court of India in the landmark case of Shreya Singhal vs Union of India, required this to be in the form of a notice via a court or government order. Thus under our current law, intermediaries are granted a safe harbour from liability so long as they act upon court or government orders which notify them of content that is required to be taken down.</p>
<p style="text-align: justify;"> </p>
<p style="text-align: justify;">Clarification (18th August, 2018): A letter sent to the Ministry of Women and Child Development mentioned the Centre for Internet & Society as instituionally endorsing a critique of the The Trafficking of Persons (Prevention, Protection and Rehabilitation) Bill, 2018. We seek to clarify that the Centre for Internet & Society did not endorse the letter to the Ministry.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill'>http://editors.cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill</a>
</p>
No publisherswarajFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2018-08-18T09:21:55ZBlog EntrySuper Cassettes v. MySpace (Redux)
http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace
<b>The latest judgment in the matter of Super Cassettes v. MySpace is a landmark and progressive ruling, which strengthens the safe harbor immunity enjoyed by Internet intermediaries in India. It interprets the provisions of the IT Act, 2000 and the Copyright Act, 1957 to restore safe harbor immunity to intermediaries even in the case of copyright claims. It also relieves MySpace from pre-screening user-uploaded content, endeavouring to strike a balance between free speech and censorship. CIS was one of the intervenors in the case, and has been duly acknowledged in the judgment.</b>
<p> </p>
<p>On 23rd December 2016, Justice Ravindra Bhat and Justice Deepa Sharma of the Delhi High Court delivered a decision overturning the 2012 order in the matter of Super Cassettes Industries Limited v. MySpace. The 2012 order was heavily criticized, for it was agnostic to the technological complexities of regulating speech on the Internet and cast unfathomable burdens on MySpace. In the following post I summarise the decision of the Division Bench. Click <a class="external-link" href="http://lobis.nic.in/ddir/dhc/SRB/judgement/24-12-2016/SRB23122016FAOOS5402011.pdf">here</a> to read the judgment.</p>
<h3><strong>Brief Facts</strong></h3>
<p>In 2007, Super Cassettes Industries Limited (SCIL) filed a suit against MySpace, a social networking platform, alleging copyright infringement against MySpace. The platform allowed users to upload and share media files,
<em>inter alia</em>, and it was discovered that users were sharing SCIL’s copyrighted works sans authorisation. SCIL promptly proceeded to file a civil suit against MySpace for primary infringement under section 51(a)(i)
of the Copyright Act as well as secondary infringement under section 51(a)(ii).</p>
<p> The 2012 order was extremely worrisome as it had turned the clock several decades back on concepts of internet intermediary liability. The court had held MySpace liable for copyright infringement despite it having shown no knowledge about specific instances of infringement; that it removed infringing content upon complaints; and that Super Cassettes had failed to submit songs to MySpace's song ID database. The most impractical burden of duty that the court pronounced was that MySpace was required to pre-screen content, rather than relying on post-infringement measures to remove infringing content. This was a result of interpreting due diligence to include pre-screening.</p>
<p>The court injuncted MySpace from permitting any uploads of SCIL's copyrighted content, and directed to expeditiously execute content removal requests. To read CIS' analysis of the Single Judge's interim order, click <a class="external-link" href="http://cis-india.org/a2k/blogs/super-cassettes-v-my-space">here</a>.</p>
<p>In the instant judgment, the bench limited their examination to MySpace’s liability for secondary infringement, and left the direct infringement determination to the Single Judge at the subsequent trial stage. In doing so, the court answered the following three questions:</p>
<h4>1) Whether MySpace could be said to have knowledge of infringement so as to attract liability for
secondary infringement under Section 51(a)(ii)?</h4>
<p>No. According to the Court, in the case of internet intermediaries, section 51(a)(ii) contemplates actual knowledge and not general awareness.</p>
<p>Elaborating re the circumstances of the case, the Court held that to attract liability for secondary infringement, MySpace should have had actual knowledge and not mere awareness of the infringement. Appreciating the difference between virtual and physical worlds, the judgment stated “<em>the nature of internet media is such that the interpretation of knowledge cannot be the same as that is used for a physical premise.”</em></p>
<p>As per the court, the following facts only amounted to a general awareness, which was not sufficient to establish secondary liability:</p>
<ol><li>Existence of user agreement terms which prohibited users from unauthorised uploading of content;<br />
</li><li>Operation of post-infringement mechanisms instituted by MySpace to identify and remove content;<br />
</li><li>SCIL sharing a voluminous catalogue of 100,000 copyrighted songs with MySpace, expecting the latter to monitor and quell any infringement;<br />
</li><li>Modifying videos to insert ads in them: SCIL contended that MySpace invited users to share and upload content which it would use to insert ads and make revenues – and this amounted to knowledge. The Court found that video modification for ad insertion only changed the format of the video and not the content; further, it was a pure automated process and there was no human intervention.</li></ol>
<p>Additionally, no constructive knowledge could be attributed to MySpace to demonstrate reasonable ground for believing that infringement had occurred. A reasonable belief could emerge only after MySpace had perused all the content uploaded and shared on its platform – a task that was impossible to perform due to the voluminous catalogue
handed to it and existing technological limitations.</p>
<p>The Court imposed a duty on SCIL to specify the works in which it owned copyright <em>and </em>being shared
without authorisation on MySpace. It held that merely giving names of all content it owned without expressly pointing out the infringing works was contrary to the established principles of copyright law. Further, MySpace contended and the judge agreed, that in many instances the works were legally shared by distributors and performers – and often users created remixed works which only bore semblance to the title of the copyright work.</p>
<p class="callout"><strong><em>In such cases it becomes even more important for a plaintiff such as
MySpace to provide specific titles, because while an intermediary may
remove the content fearing liability and damages, an authorized
individual’s license and right to fair use will suffer or stand negated.
(Para 38 in decision)</em></strong></p>
<p>Thus, where as MySpace undoubtedly permitted a place of profit for communication of infringing works uploaded by users, it did not have specific knowledge, nor reasonable belief of the infringement.</p>
<h4>2) Does proviso to Section 81 override the "safe harbor" granted to intermediaries under Section 79 of the IT Act, 2000?</h4>
<p>and</p>
<h4>3) Whether it was possible to harmoniously read and interpret Sections 79 and 81 of the IT Act, and Section 51 of the Copyright Act?</h4>
<p>No, the proviso does not override the safe harbor, i.e. the safe harbor
defence cannot be denied to the intermediary in the case of copyright
actions.The three sections have to be read harmoniously, indeed.</p>
<p>
The judgment referred to the Parliamentary Standing Committee report as a relevant tool in interpreting the two provisions, declaring that the rights conferred under the IT Act, 2000 are supplementary and not in derogation of the Patents Act or the Copyright Act. The proviso was inserted only to permit copyright owners to demand action
against intermediaries who may themselves post infringing content – the safe harbor only existed for circumstances when content was third party/user generated.</p>
<p class="callout"><strong><em>Given the supplementary nature of the provisions- one where infringement
is defined and traditional copyrights are guaranteed and the other
where digital economy and newer technologies have been kept in mind, the
only logical and harmonious manner to interpret the law would be to read
them together. Not doing so would lead to an undesirable situation
where intermediaries would be held liable irrespective of their due
diligence. (Para 49 in decision)</em></strong></p>
<p>Regarding section 79, the court reiterated that the section only granted a limited immunity to intermediaries by granting a <em>measured privilege to an intermediary</em>, which was in the nature of an affirmative defence and not a blanket immunity to avoid liability. The very purpose of section 79 was to regulate and limit this liability; where as the Copyright Act granted and controlled rights of a copyright owner.</p>
<p>The Court found Judge Whyte’s decision in Religious Technology Centre v. Netcom Online Communication Services (1995), to be particularly relevant to the instant case, and agreed with its observations. To recall, <em>Netcom</em> was the landmark US ruling which established that when a subscriber was responsible for direct infringement, and the service providers did nothing more than setting up and operating tech systems which were
necessary for the functioning of the Internet, it was illogical to impute liability on the service provider.</p>
<h3><strong>On MySpace Complying with Safe Harbor Requirements under Section 79 of the IT Act, 2000 (and Intermediary Rules, 2011)</strong></h3>
<p>The court held that MySpace's operations were in compliance with section 79(2)(b). The content transmission was initiated at the behest of the users, the recipients were not chosen by MySpace, neither was there modification of content. On the issue of modification, the court reasoned that since modification was an automated process (MySpace was inserting ads) which changed the format only, without MySpace's tacit or expressed control or knowledge, it was in compliance of the legislative requirement.</p>
<p class="callout"><strong><em>Despite several safeguard tools and notice and take down regimes,
infringed videos find their way. The remedy here is not to target
intermediaries but to ensure that infringing material is removed in an
orderly and reasonable manner. A further balancing act is required which
is that of freedom of speech and privatized censorship. If an
intermediary is tasked with the responsibility of identifying infringing
content from non-infringing one, it could have a chilling effect on
free speech; an unspecified or incomplete list may do that.
(Para 62 in decision)</em></strong></p>
On the second aspect of due-diligence, the court held that Mypace complied with the due diligence procedure specified in the Rules - it published rules, regulations, privacy policy and user agreement for access of usage. Reading Rule 3(4) with section 79(2)(c), the court held that it due diligence required MySpace to remove content within 36 hours of gaining actual knowledge or receiving knowledge by another person of the infringing content. <strong>If MySpace failed to take infringing content down accordingly, then only will safe harbour be denied to MySpace.</strong>
<p>This liberal interpretation of due diligence is a big win for internet intermediaries in India.</p>
<h3><strong>Additional Issues Considered by the Court</strong></h3>
<p>MySpace also tried to defend its activities by claiming the shield of the fair dealing section of the Indian Copyright Act. However, the Court refused, stating that the fair dealing defence was inapplicable to the case as the provisions protected transient and incidental storage. Whereas, in the instant circumstances, the content in question was stored/hosted permanently.</p>
<p>MySpace also contended that the Single Judge's injunction order was vague and general and had foisted unimplementable duties on MySpace, disregarding the way the Internet functioned. If MySpace had to strictly comply with the order, it would have to shut its business in India. <strong>The Court said that the Single Judge's order, if enforced, would create a system of unwarranted private censorship, running contrary to the principles of a free speech regime, devoid of considerations of peculiarities of the internet intermediary industry. </strong>Private censorship would also invite upon the ISP the legal risk of wrongfully terminating a user account.</p>
<p>Finally, the Court urged MySpace to explore and innovate techniques to protect the interests of traditional copyright holders in a more efficient manner.</p>
<h3><strong>Relief Granted</strong></h3>
<p>Setting aside the Single Judge's order aside, the Court directed SCIL to provide a specific catalogue of infringing works which also pointed to the URL of the files. Upon receiving such specific knowledge, MySpace has been directed to remove the content within 36 hours of the issued notice. MySpace will also keep an account of the removals, and the revenues earned from ads placed for calculating damages at the trial stage.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace'>http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace</a>
</p>
No publishersinhaIntermediary LiabilityCopyrightCensorshipAccess to Knowledge2017-01-18T14:31:25ZBlog EntryUN Special Rapporteur Report on Freedom of Expression and the Private Sector: A Significant Step Forward
http://editors.cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward
<b>On 6 June 2016, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, released a report on the Information and Communications Technology (“ICT”) sector and freedom of expression in the digital age. Vidushi Marda and Pranesh Prakash highlight the most important aspects of the report.</b>
<h2 dir="ltr">Background</h2>
<p dir="ltr">Today, the private sector is more closely linked to the freedom of expression than it has ever been before. The ability to speak to a mass audience was at one time a privilege restricted to those who had access to mass media. However, with digital technologies, that privilege is available to far more people than was ever possible in the pre-digital era. As private content created on these digital networks is becoming increasingly subject to state regulation, it is crucial to examine the role of the private sector in respect of the freedom of speech and expression.</p>
<p dir="ltr">The first foray by the Special Rapporteur into this broad area has resulted in a sweeping report, that covers almost every aspect of freedom of expression within the ICT sector, except competition which we will elaborate on later in this post.</p>
<h2 dir="ltr">Introduction</h2>
<p dir="ltr">The report aims to “provide guidance on how private actors should protect and promote freedom of expression in a digital age”. It identifies the relevant international legal framework as Article 19 of the <a href="https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668-English.pdf">International Covenant on Civil and Political Rights</a>, and Article 19 of the <a href="http://www.un.org/en/udhrbook/pdf/udhr_booklet_en_web.pdf">Universal Declaration of Human Rights</a>. The UN “Protect, Respect and Remedy” Framework and Guiding Principles, also known as the <a href="http://business-humanrights.org/sites/default/files/reports-and-materials/Ruggie-report-7-Apr-2008.pdf">Ruggie Principles</a> provide the framework for private sector responsibilities on business and human rights.</p>
<p dir="ltr">The report categorises different roles of the private sector in organising, accessing, regulating and populating the internet. This is important because the manner in which the ICT sector affects the freedom of expression is far more complicated than traditional communication industries. The report identifies the distinct impact of internet service providers, hardware and software companies, domain name registries and registrars, search engines, platforms, web hosting services, platforms, data brokers and e-commerce facilities on the freedom of expression.</p>
<h2>Legal and Policy Issues</h2>
<div>The Special Rapporteur discusses four distinct legal and policy issues that find relevance in respect of this problem statement: Content Regulation, Surveillance and Digital Security, Transparency and Remedies.</div>
<div> </div>
<h3>Content Regulation</h3>
<p dir="ltr">The report identifies two main channels through which content regulation takes place: the state, and internal processes.</p>
<p>Noting that digital content made on private networks is increasingly subject to State regulation, the report highlights the competing interests of intermediaries who manage platforms and States which demand for regulation of this content on grounds of defamation, blasphemy, protection of national security etc. This tension is demonstrated through vague laws that compel individuals and private corporations to over-comply and err on the side of caution “in order to avoid onerous penalties, filtering content of uncertain legal status and engaging in other modes of censorship and self-censorship.” Excessive intermediary liability forces intermediaries to over-comply with requests in order to ensure that local access to their platforms are not blocked. States attempt at regulating content outside the law through extra legal restrictions, and push private actors to take down content on their own initiative. Filtering content is another method, wherein States block and filter content through the private sector. Government blacklists, illegal content and suspended accounts are methods employed, and these have sometimes raised concerns of necessity and proportionality. <a href="http://scroll.in/article/807277/whatsapp-in-kashmir-when-big-brother-wants-to-go-beyond-watching-you">Network or service shutdowns</a> are classified as a “particularly pernicious” method of content regulation. Non neutral networks also are a method of content regulation with the possibilities of internet service providers throttling traffic. Zero rating is a potential issue, although the report acknowledges that “it remains a subject of debate whether they may be permissible in areas genuinely lacking Internet access”.</p>
<p>The other node of content regulation has been identified as internal policies and practices of the private sector. <a href="https://consentofthenetworked.com/author/rebeccamackinnon/">Terms of service</a> restrictions are often tailored to the jurisdiction’s laws and policies and don’t always address the needs and interests of vulnerable groups. Further, the report notes, <a href="http://www.catchnews.com/tech-news/facebook-free-basics-gatekeeping-powers-extend-to-manipulating-public-discourse-1452077063.html">design and engineering choices</a> of how private players choose to curate content are algorithmically determined and increasingly control the information that we consume. </p>
<h3>Transparency</h3>
<div> The report notes that transparency enables those entities subject to internet regulation to take informed decisions about their responsibilities and liabilities in a digital sphere and points out, that there is a severe lack of transparency about government requests to restrict or remove content. Some states even prohibit the publication of such information, with India being one example. In respect of the private sector, content hosting platforms sometimes at least reveal the circumstances under which content is removed due to a government request, although this is rather erratic. The report recognises the need to balance transparency with competing concerns like security and trade secrecy, and this is a matter of continued debate.</div>
<div> </div>
<h3 dir="ltr">Surveillance and Digital Security</h3>
<p>Freedom of expression concerns arise as data transmitted on private networks is gradually being subjected to surveillance and interference from the State and private actors. The report finds that several internet companies have reported an increase in government requests for customer data and user information. According to the Special Rapporteur, effective resistance strategies include inclusion of human rights guarantees, restrictively interpreting government requests negotiations. Private players also make surveillance and censorship equipment that enable States to intercept communications. Covert surveillance has been previously reported, with States tapping into communications as and when necessary. When private entities become aware of interception and covert surveillance, their human rights responsibilities arise. As private entities work towards enhancing encryption, anonymity and user security, states respond by <a href="http://www.cnbc.com/2016/03/29/apple-vs-fbi-all-you-need-to-know.html">compelling companies</a> to create loopholes for them to circumvent such privacy and security enhancing technology.</p>
<h3 dir="ltr">Remedies</h3>
<p>Unlawful content removal, opaque suspensions, data security breaches are commonplace occurrences in the digital sphere. The ICCPR guarantees that all people whose rights have been violated must have an effective remedy, and similarly, the Ruggie principles require that remedial and grievance mechanisms must be provided by corporations. There is some ambiguity on how these complaint or appeal mechanisms should be designed and implemented, and the nature and structure of these mechanisms is also unclear. The report states that it is necessary to investigate the role of the state in supplementing/regulating corporate mechanisms, its role in ensuring that there is a mechanism for remedies, and its responsibility to make sure that more easily and financially accessible alternatives exist for remedial measures.<br /><br /></p>
<h2> Special Rapporteur’s priorities for future work and thematic developments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Investigating laws, policies and extralegal measures that equip governments to impose restrictions on the provision of telecommunications and internet services. Examining the responsibility of companies to respond in a way that respects human rights, mitigates harm, and provides avenues for redress.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Evaluating content restrictions under terms of service and community standards. Private actors face substantial pressure from governments and individuals to restrict expression, and a priority is to evaluate the interplay of private and state actions on freedom of expression in light of human rights obligations and responsibilities.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Focusing on the legitimacy of rationales for intermediary liability for content hosting, restrictions, conditions for removing third party content.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Exploring censorship and surveillance within the human rights framework, and encouraging greater scrutiny before using these technologies for purposes that undermine the freedom of expression.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Identifying ways to balance an increasing scope of freedom of expression with the need to address governmental interests in national security and public order.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet access - Future work will explore issues around access and private sector engagement and investment in ensuring affordability and accessibility, particularly considering marginalized groups.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet governance - Internet governance frameworks and reform efforts are sensitive to the needs of women, sexual minorities and other vulnerable communities. Throughout this future work, the Special Rapporteur will pay particular attention to legal developments (legislative, regulatory, and judicial) at national and regional levels.</p>
</li></ol>
<div> </div>
<h2>Conclusions and Recommendations</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">States: The report recommends that states should not pressurise the private sector to interfere with the freedom of speech and expression in a manner that does not meet the condition of necessary and proportionate principles. Any request to take down content or access customer information must be based on validly enacted law, subject to oversight, and demonstrate necessary and proportionate means of achieving the aims laid down in Article 19(3) of the ICCPR.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Private Actors: The Special Rapporteur recommends that private actors develop and implement transparent human rights assessment procedures, and develop policies keeping in mind their human rights impact. Apart from this, private entities should integrate commitments to the freedom of expression into internal processes and ensure the “greatest possible transparency”.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">International Organisations: The report recommends that organisations make resources and educational material on internet governance publicly accessible. The Special Rapporteur also recommends encouraging meaningful civil society participation in multi-stakeholder policy making and standard setting processes, with an increased focus on sensitivity to human rights.</p>
</li></ol>
<div> </div>
<h2>CIS Comments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS strongly agrees with the expansion of the Special Rapporteur’s scope that this report represents. He is no longer looking solely at states but at the private sector too.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS also notes that competition is an important aspect of the freedom of expression, but has not been discussed in this report. Viable alternatives to platforms, networks, internet service providers etc., will ensure a healthy, competitive marketplace, and will have a positive impact in resolving the issues identified above.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Our <a href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf/view">work</a> has called for maintaining a balanced approach to liability of intermediaries for their users’ actions, since excessive liability or strict liability would lead to over-caution and removal of legitimate speech, while having no liability at all would make it difficult to act effectively against harmful speech, e.g., revenge porn.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr"><a href="http://cis-india.org/internet-governance/blog/cis-position-on-net-neutrality">CIS’ work</a> on network neutrality has highlighted the importance of neutrality for freedom of speech, and has advocated for an evidence-based approach that ensures there is neither under-regulation, nor over-regulation. The Special Rapporteur suggests that ‘Zero-Rating’ practices always violate Net Neutrality, but the majority of the definitions of Net Neutrality proposed by academics and followed by regulators across the world often do not include Zero-Rating. Similarly, he suggests that the main exception for Zero-Rating is for areas genuinely lacking access to the Internet, whereas the potential for some forms of Zero-Rating to further freedom of expression, especially of minorities, even in areas with access to the Internet, provides sufficient reason for the issue to merit greater debate.</p>
</li></ol>
<div> </div>
<div> </div>
<div>(Pranesh Prakash was invited by the Special Rapporteur to provide his views and took part in a meeting that contributed to this report)</div>
<div> </div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward'>http://editors.cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward</a>
</p>
No publishervidushiFreedom of Speech and ExpressionInternet GovernanceUNHRCDigital MediaIntermediary LiabilityICT2016-06-08T17:27:22ZBlog EntryThe Case of Whatsapp Group Admins
http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins
<b></b>
<p style="text-align: justify; ">Censorship laws in India have now roped in group administrators of chat groups on instant messaging platforms such as Whatsapp (<i>group admin(s)</i>) for allegedly objectionable content that was posted by other users of these chat groups. Several incidents<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn1">[1]</a> were reported this year where group admins were arrested in different parts of the country for allowing content that was allegedly objectionable under law. A few reports mentioned that these arrests were made under Section 153A<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn2">[2]</a> read with Section 34<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn3">[3]</a> of the Indian Penal Code (<i>IPC</i>) and Section 67<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn4">[4]</a> of the Information Technology Act (<i>IT Act</i>).</p>
<p style="text-align: justify; "><span>Targeting of a group admin for content posted by other members of a chat group has raised concerns about how this liability is imputed. Whether a group admin should be considered an intermediary under Section 2 (w) of the IT Act? If yes, whether a group admin would be protected from such liability?</span></p>
<h3><strong>Group admin as an intermediary</strong></h3>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; ">Whatsapp is an instant messaging platform which can be used for mass communication by opting to create a chat group. A chat group is a feature on Whatsapp that allows joint participation of Whatsapp users. The number of Whatsapp users on a single chat group can be up to 100. Every chat group has one or more group admins who control participation in the group by deleting or adding people. <a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn5">[5]</a> It is imperative that we understand that by choosing to create a chat group on Whatsapp whether a group admin can become liable for content posted by other members of the chat group.</p>
<p style="text-align: justify; "><span>Section 34 of the IPC provides that when a number of persons engage in a criminal act with a common intention, each person is made liable as if he alone did the act. Common intention implies a pre-arranged plan and acting in concert pursuant to the plan. It is interesting to note that group admins have been arrested under Section 153A on the ground that a group admin and a member posting content on a chat group that is actionable under this provision have common intention to post such content on the group. But would this hold true when for instance, a group admin creates a chat group for posting lawful content (say, for matchmaking purposes) and a member of the chat group posts content which is actionable under law (say, posting a video abusing Dalit women)? Common intention can be established by direct evidence or inferred from conduct or surrounding circumstances or from any incriminating facts.</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn6">[6]</a></p>
<p style="text-align: justify; "><span>We need to understand whether common intention can be established in case of a user merely acting as a group admin. For this purpose it is necessary to see how a group admin contributes to a chat group and whether he acts as an intermediary.</span></p>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; "><span>We know that parameters for determining an intermediary differ across jurisdictions and most global organisations have categorised them based on their role or technical functions.</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn7">[7]</a><span> Section 2 (w) of the Information Technology Act, 2000 (</span><i>IT Act</i><span>) defines an intermediary as </span><i>any person, who on behalf of another person, receives, stores or transmits messages or provides any service with respect to that message</i><span> </span><i>and includes the telecom services providers, network providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online marketplaces and cyber cafés</i><span>. Does a group admin receive, store or transmit messages on behalf of group participants or provide any service with respect to messages of group participants or falls in any category mentioned in the definition? Whatsapp does not allow a group admin to receive, or store on behalf of another participant on a chat group. Every group member independently controls his posts on the group. However, a group admin helps in transmitting messages of another participant to the group by allowing the participant to be a part of the group thus effectively providing service in respect of messages. A group admin therefore, should be considered an intermediary. However his contribution to the chat group is limited to allowing participation but this is discussed in further detail in the section below.</span></p>
<p style="text-align: justify; "><span>According to the Organisation for Economic Co-operation and Development (OECD), in a 2010 report</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn8">[8]</a><span>, an internet intermediary brings together or facilitates transactions between third parties on the Internet. It gives access to, hosts, transmits and indexes content, products and services originated by third parties on the Internet or provide Internet-based services to third parties. A Whatsapp chat group allows people who are not on your list to interact with you if they are on the group admins’ contact list. In facilitating this interaction, according to the OECD definition, a group admin may be considered an intermediary.</span></p>
<h3><strong>Liability as an intermediary</strong></h3>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; ">Section 79 (1) of the IT Act protects an intermediary from any liability under any law in force (for instance, liability under Section 153A pursuant to the rule laid down in Section 34 of IPC) if an intermediary fulfils certain conditions laid down therein. An intermediary is required to carry out certain due diligence obligations laid down in Rule 3 of the Information Technology (Intermediaries Guidelines) Rules, 2011 (<i>Rules</i>). These obligations include monitoring content that infringes intellectual property, threatens national security or public order, or is obscene or defamatory or violates any law in force (Rule 3(2)).<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn9">[9]</a> An intermediary is liable for publishing or hosting such user generated content, however, as mentioned earlier, this liability is conditional. Section 79 of IT Act states that an intermediary would be liable only if it initiates transmission, selects receiver of the transmission and selects or modifies information contained in the transmission that falls under any category mentioned in Rule 3 (2) of the Rules. While we know that a group admin has the ability to facilitate sharing of information and select receivers of such information, he has no direct editorial control over the information shared. Group admins can only remove members but cannot remove or modify the content posted by members of the chat group. An intermediary is liable in the event it fails to comply with due diligence obligations laid down under rule 3 (2) and 3 (3) of the Rules however, since a group admin lacks the authority to initiate transmission himself and control content, he can’t comply with these obligations. Therefore, a group admin would be protected from any liability arising out of third party/user generated content on his group pursuant to Section 79 of the IT Act.</p>
<p style="text-align: justify; "><span>It is however relevant to note whether the ability of a group admin to remove participants amounts to an indirect form of editorial control.</span></p>
<h3><strong>Other pertinent observations</strong></h3>
<p style="text-align: justify; "><strong><span> </span></strong></p>
<p style="text-align: justify; ">In several reports<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn10">[10]</a> there have been discussions about how holding a group admin liable makes the process convenient as it is difficult to locate all the users of a particular group. This reasoning may not be correct as the Whatsapp policy<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn11">[11]</a> makes it mandatory for a prospective user to provide his mobile number in order to use the platform and no additional information is collected from group admins which may justify why group admins are targeted. Investigation agencies can access mobile numbers of Whatsapp users and gain more information from telecom companies.</p>
<p style="text-align: justify; "><span>It is also interesting to note that the group admins were arrested after a user or someone familiar to a user filed a complaint with the police about content being objectionable or hurtful. Earlier this year, the apex court had ruled in the case of </span><i>Shreya Singhal v. Union of India</i><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn12">[12]</a><span> that an intermediary needed a court order or a government notification for taking down information. With actions taken against group admins on mere complaints filed by anyone, it is clear that the law enforcement officials have been overriding the mandate of the court.</span></p>
<h3><strong>Conclusion</strong></h3>
<p> </p>
<p><span style="text-align: justify; ">According to a study conducted by a global research consultancy, TNS Global, around 38 % of internet users in India use instant messaging applications such as Snapchat and Whatsapp on a daily basis, Whatsapp being the most widely used application. These figures indicate the scale of impact that arrests of group admins may have on our daily communication.</span></p>
<p style="text-align: justify; "><span>It is noteworthy that categorising a group admin as an intermediary would effectively make the Rules applicable to all Whatsapp users intending to create groups and make it difficult to enforce and would perhaps blur the distinction between users and intermediaries.</span></p>
<p style="text-align: justify; "><span>The critical question however is whether a chat group is considered a part of the bundle of services that Whatsapp offers to its users and not as an independent platform that makes a group admin a separate entity. Also, would it be correct to draw comparison of a Whatsapp group chat with a conference call on Skype or sharing a Google document with edit rights to understand the domain in which censorship laws are penetrating today?</span></p>
<p style="text-align: justify; "> </p>
<p style="text-align: justify; "><i>Valuable contribution by Pranesh Prakash and Geetha Hariharan</i></p>
<hr size="1" style="text-align: justify; " width="33%" />
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref1">[1]</a> <a href="http://www.nagpurtoday.in/whatsapp-admin-held-for-hurting-religious-sentiment/06250951">http://www.nagpurtoday.in/whatsapp-admin-held-for-hurting-religious-sentiment/06250951</a> ; <a href="http://www.catchnews.com/raipur-news/whatsapp-group-admin-arrested-for-spreading-obscene-video-of-mahatma-gandhi-1440835156.html">http://www.catchnews.com/raipur-news/whatsapp-group-admin-arrested-for-spreading-obscene-video-of-mahatma-gandhi-1440835156.html</a> ; <a href="http://www.financialexpress.com/article/india-news/whatsapp-group-admin-along-with-3-members-arrested-for-objectionable-content/147887/">http://www.financialexpress.com/article/india-news/whatsapp-group-admin-along-with-3-members-arrested-for-objectionable-content/147887/</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref2">[2]</a> Section 153A. “Promoting enmity between different groups on grounds of religion, race, place of birth, residence, language, etc., and doing acts prejudicial to maintenance of harmony.— (1) Whoever— (a) by words, either spoken or written, or by signs or by visible representations or otherwise, promotes or attempts to promote, on grounds of religion, race, place of birth, residence, language, caste or community or any other ground whatsoever, disharmony or feelings of enmity, hatred or ill-will between different religious, racial, language or regional groups or castes or communities…” or 2) Whoever commits an offence specified in sub-section (1) in any place of worship or in any assembly engaged in the performance of religious worship or religious ceremonies, shall be punished with imprisonment which may extend to five years and shall also be liable to fine.</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref3">[3]</a> Section 34. Acts done by several persons in furtherance of common intention – When a criminal act is done by several persons in furtherance of common intention of all, each of such persons is liable for that act in the same manner as if it were done by him alone.</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref4">[4]</a> Section 67 Publishing of information which is obscene in electronic form. -Whoever publishes or transmits or causes to be published in the electronic form, any material which is lascivious or appeals to the prurient interest or if its effect is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it, shall be punished on first conviction with imprisonment of either description for a term which may extend to five years and with fine which may extend to one lakh rupees and in the event of a second or subsequent conviction with imprisonment of either description for a term which may extend to ten years and also with fine which may extend to two lakh rupees."</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref5">[5]</a> https://www.whatsapp.com/faq/en/general/21073373</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref6">[6]</a> Pandurang v. State of Hyderabad AIR 1955 SC 216</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref7">[7]</a><a href="https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf">https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf</a>; <a href="http://unesdoc.unesco.org/images/0023/002311/231162e.pdf">http://unesdoc.unesco.org/images/0023/002311/231162e.pdf</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref8">[8]</a> http://www.oecd.org/internet/ieconomy/44949023.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref9">[9]</a> Rule 3(2) (b) of the Rules</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref10">[10]</a><a href="http://www.thehindu.com/news/national/other-states/if-you-are-a-whatsapp-group-admin-better-be-careful/article7531350.ece">http://www.thehindu.com/news/national/other-states/if-you-are-a-whatsapp-group-admin-better-be-careful/article7531350.ece</a>; http://www.newindianexpress.com/states/tamil_nadu/Social-Media-Administrator-You-Could-Land-in-Trouble/2015/10/10/article3071815.ece; <a href="http://www.medianama.com/2015/10/223-whatsapp-group-admin-arrest/">http://www.medianama.com/2015/10/223-whatsapp-group-admin-arrest/</a>; <a href="http://www.thenewsminute.com/article/whatsapp-group-admin-you-are-intermediary-and-here%E2%80%99s-what-you-need-know-35031">http://www.thenewsminute.com/article/whatsapp-group-admin-you-are-intermediary-and-here%E2%80%99s-what-you-need-know-35031</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref11">[11]</a> https://www.whatsapp.com/legal/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref12">[12]</a> http://supremecourtofindia.nic.in/FileServer/2015-03-24_1427183283.pdf</p>
<div>
<div id="ftn12"></div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins'>http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins</a>
</p>
No publisherJapreet GrewalIT ActIntermediary LiabilityCensorship2015-12-08T10:25:42ZBlog EntrySummary Report Internet Governance Forum 2015
http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015
<b>Centre for Internet and Society (CIS), India participated in the Internet Governance Forum (IGF) held at Poeta Ronaldo Cunha Lima Conference Center, Joao Pessoa in Brazil from 10 November 2015 to 13 November 2015. The theme of IGF 2015 was ‘Evolution of Internet Governance: Empowering Sustainable Development’. Sunil Abraham, Pranesh Prakash & Jyoti Panday from CIS actively engaged and made substantive contributions to several key issues affecting internet governance at the IGF 2015. The issue-wise detail of their engagement is set out below. </b>
<p align="center" style="text-align: left;"><strong>INTERNET
GOVERNANCE</strong></p>
<p align="justify">
I. The
Multi-stakeholder Advisory Group to the IGF organised a discussion on
<em><strong>Sustainable
Development Goals (SDGs) and Internet Economy</strong></em><em>
</em>at
the Main Meeting Hall from 9:00 am to 12:30 pm on 11 November, 2015.
The
discussions at this session focused on the importance of Internet
Economy enabling policies and eco-system for the fulfilment of
different SDGs. Several concerns relating to internet
entrepreneurship, effective ICT capacity building, protection of
intellectual property within and across borders were availability of
local applications and content were addressed. The panel also
discussed the need to identify SDGs where internet based technologies
could make the most effective contribution. Sunil
Abraham contributed to the panel discussions by addressing the issue
of development and promotion of local content and applications. List
of speakers included:</p>
<ol>
<li>
<p align="justify">
Lenni
Montiel, Assistant-Secretary-General for Development, United Nations</p>
</li><li>
<p align="justify">
Helani
Galpaya, CEO LIRNEasia</p>
</li><li>
<p align="justify">
Sergio
Quiroga da Cunha, Head of Latin America, Ericsson</p>
</li><li>
<p align="justify">
Raúl
L. Katz, Adjunct Professor, Division of Finance and Economics,
Columbia Institute of Tele-information</p>
</li><li>
<p align="justify">
Jimson
Olufuye, Chairman, Africa ICT Alliance (AfICTA)</p>
</li><li>
<p align="justify">
Lydia
Brito, Director of the Office in Montevideo, UNESCO</p>
</li><li>
<p align="justify">
H.E.
Rudiantara, Minister of Communication & Information Technology,
Indonesia</p>
</li><li>
<p align="justify">
Daniel
Sepulveda, Deputy Assistant Secretary, U.S. Coordinator for
International and Communications Policy at the U.S. Department of
State </p>
</li><li>
<p align="justify">
Deputy
Minister Department of Telecommunications and Postal Services for
the republic of South Africa</p>
</li><li>
<p align="justify">
Sunil
Abraham, Executive Director, Centre for Internet and Society, India</p>
</li><li>
<p align="justify">
H.E.
Junaid Ahmed Palak, Information and Communication Technology
Minister of Bangladesh</p>
</li><li>
<p align="justify">
Jari
Arkko, Chairman, IETF</p>
</li><li>
<p align="justify">
Silvia
Rabello, President, Rio Film Trade Association</p>
</li><li>
<p align="justify">
Gary
Fowlie, Head of Member State Relations & Intergovernmental
Organizations, ITU</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">http</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">://</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">www</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">.</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">.</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">org</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">/</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">cms</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">/</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">igf</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">2015-</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">main</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">-</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">sessions</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2327-2015-11-11-internet-economy-and-sustainable-development-main-meeting-room">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2327-2015-11-11-internet-economy-and-sustainable-development-main-meeting-room</a></u></p>
<p align="justify">
Video
link Internet
economy and Sustainable Development here
<a href="https://www.youtube.com/watch?v=D6obkLehVE8">https://www.youtube.com/watch?v=D6obkLehVE8</a></p>
<p align="justify"> II.
Public
Knowledge organised a workshop on <em><strong>The
Benefits and Challenges of the Free Flow of Data </strong></em>at
Workshop Room
5 from 11:00 am to 12:00 pm on 12 November, 2015. The discussions in
the workshop focused on the benefits and challenges of the free flow
of data and also the concerns relating to data flow restrictions
including ways to address
them. Sunil
Abraham contributed to the panel discussions by addressing the issue
of jurisdiction of data on the internet. The
panel for the workshop included the following.</p>
<ol>
<li>
<p align="justify">
Vint
Cerf, Google</p>
</li><li>
<p align="justify">
Lawrence
Strickling, U.S. Department of Commerce, NTIA</p>
</li><li>
<p align="justify">
Richard
Leaning, European Cyber Crime Centre (EC3), Europol</p>
</li><li>
<p align="justify">
Marietje
Schaake, European Parliament</p>
</li><li>
<p align="justify">
Nasser
Kettani, Microsoft</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS
India</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">://</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">www</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">org</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">cms</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshops</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">list</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">of</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">published</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshop</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">proposals</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2467-2015-11-12-ws65-the-benefits-and-challenges-of-the-free-flow-of-data-workshop-room-5">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2467-2015-11-12-ws65-the-benefits-and-challenges-of-the-free-flow-of-data-workshop-room-5</a></p>
<p align="justify">
Video link https://www.youtube.com/watch?v=KtjnHkOn7EQ</p>
<p align="justify"> III.
Article
19 and
Privacy International organised a workshop on <em><strong>Encryption
and Anonymity: Rights and Risks</strong></em>
at Workshop Room 1 from 11:00 am to 12:30 pm on 12 November, 2015.
The
workshop fostered a discussion about the latest challenges to
protection of anonymity and encryption and ways in which law
enforcement demands could be met while ensuring that individuals
still enjoyed strong encryption and unfettered access to anonymity
tools. Pranesh
Prakash contributed to the panel discussions by addressing concerns
about existing south Asian regulatory framework on encryption and
anonymity and emphasizing the need for pervasive encryption. The
panel for this workshop included the following.</p>
<ol>
<li>
<p align="justify">
David
Kaye, UN Special Rapporteur on Freedom of Expression</p>
</li><li>
<p align="justify">
Juan
Diego Castañeda, Fundación Karisma, Colombia</p>
</li><li>
<p align="justify">
Edison
Lanza, Organisation of American States Special Rapporteur</p>
</li><li>
<p align="justify">
Pranesh
Prakash, CIS India</p>
</li><li>
<p align="justify">
Ted
Hardie, Google</p>
</li><li>
<p align="justify">
Elvana
Thaci, Council of Europe</p>
</li><li>
<p align="justify">
Professor
Chris Marsden, Oxford Internet Institute</p>
</li><li>
<p align="justify">
Alexandrine
Pirlot de Corbion, Privacy International</p>
</li></ol>
<p align="justify"><a name="_Hlt435412531"></a>
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">://</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">www</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">org</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">cms</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">worksh</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">o</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">ps</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">list</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">of</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">published</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshop</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">proposals</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2407-2015-11-12-ws-155-encryption-and-anonymity-rights-and-risks-workshop-room-1">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2407-2015-11-12-ws-155-encryption-and-anonymity-rights-and-risks-workshop-room-1</a></p>
<p align="justify">
Video link available here https://www.youtube.com/watch?v=hUrBP4PsfJo</p>
<p align="justify"> IV.
Chalmers
& Associates organised a session on <em><strong>A
Dialogue on Zero Rating and Network Neutrality</strong></em>
at the Main Meeting Hall from 2:00 pm to 4:00 pm on 12 November,
2015. The Dialogue provided access to expert insight on zero-rating
and a full spectrum of diverse
views on this issue. The Dialogue also explored alternative
approaches to zero rating such as use of community networks. Pranesh
Prakash provided
a
detailed explanation of harms and benefits related to different
approaches to zero-rating. The
panellists for this session were the following.</p>
<ol>
<li>
<p align="justify">
Jochai
Ben-Avie, Senior Global Policy Manager, Mozilla, USA</p>
</li><li>
<p align="justify">
Igor
Vilas Boas de Freitas, Commissioner, ANATEL, Brazil</p>
</li><li>
<p align="justify">
Dušan
Caf, Chairman, Electronic Communications Council, Republic of
Slovenia</p>
</li><li>
<p align="justify">
Silvia
Elaluf-Calderwood, Research Fellow, London School of Economics,
UK/Peru</p>
</li><li>
<p align="justify">
Belinda
Exelby, Director, Institutional Relations, GSMA, UK</p>
</li><li>
<p align="justify">
Helani
Galpaya, CEO, LIRNEasia, Sri Lanka</p>
</li><li>
<p align="justify">
Anka
Kovacs, Director, Internet Democracy Project, India</p>
</li><li>
<p align="justify">
Kevin
Martin, VP, Mobile and Global Access Policy, Facebook, USA</p>
</li><li>
<p align="justify">
Pranesh
Prakash, Policy Director, CIS India</p>
</li><li>
<p align="justify">
Steve
Song, Founder, Village Telco, South Africa/Canada</p>
</li><li>
<p align="justify">
Dhanaraj
Thakur, Research Manager, Alliance for Affordable Internet, USA/West
Indies</p>
</li><li>
<p align="justify">
Christopher
Yoo, Professor of Law, Communication, and Computer & Information
Science, University of Pennsylvania, USA</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">http://www.intgovforum.org/cms/igf2015-main-sessions</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2457-2015-11-12-a-dialogue-on-zero-rating-and-network-neutrality-main-meeting-hall-2">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2457-2015-11-12-a-dialogue-on-zero-rating-and-network-neutrality-main-meeting-hall-2</a></p>
<p align="justify"> V.
The
Internet & Jurisdiction Project organised a workshop on
<em><strong>Transnational
Due Process: A Case Study in MS Cooperation</strong></em>
at Workshop Room
4 from 11:00 am to 12:00 pm on 13 November, 2015. The
workshop discussion focused on the challenges in developing an
enforcement framework for the internet that guarantees transnational
due process and legal interoperability. The discussion also focused
on innovative approaches to multi-stakeholder cooperation such as
issue-based networks, inter-sessional work methods and transnational
policy standards. The panellists for this discussion were the
following.</p>
<ol>
<li>
<p align="justify">
Anne
Carblanc Head of Division, Directorate for Science, Technology and
Industry, OECD</p>
</li><li>
<p align="justify">
Eileen
Donahoe Director Global Affairs, Human Rights Watch</p>
</li><li>
<p align="justify">
Byron
Holland President and CEO, CIRA (Canadian ccTLD)</p>
</li><li>
<p align="justify">
Christopher
Painter Coordinator for Cyber Issues, US Department of State</p>
</li><li>
<p align="justify">
Sunil
Abraham Executive Director, CIS India</p>
</li><li>
<p align="justify">
Alice
Munyua Lead dotAfrica Initiative and GAC representative, African
Union Commission</p>
</li><li>
<p align="justify">
Will
Hudsen Senior Advisor for International Policy, Google</p>
</li><li>
<p align="justify">
Dunja
Mijatovic Representative on Freedom of the Media, OSCE</p>
</li><li>
<p align="justify">
Thomas
Fitschen Director for the United Nations, for International
Cooperation against Terrorism and for Cyber Foreign Policy, German
Federal Foreign Office</p>
</li><li>
<p align="justify">
Hartmut
Glaser Executive Secretary, Brazilian Internet Steering Committee</p>
</li><li>
<p align="justify">
Matt
Perault, Head of Policy Development Facebook</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2475-2015-11-13-ws-132-transnational-due-process-a-case-study-in-ms-cooperation-workshop-room-4">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2475-2015-11-13-ws-132-transnational-due-process-a-case-study-in-ms-cooperation-workshop-room-4</a></p>
<p align="justify">
Video
link Transnational
Due Process: A Case Study in MS Cooperation available here <a href="https://www.youtube.com/watch?v=M9jVovhQhd0">https://www.youtube.com/watch?v=M9jVovhQhd0</a></p>
<p align="justify"> VI.
The Internet Governance Project organised a meeting of the
<em><strong>Dynamic
Coalition on Accountability of Internet Governance Venues</strong></em>
at Workshop Room 2 from 14:00
– 15:30 on
12 November, 2015. The coalition
brought together panelists to highlight the
challenges in developing an accountability
framework
for internet governance
venues that include setting up standards and developing a set of
concrete criteria. Jyoti Panday provided the perspective of civil
society on why acountability is necessary in internet governance
processes and organizations. The panelists for this workshop included
the following.</p>
<ol>
<li>
<p>
Robin
Gross, IP Justice</p>
</li><li>
<p>
Jeanette
Hofmann, Director
<a href="http://www.internetundgesellschaft.de/">Alexander
von Humboldt Institute for Internet and Society</a></p>
</li><li>
<p>
Farzaneh
Badiei,
Internet Governance Project</p>
</li><li>
<p>
Erika
Mann,
Managing
Director Public PolicyPolicy Facebook and Board of Directors
ICANN</p>
</li><li>
<p>
Paul
Wilson, APNIC</p>
</li><li>
<p>
Izumi
Okutani, Japan
Network Information Center (JPNIC)</p>
</li><li>
<p>
Keith
Drazek , Verisign</p>
</li><li>
<p>
Jyoti
Panday,
CIS</p>
</li><li>
<p>
Jorge
Cancio,
GAC representative</p>
</li></ol>
<p>
Detailed
description of the workshop is available here
<a href="http://igf2015.sched.org/event/4c23/dynamic-coalition-on-accountability-of-internet-governance-venues?iframe=no&w=&sidebar=yes&bg=no">http://igf2015.sched.org/event/4c23/dynamic-coalition-on-accountability-of-internet-governance-venues?iframe=no&w=&sidebar=yes&bg=no</a></p>
<p>
Video
link https://www.youtube.com/watch?v=UIxyGhnch7w</p>
<p> VII.
Digital
Infrastructure
Netherlands Foundation organized an open forum at
Workshop Room 3
from 11:00
– 12:00
on
10
November, 2015. The open
forum discussed the increase
in government engagement with “the internet” to protect their
citizens against crime and abuse and to protect economic interests
and critical infrastructures. It
brought
together panelists topresent
ideas about an agenda for the international protection of ‘the
public core of the internet’ and to collect and discuss ideas for
the formulation of norms and principles and for the identification of
practical steps towards that goal.
Pranesh Prakash participated in the e open forum. Other speakers
included</p>
<ol>
<li>
<p>
Bastiaan
Goslings AMS-IX, NL</p>
</li><li>
<p>
Pranesh
Prakash CIS, India</p>
</li><li>
<p>
Marilia
Maciel (FGV, Brasil</p>
</li><li>
<p>
Dennis
Broeders (NL Scientific Council for Government Policy)</p>
</li></ol>
<p>
Detailed
description of the open
forum is available here
<a href="http://schd.ws/hosted_files/igf2015/3d/DINL_IGF_Open%20Forum_The_public_core_of_the_internet.pdf">http://schd.ws/hosted_files/igf2015/3d/DINL_IGF_Open%20Forum_The_public_core_of_the_internet.pdf</a></p>
<p>
Video
link available here <a href="https://www.youtube.com/watch?v=joPQaMQasDQ">https://www.youtube.com/watch?v=joPQaMQasDQ</a></p>
<p>
VIII.
UNESCO, Council of Europe, Oxford University, Office of the High
Commissioner on Human Rights, Google, Internet Society organised a
workshop on hate speech and youth radicalisation at Room 9 on
Thursday, November 12. UNESCO shared the initial outcome from its
commissioned research on online hate speech including practical
recommendations on combating against online hate speech through
understanding the challenges, mobilizing civil society, lobbying
private sectors and intermediaries and educating individuals with
media and information literacy. The workshop also discussed how to
help empower youth to address online radicalization and extremism,
and realize their aspirations to contribute to a more peaceful and
sustainable world. Sunil Abraham provided his inputs. Other speakers
include</p>
<p>
1.
Chaired by Ms Lidia Brito, Director for UNESCO Office in Montevideo</p>
<p>
2.Frank
La Rue, Former Special Rapporteur on Freedom of Expression</p>
<p>
3.
Lillian Nalwoga, President ISOC Uganda and rep CIPESA, Technical
community</p>
<p>
4.
Bridget O’Loughlin, CoE, IGO</p>
<p>
5.
Gabrielle Guillemin, Article 19</p>
<p>
6.
Iyad Kallas, Radio Souriali</p>
<p>
7.
Sunil Abraham executive director of Center for Internet and Society,
Bangalore, India</p>
<p>
8.
Eve Salomon, global Chairman of the Regulatory Board of RICS</p>
<p>
9.
Javier Lesaca Esquiroz, University of Navarra</p>
<p>
10.
Representative GNI</p>
<p>
11.
Remote Moderator: Xianhong Hu, UNESCO</p>
<p>
12.
Rapporteur: Guilherme Canela De Souza Godoi, UNESCO</p>
<p>
Detailed
description of the workshop
is available here
<a href="http://igf2015.sched.org/event/4c1X/ws-128-mitigate-online-hate-speech-and-youth-radicalisation?iframe=no&w=&sidebar=yes&bg=no">http://igf2015.sched.org/event/4c1X/ws-128-mitigate-online-hate-speech-and-youth-radicalisation?iframe=no&w=&sidebar=yes&bg=no</a></p>
<p>
Video
link to the panel is available here
<a href="https://www.youtube.com/watch?v=eIO1z4EjRG0">https://www.youtube.com/watch?v=eIO1z4EjRG0</a></p>
<p> <strong>INTERMEDIARY
LIABILITY</strong></p>
<p align="justify">
IX.
Electronic
Frontier Foundation, Centre for Internet Society India, Open Net
Korea and Article 19 collaborated to organize
a workshop on the <em><strong>Manila
Principles on Intermediary Liability</strong></em>
at Workshop Room 9 from 11:00 am to 12:00 pm on 13 November 2015. The
workshop elaborated on the Manila
Principles, a high level principle framework of best practices and
safeguards for content restriction practices and addressing liability
for intermediaries for third party content. The
workshop
saw particpants engaged in over lapping projects considering
restriction practices coming togetehr to give feedback and highlight
recent developments across liability regimes. Jyoti
Panday laid down the key details of the Manila Principles framework
in this session. The panelists for this workshop included the
following.</p>
<ol>
<li>
<p align="justify">
Kelly
Kim Open Net Korea,</p>
</li><li>
<p align="justify">
Jyoti
Panday, CIS India,</p>
</li><li>
<p align="justify">
Gabrielle
Guillemin, Article 19,</p>
</li><li>
<p align="justify">
Rebecca
McKinnon on behalf of UNESCO</p>
</li><li>
<p align="justify">
Giancarlo
Frosio, Center for Internet and Society, Stanford Law School</p>
</li><li>
<p align="justify">
Nicolo
Zingales, Tilburg University</p>
</li><li>
<p align="justify">
Will
Hudson, Google</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2423-2015-11-13-ws-242-the-manila-principles-on-intermediary-liability-workshop-room-9">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2423-2015-11-13-ws-242-the-manila-principles-on-intermediary-liability-workshop-room-9</a></p>
<p align="justify">
Video link available here <a href="https://www.youtube.com/watch?v=kFLmzxXodjs">https://www.youtube.com/watch?v=kFLmzxXodjs</a></p>
<p align="justify"> <strong>ACCESSIBILITY</strong></p>
<p align="justify">
X.
Dynamic
Coalition
on Accessibility and Disability and Global Initiative for Inclusive
ICTs organised a workshop on <em><strong>Empowering
the Next Billion by Improving Accessibility</strong></em><em>
</em>at
Workshop Room 6 from 9:00 am to 10:30 am on 13 November, 2015. The
discussion focused on
the need and ways to remove accessibility barriers which prevent over
one billion potential users to benefit from the Internet, including
for essential services. Sunil
Abraham specifically spoke about the lack of compliance of existing
ICT infrastructure with well established accessibility standards
specifically relating to accessibility barriers in the disaster
management process. He discussed the barriers faced by persons with
physical or psychosocial disabilities. The
panelists for this discussion were the following.</p>
<ol>
<li>
<p align="justify">
Francesca
Cesa Bianchi, G3ICT</p>
</li><li>
<p align="justify">
Cid
Torquato, Government of Brazil</p>
</li><li>
<p align="justify">
Carlos
Lauria, Microsoft Brazil</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS India</p>
</li><li>
<p align="justify">
Derrick
L. Cogburn, Institute on Disability and Public Policy (IDPP) for the
ASEAN(Association of Southeast Asian Nations) Region</p>
</li><li>
<p align="justify">
Fernando
H. F. Botelho, F123 Consulting</p>
</li><li>
<p align="justify">
Gunela
Astbrink, GSA InfoComm</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2438-2015-11-13-ws-253-empowering-the-next-billion-by-improving-accessibility-workshop-room-3">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2438-2015-11-13-ws-253-empowering-the-next-billion-by-improving-accessibility-workshop-room-3</a></u></p>
<p align="justify">
Video
Link Empowering
the next billion by improving accessibility <a href="https://www.youtube.com/watch?v=7RZlWvJAXxs">https://www.youtube.com/watch?v=7RZlWvJAXxs</a></p>
<p align="justify"> <strong>OPENNESS</strong></p>
<p align="justify">
XI.
A
workshop on <em><strong>FOSS
& a Free, Open Internet: Synergies for Development</strong></em>
was organized at Workshop Room 7 from 2:00 pm to 3:30 pm on 13
November, 2015. The discussion was focused on the increasing risk to
openness of the internet and the ability of present & future
generations to use technology to improve their lives. The panel shred
different perspectives about the future co-development
of FOSS and a free, open Internet; the threats that are emerging; and
ways for communities to surmount these. Sunil
Abraham emphasised the importance of free software, open standards,
open access and access to knowledge and the lack of this mandate in
the draft outcome document for upcoming WSIS+10 review and called for
inclusion of the same. Pranesh Prakash further contributed to the
discussion by emphasizing the need for free open source software with
end‑to‑end encryption and traffic level encryption based
on open standards which are decentralized and work through federated
networks. The
panellists for this discussion were the following.</p>
<ol>
<li>
<p align="justify">
Satish
Babu, Technical Community, Chair, ISOC-TRV, Kerala, India</p>
</li><li>
<p align="justify">
Judy
Okite, Civil Society, FOSS Foundation for Africa</p>
</li><li>
<p align="justify">
Mishi
Choudhary, Private Sector, Software Freedom Law Centre, New York</p>
</li><li>
<p align="justify">
Fernando
Botelho, Private Sector, heads F123 Systems, Brazil</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS
India</p>
</li><li>
<p align="justify">
Pranesh
Prakash, CIS
India</p>
</li><li>
<p align="justify">
Nnenna
Nwakanma- WWW.Foundation</p>
</li><li>
<p align="justify">
Yves
MIEZAN EZO, Open Source strategy consultant</p>
</li><li>
<p align="justify">
Corinto
Meffe, Advisor to the President and Directors, SERPRO, Brazil</p>
</li><li>
<p align="justify">
Frank
Coelho de Alcantara, Professor, Universidade Positivo, Brazil</p>
</li><li>
<p align="justify">
Caroline
Burle, Institutional and International Relations, W3C Brazil Office
and Center of Studies on Web Technologies</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2468-2015-11-13-ws10-foss-and-a-free-open-internet-synergies-for-development-workshop-room-7" target="_top">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2468-2015-11-13-ws10-foss-and-a-free-open-internet-synergies-for-development-workshop-room-7</a></u></p>
<p align="justify">
Video
link available here <a href="https://www.youtube.com/watch?v=lwUq0LTLnDs">https://www.youtube.com/watch?v=lwUq0LTLnDs</a></p>
<p align="justify">
<br /><br /></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015'>http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015</a>
</p>
No publisherjyotiAccess to KnowledgeBig DataFreedom of Speech and ExpressionEncryptionInternet Governance ForumIntermediary LiabilityAccountabilityInternet GovernanceCensorshipCyber SecurityDigital GovernanceAnonymityCivil SocietyBlocking2015-11-30T10:47:13ZBlog EntryRole of Intermediaries in Countering Online Abuse
http://editors.cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse
<b>The Internet can be a hostile space and protecting users from abuse without curtailing freedom of expression requires a balancing act on the part of online intermediaries.</b>
<p style="text-align: justify; ">This got published as two blog entries in the NALSAR Law Tech Blog. Part 1 can be accessed <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-i/">here</a> and Part 2 <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-ii/">here</a>.</p>
<hr />
<p style="text-align: justify; ">As platforms and services coalesce around user-generated content (UGC) and entrench themselves in the digital publishing universe, they are increasingly taking on the duties and responsibilities of protecting rights including taking reasonable measures to restrict unlawful speech. Arguments around the role of intermediaries tackling unlawful content usually center around the issue of regulation—when is it feasible to regulate speech and how best should this regulation be enforced?</p>
<p class="Standard" style="text-align: justify; ">Recently, Twitter found itself at the periphery of such questions when an anonymous user of the platform, @LutyensInsider, began posting slanderous and sexually explicit comments about Swati Chaturvedi, a Delhi-based journalist. The online spat which began in February last year, culminated into<a href="http://www.dailyo.in/politics/twitter-trolls-swati-chaturvedi-lutyensinsider-presstitutes-bazaru-media-delhi-police/story/1/4300.html"> Swati filing an FIR</a> against the anonymous user, last week. Within hours of the FIR, the anonymous user deleted the tweets and went silent. Predictably, Twitter users <a href="https://twitter.com/bainjal/status/609343547796426752">hailed this</a> as a much needed deterrence to online harassment. Swati’s personal victory is worth celebrating, it is an encouragement for the many women bullied daily on the Internet, where harassment is rampant. However, while Swati might be well within her legal rights to counter slander, the rights and liabilities of private companies in such circumstances are often not as clear cut.</p>
<p class="Standard" style="text-align: justify; ">Should platforms like Twitter take on the mantle of deciding what speech is permissible or not? When and how should the limits on speech be drawn? Does this amount to private censorship?The answers are not easy and as the recent Grand Chamber of the European Court of Human Rights (ECtHR)<a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635"> </a><a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635">judgment in the case of</a> Delfi AS v. Estonia confirms, the role of UGC platforms in balancing the user rights, is an issue far from being settled. In its ruling, the ECtHR reasoned that because of their role in facilitating expression, online platforms have a requirement “<i>to take effective measures to limit the dissemination of hate speech and speech inciting violence was not ‘private censorship”.</i></p>
<p class="Standard" style="text-align: justify; ">This is problematic because the decision moves the regime away from a framework that grants immunity from liability, as long as platforms meet certain criteria and procedures. In <a href="http://www.jipitec.eu/issues/jipitec-5-3-2014/4091">other words</a> the ruling establishes strict liability for intermediaries in relation to manifestly illegal content, even if they may have no knowledge. The 'obligation' placed on the intermediary does not grant them safe harbour and is not proportionate to the monitoring and blocking capacity thus necessitated. Consequently, platforms might be incentivized to err on the side of caution and restrict comments or confine speech resulting in censorship. The ruling is especially worrying, as the standard of care placed on the intermediary does not recognize the different role played by intermediaries in detection and removal of unlawful content. Further, intermediary liability is its own legal regime and is at the same time, a subset of various legal issues that need an understanding of variation in scenarios, mediums and technology both globally and in India.</p>
<h3 class="Standard">Law and Short of IT</h3>
<p class="Standard" style="text-align: justify; ">Earlier this year, in a<a href="http://www.theverge.com/2015/2/4/7982099/twitter-ceo-sent-memo-taking-personal-responsibility-for-the"> leaked memo</a>, the Twitter CEO Dick Costolo took personal responsibility for his platform's chronic problem and failure to deal with harassment and abuse. In Swati's case, Twitter did not intervene or take steps to address harrassment. If it had to, Twitter (India), as all online intermediaries would be bound by the provisions established under Section 79 and accompanying Rules of the Information Technology Act. These legislations outline the obligations and conditions that intermediaries must fulfill to claim immunity from liability for third party content. Under the regime, upon receiving actual knowledge of unlawful information on their platform, the intermediary must comply with the notice and takedown (NTD) procedure for blocking and removal of content.</p>
<p class="Standard" style="text-align: justify; ">Private complainants could invoke the NTD procedure forcing intermediaries to act as adjudicators of an unlawful act—a role they are clearly ill-equipped to perform, especially when the content relates to political speech or alleged defamation or obscenity. The SC judgment in Shreya Singhal addressing this issue, read down the provision (Section 79 by holding that a takedown notice can only be effected if the complainant secures a court order to support her allegation. Further, it was held that the scope of restrictions under the mechanism is restricted to the specific categories identified under Article 19(2). Effectively, this means Twitter need not take down content in the absence of a court order.</p>
<h3 class="Standard">Content Policy as Due Diligence</h3>
<p class="Standard" style="text-align: justify; ">Another provision, Rule 3(2) prescribes a content policy which, prior to the Shreya Singhal judgment was a criteria for administering takedown. This content policy includes an exhaustive list of types of restricted expressions, though worryingly, the terms included in it are not clearly defined and go beyond the reasonable restrictions envisioned under Article 19(2). Terms such as “grossly harmful”, “objectionable”, “harassing”, “disparaging” and “hateful” are not defined anywhere in the Rules, are subjective and contestable as alternate interpretation and standard could be offered for the same term. Further, this content policy is not applicable to content created by the intermediary.</p>
<p class="Standard" style="text-align: justify; ">Prior to the SC verdict in Shreya Singhal, <a href="http://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability">actual knowledge could have been interpreted</a> to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. While liability accrued from not complying with takedown requests under the content policy was clear, this is not the case anymore. By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries complying with places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must and should adhere, to the boundaries set by Article 19(2). Following the SC judgment intermediaries do not have to administer takedowns without a court order thereby rendering this content policy redundant. As it stands, the content policy is an obligation that intermediaries must fulfill in order to be exempted from liability for UGC and this due diligence is limited to publishing rules and regulations, terms and conditions or user agreement informing users of the restrictions on content. The penalties for not publishing this content policy should be clarified.</p>
<p class="Standard" style="text-align: justify; ">Further, having been informed of what is permissible users are agreeing to comply with the policy outlined, by signing up to and using these platforms and services. The requirement of publishing content policy as due diligence is unnecessary given that mandating such ‘standard’ terms of use negates the difference between different types of intermediaries which accrue different kinds of liability. This also places an extraordinary power of censorship in the hands of the intermediary, which could easily stifle freedom of speech online. Such heavy handed regulation could make it impossible to publish critical views about anything without the risk of being summarily censored.</p>
<p class="Standard">Twitter may have complied with its duties by publishing the content policy, though the obligation does not seem to be an effective deterrence. Strong safe harbour provisions for intermediaries are a crucial element in the promotion and protection of the right to freedom of expression online. By absolving platforms of responsibility for UGC as long as they publish a content policy that is vague and subjective is the very reason why India’s IT Rules are in fact, in urgent need of improvement.</p>
<h3 class="Standard">Size Matters</h3>
<p class="Standard" style="text-align: justify; ">The standards for blocking, reporting and responding to abuse vary across different categories of platforms. For example, it may be easier to counter trolls and abuse on blogs or forums where the owner or an administrator is monitoring comments and UGC. Usually platforms outline monitoring and reporting policies and procedures including recourse available to victims and action to be taken against violators. However, these measures are not always effective in curbing abuse as it is possible for users to create new accounts under different usernames. For example, in Swati’s case the anonymous user behind @LutyensInsider account changed<a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx"> </a><a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx">their handle</a> to @gregoryzackim and @gzackim before deleting all tweets. In this case, perhaps the fear of criminal charges ahead was enough to silence the anonymous user, which may not always be the case.</p>
<h3 class="Standard">Tackling the Trolls</h3>
<p class="Standard" style="text-align: justify; ">Most large intermediaries have privacy settings which restrict the audience for user posts as well as prevent strangers from contacting them as a general measure against online harassment. Platforms also publish<a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html"> </a><a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html">monitoring policy</a> outlining the procedure and mechanisms for users to<a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html">register their complaint</a> or<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">report abuse</a>. Often reporting and blocking mechanisms<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">rely on community standards</a> and users reporting unlawful content. Last week Twitter<a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348"> </a><a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348">announced a new feature</a> allowing lists of blocked users to be shared between users. An improvement on existing mechanism for blocking, the feature is aimed at making the service safer for people facing similar issues and while an improvement on standard policies defining permissible limits on content, such efforts may have their limitations.</p>
<p class="Standard" style="text-align: justify; ">The mechanisms follow a one-size-fits-all policy. First, such community driven efforts do not address concerns of differences in opinion and subjectivity. Swati in defending her actions stressed the “<i>coarse discourse”</i> prevalent on social media, though as<a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/"> </a><a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/">this article points out</a> she might be assumed guilty of using offensive and abusive language. Subjectivity and many interpretations of the same opinion can pave the way for many taking offense online. Earlier this month, Nikhil Wagle’s tweets criticising Prime Minister Narendra Modi as a “pervert” was interpreted as “abusive”, “offensive” and “spreading religious disharmony”. While platforms are within their rights to establish policies for dealing with issues faced by users, there is a real danger of them doing so for<a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">“</a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">political reasons” and based on “popularity” measures</a> which may chill free speech. When many get behind a particular interpretation of an opinion, lawful speech may also be stifled as Sreemoyee Kundu <a href="http://www.dailyo.in/user/124/sreemoyeekundu">found out</a>. A victim of online abuse her account was blocked by Facebook owing to multiple reports from a “<i>faceless fanatical mob”. </i>Allowing the users to set standards of permissible speech is an improvement, though it runs the risk of mob justice and platforms need to be vigilant in applying such standards.</p>
<p class="Standard" style="text-align: justify; ">While it may be in the interest of platforms to keep a hands off approach to community policies, certain kind of content may necessiate intervention by the intermediary. There has been an increase in private companies modifying their content policy to place reasonable restriction on certain hateful behaviour in order to protect vulnerable or marginalised voices. <a href="http://www.theguardian.com/technology/2015/mar/12/twitter-bans-revenge-porn-in-user-policy-sharpening">Twitter</a> and <a href="http://www.redditblog.com/2015/05/promote-ideas-protect-people.html">Reddit's</a> policy change in addressing revenge porn are reflective of a growing understanding amongst stakeholders that in order to promote free expression of ideas, recognition and protection of certain rights on the Internet may be necessary. However, any approach to regulate user content must assess the effect of policy decisions on user rights. Google's <a href="http://www.theguardian.com/technology/2015/jun/22/revenge-porn-women-free-speech-abuse">stand on tackling revenge porn</a> may be laudable, though the <a href="https://www.techdirt.com/articles/20141109/06211929087/googles-efforts-to-push-down-piracy-sites-may-lead-more-people-to-malware.shtml">decision to push down</a> 'piracy' sites in its search results could be seen to adversely impact the choice that users have. Terms of service implemented with subjectivity and lack of transparency can and does lead to private censorship.</p>
<h3 class="Standard">The Way Forward</h3>
<p class="Standard" style="text-align: justify; ">Harassment is damaging, because of the feeling of powerlessness that it invokes in the victims and online intermediaries represent new forms of power through which users' negotiate and manage their online identity. Content restriction policies and practices must address this power imbalance by adopting baseline safeguards and best practices. It is only fair that based on principles of equality and justice, intermediaries be held responsible for the damage caused to users due to wrongdoings of other users or when they fail to carry out their operations and services as prescribed by the law. However, in its present state, the intermediary liability regime in India is not sufficient to deal with online harassment and needs to evolve into a more nuanced form of governance.</p>
<p class="Standard" style="text-align: justify; ">Any liability framework must evolve bearing in mind the slippery slope of overbroad regulation and differing standards of community responsibility. Therefore, a balanced framework would need to include elements of both targeted regulation and soft forms of governance as liability regimes need to balance fundamental human rights and the interests of private companies. Often, achieving this balance is problematic given that these companies are expected to be adjudicators and may also be the target of the breach of rights, as is the case in Delfi v Estonia. Global frameworks such as the Manila Principles can be a way forward in developing effective mechanisms. The determination of content restriction practices should always adopt the least restrictive means of doing so, distinguishing between the classes of intermediary. They must evolve considering the proportionality of the harm, the nature of the content and the impact on affected users including the proximity of affected party to content uploader.</p>
<p class="Standard" style="text-align: justify; ">Further, intermediaries and governments should communicate a clear mechanism for review and appeal of restriction decisions. Content restriction policies should incorporate an effective right to be heard. In exceptional circumstances when this is not possible, a post facto review of the restricton order and its implementation must take place as soon as practicable. Further, unlawful content restricted for a limited duration or within a specific geography, must not extend beyond these limits and a periodic review should take place to ensure the validity of the restriction. Regular, systematic review of rules and guidelines guiding intermediary liability will go a long way in ensuring that such frameworks are not overly burdensome and remain effective.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse'>http://editors.cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse</a>
</p>
No publisherjyotiOnline HarassmentInternet GovernanceIntermediary LiabilityChilling EffectOnline Abuse2015-08-02T16:38:36ZBlog EntryPanel Discussion on Internet Intermediaries, Law and Innovation
http://editors.cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation
<b>CII, Google and Centre For Communications Governance, NLU Delhi hosted a panel discussion on June 2 in New Delhi. Jyoti Panday attended.</b>
<p style="text-align: justify; ">The Centre for Internet & Society (CIS) participated in the panel discussion on 'Internet Intermediaries, Law and Innovation' hosted by CII, Google and Centre For Communications Governance, NLU Delhi. The panel discussed the impact of the existing provisions on intermediary liability and innovation and sought suggestions and the way forward.<br /><br />The panel was moderated by Dr Subho Ray, President, IAMAI<br /><br />Other panelists included:</p>
<ul style="text-align: justify; ">
<li> Mr Anupam Chander, Eminent Global Lawyer & Academician</li>
<li> Mr Apar Gupta, Advocate</li>
<li> Ms Mishi Choudhary, Founding Director , Software Freedom Law Centre</li>
<li> Mr J Sai Deepak, Associate Partner, Litigation Team, Saikrishna & Associates</li>
</ul>
<ul style="text-align: justify; ">
<li> Mr Indranil Choudhury, Founder and CEO, Lexplosion</li>
</ul>
<p><a href="http://editors.cis-india.org/internet-governance/blog/internet-intermediaries-law-and-innovation-panel.odp" class="internal-link">Click to download the presentation.</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation'>http://editors.cis-india.org/internet-governance/news/panel-discussion-on-internet-intermediaries-law-and-innovation</a>
</p>
No publisherjyotiInternet GovernanceIntermediary Liability2015-06-14T16:37:56ZNews ItemDeitY says 143 URLs have been Blocked in 2015; Procedure for Blocking Content Remains Opaque and in Urgent Need of Transparency Measures
http://editors.cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015
<b>Across India on 30 December 2014, following an order issued by the Department of Telecom (DOT), Internet Service Providers (ISPs) blocked 32 websites including Vimeo, Dailymotion, GitHub and Pastebin.</b>
<p style="text-align: justify;">In February 2015, the Centre for Internet and Society (CIS) requested the Department of Electronics and Information Technology (DeitY) under the Right to Information Act, 2005 (RTI Act) to provide information clarifying the procedures for blocking in India. We have received a response from DeitY which may be <a href="http://editors.cis-india.org/internet-governance/blog/response-deity.clarifying-procedures-for-blocking.pdf" class="external-link">seen here</a>.</p>
<p style="text-align: justify;">In this post, I shall elaborate on this response from DeitY and highlight some of the accountability and transparency measures that the procedure needs. To stress the urgency of reform, I shall also touch upon two recent developments—the response from Ministry of Communication to questions raised in Parliament on the blocking procedures and the Supreme Court (SC) judgment in Shreya Singhal v. Union of India.</p>
<h2 style="text-align: justify;">Section 69A and the Blocking Rules</h2>
<p align="JUSTIFY" class="western">Section 69A of the Information Technology Act, 2008 (S69A hereinafter) grants powers to the central government to issue directions for blocking of access to any information through any computer resource. In other words, it allows the government to block any websites under certain grounds. The Government has notified rules laying down the procedure for blocking access online under the Procedure and Safeguards for Blocking for Access of Information by Public Rules, 2009 (Rules, 2009 hereinafter). CIS has produced a poster explaining the blocking procedure (<a href="http://cis-india.org/internet-governance/blog/blocking-websites.pdf/at_download/file">download PDF</a>, 2.037MB).</p>
<p align="JUSTIFY" class="western">There are <em>three key aspects</em> of the blocking rules that need to be kept under consideration:</p>
<h3 align="JUSTIFY" class="western">Officers and committees handling requests</h3>
<p style="text-align: justify;"><strong>Designated Officer (DO)</strong> – Appointed by the Central government, officer not below the rank of Joint Secretary.<br /><strong>Nodal Officer (NO)</strong> – Appointed by organizations including Ministries or Departments of the State governments and Union Territories and any agency of the Central Government. <br /><strong>Intermediary contact</strong>–Appointed by every intermediary to receive and handle blocking directions from the DO.<br /><strong>Committee for Examination of Request (CER)</strong> – The request along with printed sample of alleged offending information is examined by the CER—committee with the DO serving as the Chairperson and representatives from Ministry of Law and Justice; Ministry of Home Affairs; Ministry of Information and Broadcasting and representative from the Indian Computer Emergency Response Team (CERT-In). The CER is responsible for examining each blocking request and makes recommendations including revoking blocking orders to the DO, which are taken into consideration for final approval of request for blocking by the Secretary, DOT. <br /><strong>Review Committee (RC) </strong>– Constituted under rule 419A of the Indian Telegraph Act, 1951, the RC includes the Cabinet Secretary, Secretary to the Government of India (Legal Affairs) and Secretary (Department of Telecom). The RC is mandated to meet at least once in 2 months and record its findings and has to validate that directions issued are in compliance with S69A(1).</p>
<h3 style="text-align: justify;">Provisions outlining the procedure for blocking</h3>
<p>Rules 6, 9 and 10 create three distinct blocking procedures, which must commence within 7 days of the DO receiving the request.</p>
<p style="text-align: justify;">a) Rule 6 lays out the first procedure, under which any person may approach the NO and request blocking, alternatively, the NO may also raise a blocking request. After the NO of the approached Ministry or Department of the State governments and Union Territories and/or any agency of the Central Government, is satisfied of the validity of the request they forward it to the DO. Requests when not sent through the NO of any organization, must be approved by Chief Secretary of the State or Union Territory or the Advisor to the Administrator of the Union Territory, before being sent to the DO.</p>
<p style="text-align: justify;">The DO upon receiving the request places, must acknowledge receipt within 24 four hours and places the request along with printed copy of alleged information for validation by the CER. The DO also, must make reasonable efforts to identify the person or intermediary hosting the information, and having identified them issue a notice asking them to appear and submit their reply and clarifications before the committee at a specified date and time, within forty eight hours of the receipt of notice.</p>
<p style="text-align: justify;">Foreign entities hosting the information are also informed and the CER gives it recommendations after hearing from the intermediary or the person has clarified their position and even if there is no representation by the same and after examining if the request falls within the scope outlined under S69A(1). The blocking directions are issued by the Secretary (DeitY), after the DO forwards the request and the CER recommendations. If approval is granted the DO directs the relevant intermediary or person to block the alleged information.</p>
<p style="text-align: justify;" class="western">b) Rule 9 outlines a procedure wherein, under emergency circumstances, and after the DO has established the necessity and expediency to block alleged information submits recommendations in writing to the Secretary, DeitY. The Secretary, upon being satisfied by the justification for, and necessity of, and expediency to block information may issue an blocking directions as an interim measure and must record the reasons for doing so in writing.</p>
<p style="text-align: justify;" class="western">Under such circumstances, the intermediary and person hosting information is not given the opportunity of a hearing. Nevertheless, the DO is required to place the request before the CER within forty eight hours of issuing of directions for interim blocking. Only upon receiving the final recommendations from the committee can the Secretary pass a final order approving the request. If the request for blocking is not approved then the interim order passed earlier is revoked, and the intermediary or identified person should be directed to unblock the information for public access.</p>
<p style="text-align: justify;" class="western">c) Rule 10 outlines the process when an order is issued by the courts in India. The DO upon receipt of the court order for blocking of information submits it to the Secretary, DeitY and initiates action as directed by the courts.</p>
<h3 style="text-align: justify;" class="western">Confidentiality clause</h3>
<p style="text-align: justify;">Rule 16 mandates confidentiality regarding all requests and actions taken thereof, which renders any requests received by the NO and the DO, recommendations made by the DO or the CER and any written reasons for blocking or revoking blocking requests outside the purview of public scrutiny. More detail on the officers and committees that enforce the blocking rules and procedure can be found <a href="http://cis-india.org/internet-governance/blog/is-india2019s-website-blocking-law-constitutional-2013-i-law-procedure">here</a>.</p>
<h2>Response on blocking from the Ministry of Communication and Information Technology</h2>
<p style="text-align: justify;">The response to our RTI from E-Security and Cyber Law Group is timely, given the recent clarification from the Ministry of Communication and Information Technology to a number of questions, raised by parliamentarian Shri Avinash Pande in the Rajya Sabha. The questions had been raised in reference to the Emergency blocking order under IT Act, the current status of the Central Monitoring System, Data Privacy law and Net Neutrality. The Centre for Communication Governance (CCG), National Law University New Delhi have extracted a set of 6 questions and you can read the full article <a href="https://ccgnludelhi.wordpress.com/2015/04/24/governments-response-to-fundamental-questions-regarding-the-internet-in-india/">here</a>.</p>
<p align="JUSTIFY" class="western">The governments response as quoted by CCG, clarifies under rule 9—the Government has issued directions for emergency blocking of <em>a total number of 216 URLs from 1st January, 2014 till date </em>and that <em>a total of 255 URLs were blocked in 2014 and no URLs has been blocked in 2015 (till 31 March 2015)</em> under S69A through the Committee constituted under the rules therein. Further, a total of 2091 URLs and 143 URLs were blocked in order to comply with the directions of the competent courts of India in 2014 and 2015 (till 31 March 2015) respectively. The government also clarified that the CER, had recommended not to block 19 URLs in the meetings held between 1<sup>st</sup><sup> </sup>January 2014 upto till date and so far, two orders have been issued to revoke 251 blocked URLs from 1st January 2014 till date. Besides, CERT-In received requests for blocking of objectionable content from individuals and organisations, and these were forwarded to the concerned websites for appropriate action, however the response did not specify the number of requests.</p>
<p align="JUSTIFY" class="western">We have prepared a table explaining the information released by the government and to highlight the inconsistency in their response.</p>
<table class="grid listing">
<colgroup> <col width="331"> <col width="90"> <col width="91"> <col width="119"> </colgroup>
<tbody>
<tr>
<td rowspan="2">
<p align="LEFT"><strong>Applicable rule and procedure outlined under the Blocking Rules</strong></p>
</td>
<td colspan="3">
<p align="CENTER"><strong>Number of websites</strong></p>
</td>
</tr>
<tr>
<td>
<p align="CENTER"><em>2014</em></p>
</td>
<td>
<p align="CENTER"><em>2015</em></p>
</td>
<td>
<p align="CENTER"><em>Total</em></p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 6 - Blocking requests from NO and others</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
<td>
<p align="CENTER">None</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 9 - Blocking under emergency circumstances</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">216</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 10 - Blocking orders from Court</p>
</td>
<td>
<p align="CENTER">2091</p>
</td>
<td>
<p align="CENTER">143</p>
</td>
<td>
<p align="CENTER">2234</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Requests from individuals and orgs forwarded to CERT-In</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Recommendations to not block by CER</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">19</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Number of blocking requests revoked</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">251</p>
</td>
</tr>
</tbody>
</table>
<p>In a <a href="http://sflc.in/deity-says-2341-urls-were-blocked-in-2014-refuses-to-reveal-more/">response </a>to an RTI filed by the Software Freedom Law Centre, DeitY said that 708 URLs were blocked in 2012, 1,349 URLs in 2013, and 2,341 URLs in 2014.</p>
<h2>Shreya Singhal v. Union of India</h2>
<p style="text-align: justify;">In its recent judgment, the SC of India upheld the constitutionality of 69A, stating that it was a narrowly-drawn provision with adequate safeguards. The constitutional challenge on behalf of the People’s Union for Civil Liberties (PUCL) considered the manner in which the blocking is done and the arguments focused on the secrecy present in blocking.</p>
<p style="text-align: justify;">The rules may indicate that there is a requirement to identify and contact the originator of information, though as an expert <a href="http://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">has pointed out</a>, there is no evidence of this in practice. The court has stressed the importance of a written order so that writ petitions may be filed under Article 226 of the Constitution. In doing so, the court seems to have assumed that the originator or intermediary is informed, and therefore held the view that any procedural inconsistencies may be challenged through writ petitions. However, this recourse is rendered ineffective not only due to procedural constraints, but also because of the confidentiality clause. The opaqueness through rule 16 severely reigns in the recourse that may be given to the originator and the intermediary. While the court notes that rule 16 requiring confidentality was argued to be unconstitutional, it does not state its opinion on this question in the judgment. One expert, holds the <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">view</a> that this, by implication, requires that requests cannot be confidential. However, such a reading down of rule 16 is yet to be tested.</p>
<p style="text-align: justify;">Further, Sunil Abraham has <a href="http://cis-india.org/internet-governance/blog/economic-and-political-weekly-sunil-abraham-april-11-2015-shreya-singhal-and-66a">pointed</a> out, “block orders are unevenly implemented by ISPs making it impossible for anyone to independently monitor and reach a conclusion whether an internet resource is inaccessible as a result of a S69A block order or due to a network anomaly.” As there are no comprehensive list of blocked websites or of the legal orders through which they are blocked exists, the public has to rely on media reports and filing RTI requests to understand the censorship regime in India. CIS has previously <a href="http://cis-india.org/internet-governance/blog/analysing-blocked-sites-riots-communalism">analysed</a> the leaked block lists and lists received as responses to RTI requests which have revealed that the block orders are full of errors and blocking of entire platforms and not just specific links has taken place.</p>
<p style="text-align: justify;">While the state has the power of blocking content, doing so in secrecy and without judical scrutiny, mark deficiencies that remain in the procedure outlined under the provisions of the blocking rules . The Court could read down rule 16 except for a really narrow set of exceptions, and in not doing so, perhaps has overlooked the opportunities for reform in the existing system. The blocking of 32 websites, is an example of the opaqueness of the system of blocking orders, and where the safeguards assumed by the SC are often not observed such as there being no access to the recommendations that were made by the CER, or towards the revocation of the blocking orders subsequently. CIS filed the RTI to try and understand the grounds for blocking and related procedures and the response has thrown up some issues that must need urgent attention.</p>
<h2>Response to RTI filed by CIS</h2>
<p align="JUSTIFY" class="western">Our first question sought clarification on the websites blocked on 30<sup>th</sup><sup> </sup>December 2014 and the response received from DeitY, E-Security and Cyber Law Group reveals that the websites had been blocked as “they were being used to post information related to ISIS using the resources provided by these websites”. The response also clarifies that the directions to block were issued on <em>18-12-2014 and as of 09-01-2015</em>, after obtaining an undertaking from website owners, stating their compliance with the Government and Indian laws, the sites were unblocked.</p>
<p align="JUSTIFY" class="western">It is not clear if ATS, Mumbai had been intercepting communication or if someone reported these websites. If the ATS was indeed intercepting communication, then as per the rules, the RC should be informed and their recommendations sought. It is unclear, if this was the case and the response evokes the confidentiality clause under rule 16 for not divulging further details. Based on our reading of the rules, court orders should be accessible to the public and without copies of requests and complaints received and knowledge of which organization raised them, there can be no appeal or recourse available to the intermediary or even the general public.</p>
<p align="JUSTIFY" class="western">We also asked for a list of all requests for blocking of information that had been received by the DO between January 2013 and January 2015, including the copies of all files that had accepted or rejected. We also specifically, asked for a list of requests under rule 9. The response from DeitY stated that since January 1, 2015 to March 31, 2015 directions to block 143 URLs had been issued based on court orders. The response completely overlooks our request for information, covering the 2 year time period. It also does not cover all types of blocking orders under rule 6 and rule 9, nor the requests that are forwarded to CERT-In, as we have gauged from the ministry's response to the Parliament. Contrary to the SC's assumption of contacting the orginator of information, it is also clear from DeitY's response that only the websites had been contacted and the letter states that the “websites replied only after blocking of objectionable content”. </p>
<p align="JUSTIFY" class="western">Further, seeking clarification on the functioning of the CER, we asked for the recent composition of members and the dates and copies of the minutes of all meetings including copies of the recommendations made by them. The response merely quotes rule 7 as the reference for the composition and does not provide any names or other details. We ascertain that as per the DeitY website Shri B.J. Srinath, Scientist-G/GC is the appointed Designated Officer, however this needs confirmation. While we are already aware of the structure of the CER which representatives and appointed public officers are guiding the examination of requests remains unclear. Presently, there are 3 Joint Secretaries appointed under the Ministry of Law and Justice, the Home Ministry has appointed 19, while 3 are appointed under the Ministry of Information and Broadcasting. Further, it is not clear which grade of scientist would be appointed to this committee from CERT-In as the rules do not specify this. While the government has clarified in their answer to Parliament that the committee had recommended not to block 19 URLs in the meetings held between 1st January 2014 to till date, it is remains unclear who is taking these decisions to block and revoke blocked URLs. The response from DeitY specifies that the CER has met six times between 2014 and March 2015, however stops short on sharing any further information or copies of files on complaints and recommendations of the CER, citing rule 16.</p>
<p align="JUSTIFY" class="western">Finally, answering our question on the composition of the RC the letter merely highlights the provision providing for the composition under 419A of the Indian Telegraph Rules, 1951. The response clarifies that so far, the RC has met once on 7th December, 2013 under the Chairmanship of the Cabinet Secretary, Department of Legal Affaits and Secretary, DOT. Our request for minutes of meetings and copies of orders and findings of the RC is denied by simply stating that “minutes are not available”. Under 419A, any directions for interception of any message or class of messages under sub-section (2) of Section 5 of the Indian Telegraph Act, 1885 issued by the competent authority shall contain reasons for such direction and a copy of such order shall be forwarded to the concerned RC within a period of seven working days. Given that the RC has met just once since 2013, it is unclear if the RC is not functioning or if the interception of messages is being guided through other procedures. Further, we do not yet know details or have any records of revocation orders or notices sent to intermediary contacts. This restricts the citizens’ right to receive information and DeitY should work to make these available for the public.</p>
<p align="JUSTIFY" class="western">Given the response to our RTI, the Ministry's response to Parliament and the SC judgment we recommend the following steps be taken by the DeitY to ensure that we create a procedure that is just, accountable and follows the rule of law.</p>
<p align="JUSTIFY" class="western">The revocation of rule 16 needs urgent clarification for two reasons:</p>
<ol>
<li>Under Section 22 of the RTI Act provisions thereof, override all conflicting provisions in any other legislation.</li>
<li style="text-align: justify;">In upholding the constitutionality of S69A the SC cites the requirement of reasons behind blocking orders to be recorded in writing, so that they may be challenged by means of writ petitions filed under <a href="http://indiankanoon.org/doc/1712542/">A</a><a href="http://indiankanoon.org/doc/1712542/">rticle 226</a> of the Constitution of India.</li></ol>
<p style="text-align: justify;">If the blocking orders or the meetings of the CER and RC that consider the reasons in the orders are to remain shrouded in secrecy and unavailable through RTI requests, filing writ petitions challenging these decisions will not be possible, rendering this very important safeguard for the protection of online free speech and expression infructuous. In summation, the need for comprehensive legislative reform remains in the blocking procedures and the government should act to address the pressing need for transparency and accountability. Not only does opacity curtial the strengths of democracy it also impedes good governance. We have filed an RTI seeking a comprehensive account of the blocking procedure, functioning of committees from 2009-2015 and we shall publish any information that we may receive.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015'>http://editors.cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionRTIIntermediary LiabilityAccountabilityFeatured69AInternet GovernanceChilling EffectTransparencyHomepageBlocking2015-04-30T07:37:40ZBlog EntryThe Supreme Court Judgment in Shreya Singhal and What It Does for Intermediary Liability in India?
http://editors.cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability
<b>Even as free speech advocates and users celebrate the Supreme Court of India's landmark judgment striking down Section 66A of the Information Technology Act of 2000, news that the Central government has begun work on drafting a new provision to replace the said section of the Act has been trickling in.</b>
<p style="text-align: justify; ">The SC judgement in upholding the constitutionality of Section 69A (procedure for blocking websites) and in reading down Section 79 (exemption from liability of intermediaries) of the IT Act, raises crucial questions regarding transparency, accountability and under what circumstances may reasonable restrictions be placed on free speech on the Internet. While discussions and analysis of S. 66A continue, in this post I will focus on the aspect of the judgment related to intermediary liability that could benefit from further clarification from the apex court and in doing so, will briefly touch upon S. 69A and secret blocking.</p>
<h3 style="text-align: justify; ">Conditions qualifying intermediary for exemption and obligations not related to exemption</h3>
<p align="JUSTIFY">The intermediary liability regime in India is defined under S. 79 and assosciated rules that were introduced to protect intermediaries for liability from user generated content and ensure the Internet continues to evolve as a <i>“marketplace of ideas”</i>. But as intermediaries may not have sufficient legal competence or resources to deliberate on the legality of an expression, they may end up erring on the side of caution and takedown lawful expression. As a study by Centre for Internet and Society (CIS) in 2012 revealed, the criteria, procedure and safeguards for administration of the takedowns as prescribed by the rules lead to a chilling effect on online free expression.</p>
<p align="JUSTIFY"><span><span><span>S. 69A grants powers to the Central Government to </span></span></span><span><i><span>“issue directions for blocking of public access to any information through any computer resource”.</span></i></span><span><span><span> The 2009 </span></span></span><span><span><span>rules </span></span></span><span><span><span>allow the blocking of websites by a court order, </span></span></span><span><span><span>and </span></span></span><span><span><span>sets in place a review committee to review the decision to block websites </span></span></span><span><span><span>a</span></span></span><span><span><span>s also establishes </span></span></span><span><span><span>penalt</span></span></span><span><span><span>ies </span></span></span><span><span><span>for the intermediary </span></span></span><span><span><span>that fails to extend cooperation in this respect. </span></span></span></p>
<p align="JUSTIFY"><span><span><span>There are two key aspects of both these provisions that must be noted:</span></span></span></p>
<p align="JUSTIFY">a) S. 79 is an exemption provision that qualifies the intermediary for conditional immunity, as long as they fulfil the conditions of the section. The judgement notes this distinction, adding that “<i>being an exemption provision, it is closely related to provisions which provide for offences including S. 69A.”</i></p>
<p align="JUSTIFY"><span><span><span>b) S. 69A does not contribute to immunity for the intermediary rather places additional obligations on the intermediary and as the judgement notes </span></span></span><span><i><span>“intermediary who finally fails to comply with the directions issued who is punishable under sub-section (3) of 69A.”</span></i></span><span><span><span> The provision though outside of the conditional immunity liability regime enacted through S. 79 contributes to the restriction of access to, or removing content online by placing liability on intermediaries to block unlawful third party content or information that is being generated, transmitted, received, stored or hosted by them. Therefore restriction requests must fall within the contours outlined in Article 19(2) and include principles of natural justice and elements of due process.</span></span></span></p>
<h3 align="JUSTIFY">Subjective Determination of Knowledge</h3>
<p align="JUSTIFY">The provisions for exemption laid down in S. 79 do not apply when they receive <i>“actual knowledge” </i>of illegal content under section 79(3)(b). Prior to the court's verdict actual knowledge could have been interpreted to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. Removing the need for intermediaries to take on an adjudicatory role and deciding on which content to restrict or takedown, the SC has read down <i>“actual knowledge”</i> to mean that there has to be a court order directing the intermediary to expeditiously remove or disable access to content online. The court also read down <i>“upon obtaining knowledge by itself”</i> and <i>“brought to actual knowledge”</i> under Rule 3(4) in the same manner as 79(3)(b).</p>
<p align="JUSTIFY"><span><span><span>Under S.79(3)(b) the intermediary must comply with the orders from the executive in order to qualify for immunity. Further, S. 79 (3)(b) goes beyond the specific categories of restriction identified in Article 19(2) by including the term </span></span></span><span><i><span>“unlawful acts”</span></i></span><span><span><span> and places the executive in an adjudicatory role of determining the illegality of content. The government cannot emulate private regulation as it is bound by the Constitution and the court addresses this issue by applying the limitation of 19(2) on unlawful acts, </span></span></span><span><i><span>“the court order and/or the notification by the appropriate government or its agency must strictly conform to the subject matters aid down in Article 19(2).”</span></i></span><span><span><span> </span></span></span></p>
<p align="JUSTIFY"><span><span><span>By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries </span></span></span><span><span><span>complying with tak</span></span></span><span><span><span>edown requests from non-government entities and </span></span></span><span><span><span>has </span></span></span><span><span><span>made government notifications and court orders to be consistent with reasonable restrictions in Article 19(2). This is an important clarification from the court, because this places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must </span></span></span><span><span><span>and should </span></span></span><span><span><span>adhere, to </span></span></span><span><span><span>the </span></span></span><span><span><span>boundaries set by Article 19(2).</span></span></span></p>
<h3><span><span><span>Procedural Safeguards</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The SC does not touch upon other parts of the rules and in not doing so, has left significant procedural issues open for debate. It is relevant to bear in mind and as established above, S. 69A blocking and restriction requirements for the intermediary are part of their additional obligations and do not qualify them for immunity. The court ruled in favour of upholding S. 69A as constitutional on the basis that blocking orders are issued when the executive has sufficiently established that it is absolutely necessary to do so, and that the necessity is relatable to only some subjects set out in Article 19(2). Further the court notes that reasons for the blocking orders must be recorded in writing so that they may be challenged through writ petitions. The court also goes on to specify that under S. 69A the intermediary and the 'originator' if identified, have the right to be heard before the committee decides to issue the blocking order. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>Under S. 79 the intermediary must also comply with government restriction orders and the procedure for notice and takedown is not sufficiently transparent and lacks procedural safeguards that have been included in the notice and takedown procedures under S. 69. For example, there is no requirement for committee to evaluate the necessity of issuing the restriction order, though the ruling does clarify that these restriction notices must be within the confines of Article 19(2). The judgement could have gone further to directing the government to state their entire cause of action and provide reasonable level of proof (prima facie). It should have also addressed issues such as the government using extra-judicial measures to restrict content including collateral pressures to force changes in terms of service, to promote or enforce so-called "voluntary" practices. </span></span></span></p>
<h3><span><span><span>Accountability</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The judgement could also have delved deeper into issues of accountability such as the need to consider 'udi alteram partem' by providing the owner of the information or the intermediary a hearing prior to issuing the restriction or blocking order nor is an post-facto review or appeal mechanism made available except for the recourse of writ petition. Procedural uncertainty around wrongly restricted content remains, including what limitations should be placed on the length, duration and geographical scope of the restriction. The court also does not address the issue of providing a recourse for the third party provider of information to have the removed information restored or put-back remains unclear. Relatedly, the court also does not clarify the concerns related to frivolous requests by establishing penalties nor is there a codified recourse under the rules presently, for the intermediary to claim damages even if it can be established that the takedown process is being abused.</span></span></span></p>
<h3><span><span><span>Transparency</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The bench in para 113 in addressing S. 79 notes that the intermediary in addition to publishing rules and regulations, privacy policy and user agreement for access or usage of their service has to also inform users of the due diligence requirements including content restriction policy under rule 3(2). However, the court ought to have noted the differentiation between different categories of intermediaries which may require different terms of use. Rather than stressing a standard terms of use as a procedural safeguard, the court should have insisted on establishing terms of use and content restriction obligations that is proportional to the role of the intermediary and based on the liability accrued in providing the service, including the impact of the restriction by the intermediary both on access and free speech. By placing requirement of disclosure or transparency on the intermediary including what has been restricted under the intermediary's own terms of service, the judgment could have gone a step further than merely informing users of their rights in using the service as it stands presently, to ensuring that users can review and have knowledge of what information has been restricted and why. The judgment also does not touch upon broader issues of intermediary liability such as proactive filtering sought by government and private parties, an important consideration given the recent developments around the right to be forgotten in Europe and around issues of defamation and pornography in India. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>The judgment, while a welcome one in the direction of ensuring the Internet remains a democratic space where free speech thrives, could benefit from the application of the recently launched Manila principles developed by CIS and others. The Manila Principles is a framework of baseline safeguards and best practices that should be considered by policymakers and intermediaries when developing, adopting, and reviewing legislation, policies and practices that govern the liability of intermediaries for third-party content. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>The court's ruling is truly worth celebrating, in terms of the tone it sets on how we think of free speech and the contours of censorship that exist in the digital space. But the real impact of this judgment lies in the debates and discussions which it will throw open about content removal practices that involve intermediaries making determinations on requests received, or those which only respond to the interests of the party requesting removal. As the Manila Principles highlight a balance between public and private interests can be obtained through a mechanism where power is distributed between the parties involved, and where an impartial, independent, and accountable oversight mechanism exists. <br /></span></span></span></p>
<p><span><span><span><br /></span></span></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability'>http://editors.cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability</a>
</p>
No publisherjyotiIT ActCensorshipFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityChilling Effect2015-04-17T23:59:34ZBlog EntryNo more 66A!
http://editors.cis-india.org/internet-governance/blog/no-more-66a
<b>In a landmark decision, the Supreme Court has struck down Section 66A. Today was a great day for freedom of speech on the Internet! When Section 66A was in operation, if you made a statement that led to offence, you could be prosecuted. We are an offence-friendly nation, judging by media reports in the last year. It was a year of book-bans, website blocking and takedown requests. Facebook’s Transparency Report showed that next to the US, India made the most requests for information about user accounts. A complaint under Section 66A would be a ground for such requests.</b>
<p style="text-align: justify; ">Section 66A hung like a sword in the middle: Shaheen Dhada was arrested in Maharashtra for observing that Bal Thackeray’s funeral shut down the city, Devu Chodankar in Goa and Syed Waqar in Karnataka were arrested for making posts about Narendra Modi, and a Puducherry man was arrested for criticizing P. Chidambaram’s son. The law was vague and so widely worded that it was prone to misuse, and was in fact being misused.</p>
<p style="text-align: justify; ">Today, the Supreme Court struck down Section 66A in its judgment on a <a class="external-link" href="http://cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact">set of petitions</a> heard together last year and earlier this year. Stating that the law is vague, the bench comprising Chelameshwar and Nariman, JJ. held that while restrictions on free speech are constitutional insofar as they are in line with Article 19(2) of the Constitution. Section 66A, they held, does not meet this test: The central protection of free speech is the freedom to make statements that “offend, shock or disturb”, and Section 66A is an unconstitutional curtailment of these freedoms. To cross the threshold of constitutional limitation, the impugned speech must be of such a nature that it incites violence or is an exhortation to violence. Section 66A, by being extremely vague and broad, does not meet this threshold. These are, of course, drawn from news reports of the judgment; the judgment is not available yet.</p>
<p style="text-align: justify; ">Reports also say that Section 79(3)(b) has been read down. Previously, any private individual or entity, and the government and its departments could request intermediaries to take down a website, without a court order. If the intermediaries did not comply, they would lose immunity under Section 79. The Supreme Court judgment states that both in Rule 3(4) of the Intermediaries Guidelines and in Section 79(3)(b), the "actual knowledge of the court order or government notification" is necessary before website takedowns can be effected. In effect, this mean that intermediaries <i>need not</i> act upon private notices under Section 79, while they can act upon them if they choose. This stops intermediaries from standing judge over what constitutes an unlawful act. If they choose not to take down content after receiving a private notice, they will not lose immunity under Section 79.</p>
<p style="text-align: justify; ">Section 69A, the website blocking procedure, has been left intact by the Court, despite infirmities such as a lack of judicial review and non-transparent operation. More updates when the judgment is made available.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/no-more-66a'>http://editors.cis-india.org/internet-governance/blog/no-more-66a</a>
</p>
No publishergeethaCensorshipFreedom of Speech and ExpressionHomepageIntermediary LiabilityFeaturedChilling EffectSection 66AArticle 19(1)(a)Blocking2015-03-26T02:01:31ZBlog EntryOverview of the Constitutional Challenges to the IT Act
http://editors.cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact
<b>There are currently ten cases before the Supreme Court challenging various provisions of the Information Technology Act, the rules made under that, and other laws, that are being heard jointly. Advocate Gopal Sankaranarayanan who's arguing Anoop M.K. v. Union of India has put together this chart that helps you track what's being challenged in each case.</b>
<br />
<br />
<br />
<table class="tg" style="undefined;table-layout: fixed; border=">
<tr>
<th class="tg-s6z2">PENDING MATTERS</th>
<th class="tg-s6z2">CASE NUMBER</th>
<th class="tg-0ord">PROVISIONS CHALLENGED</th>
</tr>
<tr>
<td class="tg-4eph">Shreya Singhal v. Union of India</td>
<td class="tg-spn1">W.P.(CRL.) NO. 167/2012</td>
<td class="tg-zapm">66A</td>
</tr>
<tr>
<td class="tg-031e">Common Cause & Anr. v. Union of India</td>
<td class="tg-s6z2">W.P.(C) NO. 21/2013</td>
<td class="tg-0ord">66A, 69A & 80</td>
</tr>
<tr>
<td class="tg-4eph">Rajeev Chandrasekhar v. Union of India & Anr.</td>
<td class="tg-spn1">W.P.(C) NO. 23/2013</td>
<td class="tg-zapm">66A & Rules 3(2), 3(3), 3(4) & 3(7) of the Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-031e">Dilip Kumar Tulsidas Shah v. Union of India & Anr.</td>
<td class="tg-s6z2">W.P.(C) NO. 97/2013</td>
<td class="tg-0ord">66A</td>
</tr>
<tr>
<td class="tg-4eph">Peoples Union for Civil Liberties v. Union of India & Ors.</td>
<td class="tg-spn1">W.P.(CRL.) NO. 199/2013</td>
<td class="tg-zapm">66A, 69A, Intermediaries Rules 2011 (s.79(2) Rules) & Blocking of Access of Information by Public Rules 2009 (s.69A Rules)</td>
</tr>
<tr>
<td class="tg-031e">Mouthshut.Com (India) Pvt. Ltd. & Anr. v. Union of India & Ors.</td>
<td class="tg-s6z2">W.P.(C) NO. 217/2013</td>
<td class="tg-0ord">66A & Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-4eph">Taslima Nasrin v. State of U.P & Ors.</td>
<td class="tg-spn1">W.P.(CRL.) NO. 222/2013</td>
<td class="tg-zapm">66A</td>
</tr>
<tr>
<td class="tg-031e">Manoj Oswal v. Union of India & Anr.</td>
<td class="tg-s6z2">W.P.(CRL.) NO. 225/2013</td>
<td class="tg-0ord">66A & 499/500 Indian Penal Code</td>
</tr>
<tr>
<td class="tg-4eph">Internet and Mobile Ass'n of India & Anr. v. Union of India & Anr.</td>
<td class="tg-spn1">W.P.(C) NO. 758/2014</td>
<td class="tg-zapm">79(3) & Intermediaries Rules 2011</td>
</tr>
<tr>
<td class="tg-031e">Anoop M.K. v. Union of India & Ors.</td>
<td class="tg-s6z2">W.P.(CRL.) NO. 196/2014</td>
<td class="tg-0ord">66A, 69A, 80 & S.118(d) of the Kerala Police Act, 2011</td>
</tr>
</table>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact'>http://editors.cis-india.org/internet-governance/blog/overview-constitutional-challenges-on-itact</a>
</p>
No publisherpraneshIT ActCourt CaseFreedom of Speech and ExpressionIntermediary LiabilityConstitutional LawCensorshipSection 66AArticle 19(1)(a)Blocking2014-12-19T09:01:50ZBlog Entry