The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 11 to 25.
The Ministry And The Trace: Subverting End-To-End Encryption
http://editors.cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption
<b>A legal and technical analysis of the 'traceability' rule and its impact on messaging privacy.</b>
<p> </p>
<p>The paper was published in the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">NUJS Law Review Volume 14 Issue 2 (2021)</a>.</p>
<hr />
<h2>Abstract</h2>
<div class="justify">
<div class="pbs-main-wrapper">
<p>End-to-end
encrypted messaging allows individuals to hold confidential
conversations free from the interference of states and private
corporations. To aid surveillance and prosecution of crimes, the Indian
Government has mandated online messaging providers to enable
identification of originators of messages that traverse their platforms.
This paper establishes how the different ways in which this
‘traceability’ mandate can be implemented (dropping end-to-end
encryption, hashing messages, and attaching originator information to
messages) come with serious costs to usability, security and privacy.
Through a legal and constitutional analysis, we contend that
traceability exceeds the scope of delegated legislation under the
Information Technology Act, and is at odds with the fundamental right to
privacy.</p>
<p> </p>
<p>Click here to read the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">full paper</a>.</p>
</div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption'>http://editors.cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption</a>
</p>
No publisherGurshabad Grover, Tanaya Rajwade and Divyank KatiraCryptographyIntermediary LiabilityConstitutional LawInternet GovernanceMessagingEncryption Policy2021-07-12T08:18:18ZBlog EntryThe Case of Whatsapp Group Admins
http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins
<b></b>
<p style="text-align: justify; ">Censorship laws in India have now roped in group administrators of chat groups on instant messaging platforms such as Whatsapp (<i>group admin(s)</i>) for allegedly objectionable content that was posted by other users of these chat groups. Several incidents<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn1">[1]</a> were reported this year where group admins were arrested in different parts of the country for allowing content that was allegedly objectionable under law. A few reports mentioned that these arrests were made under Section 153A<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn2">[2]</a> read with Section 34<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn3">[3]</a> of the Indian Penal Code (<i>IPC</i>) and Section 67<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn4">[4]</a> of the Information Technology Act (<i>IT Act</i>).</p>
<p style="text-align: justify; "><span>Targeting of a group admin for content posted by other members of a chat group has raised concerns about how this liability is imputed. Whether a group admin should be considered an intermediary under Section 2 (w) of the IT Act? If yes, whether a group admin would be protected from such liability?</span></p>
<h3><strong>Group admin as an intermediary</strong></h3>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; ">Whatsapp is an instant messaging platform which can be used for mass communication by opting to create a chat group. A chat group is a feature on Whatsapp that allows joint participation of Whatsapp users. The number of Whatsapp users on a single chat group can be up to 100. Every chat group has one or more group admins who control participation in the group by deleting or adding people. <a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn5">[5]</a> It is imperative that we understand that by choosing to create a chat group on Whatsapp whether a group admin can become liable for content posted by other members of the chat group.</p>
<p style="text-align: justify; "><span>Section 34 of the IPC provides that when a number of persons engage in a criminal act with a common intention, each person is made liable as if he alone did the act. Common intention implies a pre-arranged plan and acting in concert pursuant to the plan. It is interesting to note that group admins have been arrested under Section 153A on the ground that a group admin and a member posting content on a chat group that is actionable under this provision have common intention to post such content on the group. But would this hold true when for instance, a group admin creates a chat group for posting lawful content (say, for matchmaking purposes) and a member of the chat group posts content which is actionable under law (say, posting a video abusing Dalit women)? Common intention can be established by direct evidence or inferred from conduct or surrounding circumstances or from any incriminating facts.</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn6">[6]</a></p>
<p style="text-align: justify; "><span>We need to understand whether common intention can be established in case of a user merely acting as a group admin. For this purpose it is necessary to see how a group admin contributes to a chat group and whether he acts as an intermediary.</span></p>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; "><span>We know that parameters for determining an intermediary differ across jurisdictions and most global organisations have categorised them based on their role or technical functions.</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn7">[7]</a><span> Section 2 (w) of the Information Technology Act, 2000 (</span><i>IT Act</i><span>) defines an intermediary as </span><i>any person, who on behalf of another person, receives, stores or transmits messages or provides any service with respect to that message</i><span> </span><i>and includes the telecom services providers, network providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online marketplaces and cyber cafés</i><span>. Does a group admin receive, store or transmit messages on behalf of group participants or provide any service with respect to messages of group participants or falls in any category mentioned in the definition? Whatsapp does not allow a group admin to receive, or store on behalf of another participant on a chat group. Every group member independently controls his posts on the group. However, a group admin helps in transmitting messages of another participant to the group by allowing the participant to be a part of the group thus effectively providing service in respect of messages. A group admin therefore, should be considered an intermediary. However his contribution to the chat group is limited to allowing participation but this is discussed in further detail in the section below.</span></p>
<p style="text-align: justify; "><span>According to the Organisation for Economic Co-operation and Development (OECD), in a 2010 report</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn8">[8]</a><span>, an internet intermediary brings together or facilitates transactions between third parties on the Internet. It gives access to, hosts, transmits and indexes content, products and services originated by third parties on the Internet or provide Internet-based services to third parties. A Whatsapp chat group allows people who are not on your list to interact with you if they are on the group admins’ contact list. In facilitating this interaction, according to the OECD definition, a group admin may be considered an intermediary.</span></p>
<h3><strong>Liability as an intermediary</strong></h3>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; ">Section 79 (1) of the IT Act protects an intermediary from any liability under any law in force (for instance, liability under Section 153A pursuant to the rule laid down in Section 34 of IPC) if an intermediary fulfils certain conditions laid down therein. An intermediary is required to carry out certain due diligence obligations laid down in Rule 3 of the Information Technology (Intermediaries Guidelines) Rules, 2011 (<i>Rules</i>). These obligations include monitoring content that infringes intellectual property, threatens national security or public order, or is obscene or defamatory or violates any law in force (Rule 3(2)).<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn9">[9]</a> An intermediary is liable for publishing or hosting such user generated content, however, as mentioned earlier, this liability is conditional. Section 79 of IT Act states that an intermediary would be liable only if it initiates transmission, selects receiver of the transmission and selects or modifies information contained in the transmission that falls under any category mentioned in Rule 3 (2) of the Rules. While we know that a group admin has the ability to facilitate sharing of information and select receivers of such information, he has no direct editorial control over the information shared. Group admins can only remove members but cannot remove or modify the content posted by members of the chat group. An intermediary is liable in the event it fails to comply with due diligence obligations laid down under rule 3 (2) and 3 (3) of the Rules however, since a group admin lacks the authority to initiate transmission himself and control content, he can’t comply with these obligations. Therefore, a group admin would be protected from any liability arising out of third party/user generated content on his group pursuant to Section 79 of the IT Act.</p>
<p style="text-align: justify; "><span>It is however relevant to note whether the ability of a group admin to remove participants amounts to an indirect form of editorial control.</span></p>
<h3><strong>Other pertinent observations</strong></h3>
<p style="text-align: justify; "><strong><span> </span></strong></p>
<p style="text-align: justify; ">In several reports<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn10">[10]</a> there have been discussions about how holding a group admin liable makes the process convenient as it is difficult to locate all the users of a particular group. This reasoning may not be correct as the Whatsapp policy<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn11">[11]</a> makes it mandatory for a prospective user to provide his mobile number in order to use the platform and no additional information is collected from group admins which may justify why group admins are targeted. Investigation agencies can access mobile numbers of Whatsapp users and gain more information from telecom companies.</p>
<p style="text-align: justify; "><span>It is also interesting to note that the group admins were arrested after a user or someone familiar to a user filed a complaint with the police about content being objectionable or hurtful. Earlier this year, the apex court had ruled in the case of </span><i>Shreya Singhal v. Union of India</i><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn12">[12]</a><span> that an intermediary needed a court order or a government notification for taking down information. With actions taken against group admins on mere complaints filed by anyone, it is clear that the law enforcement officials have been overriding the mandate of the court.</span></p>
<h3><strong>Conclusion</strong></h3>
<p> </p>
<p><span style="text-align: justify; ">According to a study conducted by a global research consultancy, TNS Global, around 38 % of internet users in India use instant messaging applications such as Snapchat and Whatsapp on a daily basis, Whatsapp being the most widely used application. These figures indicate the scale of impact that arrests of group admins may have on our daily communication.</span></p>
<p style="text-align: justify; "><span>It is noteworthy that categorising a group admin as an intermediary would effectively make the Rules applicable to all Whatsapp users intending to create groups and make it difficult to enforce and would perhaps blur the distinction between users and intermediaries.</span></p>
<p style="text-align: justify; "><span>The critical question however is whether a chat group is considered a part of the bundle of services that Whatsapp offers to its users and not as an independent platform that makes a group admin a separate entity. Also, would it be correct to draw comparison of a Whatsapp group chat with a conference call on Skype or sharing a Google document with edit rights to understand the domain in which censorship laws are penetrating today?</span></p>
<p style="text-align: justify; "> </p>
<p style="text-align: justify; "><i>Valuable contribution by Pranesh Prakash and Geetha Hariharan</i></p>
<hr size="1" style="text-align: justify; " width="33%" />
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref1">[1]</a> <a href="http://www.nagpurtoday.in/whatsapp-admin-held-for-hurting-religious-sentiment/06250951">http://www.nagpurtoday.in/whatsapp-admin-held-for-hurting-religious-sentiment/06250951</a> ; <a href="http://www.catchnews.com/raipur-news/whatsapp-group-admin-arrested-for-spreading-obscene-video-of-mahatma-gandhi-1440835156.html">http://www.catchnews.com/raipur-news/whatsapp-group-admin-arrested-for-spreading-obscene-video-of-mahatma-gandhi-1440835156.html</a> ; <a href="http://www.financialexpress.com/article/india-news/whatsapp-group-admin-along-with-3-members-arrested-for-objectionable-content/147887/">http://www.financialexpress.com/article/india-news/whatsapp-group-admin-along-with-3-members-arrested-for-objectionable-content/147887/</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref2">[2]</a> Section 153A. “Promoting enmity between different groups on grounds of religion, race, place of birth, residence, language, etc., and doing acts prejudicial to maintenance of harmony.— (1) Whoever— (a) by words, either spoken or written, or by signs or by visible representations or otherwise, promotes or attempts to promote, on grounds of religion, race, place of birth, residence, language, caste or community or any other ground whatsoever, disharmony or feelings of enmity, hatred or ill-will between different religious, racial, language or regional groups or castes or communities…” or 2) Whoever commits an offence specified in sub-section (1) in any place of worship or in any assembly engaged in the performance of religious worship or religious ceremonies, shall be punished with imprisonment which may extend to five years and shall also be liable to fine.</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref3">[3]</a> Section 34. Acts done by several persons in furtherance of common intention – When a criminal act is done by several persons in furtherance of common intention of all, each of such persons is liable for that act in the same manner as if it were done by him alone.</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref4">[4]</a> Section 67 Publishing of information which is obscene in electronic form. -Whoever publishes or transmits or causes to be published in the electronic form, any material which is lascivious or appeals to the prurient interest or if its effect is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it, shall be punished on first conviction with imprisonment of either description for a term which may extend to five years and with fine which may extend to one lakh rupees and in the event of a second or subsequent conviction with imprisonment of either description for a term which may extend to ten years and also with fine which may extend to two lakh rupees."</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref5">[5]</a> https://www.whatsapp.com/faq/en/general/21073373</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref6">[6]</a> Pandurang v. State of Hyderabad AIR 1955 SC 216</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref7">[7]</a><a href="https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf">https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf</a>; <a href="http://unesdoc.unesco.org/images/0023/002311/231162e.pdf">http://unesdoc.unesco.org/images/0023/002311/231162e.pdf</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref8">[8]</a> http://www.oecd.org/internet/ieconomy/44949023.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref9">[9]</a> Rule 3(2) (b) of the Rules</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref10">[10]</a><a href="http://www.thehindu.com/news/national/other-states/if-you-are-a-whatsapp-group-admin-better-be-careful/article7531350.ece">http://www.thehindu.com/news/national/other-states/if-you-are-a-whatsapp-group-admin-better-be-careful/article7531350.ece</a>; http://www.newindianexpress.com/states/tamil_nadu/Social-Media-Administrator-You-Could-Land-in-Trouble/2015/10/10/article3071815.ece; <a href="http://www.medianama.com/2015/10/223-whatsapp-group-admin-arrest/">http://www.medianama.com/2015/10/223-whatsapp-group-admin-arrest/</a>; <a href="http://www.thenewsminute.com/article/whatsapp-group-admin-you-are-intermediary-and-here%E2%80%99s-what-you-need-know-35031">http://www.thenewsminute.com/article/whatsapp-group-admin-you-are-intermediary-and-here%E2%80%99s-what-you-need-know-35031</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref11">[11]</a> https://www.whatsapp.com/legal/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref12">[12]</a> http://supremecourtofindia.nic.in/FileServer/2015-03-24_1427183283.pdf</p>
<div>
<div id="ftn12"></div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins'>http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins</a>
</p>
No publisherJapreet GrewalIT ActIntermediary LiabilityCensorship2015-12-08T10:25:42ZBlog EntrySuper Cassettes v. MySpace (Redux)
http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace
<b>The latest judgment in the matter of Super Cassettes v. MySpace is a landmark and progressive ruling, which strengthens the safe harbor immunity enjoyed by Internet intermediaries in India. It interprets the provisions of the IT Act, 2000 and the Copyright Act, 1957 to restore safe harbor immunity to intermediaries even in the case of copyright claims. It also relieves MySpace from pre-screening user-uploaded content, endeavouring to strike a balance between free speech and censorship. CIS was one of the intervenors in the case, and has been duly acknowledged in the judgment.</b>
<p> </p>
<p>On 23rd December 2016, Justice Ravindra Bhat and Justice Deepa Sharma of the Delhi High Court delivered a decision overturning the 2012 order in the matter of Super Cassettes Industries Limited v. MySpace. The 2012 order was heavily criticized, for it was agnostic to the technological complexities of regulating speech on the Internet and cast unfathomable burdens on MySpace. In the following post I summarise the decision of the Division Bench. Click <a class="external-link" href="http://lobis.nic.in/ddir/dhc/SRB/judgement/24-12-2016/SRB23122016FAOOS5402011.pdf">here</a> to read the judgment.</p>
<h3><strong>Brief Facts</strong></h3>
<p>In 2007, Super Cassettes Industries Limited (SCIL) filed a suit against MySpace, a social networking platform, alleging copyright infringement against MySpace. The platform allowed users to upload and share media files,
<em>inter alia</em>, and it was discovered that users were sharing SCIL’s copyrighted works sans authorisation. SCIL promptly proceeded to file a civil suit against MySpace for primary infringement under section 51(a)(i)
of the Copyright Act as well as secondary infringement under section 51(a)(ii).</p>
<p> The 2012 order was extremely worrisome as it had turned the clock several decades back on concepts of internet intermediary liability. The court had held MySpace liable for copyright infringement despite it having shown no knowledge about specific instances of infringement; that it removed infringing content upon complaints; and that Super Cassettes had failed to submit songs to MySpace's song ID database. The most impractical burden of duty that the court pronounced was that MySpace was required to pre-screen content, rather than relying on post-infringement measures to remove infringing content. This was a result of interpreting due diligence to include pre-screening.</p>
<p>The court injuncted MySpace from permitting any uploads of SCIL's copyrighted content, and directed to expeditiously execute content removal requests. To read CIS' analysis of the Single Judge's interim order, click <a class="external-link" href="http://cis-india.org/a2k/blogs/super-cassettes-v-my-space">here</a>.</p>
<p>In the instant judgment, the bench limited their examination to MySpace’s liability for secondary infringement, and left the direct infringement determination to the Single Judge at the subsequent trial stage. In doing so, the court answered the following three questions:</p>
<h4>1) Whether MySpace could be said to have knowledge of infringement so as to attract liability for
secondary infringement under Section 51(a)(ii)?</h4>
<p>No. According to the Court, in the case of internet intermediaries, section 51(a)(ii) contemplates actual knowledge and not general awareness.</p>
<p>Elaborating re the circumstances of the case, the Court held that to attract liability for secondary infringement, MySpace should have had actual knowledge and not mere awareness of the infringement. Appreciating the difference between virtual and physical worlds, the judgment stated “<em>the nature of internet media is such that the interpretation of knowledge cannot be the same as that is used for a physical premise.”</em></p>
<p>As per the court, the following facts only amounted to a general awareness, which was not sufficient to establish secondary liability:</p>
<ol><li>Existence of user agreement terms which prohibited users from unauthorised uploading of content;<br />
</li><li>Operation of post-infringement mechanisms instituted by MySpace to identify and remove content;<br />
</li><li>SCIL sharing a voluminous catalogue of 100,000 copyrighted songs with MySpace, expecting the latter to monitor and quell any infringement;<br />
</li><li>Modifying videos to insert ads in them: SCIL contended that MySpace invited users to share and upload content which it would use to insert ads and make revenues – and this amounted to knowledge. The Court found that video modification for ad insertion only changed the format of the video and not the content; further, it was a pure automated process and there was no human intervention.</li></ol>
<p>Additionally, no constructive knowledge could be attributed to MySpace to demonstrate reasonable ground for believing that infringement had occurred. A reasonable belief could emerge only after MySpace had perused all the content uploaded and shared on its platform – a task that was impossible to perform due to the voluminous catalogue
handed to it and existing technological limitations.</p>
<p>The Court imposed a duty on SCIL to specify the works in which it owned copyright <em>and </em>being shared
without authorisation on MySpace. It held that merely giving names of all content it owned without expressly pointing out the infringing works was contrary to the established principles of copyright law. Further, MySpace contended and the judge agreed, that in many instances the works were legally shared by distributors and performers – and often users created remixed works which only bore semblance to the title of the copyright work.</p>
<p class="callout"><strong><em>In such cases it becomes even more important for a plaintiff such as
MySpace to provide specific titles, because while an intermediary may
remove the content fearing liability and damages, an authorized
individual’s license and right to fair use will suffer or stand negated.
(Para 38 in decision)</em></strong></p>
<p>Thus, where as MySpace undoubtedly permitted a place of profit for communication of infringing works uploaded by users, it did not have specific knowledge, nor reasonable belief of the infringement.</p>
<h4>2) Does proviso to Section 81 override the "safe harbor" granted to intermediaries under Section 79 of the IT Act, 2000?</h4>
<p>and</p>
<h4>3) Whether it was possible to harmoniously read and interpret Sections 79 and 81 of the IT Act, and Section 51 of the Copyright Act?</h4>
<p>No, the proviso does not override the safe harbor, i.e. the safe harbor
defence cannot be denied to the intermediary in the case of copyright
actions.The three sections have to be read harmoniously, indeed.</p>
<p>
The judgment referred to the Parliamentary Standing Committee report as a relevant tool in interpreting the two provisions, declaring that the rights conferred under the IT Act, 2000 are supplementary and not in derogation of the Patents Act or the Copyright Act. The proviso was inserted only to permit copyright owners to demand action
against intermediaries who may themselves post infringing content – the safe harbor only existed for circumstances when content was third party/user generated.</p>
<p class="callout"><strong><em>Given the supplementary nature of the provisions- one where infringement
is defined and traditional copyrights are guaranteed and the other
where digital economy and newer technologies have been kept in mind, the
only logical and harmonious manner to interpret the law would be to read
them together. Not doing so would lead to an undesirable situation
where intermediaries would be held liable irrespective of their due
diligence. (Para 49 in decision)</em></strong></p>
<p>Regarding section 79, the court reiterated that the section only granted a limited immunity to intermediaries by granting a <em>measured privilege to an intermediary</em>, which was in the nature of an affirmative defence and not a blanket immunity to avoid liability. The very purpose of section 79 was to regulate and limit this liability; where as the Copyright Act granted and controlled rights of a copyright owner.</p>
<p>The Court found Judge Whyte’s decision in Religious Technology Centre v. Netcom Online Communication Services (1995), to be particularly relevant to the instant case, and agreed with its observations. To recall, <em>Netcom</em> was the landmark US ruling which established that when a subscriber was responsible for direct infringement, and the service providers did nothing more than setting up and operating tech systems which were
necessary for the functioning of the Internet, it was illogical to impute liability on the service provider.</p>
<h3><strong>On MySpace Complying with Safe Harbor Requirements under Section 79 of the IT Act, 2000 (and Intermediary Rules, 2011)</strong></h3>
<p>The court held that MySpace's operations were in compliance with section 79(2)(b). The content transmission was initiated at the behest of the users, the recipients were not chosen by MySpace, neither was there modification of content. On the issue of modification, the court reasoned that since modification was an automated process (MySpace was inserting ads) which changed the format only, without MySpace's tacit or expressed control or knowledge, it was in compliance of the legislative requirement.</p>
<p class="callout"><strong><em>Despite several safeguard tools and notice and take down regimes,
infringed videos find their way. The remedy here is not to target
intermediaries but to ensure that infringing material is removed in an
orderly and reasonable manner. A further balancing act is required which
is that of freedom of speech and privatized censorship. If an
intermediary is tasked with the responsibility of identifying infringing
content from non-infringing one, it could have a chilling effect on
free speech; an unspecified or incomplete list may do that.
(Para 62 in decision)</em></strong></p>
On the second aspect of due-diligence, the court held that Mypace complied with the due diligence procedure specified in the Rules - it published rules, regulations, privacy policy and user agreement for access of usage. Reading Rule 3(4) with section 79(2)(c), the court held that it due diligence required MySpace to remove content within 36 hours of gaining actual knowledge or receiving knowledge by another person of the infringing content. <strong>If MySpace failed to take infringing content down accordingly, then only will safe harbour be denied to MySpace.</strong>
<p>This liberal interpretation of due diligence is a big win for internet intermediaries in India.</p>
<h3><strong>Additional Issues Considered by the Court</strong></h3>
<p>MySpace also tried to defend its activities by claiming the shield of the fair dealing section of the Indian Copyright Act. However, the Court refused, stating that the fair dealing defence was inapplicable to the case as the provisions protected transient and incidental storage. Whereas, in the instant circumstances, the content in question was stored/hosted permanently.</p>
<p>MySpace also contended that the Single Judge's injunction order was vague and general and had foisted unimplementable duties on MySpace, disregarding the way the Internet functioned. If MySpace had to strictly comply with the order, it would have to shut its business in India. <strong>The Court said that the Single Judge's order, if enforced, would create a system of unwarranted private censorship, running contrary to the principles of a free speech regime, devoid of considerations of peculiarities of the internet intermediary industry. </strong>Private censorship would also invite upon the ISP the legal risk of wrongfully terminating a user account.</p>
<p>Finally, the Court urged MySpace to explore and innovate techniques to protect the interests of traditional copyright holders in a more efficient manner.</p>
<h3><strong>Relief Granted</strong></h3>
<p>Setting aside the Single Judge's order aside, the Court directed SCIL to provide a specific catalogue of infringing works which also pointed to the URL of the files. Upon receiving such specific knowledge, MySpace has been directed to remove the content within 36 hours of the issued notice. MySpace will also keep an account of the removals, and the revenues earned from ads placed for calculating damages at the trial stage.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace'>http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace</a>
</p>
No publishersinhaIntermediary LiabilityCopyrightCensorshipAccess to Knowledge2017-01-18T14:31:25ZBlog EntrySuper Cassettes v. MySpace
http://editors.cis-india.org/a2k/blogs/super-cassettes-v-my-space
<b>The Delhi High Court’s judgment in Super Cassettes v. MySpace last July is worrying for a number of reasons. The court failed to appreciate the working of intermediaries online and disregard all pragmatic considerations involved. The consequences for free expression and particularly for file sharing by users of services online are especially unfavourable. </b>
<p style="text-align: justify; ">The judgment<a href="#fn*" name="fr*">[*]</a>is extremely worrying since it holds MySpace liable for copyright infringement, <b>despite</b> it having shown that it did not know, and could not have known, about each instance of infringement; that it removed each instance of alleged infringement upon mere complaint; that it asked Super Cassettes to submit their songs to their song identification database and Super Cassettes didn't.</p>
<p style="text-align: justify; ">This, in essence, means, that all 'social media services' in which there is even a <b>potential</b> for copyright infringement (such as YouTube, Facebook, Twitter, etc.) are now faced with a choice of either braving lawsuits for activities of their users that they have no control over — they can at best respond to takedown requests after the infringing material has already been put up — or to wind down their operations in India.</p>
<h2 style="text-align: justify; ">The Facts</h2>
<p style="text-align: justify; ">Aside from social networking, MySpace facilitates the sharing of content between its users. This case concerns content (whose copyright vested in T-Series) was uploaded by users to MySpace’s website. It appears that tensions between MySpace and T-Series arose in 2007, when T-Series entered into talks with MySpace to grant it licenses in its copyrighted content, while MySpace asked instead that T-Series register with its rights management programme. Neither the license nor the registration came about, and the infringing material continued to be available on the MySpace website.</p>
<p style="text-align: justify; ">Specifically, T-Series alleged that cases for primary infringement under section 51(a)(i) of the Copyright Act as well as secondary infringement under section 51 (a) (ii) could be made out. Alleging that MySpace had infringed its copyrights and so affected its earnings in royalties, T-Series approached the Delhi High Court and filed a suit seeking injunctive relief and damages. In proceedings for interim relief while the suit was pending, the court granted an injunction, but, in an appeal by MySpace, added the qualification that the content would have to be taken down only on receipt of a specific catalogue of infringing works available on MySpace, rather than a general list of works in which T-Series held a copyright.</p>
<h2 style="text-align: justify; ">The Defence</h2>
<p>While other arguments such as one around the jurisdiction of the court were also raised, the central issues are listed below:</p>
<ol>
<li style="text-align: justify; ">Non-Specificity of Prayer<br />T-Series’ claim in the suit is for a blanket injunction on copyrighted content on the MySpace website. This imposes a clearly untenable, even impossible, burden for intermediaries to comply with.</li>
<li style="text-align: justify; ">Knowledge<br />MySpace argued that no liability could accrue to it on two counts. The first was that it had no actual or direct knowledge or role in the selection of the content, while the second was that no control was exercised, or was exercisable over the uploading of the content. Additionally, there was no possible means by which it could have identified the offending content and segregated it from lawful content, or monitored all of the content that it serves as a platform for.</li>
<li style="text-align: justify; ">Intermediary status and Safe Harbour Protection<br />In relation to its status as an intermediary, MySpace raised several arguments. First, it argued that it had immunity under section 79 of the IT Act and under the US Digital Millennium Copyright Act (US DMCA). Another argument restated what is arguably the most basic tenet of intermediary liability that merely providing the platform by which infringement could occur cannot amount to infringement. In other words, the mere act of facilitating expression over internet does not amount to infringement. It then made reference to its terms of use and its institution of safeguards (in the form of a hash filter, a rights management tool and a system of take-down–stay-down), which it argued clearly reflect an intention to discourage or else address cases of infringement as they arise. MySpace also emphasized that a US DMCA compliant procedure was in place, although T-Series countered that the notice and take down system would not mitigate the infringement.</li>
<li style="text-align: justify; ">Relationship between MySpace and its Users<br />Taking from previous arguments about a lack of control and its status as an intermediary, MySpace argued that it was simply a licensee of users who uploaded content. The license is limited, in that MySpace is only allowed to alter user-generated content so as to make it viewable.</li>
</ol>
<h2 style="text-align: justify; ">Outcomes</h2>
<ol>
<li style="text-align: justify; ">Infringement by Facilitation<br />The court concluded that infringement in terms of section 51 (a) (ii) had occurred in this case, since web space is a “place” in the terms required by the section and there were monetary gains in the form of ad revenue. The argument as to a lack of knowledge of infringement was also rejected on the ground that MySpace’s provision for safeguards against infringement clearly established a reason to believe that infringement will occur. Also referenced as evidence of knowledge, or at least a reason to believe infringement would occur, is the fact that MySpace modifies the format of the content before making it available on its website. It also tested for infringement by authorization in terms of section 14 read with section 51 (a) (i), but concluded that this did not arise here.</li>
<li style="text-align: justify; ">Reading away section 79?<br />The court accepted the argument made by T-Series to the effect that sections 79 and 81 of the IT Act must be read together. Since section 79 would be overridden by section 81’s non-obstante, the effect would be that rights holders’ interests under the Copyright Act will erode intermediaries’ immunity under section 79. </li>
<li style="text-align: justify; ">Due Diligence<br />The court rejected the argument that the provision of due diligence or curative measures post-infringement would be sufficient. Specifically, the contention that the quantum of content being uploaded precludes close scrutiny, given the amount of labour that would be involved, was rejected. Content should not immediately be made available but must be subject to enquiries as to its title or to authentication of its proprietor before it is made available. In fact, it holds that, “there is no reason to axiomatically make each and every work available to the public solely because user has supplied them unless the defendants are so sure that it is not infringement.” (Paragraph 88).</li>
</ol> <ol> </ol>
<p style="text-align: justify; ">There is also an attempt to distinguish the Indian framework from the DMCA. While that law calls for post-infringement measures, it is argued that in India, on reading section 51 with section 55, the focus is on preventing infringement at the threshold. In response to the case that it would be impossible to do so, the court held that since the process here requires MySpace to modify the format of content uploaded to it to make it viewable, it will have a reasonable opportunity to test for infringement.</p>
<h2 style="text-align: justify; ">Analysis</h2>
<h3>Accounting for the Medium of Communication</h3>
<p style="text-align: justify; ">The court’s analysis of the issues begins with a predictable emphasis on how the law of copyright would operate in the context of what is termed “internet computing”, peppered with trite statements about “the virtual world of internet” creating “complexit[ies]” for copyright law. The court appears to have entered into this discussion to establish that the notion of place in section 51 (a) (ii) should extend to “web space” but the statements made here only serve to contrast starkly against its subsequent failure to account for the peculiarities of form and function of intermediaries online. Had this line of argument been taken to its logical conclusion, after the character of the medium had been appreciated, the court’s final conclusion, that MySpace is liable for copyright infringement, would have been an impossible one to arrive at.</p>
<h3 style="text-align: justify; ">And What of Free Speech?</h3>
<p style="text-align: justify; ">As it had argued before the court, intermediaries such as MySpace have no means by which to determine whether content is illegal (whether by reason of amounting to a violation of copyright, or otherwise) until content is uploaded. In other words, there is no existing mechanism by which this determination can be made at the threshold, before posting.</p>
<p style="text-align: justify; ">The court does not engage with the larger consequences for such a scheme of penalizing intermediaries. Censoring patent illegalities at the threshold, even if that were possible is one thing. The precedent that the court creates here is quite another. Given the general difficulty in conclusively establishing whether there is an infringement at all due to the complexities in applying the exceptions contained under section 52, it should not be for ordinary private or commercial interests such as intermediaries to sit in judgment over whether content is or is not published at all. In order to minimize its own liability, the likelihood of legitimate content being censored by the intermediary prior to posting is high.</p>
<p style="text-align: justify; ">The consequences for civil liberties, and free speech and expression online in particular, appear to have been completely ignored in favour of rights holders’ commercial interests.</p>
<h3 style="text-align: justify; ">Consequences for Intermediary Liability and Safe Harbour Protection</h3>
<blockquote class="pullquote" style="text-align: justify; ">Even if every instance in question did amount to an infringement of copyright and a mechanism did exist allowing for removal of content, the effect of this judgment is to create a strict liability regime for intermediaries.</blockquote>
<p style="text-align: justify; ">In other words, the court’s ruling will have the effect that courts’ determination of intermediaries’ liability will become detached from whether or not any fault can be attributed to them. MySpace did make this argument, even going as far as to suggest that doing so would impose strict liability on intermediaries. This would lead to an unprecedented and entirely unjustifiable result. In spite the fact that a given intermediary did apply all available means to prevent the publication of potentially infringing content, it would remain potentially liable for any illegality in the content, even though the illegality could not have been detected or addressed.</p>
<p style="text-align: justify; ">What is perhaps even more worrying is that MySpace’s attempt at proactively and in good faith preventing copyright infringement through its terms of use and in addressing them through its post-infringement measures was explicitly cited as evidence of knowledge of and control over the uploading of copyrighted material, at the threshold rather than ex post. This creates perverse incentives for the intermediary to ignore infringement, to the detriment of rights holders, rather than act proactively to minimize its incidence.</p>
<p style="text-align: justify; ">A final observation is that the court’s use, while pronouncing on relief, of the fact that MySpace makes a “copy” of the uploaded content by converting it into a format that could subsequently be hosted on the site and made accessible to show evidence of infringement and impose liability upon MySpace in itself is a glaring instance of the disingenuous reasoning the court employs throughout the case. There is another problem with the amended section 79, which waives immunity where the intermediary “modifies” material. That term is vague and overreaches, as it does here: altering formats to make content compatible with a given platform is not comparable to choices as to the content of speech or expression, but the reading is tenable under section 79 as it stands.</p>
<p style="text-align: justify; ">The result of all of this is to dislodge the section 79 immunity that accrues to intermediaries and replace that with a presumption that they are liable, rather than not, for any illegality in the content that they passively host.</p>
<h3 style="text-align: justify; ">Effect of the Copyright (Amendment) Act, 2012</h3>
<p style="text-align: justify; ">Since the judgment in the MySpace case, the Copyright Act has been amended to include some provisions that would bear on online service providers and on intermediaries’ liability for hosting infringing content, in particular. Section 52 (1) (b) of the amended Act provides that “transient or incidental storage of a work or performance purely in the technical process of electronic transmission or communication to the public” would not infringe copyright. The other material provision is section 52 (1) (c) which provides that “transient or incidental storage of a work or performance for the purpose of providing electronic links, access or integration, where such links, access or integration has not been expressly prohibited by the right holder, unless the person responsible is aware or has reasonable grounds for believing that such storage is of an infringing copy” will not constitute an infringement of copyright. The latter provision appears to institute a rather rudimentary, and very arguably incomplete, system of notice and takedown by way of a proviso. This requires intermediaries to takedown content on written complaint from copyright owners for a period of 21 days or until a competent rules on the matter whichever is sooner, and restore access to the content once that time period lapses, if there is no court order to sustain it beyond that period.</p>
<p style="text-align: justify; ">This post does not account for the effect that these provisions could have had on the case, but it is already clear, from the sloppy drafting of section 52 (1) (c) and its proviso that they are not entirely salutary even at the outset. At any rate, there appears to be nothing that *<i>determinatively*</i> affects intermediaries’ secondary liability, <i>i.e.</i>, their liability for users’ infringing acts.</p>
<hr />
<p style="text-align: justify; "><i>Disclosure: CIS is now a party to these proceedings at the Delhi High Court. This is a purely academic critique, and should not be seen to have any prejudice to the arguments we will make there.</i></p>
<hr />
<p>[<a href="#fr*" name="fn*">*</a>]. Super Cassettes Industries Ltd. v. MySpace Inc. and Another, on 29 July, 2011, Indian Kanoon - Search engine for Indian Law. See<a class="external-link" href="http://bit.ly/quj6JW"> http://bit.ly/quj6JW</a>, last accessed on October 31, 2012.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/super-cassettes-v-my-space'>http://editors.cis-india.org/a2k/blogs/super-cassettes-v-my-space</a>
</p>
No publisherujwalaAccess to KnowledgeCopyrightIntellectual Property RightsIntermediary LiabilityFeatured2012-10-31T10:27:36ZBlog EntrySummary Report Internet Governance Forum 2015
http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015
<b>Centre for Internet and Society (CIS), India participated in the Internet Governance Forum (IGF) held at Poeta Ronaldo Cunha Lima Conference Center, Joao Pessoa in Brazil from 10 November 2015 to 13 November 2015. The theme of IGF 2015 was ‘Evolution of Internet Governance: Empowering Sustainable Development’. Sunil Abraham, Pranesh Prakash & Jyoti Panday from CIS actively engaged and made substantive contributions to several key issues affecting internet governance at the IGF 2015. The issue-wise detail of their engagement is set out below. </b>
<p align="center" style="text-align: left;"><strong>INTERNET
GOVERNANCE</strong></p>
<p align="justify">
I. The
Multi-stakeholder Advisory Group to the IGF organised a discussion on
<em><strong>Sustainable
Development Goals (SDGs) and Internet Economy</strong></em><em>
</em>at
the Main Meeting Hall from 9:00 am to 12:30 pm on 11 November, 2015.
The
discussions at this session focused on the importance of Internet
Economy enabling policies and eco-system for the fulfilment of
different SDGs. Several concerns relating to internet
entrepreneurship, effective ICT capacity building, protection of
intellectual property within and across borders were availability of
local applications and content were addressed. The panel also
discussed the need to identify SDGs where internet based technologies
could make the most effective contribution. Sunil
Abraham contributed to the panel discussions by addressing the issue
of development and promotion of local content and applications. List
of speakers included:</p>
<ol>
<li>
<p align="justify">
Lenni
Montiel, Assistant-Secretary-General for Development, United Nations</p>
</li><li>
<p align="justify">
Helani
Galpaya, CEO LIRNEasia</p>
</li><li>
<p align="justify">
Sergio
Quiroga da Cunha, Head of Latin America, Ericsson</p>
</li><li>
<p align="justify">
Raúl
L. Katz, Adjunct Professor, Division of Finance and Economics,
Columbia Institute of Tele-information</p>
</li><li>
<p align="justify">
Jimson
Olufuye, Chairman, Africa ICT Alliance (AfICTA)</p>
</li><li>
<p align="justify">
Lydia
Brito, Director of the Office in Montevideo, UNESCO</p>
</li><li>
<p align="justify">
H.E.
Rudiantara, Minister of Communication & Information Technology,
Indonesia</p>
</li><li>
<p align="justify">
Daniel
Sepulveda, Deputy Assistant Secretary, U.S. Coordinator for
International and Communications Policy at the U.S. Department of
State </p>
</li><li>
<p align="justify">
Deputy
Minister Department of Telecommunications and Postal Services for
the republic of South Africa</p>
</li><li>
<p align="justify">
Sunil
Abraham, Executive Director, Centre for Internet and Society, India</p>
</li><li>
<p align="justify">
H.E.
Junaid Ahmed Palak, Information and Communication Technology
Minister of Bangladesh</p>
</li><li>
<p align="justify">
Jari
Arkko, Chairman, IETF</p>
</li><li>
<p align="justify">
Silvia
Rabello, President, Rio Film Trade Association</p>
</li><li>
<p align="justify">
Gary
Fowlie, Head of Member State Relations & Intergovernmental
Organizations, ITU</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">http</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">://</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">www</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">.</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">.</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">org</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">/</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">cms</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">/</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">igf</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">2015-</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">main</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">-</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">sessions</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2327-2015-11-11-internet-economy-and-sustainable-development-main-meeting-room">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2327-2015-11-11-internet-economy-and-sustainable-development-main-meeting-room</a></u></p>
<p align="justify">
Video
link Internet
economy and Sustainable Development here
<a href="https://www.youtube.com/watch?v=D6obkLehVE8">https://www.youtube.com/watch?v=D6obkLehVE8</a></p>
<p align="justify"> II.
Public
Knowledge organised a workshop on <em><strong>The
Benefits and Challenges of the Free Flow of Data </strong></em>at
Workshop Room
5 from 11:00 am to 12:00 pm on 12 November, 2015. The discussions in
the workshop focused on the benefits and challenges of the free flow
of data and also the concerns relating to data flow restrictions
including ways to address
them. Sunil
Abraham contributed to the panel discussions by addressing the issue
of jurisdiction of data on the internet. The
panel for the workshop included the following.</p>
<ol>
<li>
<p align="justify">
Vint
Cerf, Google</p>
</li><li>
<p align="justify">
Lawrence
Strickling, U.S. Department of Commerce, NTIA</p>
</li><li>
<p align="justify">
Richard
Leaning, European Cyber Crime Centre (EC3), Europol</p>
</li><li>
<p align="justify">
Marietje
Schaake, European Parliament</p>
</li><li>
<p align="justify">
Nasser
Kettani, Microsoft</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS
India</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">://</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">www</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">org</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">cms</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshops</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">list</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">of</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">published</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshop</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">proposals</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2467-2015-11-12-ws65-the-benefits-and-challenges-of-the-free-flow-of-data-workshop-room-5">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2467-2015-11-12-ws65-the-benefits-and-challenges-of-the-free-flow-of-data-workshop-room-5</a></p>
<p align="justify">
Video link https://www.youtube.com/watch?v=KtjnHkOn7EQ</p>
<p align="justify"> III.
Article
19 and
Privacy International organised a workshop on <em><strong>Encryption
and Anonymity: Rights and Risks</strong></em>
at Workshop Room 1 from 11:00 am to 12:30 pm on 12 November, 2015.
The
workshop fostered a discussion about the latest challenges to
protection of anonymity and encryption and ways in which law
enforcement demands could be met while ensuring that individuals
still enjoyed strong encryption and unfettered access to anonymity
tools. Pranesh
Prakash contributed to the panel discussions by addressing concerns
about existing south Asian regulatory framework on encryption and
anonymity and emphasizing the need for pervasive encryption. The
panel for this workshop included the following.</p>
<ol>
<li>
<p align="justify">
David
Kaye, UN Special Rapporteur on Freedom of Expression</p>
</li><li>
<p align="justify">
Juan
Diego Castañeda, Fundación Karisma, Colombia</p>
</li><li>
<p align="justify">
Edison
Lanza, Organisation of American States Special Rapporteur</p>
</li><li>
<p align="justify">
Pranesh
Prakash, CIS India</p>
</li><li>
<p align="justify">
Ted
Hardie, Google</p>
</li><li>
<p align="justify">
Elvana
Thaci, Council of Europe</p>
</li><li>
<p align="justify">
Professor
Chris Marsden, Oxford Internet Institute</p>
</li><li>
<p align="justify">
Alexandrine
Pirlot de Corbion, Privacy International</p>
</li></ol>
<p align="justify"><a name="_Hlt435412531"></a>
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">://</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">www</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">org</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">cms</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">worksh</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">o</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">ps</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">list</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">of</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">published</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshop</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">proposals</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2407-2015-11-12-ws-155-encryption-and-anonymity-rights-and-risks-workshop-room-1">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2407-2015-11-12-ws-155-encryption-and-anonymity-rights-and-risks-workshop-room-1</a></p>
<p align="justify">
Video link available here https://www.youtube.com/watch?v=hUrBP4PsfJo</p>
<p align="justify"> IV.
Chalmers
& Associates organised a session on <em><strong>A
Dialogue on Zero Rating and Network Neutrality</strong></em>
at the Main Meeting Hall from 2:00 pm to 4:00 pm on 12 November,
2015. The Dialogue provided access to expert insight on zero-rating
and a full spectrum of diverse
views on this issue. The Dialogue also explored alternative
approaches to zero rating such as use of community networks. Pranesh
Prakash provided
a
detailed explanation of harms and benefits related to different
approaches to zero-rating. The
panellists for this session were the following.</p>
<ol>
<li>
<p align="justify">
Jochai
Ben-Avie, Senior Global Policy Manager, Mozilla, USA</p>
</li><li>
<p align="justify">
Igor
Vilas Boas de Freitas, Commissioner, ANATEL, Brazil</p>
</li><li>
<p align="justify">
Dušan
Caf, Chairman, Electronic Communications Council, Republic of
Slovenia</p>
</li><li>
<p align="justify">
Silvia
Elaluf-Calderwood, Research Fellow, London School of Economics,
UK/Peru</p>
</li><li>
<p align="justify">
Belinda
Exelby, Director, Institutional Relations, GSMA, UK</p>
</li><li>
<p align="justify">
Helani
Galpaya, CEO, LIRNEasia, Sri Lanka</p>
</li><li>
<p align="justify">
Anka
Kovacs, Director, Internet Democracy Project, India</p>
</li><li>
<p align="justify">
Kevin
Martin, VP, Mobile and Global Access Policy, Facebook, USA</p>
</li><li>
<p align="justify">
Pranesh
Prakash, Policy Director, CIS India</p>
</li><li>
<p align="justify">
Steve
Song, Founder, Village Telco, South Africa/Canada</p>
</li><li>
<p align="justify">
Dhanaraj
Thakur, Research Manager, Alliance for Affordable Internet, USA/West
Indies</p>
</li><li>
<p align="justify">
Christopher
Yoo, Professor of Law, Communication, and Computer & Information
Science, University of Pennsylvania, USA</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">http://www.intgovforum.org/cms/igf2015-main-sessions</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2457-2015-11-12-a-dialogue-on-zero-rating-and-network-neutrality-main-meeting-hall-2">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2457-2015-11-12-a-dialogue-on-zero-rating-and-network-neutrality-main-meeting-hall-2</a></p>
<p align="justify"> V.
The
Internet & Jurisdiction Project organised a workshop on
<em><strong>Transnational
Due Process: A Case Study in MS Cooperation</strong></em>
at Workshop Room
4 from 11:00 am to 12:00 pm on 13 November, 2015. The
workshop discussion focused on the challenges in developing an
enforcement framework for the internet that guarantees transnational
due process and legal interoperability. The discussion also focused
on innovative approaches to multi-stakeholder cooperation such as
issue-based networks, inter-sessional work methods and transnational
policy standards. The panellists for this discussion were the
following.</p>
<ol>
<li>
<p align="justify">
Anne
Carblanc Head of Division, Directorate for Science, Technology and
Industry, OECD</p>
</li><li>
<p align="justify">
Eileen
Donahoe Director Global Affairs, Human Rights Watch</p>
</li><li>
<p align="justify">
Byron
Holland President and CEO, CIRA (Canadian ccTLD)</p>
</li><li>
<p align="justify">
Christopher
Painter Coordinator for Cyber Issues, US Department of State</p>
</li><li>
<p align="justify">
Sunil
Abraham Executive Director, CIS India</p>
</li><li>
<p align="justify">
Alice
Munyua Lead dotAfrica Initiative and GAC representative, African
Union Commission</p>
</li><li>
<p align="justify">
Will
Hudsen Senior Advisor for International Policy, Google</p>
</li><li>
<p align="justify">
Dunja
Mijatovic Representative on Freedom of the Media, OSCE</p>
</li><li>
<p align="justify">
Thomas
Fitschen Director for the United Nations, for International
Cooperation against Terrorism and for Cyber Foreign Policy, German
Federal Foreign Office</p>
</li><li>
<p align="justify">
Hartmut
Glaser Executive Secretary, Brazilian Internet Steering Committee</p>
</li><li>
<p align="justify">
Matt
Perault, Head of Policy Development Facebook</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2475-2015-11-13-ws-132-transnational-due-process-a-case-study-in-ms-cooperation-workshop-room-4">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2475-2015-11-13-ws-132-transnational-due-process-a-case-study-in-ms-cooperation-workshop-room-4</a></p>
<p align="justify">
Video
link Transnational
Due Process: A Case Study in MS Cooperation available here <a href="https://www.youtube.com/watch?v=M9jVovhQhd0">https://www.youtube.com/watch?v=M9jVovhQhd0</a></p>
<p align="justify"> VI.
The Internet Governance Project organised a meeting of the
<em><strong>Dynamic
Coalition on Accountability of Internet Governance Venues</strong></em>
at Workshop Room 2 from 14:00
– 15:30 on
12 November, 2015. The coalition
brought together panelists to highlight the
challenges in developing an accountability
framework
for internet governance
venues that include setting up standards and developing a set of
concrete criteria. Jyoti Panday provided the perspective of civil
society on why acountability is necessary in internet governance
processes and organizations. The panelists for this workshop included
the following.</p>
<ol>
<li>
<p>
Robin
Gross, IP Justice</p>
</li><li>
<p>
Jeanette
Hofmann, Director
<a href="http://www.internetundgesellschaft.de/">Alexander
von Humboldt Institute for Internet and Society</a></p>
</li><li>
<p>
Farzaneh
Badiei,
Internet Governance Project</p>
</li><li>
<p>
Erika
Mann,
Managing
Director Public PolicyPolicy Facebook and Board of Directors
ICANN</p>
</li><li>
<p>
Paul
Wilson, APNIC</p>
</li><li>
<p>
Izumi
Okutani, Japan
Network Information Center (JPNIC)</p>
</li><li>
<p>
Keith
Drazek , Verisign</p>
</li><li>
<p>
Jyoti
Panday,
CIS</p>
</li><li>
<p>
Jorge
Cancio,
GAC representative</p>
</li></ol>
<p>
Detailed
description of the workshop is available here
<a href="http://igf2015.sched.org/event/4c23/dynamic-coalition-on-accountability-of-internet-governance-venues?iframe=no&w=&sidebar=yes&bg=no">http://igf2015.sched.org/event/4c23/dynamic-coalition-on-accountability-of-internet-governance-venues?iframe=no&w=&sidebar=yes&bg=no</a></p>
<p>
Video
link https://www.youtube.com/watch?v=UIxyGhnch7w</p>
<p> VII.
Digital
Infrastructure
Netherlands Foundation organized an open forum at
Workshop Room 3
from 11:00
– 12:00
on
10
November, 2015. The open
forum discussed the increase
in government engagement with “the internet” to protect their
citizens against crime and abuse and to protect economic interests
and critical infrastructures. It
brought
together panelists topresent
ideas about an agenda for the international protection of ‘the
public core of the internet’ and to collect and discuss ideas for
the formulation of norms and principles and for the identification of
practical steps towards that goal.
Pranesh Prakash participated in the e open forum. Other speakers
included</p>
<ol>
<li>
<p>
Bastiaan
Goslings AMS-IX, NL</p>
</li><li>
<p>
Pranesh
Prakash CIS, India</p>
</li><li>
<p>
Marilia
Maciel (FGV, Brasil</p>
</li><li>
<p>
Dennis
Broeders (NL Scientific Council for Government Policy)</p>
</li></ol>
<p>
Detailed
description of the open
forum is available here
<a href="http://schd.ws/hosted_files/igf2015/3d/DINL_IGF_Open%20Forum_The_public_core_of_the_internet.pdf">http://schd.ws/hosted_files/igf2015/3d/DINL_IGF_Open%20Forum_The_public_core_of_the_internet.pdf</a></p>
<p>
Video
link available here <a href="https://www.youtube.com/watch?v=joPQaMQasDQ">https://www.youtube.com/watch?v=joPQaMQasDQ</a></p>
<p>
VIII.
UNESCO, Council of Europe, Oxford University, Office of the High
Commissioner on Human Rights, Google, Internet Society organised a
workshop on hate speech and youth radicalisation at Room 9 on
Thursday, November 12. UNESCO shared the initial outcome from its
commissioned research on online hate speech including practical
recommendations on combating against online hate speech through
understanding the challenges, mobilizing civil society, lobbying
private sectors and intermediaries and educating individuals with
media and information literacy. The workshop also discussed how to
help empower youth to address online radicalization and extremism,
and realize their aspirations to contribute to a more peaceful and
sustainable world. Sunil Abraham provided his inputs. Other speakers
include</p>
<p>
1.
Chaired by Ms Lidia Brito, Director for UNESCO Office in Montevideo</p>
<p>
2.Frank
La Rue, Former Special Rapporteur on Freedom of Expression</p>
<p>
3.
Lillian Nalwoga, President ISOC Uganda and rep CIPESA, Technical
community</p>
<p>
4.
Bridget O’Loughlin, CoE, IGO</p>
<p>
5.
Gabrielle Guillemin, Article 19</p>
<p>
6.
Iyad Kallas, Radio Souriali</p>
<p>
7.
Sunil Abraham executive director of Center for Internet and Society,
Bangalore, India</p>
<p>
8.
Eve Salomon, global Chairman of the Regulatory Board of RICS</p>
<p>
9.
Javier Lesaca Esquiroz, University of Navarra</p>
<p>
10.
Representative GNI</p>
<p>
11.
Remote Moderator: Xianhong Hu, UNESCO</p>
<p>
12.
Rapporteur: Guilherme Canela De Souza Godoi, UNESCO</p>
<p>
Detailed
description of the workshop
is available here
<a href="http://igf2015.sched.org/event/4c1X/ws-128-mitigate-online-hate-speech-and-youth-radicalisation?iframe=no&w=&sidebar=yes&bg=no">http://igf2015.sched.org/event/4c1X/ws-128-mitigate-online-hate-speech-and-youth-radicalisation?iframe=no&w=&sidebar=yes&bg=no</a></p>
<p>
Video
link to the panel is available here
<a href="https://www.youtube.com/watch?v=eIO1z4EjRG0">https://www.youtube.com/watch?v=eIO1z4EjRG0</a></p>
<p> <strong>INTERMEDIARY
LIABILITY</strong></p>
<p align="justify">
IX.
Electronic
Frontier Foundation, Centre for Internet Society India, Open Net
Korea and Article 19 collaborated to organize
a workshop on the <em><strong>Manila
Principles on Intermediary Liability</strong></em>
at Workshop Room 9 from 11:00 am to 12:00 pm on 13 November 2015. The
workshop elaborated on the Manila
Principles, a high level principle framework of best practices and
safeguards for content restriction practices and addressing liability
for intermediaries for third party content. The
workshop
saw particpants engaged in over lapping projects considering
restriction practices coming togetehr to give feedback and highlight
recent developments across liability regimes. Jyoti
Panday laid down the key details of the Manila Principles framework
in this session. The panelists for this workshop included the
following.</p>
<ol>
<li>
<p align="justify">
Kelly
Kim Open Net Korea,</p>
</li><li>
<p align="justify">
Jyoti
Panday, CIS India,</p>
</li><li>
<p align="justify">
Gabrielle
Guillemin, Article 19,</p>
</li><li>
<p align="justify">
Rebecca
McKinnon on behalf of UNESCO</p>
</li><li>
<p align="justify">
Giancarlo
Frosio, Center for Internet and Society, Stanford Law School</p>
</li><li>
<p align="justify">
Nicolo
Zingales, Tilburg University</p>
</li><li>
<p align="justify">
Will
Hudson, Google</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2423-2015-11-13-ws-242-the-manila-principles-on-intermediary-liability-workshop-room-9">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2423-2015-11-13-ws-242-the-manila-principles-on-intermediary-liability-workshop-room-9</a></p>
<p align="justify">
Video link available here <a href="https://www.youtube.com/watch?v=kFLmzxXodjs">https://www.youtube.com/watch?v=kFLmzxXodjs</a></p>
<p align="justify"> <strong>ACCESSIBILITY</strong></p>
<p align="justify">
X.
Dynamic
Coalition
on Accessibility and Disability and Global Initiative for Inclusive
ICTs organised a workshop on <em><strong>Empowering
the Next Billion by Improving Accessibility</strong></em><em>
</em>at
Workshop Room 6 from 9:00 am to 10:30 am on 13 November, 2015. The
discussion focused on
the need and ways to remove accessibility barriers which prevent over
one billion potential users to benefit from the Internet, including
for essential services. Sunil
Abraham specifically spoke about the lack of compliance of existing
ICT infrastructure with well established accessibility standards
specifically relating to accessibility barriers in the disaster
management process. He discussed the barriers faced by persons with
physical or psychosocial disabilities. The
panelists for this discussion were the following.</p>
<ol>
<li>
<p align="justify">
Francesca
Cesa Bianchi, G3ICT</p>
</li><li>
<p align="justify">
Cid
Torquato, Government of Brazil</p>
</li><li>
<p align="justify">
Carlos
Lauria, Microsoft Brazil</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS India</p>
</li><li>
<p align="justify">
Derrick
L. Cogburn, Institute on Disability and Public Policy (IDPP) for the
ASEAN(Association of Southeast Asian Nations) Region</p>
</li><li>
<p align="justify">
Fernando
H. F. Botelho, F123 Consulting</p>
</li><li>
<p align="justify">
Gunela
Astbrink, GSA InfoComm</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2438-2015-11-13-ws-253-empowering-the-next-billion-by-improving-accessibility-workshop-room-3">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2438-2015-11-13-ws-253-empowering-the-next-billion-by-improving-accessibility-workshop-room-3</a></u></p>
<p align="justify">
Video
Link Empowering
the next billion by improving accessibility <a href="https://www.youtube.com/watch?v=7RZlWvJAXxs">https://www.youtube.com/watch?v=7RZlWvJAXxs</a></p>
<p align="justify"> <strong>OPENNESS</strong></p>
<p align="justify">
XI.
A
workshop on <em><strong>FOSS
& a Free, Open Internet: Synergies for Development</strong></em>
was organized at Workshop Room 7 from 2:00 pm to 3:30 pm on 13
November, 2015. The discussion was focused on the increasing risk to
openness of the internet and the ability of present & future
generations to use technology to improve their lives. The panel shred
different perspectives about the future co-development
of FOSS and a free, open Internet; the threats that are emerging; and
ways for communities to surmount these. Sunil
Abraham emphasised the importance of free software, open standards,
open access and access to knowledge and the lack of this mandate in
the draft outcome document for upcoming WSIS+10 review and called for
inclusion of the same. Pranesh Prakash further contributed to the
discussion by emphasizing the need for free open source software with
end‑to‑end encryption and traffic level encryption based
on open standards which are decentralized and work through federated
networks. The
panellists for this discussion were the following.</p>
<ol>
<li>
<p align="justify">
Satish
Babu, Technical Community, Chair, ISOC-TRV, Kerala, India</p>
</li><li>
<p align="justify">
Judy
Okite, Civil Society, FOSS Foundation for Africa</p>
</li><li>
<p align="justify">
Mishi
Choudhary, Private Sector, Software Freedom Law Centre, New York</p>
</li><li>
<p align="justify">
Fernando
Botelho, Private Sector, heads F123 Systems, Brazil</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS
India</p>
</li><li>
<p align="justify">
Pranesh
Prakash, CIS
India</p>
</li><li>
<p align="justify">
Nnenna
Nwakanma- WWW.Foundation</p>
</li><li>
<p align="justify">
Yves
MIEZAN EZO, Open Source strategy consultant</p>
</li><li>
<p align="justify">
Corinto
Meffe, Advisor to the President and Directors, SERPRO, Brazil</p>
</li><li>
<p align="justify">
Frank
Coelho de Alcantara, Professor, Universidade Positivo, Brazil</p>
</li><li>
<p align="justify">
Caroline
Burle, Institutional and International Relations, W3C Brazil Office
and Center of Studies on Web Technologies</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2468-2015-11-13-ws10-foss-and-a-free-open-internet-synergies-for-development-workshop-room-7" target="_top">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2468-2015-11-13-ws10-foss-and-a-free-open-internet-synergies-for-development-workshop-room-7</a></u></p>
<p align="justify">
Video
link available here <a href="https://www.youtube.com/watch?v=lwUq0LTLnDs">https://www.youtube.com/watch?v=lwUq0LTLnDs</a></p>
<p align="justify">
<br /><br /></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015'>http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015</a>
</p>
No publisherjyotiAccess to KnowledgeBig DataFreedom of Speech and ExpressionEncryptionInternet Governance ForumIntermediary LiabilityAccountabilityInternet GovernanceCensorshipCyber SecurityDigital GovernanceAnonymityCivil SocietyBlocking2015-11-30T10:47:13ZBlog EntrySubmission to the Facebook Oversight Board in Case 2021-008-FB-FBR: Brazil, Health Misinformation and Lockdowns
http://editors.cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-in-case-2021-008-fb-fbr-brazil-health-misinformation-and-lockdowns
<b>In this note, we answer questions set out by the Board, pursuant to case 2021-008-FB-FBR, which concerned a post made by a Brazilian sub-national health official, and raised questions on health misinformation and enforcement of Facebook's community standards. </b>
<h1 style="text-align: justify;" dir="ltr">Background </h1>
<p dir="ltr">The <a href="https://about.fb.com/news/tag/oversight-board/">Oversight Board</a> is an expert body created to exercise oversight over Facebook’s content moderation decisions and enforcement of community guidelines. It is entirely independent from Facebook in its funding and administration and provides decisions on questions of policy as well as individual cases. It can also make recommendations on Facebook’s content policies. Its decisions are binding on Facebook, unless implementing them could violate the law. Accordingly, Facebook <a href="https://transparency.fb.com/oversight/oversight-board-cases/">implements</a> these decisions across identical content with parallel context, when it is technically and operationally possible to do so. </p>
<p dir="ltr">In June 2021, the Board made an <a href="https://oversightboard.com/news/170403765029629-announcement-of-case-2021-008-fb-fbr/">announcement</a> soliciting public comments on case 2021-008-FB-FBR, concerning a Brazilian state level medical council’s post questioning the effectiveness of lockdowns during the COVID-19 pandemic. Specifically, the post noted that lockdowns (i) are ineffective; (ii) lead to an increase in mental disorders, alcohol abuse, drug abuse, economic damage etc.; (iii) are against fundamental rights under the Brazilian Constitution; and (iv) are condemned by the World Health Organisation (“WHO”). These assertions were backed up by two statements (i) an alleged quote by Dr. Nabarro (WHO) stating that “the lockdown does not save lives and makes poor people much poorer”; and (ii) an example of how the Brazilian state of Amazonas had an increase in deaths and hospital admissions after lockdown. Ultimately, the post concluded that effective COVID-19 preventive measures include education campaigns about hygiene measures, use of masks, social distancing, vaccination and extensive monitoring by the government — but never the decision to adopt lockdowns. The post was viewed around 32,000 times and shared over 270 times. It was not reported by anyone. </p>
<p dir="ltr">Facebook did not take any action against the post, since it had opined that the post is not violative of its community standards. Moreover, WHO has also not advised Facebook to remove claims against lockdowns. In such a scenario, Facebook referred the case to the Oversight Board citing its public importance. </p>
<p dir="ltr">In its announcement, the Board sought answers on the following points: </p>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether Facebook’s decision to take no action against the content was consistent with its Community Standards and other policies, including the Misinformation and Harm policy (which sits within the rules on <a href="https://www.facebook.com/communitystandards/credible_violence">Violence and Incitement</a>). </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether Facebook’s decision to take no action is consistent with the company’s stated values and human rights commitments. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether, in this case, Facebook should have considered alternative enforcement measures to removing the content (e.g., the <a href="https://www.facebook.com/communitystandards/false_news">False News</a> Community Standard places an emphasis on “reduce” and “inform,” including: labelling, downranking, providing additional context etc.), and what principles should inform the application of these measures. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">How Facebook should treat content posted by the official accounts of national or sub-national level public health authorities, including where it may diverge from official guidance from international health organizations. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Insights on the post’s claims and their potential impact in the context of Brazil, including on national efforts to prevent the spread of COVID-19. </p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Whether Facebook should create a new Community Standard on health misinformation, as recommended by the Oversight Board in case decision <a href="https://oversightboard.com/decision/FB-XWJQBU9A/">2020-006-FB-FBR</a>.</p>
</li></ol>
<h1 style="text-align: justify;" dir="ltr">Submission to the Board</h1>
<p dir="ltr">Facebook’s decision to take no action against the post is consistent with its (i) <a href="https://www.facebook.com/communitystandards/credible_violence">Violence and Incitement</a> community standard read with the <a href="https://www.facebook.com/help/230764881494641">COVID-19 Policy Updates and Protections</a>; and (ii) <a href="https://www.facebook.com/communitystandards/false_news">False News</a> community standard. Facebook’s<a href="https://about.fb.com/news/2018/08/hard-questions-free-expression/"> website</a> as well as all of the Board’s <a href="https://oversightboard.com/decision/FB-6YHRXHZR/">past</a> <a href="https://oversightboard.com/decision/FB-QBJDASCV/">decisions</a> refer to the International Covenant on Civil and Political Rights’ (ICCPR) jurisprudence based <a href="https://www2.ohchr.org/english/bodies/hrc/docs/gc34.pdf">three-pronged test</a> of legality, legitimate aim, and necessity and proportionality in determining violations of Facebook’s community standards. Facebook must apply the same principles to guide the use of its enforcement actions too, keeping in mind the context, intent, tone and impact of the speech. </p>
<p dir="ltr">First, none of Facebook’s aforementioned rules contain explicit prohibitions on content questioning lockdown effectiveness. There is nothing to indicate that “misinformation”, which is undefined, includes within its scope information about the effectiveness of lockdowns. The World Health Organisation has also not advised against such posts. Applying the principle of legality, any person cannot reasonably foresee that such content is prohibited. Accordingly, Facebook’s community standards have not been violated, </p>
<p dir="ltr">Second, the post does not meet the threshold of causing “imminent” harm stipulated in the community standards. Case decision <a href="https://oversightboard.com/decision/FB-XWJQBU9A/">2020-006-FB-FBR</a>, notes that an assessment of “imminence” is made with reference to factors like context, speaker credibility, language etc. Presently, the post’s language and tone, including its quoting of experts and case studies, indicate that its intent is to encourage informed, scientific debate on lockdown effectiveness. </p>
<p dir="ltr">Third, Facebook’s false news community standard does contain any explicit prohibitions. Hence there is no question of its violation. Any decision to the contrary may go against the standard’s stated policy logic of not stifling public discourse, and create a chilling effect on posts questioning the lockdown efficacy. This will set a problematic precedent that Facebook will be mandated to implement.</p>
<p dir="ltr">Presently, Facebook cannot remove the post since no community standards have been violated. Facebook must not reduce the post’s circulation since this may stifle public discussion around lockdown effectiveness. Further, its removal would have resulted in violation of the user’s right to freedom of opinion and expression, as guaranteed by the Universal Declaration of Human Rights (UDHR) and the ICCPR, which are in turn part of Facebook’s Corporate Human Rights Policy. </p>
<p dir="ltr">Instead, Facebook can provide additional context along with the post through its “<a href="https://about.fb.com/news/2018/04/inside-feed-article-context/">related articles</a>” feature, by showing fact checked articles talking about the benefits of lockdown. This approach is the most beneficial since (i) it is less restrictive than reducing circulation of the post; (ii) it balances interests better than not taking any actions by allowing people to be informed about both sides of the debate on lockdowns so that they can make an informed assessment. </p>
<p dir="ltr">Further, Facebook’s treatment of content posted by official accounts of national or sub-national health authorities should be circumscribed by its updated <a href="https://transparency.fb.com/features/approach-to-newsworthy-content/">Newsworthy Content Policy</a>, and the Board’s decision in the <a href="https://oversightboard.com/decision/FB-691QAMHJ/">2021-001-FB-FBR</a>, which had adopted the <a href="https://www.ohchr.org/en/issues/freedomopinion/articles19-20/pages/index.aspx">Rabat Plan of Action</a> to determine whether a restriction on freedom of expression is required to prevent incitement. The Rabat Plan of Action proposes a six-prong test, that considers: a) the social and political context, b) status of the speaker, c) intent to incite the audience against a target group, d) content and form of the speech, e) extent of its dissemination and f) likelihood of harm, including imminence. Apart from taking these factors into consideration, Facebook must <a href="https://transparency.fb.com/features/approach-to-newsworthy-content/">perform</a> a balancing test to determine whether the public interest of the information in the post outweighs the risks of harm. </p>
<p dir="ltr">In the Board’s decision in <a href="https://oversightboard.com/decision/FB-XWJQBU9A/">2020-006-FB-FBR</a>, it was recommended to Facebook to: a) set out a clear and accessible Community Standard on health misinformation, b) consolidate and clarify existing rules in one place (including defining key terms such as misinformation) and c) provision of "detailed hypotheticals that illustrate the nuances of interpretation and application of [these] rules" to provide further clarity for users. Following this, Facebook has <a href="https://assets.documentcloud.org/documents/20491921/covid-19-response-full.pdf">notified</a> its implementation measures, where it has fully implemented these recommendations, thereby bringing it into compliance.</p>
<p dir="ltr">Finally, Brazil is one of the <a href="https://www.bbc.com/news/world-51235105">worst affected</a> countries in the pandemic. It has also been <a href="https://www.ft.com/content/ea62950e-89c0-4b8b-b458-05c90a55b81f">struggling </a>to combat the spread of fake news during the pandemic. President Bolsanaro has been <a href="https://www.hrw.org/news/2021/01/28/brazil-crackdown-critics-covid-19-response">criticised</a> for <a href="https://www.theguardian.com/commentisfree/2020/feb/07/democracy-and-freedom-of-expression-are-under-threat-in-brazil">curbing free speech</a> by using a dictatorship-era <a href="http://www.iconnectblog.com/2021/02/undemocratic-legislation-to-undermine-freedom-of-speech-in-brazil/">national security law</a>., and questioned on his handling of the pandemic, including his own controversial <a href="https://www.bbc.com/news/world-latin-america-56479614">statements </a>questioning lockdown effectiveness. In such a scenario, the post may be perceived in a political colour rather than as an attempt at scientific discussion. However, it is unlikely that the post will lead to any-knee jerk reactions, since people are already familiar with the lockdown debate on which much has already been said and done. A post like this which merely reiterates one side of an ongoing debate is not likely to cause people to take any action to violate lockdown.</p>
<p dir="ltr">For detailed explanation on these questions, please see <a class="external-link" href="https://cis-india.org/internet-governance/facebook-oversight-board-submission-brazil">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-in-case-2021-008-fb-fbr-brazil-health-misinformation-and-lockdowns'>http://editors.cis-india.org/internet-governance/blog/submission-to-the-facebook-oversight-board-in-case-2021-008-fb-fbr-brazil-health-misinformation-and-lockdowns</a>
</p>
No publisherTanvi Apte and Torsha SarkarInternet FreedomMisinformationIntermediary LiabilityInformation Technology2021-07-01T07:34:09ZBlog EntryStatutory Motion Against Intermediary Guidelines Rules
http://editors.cis-india.org/internet-governance/blog/statutory-motion-against-intermediary-guidelines-rules
<b>Rajya Sabha MP, Shri P. Rajeev has moved a motion that the much-criticised Intermediary Guidelines Rules be annulled. </b>
<h2>Motion to Annul Intermediary Guidelines Rules</h2>
<p>A <a href="http://164.100.47.5/newsite/bulletin2/Bull_No.aspx?number=49472">motion to annul</a> the <a href="http://cis-india.org/internet-governance/resources/intermediary-guidelines-rules">Intermediary Guidelines Rules</a> was moved on March 23, 2012, by <a href="http://india.gov.in/govt/rajyasabhampbiodata.php?mpcode=2106">Shri P. Rajeeve</a>, CPI(M) MP in the Rajya Sabha from Thrissur, Kerala.</p>
<p>The motion reads:</p>
<p>"That this House resolves that the Information Technology (Intermediaries Guidelines) Rules, 2011 issued under clause (zg) of sub-section (2) of Section 87 read with sub-section (2) of Section 79 of the Information Technology Act, 2000 published in the Gazette of India dated the 13th April, 2011 vide Notification No. G.S.R 314(E) and laid on the Table of the House on the 12th August, 2011, be annuled; and</p>
<p>That this House recommends to Lok Sabha that Lok Sabha do concur on this Motion."</p>
<p>This isn't the first time that Mr. Rajeeve is raising his voice against the Intermediary Guidelines Rules. Indeed, even when the Rules were just in draft stage, he along with the MPs Kumar Deepak Das, Rajeev Chandrashekar, and Mahendra Mohan drew Parliamentarians' <a href="http://rajeev.in/pages/..%5CNews%5Ccensorship_Blogs%5CBloggers_Internet.html">attention to the rules</a>. Yet, the government did not heed the MPs' concern, nor the concern of all the civil society organizations that wrote in to them concerned about human rights implications of the new laws. On September 6, 2011, Lok Sabha MP <a href="http://editors.cis-india.org/internet-governance/blog/164.100.47.132/debatestext/15/VIII/0609.pdf">Jayant Choudhary gave notice</a> (under Rule 377 of the Lok Sabha Rules) that the Intermediary Guidelines Rules as well as the Reasonable Security Practices Rules need to be reviewed. Yet, the government has not even addressed those concerns, and indeed has cracked down even harder on online freedom of speech since then.</p>
<h2>Fundamental Problems with Intermediary Guidelines Rules</h2>
<p>The fundamental problems with the Rules, which deal with objectionable material online:</p>
<h3>Shifting blame.</h3>
<p>It makes the 'intermediary', including ISPs like BSNL and Airtel responsible for objectionable content that their users have put up.</p>
<h3>No chance to defend.</h3>
<p>There is no need to inform users before this content is removed. So, even material put up by a political party can be removed based on <em>anyone's</em> complaint, without telling that party. This was done against a site called *CartoonsAgainstCorruption.com". This goes against Article 19(1)(a).</p>
<h3>Lack of transparency</h3>
<p>No information is required to be provided that content has been removed. It's a black-box system, with no one, not even the government, knowing that content has been removed following a request. So even the government does not know how many sites have been removed after these Rules have come into effect.</p>
<h3>No differentiation between intermediaries.</h3>
<p>A one-size-fits-all system is followed where an e-mail provider is equated with an online newspaper, which is equated with a video upload site, which is equated with a search engine. This is like equating the post-office and a book publisher as being equivalent for, say, defamatory speech. This is violative of Article 14 of the Constitution, which requires that unequals be treated unequally by the law.</p>
<h3>No proportionality.</h3>
<p>A DNS provider (i.e., the person who gives you your web address) is an intermediary who can be asked to 'disable access' to a website on the basis of a single page, even though the rest of the site has nothing objectionable.</p>
<h3>Vague and unconstitutional requirements.</h3>
<p>Disparaging speech, as long as it isn't defamatory, is not criminalised in India, and can't be because the Constitution does not allow for it. Content about gambling in print is not unlawful, but now all Internet intermediaries are required to remove any content that promotes gambling.</p>
<h3>Allows private censorship.</h3>
<p>The Rules do not draw a distinction between arbitrary actions of an intermediary and take-downs subsequent to a request.</p>
<h3>Presumption of illegality.</h3>
<p>The Rules are based on the presumption that all complaints (and resultant mandatory taking down of the content) are correct, and that the incorrectness of the take-downs can be disputed in court (if they ever discover that it has been removed). This is contrary to the presumption of validity of speech used by Indian courts, and is akin to prior restraint on speech. Courts have held that for content such as defamation, prior restraints cannot be put on speech, and that civil and criminal action can only be taken post-speech.</p>
<h3>Government censorship, not 'self-regulation'.</h3>
<p>The government says these are industry best-practices in existing terms of service agreements. But the Rules require all intermediaries to include the government-prescribed terms in an agreement, no matter what services they provide. It is one thing for a company to choose the terms of its terms of service agreement, and completely another for the government to dictate those terms of service.</p>
<h2>Problems Noted Early</h2>
<p>We have noted in the past the problems with the Rules, including when the Rules were still in draft form:</p>
<ul>
<li>
<p><a href="http://cis-india.org/internet-governance/blog/intermediary-due-diligence">CIS Para-wise Comments on Intermediary Due Diligence Rules, 2011</a> </p>
</li>
<li>
<p><a href="http://www.outlookindia.com/article.aspx?279712">E-Books Are Easier To Ban Than Books</a></p>
</li>
<li>
<p><a href="http://kafila.org/2012/01/11/invisible-censorship-how-the-government-censors-without-being-seen-pranesh-prakash/">Invisible Censorship: How the Government Censors Without Being Seen</a></p>
</li>
<li>
<p><a href="http://india.blogs.nytimes.com/2011/12/07/chilling-impact-of-indias-april-internet-rules/">'Chilling' Impact of India's April Internet Rules</a></p>
</li>
<li>
<p><a href="http://www.tehelka.com/story_main51.asp?filename=Op280112proscons.asp">The Quixotic Fight To Clean Up The Web</a></p>
</li>
<li>
<p><a href="http://cis-india.org/internet-governance/online-pre-censorship-harmful-impractical">Online Pre-censorship is Harmful and Impractical</a></p>
</li>
<li>
<p><a href="http://www.indianexpress.com/story-print/787789/">Killing the Internet Softly With Its Rules</a></p>
</li>
</ul>
<p>Other organizations like the Software Freedom Law Centre also sent in <a href="http://softwarefreedom.in/index.php?option=com_content&view=article&id=78&Itemid=79">scathing comments on the law</a>, noting that they are unconstitutional.</p>
<p>We are very glad that Shri Rajeeve has moved this motion, and we hope that it gets adopted in the Lok Sabha as well, and that the Rules get defeated.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/statutory-motion-against-intermediary-guidelines-rules'>http://editors.cis-india.org/internet-governance/blog/statutory-motion-against-intermediary-guidelines-rules</a>
</p>
No publisherpraneshIT ActParliamentFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityCensorship2012-04-03T09:35:41ZBlog EntryRoundtable on Intermediary Liability and Gender Based Violence at the Digital Citizen Summit, 2018
http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018
<b>Akriti Bopanna and Ambika Tandon conducted a panel on 'Gender and Intermediary Liability' at the Digital Citizen Summit, hosted by the Digital Empowerment Foundation, on November 1, 2018 at India International Centre, New Delhi.</b>
<p class="moz-quote-pre">Ambika was the moderator for the panel, with Apar Gupta, Jyoti Pandey, Amrita Vasudevan, Anja Kovacs, and Japleen Pasricha as speakers. Click to read the <a class="external-link" href="http://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit">concept note</a> and the <a class="external-link" href="http://cis-india.org/internet-governance/files/dcs-2018-agenda">agenda</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018'>http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2018-11-07T02:55:40ZNews ItemRoundtable Discussion on Intermediary Liability
http://editors.cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability
<b>Tanaya Rajwade participated in a roundtable discussion on intermediary liability organised by SFLC and the Dialogue in New Delhi on October 17, 2019.</b>
<p>Click to view the <a class="external-link" href="http://cis-india.org/internet-governance/files/internet-liability">agenda</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability'>http://editors.cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability</a>
</p>
No publisherAdminFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-10-20T07:08:11ZNews ItemRole of Intermediaries in Countering Online Abuse
http://editors.cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse
<b>The Internet can be a hostile space and protecting users from abuse without curtailing freedom of expression requires a balancing act on the part of online intermediaries.</b>
<p style="text-align: justify; ">This got published as two blog entries in the NALSAR Law Tech Blog. Part 1 can be accessed <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-i/">here</a> and Part 2 <a class="external-link" href="https://techlawforum.wordpress.com/2015/06/30/role-of-intermediaries-in-countering-online-abuse-still-a-work-in-progress-part-ii/">here</a>.</p>
<hr />
<p style="text-align: justify; ">As platforms and services coalesce around user-generated content (UGC) and entrench themselves in the digital publishing universe, they are increasingly taking on the duties and responsibilities of protecting rights including taking reasonable measures to restrict unlawful speech. Arguments around the role of intermediaries tackling unlawful content usually center around the issue of regulation—when is it feasible to regulate speech and how best should this regulation be enforced?</p>
<p class="Standard" style="text-align: justify; ">Recently, Twitter found itself at the periphery of such questions when an anonymous user of the platform, @LutyensInsider, began posting slanderous and sexually explicit comments about Swati Chaturvedi, a Delhi-based journalist. The online spat which began in February last year, culminated into<a href="http://www.dailyo.in/politics/twitter-trolls-swati-chaturvedi-lutyensinsider-presstitutes-bazaru-media-delhi-police/story/1/4300.html"> Swati filing an FIR</a> against the anonymous user, last week. Within hours of the FIR, the anonymous user deleted the tweets and went silent. Predictably, Twitter users <a href="https://twitter.com/bainjal/status/609343547796426752">hailed this</a> as a much needed deterrence to online harassment. Swati’s personal victory is worth celebrating, it is an encouragement for the many women bullied daily on the Internet, where harassment is rampant. However, while Swati might be well within her legal rights to counter slander, the rights and liabilities of private companies in such circumstances are often not as clear cut.</p>
<p class="Standard" style="text-align: justify; ">Should platforms like Twitter take on the mantle of deciding what speech is permissible or not? When and how should the limits on speech be drawn? Does this amount to private censorship?The answers are not easy and as the recent Grand Chamber of the European Court of Human Rights (ECtHR)<a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635"> </a><a href="http://hudoc.echr.coe.int/sites/eng/pages/search.aspx?i=001-126635">judgment in the case of</a> Delfi AS v. Estonia confirms, the role of UGC platforms in balancing the user rights, is an issue far from being settled. In its ruling, the ECtHR reasoned that because of their role in facilitating expression, online platforms have a requirement “<i>to take effective measures to limit the dissemination of hate speech and speech inciting violence was not ‘private censorship”.</i></p>
<p class="Standard" style="text-align: justify; ">This is problematic because the decision moves the regime away from a framework that grants immunity from liability, as long as platforms meet certain criteria and procedures. In <a href="http://www.jipitec.eu/issues/jipitec-5-3-2014/4091">other words</a> the ruling establishes strict liability for intermediaries in relation to manifestly illegal content, even if they may have no knowledge. The 'obligation' placed on the intermediary does not grant them safe harbour and is not proportionate to the monitoring and blocking capacity thus necessitated. Consequently, platforms might be incentivized to err on the side of caution and restrict comments or confine speech resulting in censorship. The ruling is especially worrying, as the standard of care placed on the intermediary does not recognize the different role played by intermediaries in detection and removal of unlawful content. Further, intermediary liability is its own legal regime and is at the same time, a subset of various legal issues that need an understanding of variation in scenarios, mediums and technology both globally and in India.</p>
<h3 class="Standard">Law and Short of IT</h3>
<p class="Standard" style="text-align: justify; ">Earlier this year, in a<a href="http://www.theverge.com/2015/2/4/7982099/twitter-ceo-sent-memo-taking-personal-responsibility-for-the"> leaked memo</a>, the Twitter CEO Dick Costolo took personal responsibility for his platform's chronic problem and failure to deal with harassment and abuse. In Swati's case, Twitter did not intervene or take steps to address harrassment. If it had to, Twitter (India), as all online intermediaries would be bound by the provisions established under Section 79 and accompanying Rules of the Information Technology Act. These legislations outline the obligations and conditions that intermediaries must fulfill to claim immunity from liability for third party content. Under the regime, upon receiving actual knowledge of unlawful information on their platform, the intermediary must comply with the notice and takedown (NTD) procedure for blocking and removal of content.</p>
<p class="Standard" style="text-align: justify; ">Private complainants could invoke the NTD procedure forcing intermediaries to act as adjudicators of an unlawful act—a role they are clearly ill-equipped to perform, especially when the content relates to political speech or alleged defamation or obscenity. The SC judgment in Shreya Singhal addressing this issue, read down the provision (Section 79 by holding that a takedown notice can only be effected if the complainant secures a court order to support her allegation. Further, it was held that the scope of restrictions under the mechanism is restricted to the specific categories identified under Article 19(2). Effectively, this means Twitter need not take down content in the absence of a court order.</p>
<h3 class="Standard">Content Policy as Due Diligence</h3>
<p class="Standard" style="text-align: justify; ">Another provision, Rule 3(2) prescribes a content policy which, prior to the Shreya Singhal judgment was a criteria for administering takedown. This content policy includes an exhaustive list of types of restricted expressions, though worryingly, the terms included in it are not clearly defined and go beyond the reasonable restrictions envisioned under Article 19(2). Terms such as “grossly harmful”, “objectionable”, “harassing”, “disparaging” and “hateful” are not defined anywhere in the Rules, are subjective and contestable as alternate interpretation and standard could be offered for the same term. Further, this content policy is not applicable to content created by the intermediary.</p>
<p class="Standard" style="text-align: justify; ">Prior to the SC verdict in Shreya Singhal, <a href="http://cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability">actual knowledge could have been interpreted</a> to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. While liability accrued from not complying with takedown requests under the content policy was clear, this is not the case anymore. By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries complying with places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must and should adhere, to the boundaries set by Article 19(2). Following the SC judgment intermediaries do not have to administer takedowns without a court order thereby rendering this content policy redundant. As it stands, the content policy is an obligation that intermediaries must fulfill in order to be exempted from liability for UGC and this due diligence is limited to publishing rules and regulations, terms and conditions or user agreement informing users of the restrictions on content. The penalties for not publishing this content policy should be clarified.</p>
<p class="Standard" style="text-align: justify; ">Further, having been informed of what is permissible users are agreeing to comply with the policy outlined, by signing up to and using these platforms and services. The requirement of publishing content policy as due diligence is unnecessary given that mandating such ‘standard’ terms of use negates the difference between different types of intermediaries which accrue different kinds of liability. This also places an extraordinary power of censorship in the hands of the intermediary, which could easily stifle freedom of speech online. Such heavy handed regulation could make it impossible to publish critical views about anything without the risk of being summarily censored.</p>
<p class="Standard">Twitter may have complied with its duties by publishing the content policy, though the obligation does not seem to be an effective deterrence. Strong safe harbour provisions for intermediaries are a crucial element in the promotion and protection of the right to freedom of expression online. By absolving platforms of responsibility for UGC as long as they publish a content policy that is vague and subjective is the very reason why India’s IT Rules are in fact, in urgent need of improvement.</p>
<h3 class="Standard">Size Matters</h3>
<p class="Standard" style="text-align: justify; ">The standards for blocking, reporting and responding to abuse vary across different categories of platforms. For example, it may be easier to counter trolls and abuse on blogs or forums where the owner or an administrator is monitoring comments and UGC. Usually platforms outline monitoring and reporting policies and procedures including recourse available to victims and action to be taken against violators. However, these measures are not always effective in curbing abuse as it is possible for users to create new accounts under different usernames. For example, in Swati’s case the anonymous user behind @LutyensInsider account changed<a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx"> </a><a href="http://www.hindustantimes.com/newdelhi/twitter-troll-lutyensinsider-changes-handle-after-delhi-journo-files-fir/article1-1357281.aspx">their handle</a> to @gregoryzackim and @gzackim before deleting all tweets. In this case, perhaps the fear of criminal charges ahead was enough to silence the anonymous user, which may not always be the case.</p>
<h3 class="Standard">Tackling the Trolls</h3>
<p class="Standard" style="text-align: justify; ">Most large intermediaries have privacy settings which restrict the audience for user posts as well as prevent strangers from contacting them as a general measure against online harassment. Platforms also publish<a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html"> </a><a href="http://www.slate.com/articles/technology/bitwise/2015/04/twitter_s_new_abuse_policy_if_it_can_t_stop_it_hide_it.html">monitoring policy</a> outlining the procedure and mechanisms for users to<a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/04/twitter_s_new_harassment_policy_not_transparent_not_engaged_with_users.html">register their complaint</a> or<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">report abuse</a>. Often reporting and blocking mechanisms<a href="https://blog.twitter.com/2015/update-on-user-safety-features"> </a><a href="https://blog.twitter.com/2015/update-on-user-safety-features">rely on community standards</a> and users reporting unlawful content. Last week Twitter<a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348"> </a><a href="https://twittercommunity.com/t/removing-the-140-character-limit-from-direct-messages/41348">announced a new feature</a> allowing lists of blocked users to be shared between users. An improvement on existing mechanism for blocking, the feature is aimed at making the service safer for people facing similar issues and while an improvement on standard policies defining permissible limits on content, such efforts may have their limitations.</p>
<p class="Standard" style="text-align: justify; ">The mechanisms follow a one-size-fits-all policy. First, such community driven efforts do not address concerns of differences in opinion and subjectivity. Swati in defending her actions stressed the “<i>coarse discourse”</i> prevalent on social media, though as<a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/"> </a><a href="http://www.opindia.com/2015/06/foul-mouthed-twitter-user-files-fir-against-loud-mouthed-slanderer/">this article points out</a> she might be assumed guilty of using offensive and abusive language. Subjectivity and many interpretations of the same opinion can pave the way for many taking offense online. Earlier this month, Nikhil Wagle’s tweets criticising Prime Minister Narendra Modi as a “pervert” was interpreted as “abusive”, “offensive” and “spreading religious disharmony”. While platforms are within their rights to establish policies for dealing with issues faced by users, there is a real danger of them doing so for<a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html"> </a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">“</a><a href="http://www.slate.com/articles/technology/users/2015/05/chuck_c_johnson_suspended_from_twitter_why.2.html">political reasons” and based on “popularity” measures</a> which may chill free speech. When many get behind a particular interpretation of an opinion, lawful speech may also be stifled as Sreemoyee Kundu <a href="http://www.dailyo.in/user/124/sreemoyeekundu">found out</a>. A victim of online abuse her account was blocked by Facebook owing to multiple reports from a “<i>faceless fanatical mob”. </i>Allowing the users to set standards of permissible speech is an improvement, though it runs the risk of mob justice and platforms need to be vigilant in applying such standards.</p>
<p class="Standard" style="text-align: justify; ">While it may be in the interest of platforms to keep a hands off approach to community policies, certain kind of content may necessiate intervention by the intermediary. There has been an increase in private companies modifying their content policy to place reasonable restriction on certain hateful behaviour in order to protect vulnerable or marginalised voices. <a href="http://www.theguardian.com/technology/2015/mar/12/twitter-bans-revenge-porn-in-user-policy-sharpening">Twitter</a> and <a href="http://www.redditblog.com/2015/05/promote-ideas-protect-people.html">Reddit's</a> policy change in addressing revenge porn are reflective of a growing understanding amongst stakeholders that in order to promote free expression of ideas, recognition and protection of certain rights on the Internet may be necessary. However, any approach to regulate user content must assess the effect of policy decisions on user rights. Google's <a href="http://www.theguardian.com/technology/2015/jun/22/revenge-porn-women-free-speech-abuse">stand on tackling revenge porn</a> may be laudable, though the <a href="https://www.techdirt.com/articles/20141109/06211929087/googles-efforts-to-push-down-piracy-sites-may-lead-more-people-to-malware.shtml">decision to push down</a> 'piracy' sites in its search results could be seen to adversely impact the choice that users have. Terms of service implemented with subjectivity and lack of transparency can and does lead to private censorship.</p>
<h3 class="Standard">The Way Forward</h3>
<p class="Standard" style="text-align: justify; ">Harassment is damaging, because of the feeling of powerlessness that it invokes in the victims and online intermediaries represent new forms of power through which users' negotiate and manage their online identity. Content restriction policies and practices must address this power imbalance by adopting baseline safeguards and best practices. It is only fair that based on principles of equality and justice, intermediaries be held responsible for the damage caused to users due to wrongdoings of other users or when they fail to carry out their operations and services as prescribed by the law. However, in its present state, the intermediary liability regime in India is not sufficient to deal with online harassment and needs to evolve into a more nuanced form of governance.</p>
<p class="Standard" style="text-align: justify; ">Any liability framework must evolve bearing in mind the slippery slope of overbroad regulation and differing standards of community responsibility. Therefore, a balanced framework would need to include elements of both targeted regulation and soft forms of governance as liability regimes need to balance fundamental human rights and the interests of private companies. Often, achieving this balance is problematic given that these companies are expected to be adjudicators and may also be the target of the breach of rights, as is the case in Delfi v Estonia. Global frameworks such as the Manila Principles can be a way forward in developing effective mechanisms. The determination of content restriction practices should always adopt the least restrictive means of doing so, distinguishing between the classes of intermediary. They must evolve considering the proportionality of the harm, the nature of the content and the impact on affected users including the proximity of affected party to content uploader.</p>
<p class="Standard" style="text-align: justify; ">Further, intermediaries and governments should communicate a clear mechanism for review and appeal of restriction decisions. Content restriction policies should incorporate an effective right to be heard. In exceptional circumstances when this is not possible, a post facto review of the restricton order and its implementation must take place as soon as practicable. Further, unlawful content restricted for a limited duration or within a specific geography, must not extend beyond these limits and a periodic review should take place to ensure the validity of the restriction. Regular, systematic review of rules and guidelines guiding intermediary liability will go a long way in ensuring that such frameworks are not overly burdensome and remain effective.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse'>http://editors.cis-india.org/internet-governance/blog/role-of-intermediaries-in-counting-online-abuse</a>
</p>
No publisherjyotiOnline HarassmentInternet GovernanceIntermediary LiabilityChilling EffectOnline Abuse2015-08-02T16:38:36ZBlog EntryRight to Exclusion, Government Spaces, and Speech
http://editors.cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech
<b>The conclusion of the litigation surrounding Trump blocking its critiques on Twitter brings to forefront two less-discussed aspects of intermediary liability: a) if social media platforms could be compelled to ‘carry’ speech under any established legal principles, thereby limiting their right to exclude users or speech, and b) whether users have a constitutional right to access social media spaces of elected officials. This essay analyzes these issues under the American law, as well as draws parallel for India, in light of the ongoing litigation around the suspension of advocate Sanjay Hegde’s Twitter account.</b>
<p> </p>
<p>This article first appeared on the Indian Journal of Law and Technology (IJLT) blog, and can be accessed <a class="external-link" href="https://www.ijlt.in/post/right-to-exclusion-government-controlled-spaces-and-speech">here</a>. Cross-posted with permission. </p>
<p>---</p>
<h2><span class="s1">Introduction</span></h2>
<p class="p2"><span class="s1">On April 8, the Supreme Court of the United States (SCOTUS), vacated the judgment of the US Court of Appeals for Second Circuit’s in <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2"><em>Knight First Amendment Institute v Trump</em></span></a>. In that case, the Court of Appeals had precluded Donald Trump, then-POTUS, from blocking his critics from his Twitter account on the ground that such action amounted to the erosion of constitutional rights of his critics. The Court of Appeals had held that his use of @realDonaldTrump in his official capacity had transformed the nature of the account from private to public, and therefore, blocking users he disagreed with amounted to viewpoint discrimination, something that was incompatible with the First Amendment.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">The SCOTUS <a href="https://www.supremecourt.gov/opinions/20pdf/20-197_5ie6.pdf"><span class="s2">ordered</span></a> the case to be dismissed as moot, on account of Trump no longer being in office. Justice Clarence Thomas issued a ten-page concurrence that went into additional depth regarding the nature of social media platforms and user rights. It must be noted that the concurrence does not hold any direct precedential weightage, since Justice Thomas was not joined by any of his colleagues at the bench for the opinion. However, given that similar questions of public import, are currently being deliberated in the ongoing <em>Sanjay Hegde</em> <a href="https://www.barandbench.com/news/litigation/delhi-high-court-sanjay-hegde-challenge-suspension-twitter-account-hearing-july-8"><span class="s2">litigation</span></a> in the Delhi High Court, Justice Thomas’ concurrence might hold some persuasive weightage in India. While the facts of these litigations might be starkly different, both of them are nevertheless characterized by important questions of applying constitutional doctrines to private parties like Twitter and the supposedly ‘public’ nature of social media platforms.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In this essay, we consider the legal questions raised in the opinion as possible learnings for India. In the first part, we analyze the key points raised by Justice Thomas, vis-a-vis the American legal position on intermediary liability and freedom of speech. In the second part, we apply these deliberations to the <em>Sanjay Hegde </em>litigation, as a case-study and a roadmap for future legal jurisprudence to be developed.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">A flawed analogy</span></h2>
<p class="p2"><span class="s1">At the outset, let us briefly refresh the timeline of Trump’s tryst with Twitter, and the history of this litigation: the Court of Appeals decision was <a href="https://int.nyt.com/data/documenthelper/1365-trump-twitter-second-circuit-r/c0f4e0701b087dab9b43/optimized/full.pdf%23page=1"><span class="s2">issued</span></a> in 2019, when Trump was still in office. Post-November 2020 Presidential Election, where he was voted out, his supporters <a href="https://indianexpress.com/article/explained/us-capitol-hill-siege-explained-7136632/"><span class="s2">broke</span></a> into Capitol Hill. Much of the blame for the attack was pinned on Trump’s use of social media channels (including Twitter) to instigate the violence and following this, Twitter <a href="https://blog.twitter.com/en_us/topics/company/2020/suspension"><span class="s2">suspended</span></a> his account permanently.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is this final fact that seized Justice Thomas’ reasoning. He noted that a private party like Twitter’s power to do away with Trump’s account altogether was at odds with the Court of Appeals’ earlier finding about the public nature of the account. He deployed a hotel analogy to justify this: government officials renting a hotel room for a public hearing on regulation could not kick out a dissenter, but if the same officials gather informally in the hotel lounge, then they would be within their rights to ask the hotel to kick out a heckler. The difference in the two situations would be that, <em>“the government controls the space in the first scenario, the hotel, in the latter.” </em>He noted that Twitter’s conduct was similar to the second situation, where it “<em>control(s) the avenues for speech</em>”. Accordingly, he dismissed the idea that the original respondents (the users whose accounts were blocked) had any First Amendment claims against Trump’s initial blocking action, since the ultimate control of the ‘avenue’ was with Twitter, and not Trump.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">In the facts of the case however, this analogy was not justified. The Court of Appeals had not concerned itself with the question of private ‘control’ of entire social media spaces, and given the timeline of the litigation, it was impossible for them to pre-empt such considerations within the judgment. In fact, the only takeaway from the original decision had been that an elected representative’s utilization of his social media account for official purposes transformed </span><span class="s3">only that particular space</span><span class="s1"><em> </em>into a public forum where constitutional rights would find applicability. In delving into questions of ‘control’ and ‘avenues of speech’, issues that had been previously unexplored, Justice Thomas conflates a rather specific point into a much bigger, general conundrum. Further deliberations in the concurrence are accordingly put forward upon this flawed premise.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Right to exclusion (and must carry claims)</span></h2>
<p class="p2"><span class="s1">From here, Justice Thomas identified the problem to be “<em>private, concentrated control over online content and platforms available to the public</em>”, and brought forth two alternate regulatory systems — common carrier and public accommodation — to argue for ‘equal access’ over social media space. He posited that successful application of either of the two analogies would effectively restrict a social media platform’s right to exclude its users, and “<em>an answer may arise for dissatisfied platform users who would appreciate not being blocked</em>”. Essentially, this would mean that platforms would be obligated to carry <em>all </em>forms of (presumably) legal speech, and users would be entitled to sue platforms in case they feel their content has been unfairly taken down, a phenomenon Daphne Keller <a href="http://cyberlaw.stanford.edu/blog/2018/09/why-dc-pundits-must-carry-claims-are-relevant-global-censorship"><span class="s2">describes</span></a> as ‘must carry claims’.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">Again, this is a strange place to find the argument to proceed, since the original facts of the case were not about ‘<em>dissatisfied platform users’,</em> but an elected representative’s account being used in dissemination of official information. Beyond the initial ‘private’ control deliberation, Justice Thomas did not seem interested in exploring this original legal position, and instead emphasized on analogizing social media platforms in order to enforce ‘equal access’, finally arriving at a position that would be legally untenable in the USA.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">The American law on intermediary liability, as embodied in Section 230 of the Communications Decency Act (CDA), has two key components: first, intermediaries are <a href="https://www.eff.org/issues/cda230"><span class="s2">protected</span></a> against the contents posted by its users, under a legal model <a href="https://www.article19.org/wp-content/uploads/2018/02/Intermediaries_ENGLISH.pdf"><span class="s2">termed</span></a> as ‘broad immunity’, and second, an intermediary does not stand to lose its immunity if it chooses to moderate and remove speech it finds objectionable, popularly <a href="https://intpolicydigest.org/section-230-how-it-actually-works-what-might-change-and-how-that-could-affect-you/"><span class="s2">known</span></a> as the Good Samaritan protection. It is the effect of these two components, combined, that allows platforms to take calls on what to remove and what to keep, translating into a ‘right to exclusion’. Legally compelling them to carry speech, under the garb of ‘access’ would therefore, strike at the heart of the protection granted by the CDA.<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Learnings for India</span></h2>
<p class="p2"><span class="s1">In his petition to the Delhi High Court, Senior Supreme Court Advocate, Sanjay Hegde had contested that the suspension of his Twitter account, on the grounds of him sharing anti-authoritarian imagery, was arbitrary and that:<span class="Apple-converted-space"> </span></span></p>
<ol style="list-style-type: lower-alpha;" class="ol1"><li class="li2"><span class="s1">Twitter was carrying out a public function and would be therefore amenable to writ jurisdiction under Article 226 of the Indian Constitution; and</span></li><li class="li2"><span class="s1">The suspension of his account had amounted to a violation of his right to freedom of speech and expression under Article 19(1)(a) and his rights to assembly and association under Article 19(1)(b) and 19(1)(c); and</span></li><li class="li2"><span class="s1">The government has a positive obligation to ensure that any censorship on social media platforms is done in accordance with Article 19(2).<span class="Apple-converted-space"> </span></span></li></ol>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">The first two prongs of the original petition are perhaps easily disputed: as previous <a href="https://indconlawphil.wordpress.com/2020/01/28/guest-post-social-media-public-forums-and-the-freedom-of-speech-ii/"><span class="s2">commentary</span></a> has pointed out, existing Indian constitutional jurisprudence on ‘public function’ does not implicate Twitter, and accordingly, it would be a difficult to make out a case that account suspensions, no matter how arbitrary, would amount to a violation of the user’s fundamental rights. It is the third contention that requires some additional insight in the context of our previous discussion.<span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Does the Indian legal system support a right to exclusion?<span class="Apple-converted-space"> </span></span></h3>
<p class="p2"><span class="s1">Suing Twitter to reinstate a suspended account, on the ground that such suspension was arbitrary and illegal, is in its essence a request to limit Twitter’s right to exclude its users. The petition serves as an example of a must-carry claim in the Indian context and vindicates Justice Thomas’ (misplaced) defence of ‘<em>dissatisfied platform users</em>’. Legally, such claims perhaps have a better chance of succeeding here, since the expansive protection granted to intermediaries via Section 230 of the CDA, is noticeably absent in India. Instead, intermediaries are bound by conditional immunity, where availment of a ‘safe harbour’, i.e., exemption from liability, is contingent on fulfilment of statutory conditions, made under <a href="https://indiankanoon.org/doc/844026/"><span class="s2">section 79</span></a> of the Information Technology (IT) Act and the rules made thereunder. Interestingly, in his opinion, Justice Thomas had briefly visited a situation where the immunity under Section 230 was made conditional: to gain Good Samaritan protection, platforms might be induced to ensure specific conditions, including ‘nondiscrimination’. This is controversial (and as commentators have noted, <a href="https://www.lawfareblog.com/justice-thomas-gives-congress-advice-social-media-regulation"><span class="s2">wrong</span></a>), since it had the potential to whittle down the US' ‘broad immunity’ model of intermediary liability to a system that would resemble the Indian one.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">It is worth noting that in the newly issued Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, proviso to Rule 3(1)(d) allows for “<em>the removal or disabling of access to any information, data or communication link [...] under clause (b) on a voluntary basis, or on the basis of grievances received under sub-rule (2) [...]</em>” without dilution of statutory immunity. This does provide intermediaries a right to exclude, albeit limited, since its scope is restricted to content removed under the operation of specific sub-clauses within the rules, as opposed to Section 230, which is couched in more general terms. Of course, none of this precludes the government from further prescribing obligations similar to those prayed in the petition.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">On the other hand, it is a difficult proposition to support that Twitter’s right to exclusion should be circumscribed by the Constitution, as prayed. In the petition, this argument is built over the judgment in <a href="https://indiankanoon.org/doc/110813550/"><span class="s2"><em>Shreya Singhal v Union of India</em></span></a>, where it was held that takedowns under section 79 are to be done only on receipt of a court order or a government notification, and that the scope of the order would be restricted to Article 19(2). This, in his opinion, meant that “<em>any suo-motu takedown of material by intermediaries must conform to Article 19(2)</em>”.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">To understand why this argument does not work, it is important to consider the context in which the <em>Shreya Singhal </em>judgment was issued. Previously, intermediary liability was governed by the Information Technology (Intermediaries Guidelines) Rules, 2011 issued under section 79 of the IT Act. Rule 3(4) made provisions for sending takedown orders to the intermediary, and the prerogative to send such orders was on ‘<em>an affected person</em>’. On receipt of these orders, the intermediary was bound to remove content and neither the intermediary nor the user whose content was being censored, had the opportunity to dispute the takedown.</span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">As a result, the potential for misuse was wide-open. Rishabh Dara’s <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf"><span class="s2">research</span></a> provided empirical evidence for this; intermediaries were found to act on flawed takedown orders, on the apprehension of being sanctioned under the law, essentially chilling free expression online. The <em>Shreya Singhal</em> judgment, in essence, reined in this misuse by stating that an intermediary is legally obliged to act <em>only when </em>a takedown order is sent by the government or the court. The intent of this was, in the court’s words: “<em>it would be very difficult for intermediaries [...] to act when millions of requests are made and the intermediary is then to judge as to which of such requests are legitimate and which are not.</em>”<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p5"><span class="s1">In light of this, if Hegde’s petition succeeds, it would mean that intermediaries would now be obligated to subsume the entirety of Article 19(2) jurisprudence in their decision-making, interpret and apply it perfectly, and be open to petitions from users when they fail to do so. This might be a startling undoing of the court’s original intent in <em>Shreya Singhal</em>. Such a reading also means limiting an intermediary’s prerogative to remove speech that may not necessarily fall within the scope of Article 19(2), but is still systematically problematic, including unsolicited commercial communications. Further, most platforms today are dealing with an unprecedented spread and consumption of harmful, misleading information. Limiting their right to exclude speech in this manner, we might be <a href="https://www.hoover.org/sites/default/files/research/docs/who-do-you-sue-state-and-platform-hybrid-power-over-online-speech_0.pdf"><span class="s2">exacerbating</span></a> this problem. <span class="Apple-converted-space"> </span></span></p>
<h3><span class="s1">Government-controlled spaces on social media platforms</span></h3>
<p class="p2"><span class="s1">On the other hand, the original finding of the Court of Appeals, regarding the public nature of an elected representative’s social media account and First Amendment rights of the people to access such an account, might yet still prove instructive for India. While the primary SCOTUS order erases the precedential weight of the original case, there have been similar judgments issued by other courts in the USA, including by the <a href="https://globalfreedomofexpression.columbia.edu/cases/davison-v-randall/"><span class="s2">Fourth Circuit</span></a> court and as a result of a <a href="https://knightcolumbia.org/content/texas-attorney-general-unblocks-twitter-critics-in-knight-institute-v-paxton"><span class="s2">lawsuit</span></a> against a Texas Attorney General.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p4"><span class="s1">A similar situation can be envisaged in India as well. The Supreme Court has <a href="https://indiankanoon.org/doc/591481/"><span class="s2">repeatedly</span></a> <a href="https://indiankanoon.org/doc/27775458/"><span class="s2">held</span></a> that Article 19(1)(a) encompasses not just the right to disseminate information, but also the right to <em>receive </em>information, including <a href="https://indiankanoon.org/doc/438670/"><span class="s2">receiving</span></a> information on matters of public concern. Additionally, in <a href="https://indiankanoon.org/doc/539407/"><span class="s2"><em>Secretary, Ministry of Information and Broadcasting v Cricket Association of Bengal</em></span></a>, the Court had held that the right of dissemination included the right of communication through any media: print, electronic or audio-visual. Then, if we assume that government-controlled spaces on social media platforms, used in dissemination of official functions, are ‘public spaces’, then the government’s denial of public access to such spaces can be construed to be a violation of Article 19(1)(a).<span class="Apple-converted-space"> </span></span></p>
<h2><span class="s1">Conclusion</span></h2>
<p class="p2"><span class="s1">As indicated earlier, despite the facts of the two litigations being different, the legal questions embodied within converge startlingly, inasmuch that are both examples of the growing discontent around the power wielded by social media platforms, and the flawed attempts at fixing it.<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">While the above discussion might throw some light on the relationship between an individual, the state and social media platforms, many questions still continue to remain unanswered. For instance, once we establish that users have a fundamental right to access certain spaces within the social media platform, then does the platform have a right to remove that space altogether? If it does so, can a constitutional remedy be made against the platform? Initial <a href="https://indconlawphil.wordpress.com/2018/07/01/guest-post-social-media-public-forums-and-the-freedom-of-speech/"><span class="s2">commentary</span></a> on the Court of Appeals’ decision had contested that the takeaway from that judgment had been that constitutional norms had a primacy over the platform’s own norms of governance. In such light, would the platform be constitutionally obligated to <em>not </em>suspend a government account, even if the content on such an account continues to be harmful, in violation of its own moderation standards?<span class="Apple-converted-space"> </span></span></p>
<p class="p3"><span class="s1"></span></p>
<p class="p2"><span class="s1">This is an incredibly tricky dimension of the law, made trickier still by the dynamic nature of the platforms, the intense political interests permeating the need for governance, and the impacts on users in the instance of a flawed solution. Continuous engagement, scholarship and emphasis on having a human rights-respecting framework underpinning the regulatory system, are the only ways forward.<span class="Apple-converted-space"> </span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space">---</span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"><br /></span></span></p>
<p class="p2"><span class="s1"><span class="Apple-converted-space"></span></span></p>
<p>The author would like to thank Gurshabad Grover and Arindrajit Basu for reviewing this piece. </p>
<div> </div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech'>http://editors.cis-india.org/internet-governance/blog/right-to-exclusion-government-spaces-and-speech</a>
</p>
No publisherTorSharkFreedom of Speech and ExpressionIntermediary LiabilityInformation Technology2021-07-02T12:05:13ZBlog EntryRethinking the intermediary liability regime in India
http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india
<b>The article consolidates some of our broad thematic concerns with the draft amendments to the intermediary liability rules, published by MeitY last December.
</b>
<p>The blog post by Torsha Sarkar was <a class="external-link" href="https://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/">published by CyberBRICS</a> on August 12, 2019.</p>
<hr />
<h3 style="text-align: justify; ">Introduction</h3>
<p style="text-align: justify; ">In December 2018, the Ministry of Electronics and Information Technology (“MeitY”) released the Intermediary Liability Guidelines (Amendment) Rules (“the Guidelines”), which would be significantly altering the intermediary liability regime in the country. While the Guidelines has drawn a considerable amount of attention and criticism, from the perspective of the government, the change has been overdue.</p>
<p style="text-align: justify; ">The Indian government has been determined to overhaul the pre-existing safe harbour regime since last year. The draft<a href="https://www.medianama.com/wp-content/uploads/Draft-National-E-commerce-Policy.pdf">version</a> of the e-commerce policy, which were leaked last year, also hinted at similar plans. As effects of mass dissemination of disinformation, propaganda and hate speech around the world spill over to offline harms, governments have been increasingly looking to enact interventionist laws that leverage more responsibility on the intermediaries. India has not been an exception.</p>
<p style="text-align: justify; ">A major source of these harmful and illegal content in India come through the popular communications app WhatsApp, despite the company’s enactment of several anti-spam measures over the past few years. Last year, rumours circulated on WhatsApp prompted a series of lynchings. In May, Reuters <a href="https://in.reuters.com/article/india-election-socialmedia-whatsapp/in-india-election-a-14-software-tool-helps-overcome-whatsapp-controls-idINKCN1SL0PZ" rel="noreferrer noopener" target="_blank">reported</a> that clones and software tools were available at minimal cost in the market, for politicians and other interested parties to bypass these measures, and continue the trend of bulk messaging.</p>
<p style="text-align: justify; ">These series of incidents have made it clear that disinformation is a very real problem, and the current regulatory framework is not enough to address it. The government’s response to this has been accordingly, to introduce the Guidelines. This rationale also finds a place in its preliminary<a href="https://www.meity.gov.in/comments-invited-draft-intermediary-rules" rel="noreferrer noopener" target="_blank">statement of reasons</a>.</p>
<p style="text-align: justify; ">While enactment of such interventionist laws has triggered fresh rounds of debate on free speech and censorship, it would be wrong to say that such laws were completely one-sided, or uncalled for.</p>
<p style="text-align: justify; ">On one hand, automated amplification and online mass circulation of purposeful disinformation, propaganda, of terrorist attack videos, or of plain graphic content, are all problems that the government would concern itself with. On the other hand, several online companies (including <a href="https://www.blog.google/outreach-initiatives/public-policy/oversight-frameworks-content-sharing-platforms/" rel="noreferrer noopener" target="_blank">Google</a>) also seem to be in an uneasy agreement that simple self-regulation of content would not cut it. For better oversight, more engagement with both government and civil society members is needed.</p>
<p style="text-align: justify; ">In March this year, Mark Zuckerberg wrote an<a href="https://www.washingtonpost.com/opinions/mark-zuckerberg-the-internet-needs-new-rules-lets-start-in-these-four-areas/2019/03/29/9e6f0504-521a-11e9-a3f7-78b7525a8d5f_story.html?utm_term=.4d177c66782f" rel="noreferrer noopener" target="_blank">op-ed</a> for the Washington Post, calling for more government involvement in the process of content regulation on its platform. While it would be interesting to consider how Zuckerberg’s view aligns with those similarly placed, it would nevertheless be correct to say that online intermediaries are under more pressure than ever to keep their platforms clean of content that is ‘illegal, harmful, obscene’. And this list only grows.</p>
<p style="text-align: justify; ">That being said, the criticism from several stakeholders is sharp and clear in instances of such law being enacted – be it the ambitious <a href="https://www.ivir.nl/publicaties/download/NetzDG_Tworek_Leerssen_April_2019.pdf" rel="noreferrer noopener" target="_blank">NetzDG</a> aimed at combating Nazi propaganda, hate speech and fake news, or the controversial new European Copyright Directive which has been welcomed by journalists but has been severely critiqued by online content creators and platforms as detrimental against user-generated content.</p>
<p style="text-align: justify; ">In the backdrop of such conflicting interests on online content moderation, it would be useful to examine the Guidelines released by MeitY. In the first portion we would be looking at certain specific concerns existing within the rules, while in the second portion, we would be pushing the narrative further to see what an alternative regulatory framework may look like.</p>
<p style="text-align: justify; ">Before we jump to the crux of this discussion, one important disclosure must be made about the underlying ideology of this piece. It would be unrealistic to claim that the internet should be absolutely free from regulation. Swathes of content on child sexual abuse, or terrorist propaganda, or even the hordes of death and rape threats faced by women online are and should be concerns of a civil society. While that is certainly a strong driving force for regulation, this concern should not override the basic considerations for human rights (including freedom of expression). These ideas would be expanded a bit more in the upcoming sections.</p>
<h3 style="text-align: justify; ">Broad, thematic concerns with the Rules</h3>
<h3 style="text-align: justify; ">A uniform mechanism of compliance</h3>
<h3 style="text-align: justify; ">Timelines</h3>
<p style="text-align: justify; ">Rule 3(8) of the Guidelines mandates intermediaries, prompted by <em>a</em> <em>court order or a government notification</em>, to take down content relating to unlawful acts within 24 hours of such notification. In case they fail to do so, the safe harbour applicable to them under section 79 of the Information Technology Act (“the Act”) would cease to apply, and they would be liable. Prior to the amendment, this timeframe was 36 hours.</p>
<p style="text-align: justify; ">There is a visible lack of research which could rationalize that a 24-hour timeline for compliance is the optimal framework, for <em>all</em> intermediaries, irrespective of the kind of services they provide, or the sizes or resources available to them. As Mozilla Foundation has <a href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/" rel="noreferrer noopener" target="_blank">commented</a>, regulation of illegal content online simply cannot be done in an one-size-fits-all approach, nor can <a href="https://blog.mozilla.org/netpolicy/2019/04/10/uk_online-harms/" rel="noreferrer noopener" target="_blank">regulation be made</a> with only the tech incumbents in mind. While platforms like YouTube can comfortably <a href="https://www.bmjv.de/SharedDocs/Pressemitteilungen/DE/2017/03142017_Monitoring_SozialeNetzwerke.html" rel="noreferrer noopener" target="_blank">remove</a> criminal prohibited content within a span of 24 hours, this still can place a large burden on smaller companies, who may not have the necessary resources to comply within this timeframe. There are a few unintended consequences that would arise out of this situation.</p>
<p style="text-align: justify; ">One, sanctions under the Act, which would include both organisational ramifications like website blocking (under section 69A of the Act) as well as individual liability, would affect the smaller intermediaries more than it would affect the bigger ones. A bigger intermediary like Facebook may be able to withstand a large fine in lieu of its failure to control, say, hate speech on its platform. That may not be true for a smaller online marketplace, or even a smaller online social media site, targeted towards a very specific community. This compliance mechanism, accordingly, may just go on to strengthen the larger companies, and eliminating the competition from the smaller companies.</p>
<p style="text-align: justify; ">Two, intermediaries, in fear of heavy criminal sanctions would err on the side of law. This would mean that the decisions involved in determining whether a piece of content is illegal or not would be shorter, less nuanced. This would also mean that legitimate speech would also be under risk from censorship, and intermediaries would pay <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf" rel="noreferrer noopener" target="_blank">less heed</a> to the technical requirements or the correct legal procedures required for content takedown.</p>
<h3 style="text-align: justify; ">Utilization of ‘automated technology’</h3>
<p style="text-align: justify; ">Another place where the Guidelines assume that all intermediaries operating in India are on the same footing is Rule 3(9). This mandates these entities to proactively monitor for ‘unlawful content’ on their platforms. Aside the unconstitutionality of this provision, this also assumes that all intermediaries would have the requisite resource to actually set up this tool and operate it successfully. YouTube’s ContentID, which began in 2007, has already seen a whopping <a href="https://www.blog.google/outreach-initiatives/public-policy/protecting-what-we-love-about-internet-our-efforts-stop-online-piracy/" rel="noreferrer noopener" target="_blank">100 million dollars investment by 2018</a>.</p>
<p style="text-align: justify; ">Funnily enough, ContentID is a tool exclusively dedicated to finding copyright violation of rights-holder, and even then, it has been proven to be not <a href="https://www.plagiarismtoday.com/2019/01/10/youtubes-copyright-insanity/" rel="noreferrer noopener" target="_blank">infallible</a>. The Guidelines’ sweeping net of ‘unlawful’ content include far many more categories than mere violations of IP rights, and the framework assumes that intermediaries would be able to set up and run an automated tool that would filter through <em>all</em> these categories of ‘unlawful content’ at one go.</p>
<h3 style="text-align: justify; ">The problems of AI</h3>
<p style="text-align: justify; ">Aside the implementation-related concerns, there are also technical challenges related with Rule 3(9). Supervised learning systems (like the one envisaged under the Guidelines) use training data sets for pro-active filtering. This means if the system is taught that for ten instances of A being the input, the output would be B, then for the eleventh time, it sees A, it would give the output B. In the lingo of content filtering, the system would be taught, for example, that nudity is bad. The next time the system encounters nudity in a picture, it would automatically flag it as ‘bad’ and violating the community standards.</p>
<p style="text-align: justify; "><a href="https://www.theguardian.com/technology/2016/sep/08/facebook-mark-zuckerberg-napalm-girl-photo-vietnam-war" rel="noreferrer noopener" target="_blank">Except, that is not how it should work</a>. For every post that is under the scrutiny of the platform operators, numerous nuances and contextual cues act as mitigating factors, none of which, at this point, would be<a href="https://scholarship.law.nd.edu/cgi/viewcontent.cgi?referer=https://www.google.co.in/&httpsredir=1&article=1704&context=ndlr" rel="noreferrer noopener" target="_blank">understandable</a> by a machine.</p>
<p style="text-align: justify; ">Additionally, the training data used to feed the system <a href="https://www.cmu.edu/dietrich/philosophy/docs/london/IJCAI17-AlgorithmicBias-Distrib.pdf" rel="noreferrer noopener" target="_blank">can be biased</a>. A self-driving car who is fed training data from only one region of the country would learn the customs and driving norms of that particular region, and not the patterns that apply across the intended purpose of driving throughout the country.</p>
<p style="text-align: justify; ">Lastly, it is not disputed that bias would be completely eliminated in case the content moderation was undertaken by a human. However, the difference between a human moderator and an automated one, would be that there would be a measure of accountability in the first one. The decision of the human moderator can be disputed, and the moderator would have a chance to explain his reasons for the removal. Artificial intelligence (“AI”) is identified by the algorithmic ‘<a href="http://raley.english.ucsb.edu/wp-content/Engl800/Pasquale-blackbox.pdf" rel="noreferrer noopener" target="_blank">black box</a>’ that processes inputs, and generates usable outputs. Implementing workable accountability standards for this system, including figuring out appeal and grievance redressal mechanisms in cases of dispute, are all problems that the regulator must concern itself with.</p>
<p style="text-align: justify; ">In the absence of any clarity or revision, it seems unlikely that the provision would actually ever see full implementation. Neither would the intermediaries know what kind of ‘automated technology’ they are supposed to use for filtering ‘unlawful content’, nor would there be any incentives for them to actually deploy this system effectively for their platforms.</p>
<h3 style="text-align: justify; ">What can be done?</h3>
<p style="text-align: justify; ">First, more research is needed to understand the effect of compliance timeframes on the accuracy of content takedown. Several jurisdictions are operating now on different timeframes of compliance, and it would be a far more holistic regulation should the government consider the dialogue around each of them and see what it means for India.</p>
<p style="text-align: justify; ">Second, it might be useful to consider the concept of an independent regulator as an alternative and as a compromise between pure governmental regulation (which is more or less what the system is) or self-regulation (which the Guidelines, albeit problematically, also espouse through Rule 3(9)).</p>
<p style="text-align: justify; ">The <a href="https://www.gov.uk/government/consultations/online-harms-white-paper" rel="noreferrer noopener" target="_blank">UK White Paper on Harms</a>, a piece of important document in the system of liability overhaul, proposes an arms-length regulator who would be responsible for drafting codes of conduct for online companies and responsible for their enforcement. While the exact merits of the system is still up for debate, the concept of having a separate body to oversee, formulate and also possibly<a href="https://medium.com/adventures-in-consumer-technology/regulating-social-media-a-policy-proposal-a2a25627c210" rel="noreferrer noopener" target="_blank">arbitrate</a> disputes regarding content removal, is finding traction in several parallel developments.</p>
<p style="text-align: justify; ">One of the Transatlantic Working Group Sessions seem to discuss this idea in terms of having an ‘<a href="https://medium.com/whither-news/proposals-for-reasonable-technology-regulation-and-an-internet-court-58ac99bec420" rel="noreferrer noopener" target="_blank">internet court</a>’ for illegal content regulation. This would have the noted advantage of a) formulating norms of online content in a transparent, public fashion, something previously done behind closed doors of either the government or the tech incumbents and b) having specially trained professionals who would be able to dispose of matters in an expeditious manner.</p>
<p style="text-align: justify; ">India is not unfamiliar to the idea of specialized tribunals, or quasi-judicial bodies for dealing with specific challenges. In 2015, for example, the Government of India passed the Commercial Courts Act, by which specific courts were tasked to deal with matters of very large value. This is neither an isolated instance of the government choosing to create new bodies for dealing with a specific problem, nor would it be inimitable in the future.</p>
<p style="text-align: justify; ">There is no<a href="https://www.thehindubusinessline.com/opinion/resurrecting-the-marketplace-of-ideas/article26313605.ece" rel="noreferrer noopener" target="_blank"> silver bullet</a> when it comes to moderation of content on the web. However, in light of these parallel convergence of ideas, the appeal of an independent regulatory system as a sane compromise between complete government control and <em>laissez-faire</em>autonomy, is worth considering.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india'>http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india</a>
</p>
No publishertorshaInternet GovernanceIntermediary LiabilityArtificial Intelligence2019-08-16T01:49:47ZBlog EntryResponse to the Draft of The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018
http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018
<b>In this response, we aim to examine whether the draft rules meet tests of constitutionality and whether they are consistent with the parent Act. We also examine potential harms that may arise from the Rules as they are currently framed and make recommendations to the draft rules that we hope will help the Government meet its objectives while remaining situated within the constitutional ambit.</b>
<p><br style="text-align: start;" /><span style="text-align: start; float: none;">This document presents the Centre for Internet & Society (CIS) response</span><span style="text-align: start; float: none;"> to the Ministry of Electronics and Information Technology’s invitation</span><span style="text-align: start; float: none;"> to comment and suggest changes to the draft of The Information</span><span style="text-align: start; float: none;"> Technology [Intermediary Guidelines (Amendment) Rules] 2018 (hereinafter</span><span style="text-align: start; float: none;"> referred to as the “draft rules”) published on December 24, 2018. CIS is</span><span style="text-align: start; float: none;"> grateful for the opportunity to put forth its views and comments. This response was sent on the January 31, 2019.</span><br style="text-align: start;" /><br style="text-align: start;" /><span style="text-align: start; float: none;">In this response, we aim to examine whether the draft rules meet tests</span><span style="text-align: start; float: none;"> of constitutionality and whether they are consistent with the parent</span><span style="text-align: start; float: none;"> Act. We also examine potential harms that may arise from the Rules as</span><span style="text-align: start; float: none;"> they are currently framed and make recommendations to the draft rules</span><span style="text-align: start; float: none;"> that we hope will help the Government meet its objectives while</span><span style="text-align: start; float: none;"> remaining situated within the constitutional ambit.</span></p>
<p><span style="text-align: start; float: none;">The response can be accessed <a href="https://cis-india.org/internet-governance/resources/Intermediary%20Liability%20Rules%202018.pdf">here</a>.<br /></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018'>http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018</a>
</p>
No publisherGurshabad Grover, Elonnai Hickok, Arindrajit Basu, AkritiFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-02-07T08:06:41ZBlog EntryReport on CIS' Workshop at the IGF:'An Evidence Based Framework for Intermediary Liability'
http://editors.cis-india.org/internet-governance/report-on-cis-workshop-at-igf
<b>An evidence based framework for intermediary liability' was organised to present evidence and discuss ongoing research on the changing definition, function and responsibilities of intermediaries across jurisdictions.</b>
<p style="text-align: justify; ">The discussion from the workshop will contribute to a comprehensible framework for liability, consistent with the capacity of the intermediary and with international human-rights standards.</p>
<p style="text-align: justify; ">Electronic Frontier Foundation (USA), Article 19 (UK) and Centre for Internet and Society (India) have come together towards the development of best practices and principles related to the regulation of online content through intermediaries. The nine principles are: Transparency, Consistency, Clarity, Mindful Community Policy Making, Necessity and Proportionality in Content Restrictions, Privacy, Access to Remedy, Accountability, and Due Process in both Legal and Private Enforcement. The workshop discussion will contribute to a comprehensible framework for liability that is consistent with the capacity of the intermediary and with international human-rights standards. The session was hosted by Centre for Internet and Society (India) and Centre for Internet and Society, Stanford (USA) and attended by 7 speakers and 40 participants.</p>
<p style="text-align: justify; ">Jeremy Malcolm, Senior Global Policy Analyst EFF kicked off the workshop highlighting the need to develop a liability framework for intermediaries that is derived out of an understanding of their different functions, their role within the economy and their impact on human rights. He went on to structure the discussion which would follow to focus on ongoing projects and examples that highlight central issues related to gathering and presenting evidence to inform the policy space.</p>
<p style="text-align: justify; ">Martin Husovec from the International Max Planck Research School for Competition and Innovation, began his presentation, tracking the development of safe harbour frameworks within social contract theory. Opining that safe harbour was created as a balancing mechanism between a return of investments of the right holders and public interest for Internet as a public space, he introduced emerging claims that technological advancement have altered this equilibrium. Citing injunctions and private lawsuits as instruments, often used against law abiding intermediaries, he pointed to the problem within existing liability frameoworks, where even intermediaries, who diligently deal with illegitimate content on their services, can be still subject to a forced cooperation to the benefit of right holders. He added that for liability frameworks to be effective, they must keep pace with advances in technology and are fair to right holders and the public interest.</p>
<p style="text-align: justify; ">He also pointed that in any liability framework because the ‘law’ that prescribes an interference, must be always sufficiently clear and foreseeable, as to both the meaning and nature of the applicable measures, so it sufficiently outlines the scope and manner of exercise of the power of interference in the exercise of the rights guaranteed. He illustrated this with the example of the German Federal Supreme Court attempts with Wi-Fi policy-making in 2010. He also raised issues of costs of uncertainty in seeking courts as the only means to balance rights as they often, do not have the necessary information. Similarly, society also does not benefit from open ended accountability of intermediaries and called for a balanced approach to regulation.</p>
<p style="text-align: justify; ">The need for consistency in liability regimes across jurisdictions, was raised by Giancarlo Frosio, Intermediary Liability Fellow at Stanford's Centre for Internet and Society. He introduced the World Intermediary Liability Map, a project mapping legislation and case law across 70 countries towards creating a repository of information that informs policymaking and helps create accountability. Highlighting key takeaways from his research, he stressed the necessity of having clear definitions in the field of intermediary liability and the need to develop taxonomy of issues to deepen our understanding of the issues at stake towards an understanding of type of liability appropriate for a particular jurisdiction.</p>
<p style="text-align: justify; ">Nicolo Zingales, Assistant Professor of Law at Tilburg University highlighted the need for due process and safeguards for human rights and called for more user involvement in systems that are in place in different countries to respond to requests of takedown. Presenting his research findings, he pointed to the imbalance in the way notice and takedown regimes are structured, where content is taken down presumptively, but the possibility of restoring user content is provided only at a subsequent stage or not at all in many cases. He cited several examples of enhancing user participation in liability mechanisms including notice and notice, strict litigation sanction inferring the knowledge that the content might have been legal and shifting the presumption in favor of the users and the reverse notice and takedown procedure. He also raised the important question, if multistakeholder cooperation is sufficient or adequate to enable the users to have a say and enter as part of the social construct in this space? Reminding the participants of the failure of the multistakeholder agreement process regarding the cost for the filters in the UK, that would be imposed according to judicial procedure, he called for strengthening our efforts to enable users to get more involved in protecting their rights online.</p>
<p style="text-align: justify; ">Gabrielle Guillemin from Article 19 presented her research on the types of intermediaries and models of liability in place across jurisdictions. Pointing to the problems associated with intermediaries having to monitor content and determine legality of content, she called for procedural safeguards and stressed the need to place the dispute back in the hands of users and content owners and the person who has written the content rather than the intermediary. She goes on to provide some useful and practically-grounded solutions to strengthen existing takedown mechanisms including, adding details to the notices, introducing fees in order to extend the number of claims that are made and defining procedure regards criminal content.</p>
<p style="text-align: justify; ">Elonnai Hickok introduced CIS' research to the UNESCO report Fostering Freedom Online: the Role of Internet Intermediaries, comparing a range of liability models in different stages of development and provisions across jurisdictions. She argued for a liability framework that tackles procedural and regulatory uncertainty, lack of due process, lack of remedy and varying content criteria.</p>
<p style="text-align: justify; ">Francisco Vera, Advocacy Director, Derechos Digitales from Chile raised issues related to mindful community policy-making expounding on Chile's implementation of intermediary liability obligation with the USA, the introduction of judicial oversight under Chilean legislation which led to US objection to Chile on grounds of not fulfilling their standards in terms of Internet property protection. He highlighted the tensions that arise in balancing the needs of the multiple communities and interests engaged over common resources and stressed the need for evidence in policy-making to balance the needs of rights holders and public interest. He stressed the need for evidence to inform policy-making and ensure it keeps pace with technological developments citing the example of the ongoing Transpacific Partnership Agreement negotiations that call for exporting provisions DMCA provisions to 11 countries even though there is no evidence of the success of the system for public interest. He concluded by cautioning against the development of frameworks that are or have the potential to be used as anti-competitive mechanisms that curtail innovation and therby do not serve public interest.</p>
<p style="text-align: justify; ">Malcolm Hutty associated with the European Internet Service Providers Association, Chair of the Intermediary Reliability Committee and London Internet Exchange brought in the intermediaries' perspective into the discussion. He argued for challenging the link between liability and forced cooperation, understated the problems arising from distinction without a difference and incentives built in within existing regimes. He raised issues arising from the expectancy on the part of those engaged in pre-emptive regulation of unwanted or undesirable content for intermediaries to automate content. Pointing to the increasing impact of intermediaries in our lives he underscored how exposing vast areas of people's lives to regulatory enforce, which enhances power of the state to implement public policy in the public interest and expect it to be executed, can have both positive and negative implications on issues such as privacy and freedom of expression.</p>
<p style="text-align: justify; ">He called out practices in regulatory regimes that focus on one size fits all solutions such as seeking automating filters on a massive scale and instead called for context and content specific solutions, that factor the commercial imperatives of intermediaries. He also addressed the economic consequences of liability frameworks to the industry including cost effectiveness of balancing rights, barriers to investments that arise in heavily regulated or new types of online services that are likely to be the targeted for specific enforcement measures and the long term costs of adapting old enforcement mechanisms that apply, while networks need to be updated to extend services to users.</p>
<p style="text-align: justify; ">The workshop presented evidence of a variety of approaches and the issues that arise in applying those approaches to impose liability on intermediaries. Two choices emerged towards developing frameworks for enforcing responsibility on intermediaries. We could either rely on a traditional approach, essentially court-based and off-line mechanisms for regulating behaviour and disputes. The downside of this is it will be slow and costly to the public purse. In particular, we will lose a great deal of the opportunity to extend regulation much more deeply into people's lives so as to implement the public interest.<br /><br />Alternatively, we could rely on intermediaries to develop and automate systems to control our online behaviour. While this approach does not suffer from efficiency problems of the earlier approach it does lack, both in terms of hindering the developments of the Information Society, and potentially yielding up many of the traditionally expected protections under a free and liberal society. The right approach lies somewhere in the middle and development of International Principles for Intermediary Liability, announced at the end of the workshop, is a step closer to the developing a balanced framework for liability.</p>
<hr />
<p>See the <a class="external-link" href="http://www.intgovforum.org/cms/174-igf-2014/transcripts/1968-2014-09-03-ws206-an-evidence-based-liability-policy-framework-room-5">transcript on IGF website</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/report-on-cis-workshop-at-igf'>http://editors.cis-india.org/internet-governance/report-on-cis-workshop-at-igf</a>
</p>
No publisherjyotiPrivacyFreedom of Speech and ExpressionInternet Governance ForumInternet GovernanceIntermediary Liability2014-09-24T10:47:30ZBlog EntryRebuttal of DIT's Misleading Statements on New Internet Rules
http://editors.cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries
<b>The press statement issued on May 11 by the Department of Information Technology (DIT) on the furore over the newly-issued rules on 'intermediary due diligence' is misleading and is, in places, plainly false. We are presenting a point-by-point rebuttal of the DIT's claims.</b>
<p>In its <a class="external-link" href="http://pib.nic.in/newsite/erelease.aspx?relid=72066">press release on Wednesday, May 11, 2011</a>, the DIT stated:
<blockquote>The
attention of Government has been drawn to news items in a section of
media on certain aspects of the Rules notified under Section 79
pertaining to liability of intermediaries under the Information
Technology Act, 2000. These items have raised two broad issues. One is
that words used in Rules for objectionable content are broad and could
be interpreted subjectively. Secondly, there is an apprehension that the
Rules enable the Government to regulate content in a highly subjective
and possibly arbitrary manner. <br /></blockquote>
<p>There are actually more issues than merely "subjective interpretation" and "arbitrary governmental regulation".</p>
<ul><li style="list-style-type: disc;">The
Indian Constitution limits how much the government can regulate
citizens’ fundamental right to freedom of speech and expression. Any
measure afoul of the constitution is invalid. </li><li style="list-style-type: disc;">Several
portions of the rules are beyond the limited powers that Parliament had
granted the Department of IT to create interpretive rules under the
Information Technology Act. Parliament directed the Government to merely
define what “due diligence” requirements an intermediary would have to
follow in order to claim the qualified protection against liability that
Section 79 of the Information Technology Act provides; these current
rules have gone dangerously far beyond that, by framing rules that
insist that intermediaries, without investigation, has to remove content within 36-hours of receipt of a
complaint, keep records of a users' details and provide them to
law enforcement officials.</li></ul>
<p>The Department of Information Technology (DIT), Ministry of
Communications & IT has clarified that the Intermediaries Guidelines
Rules, 2011 prescribe that due diligence need to be observed by the
Intermediaries to enjoy exemption from liability for hosting any third
party information under Section 79 of the Information Technology Act,
2000. These due diligence practices are the best practices followed
internationally by well-known mega corporations operating on the
Internet. The terms specified in the Rules are in accordance with the
terms used by most of the Intermediaries as part of their existing
practices, policies and terms of service which they have published on
their website.</p>
<ol><li>We are not aware of any country that actually goes to the extent of
deciding what Internet-wide ‘best practices’ are and actually converting
those ‘best practices’ into law by prescribing a universal terms of
service that all Internet services, websites, and products should enforce.</li><li>The Rules require all intermediaries to include the
government-prescribed terms in an agreement, no matter what services
they provide. It is one thing for a company to choose the terms of its
terms of service agreement, and completely another for the government to
dictate those terms of service. As long as the terms of service of an
intermediary are not unlawful or bring up issues of users’ rights (such
as the right to privacy), there is no reason for the government to jump
in and dictate what the terms of service should or should not be.</li><li>The DIT has not offered any proof to back up its assertion that 'most'
intermediaries already have such terms. Google, a ‘mega corporation’
which is an intermediary, <a class="external-link" href="http://www.google.com/accounts/TOS?hl=en">does not have such an overarching policy</a>. Indiatimes, another ‘mega
corporation’ intermediary, <a class="external-link" href="http://www.indiatimes.com/policyterms/1555176.cms">does not either</a>. Just because <a class="external-link" href="http://www.rediff.com/termsofuse.html">a
company like Rediff</a> and <a class="external-link" href="http://us.blizzard.com/en-us/company/legal/wow_tou.html">
Blizzard's World of Warcraft</a> have some of those terms does not mean a) that they should have all of those terms, nor that b) everyone else should as well.<br /><br />In
attempting to take different terms of service from different Internet
services and products—the very fact of which indicate the differing
needs felt across varying online communities—the Department has put in
place a one-size-fits-all approach. How can this be possible on the Internet, when we wouldn't regulate the post-office and a book publisher under the same rules of liability for, say, defamatory speech.</li><li>There is also a significant difference between the effect of those
terms of service and that of these Rules. An intermediary-framed terms of service
suggest that the intermediary <em>may</em> investigate and boot someone off a service for violation, while the Rules insist that
the intermediary simply has to mandatorily remove content, keep records of users' details and provide them to law enforcement officials,
else be subject to crippling legal liability.</li></ol>
<p>So
to equate the effect of these Rules to merely following ‘existing
practices’ is plainly wrong. An intermediary—like the CIS website—should have the freedom to choose not to have terms of service
agreements. We now don’t.“In case any issue arises concerning the interpretation of the terms
used by the Intermediary, which is not agreed to by the user or affected
person, the same can only be adjudicated by a Court of Law. The
Government or any of its agencies have no power to intervene or even
interpret. DIT has reiterated that there is no intention of the
Government to acquire regulatory jurisdiction over content under these
Rules. It has categorically said that these rules do not provide for any
regulation or control of content by the Government.”</p>
<p>The
Rules are based on the presumption that all complaints (and resultant
mandatory taking down of the content) are correct, and that the
incorrectness of the take-downs can be disputed in court. Why not just
invert that, and presume that all complaints need to be proven first, and the correctness of the complaints (instead of the take-downs) be disputed in court? </p>
<p>Indeed,
the courts have insisted that presumption of validity is the only
constitutional way of dealing with speech. (See, for instance, <em>Karthikeyan R. v. Union
of India</em>, a 2010 Madras High Court judgment.)</p>
<p>Further,
only constitutional courts (namely High Courts and the Supreme Court)
can go into the question of the validity of a law. Other courts have to
apply the law, even if it the judge believes it is constitutionally
invalid. So, most courts will be forced to apply this law of highly
questionable constitutionality until a High Court or the Supreme Court
strikes it down.</p>
<p>What
the Department has in fact done is to explicitly open up the floodgates
for increased liability claims and litigation - which runs exactly
counter to the purpose behind the amendment of Section 79 by Parliament
in 2008.</p>
<blockquote>“The
Government adopted a very transparent process for formulation of the
Rules under the Information Technology Act. The draft Rules were
published on the Department of Information Technology website for
comments and were widely covered by the media. None of the Industry
Associations and other stakeholders objected to the formulation which is
now being cited in some section of media.”<br /></blockquote>
<p>This is a blatant lie.</p>
<p>Civil
society voices, including <a href="http://editors.cis-india.org/internet-governance/blog/2011/02/25/intermediary-due-diligence" class="external-link">CIS</a>, <a class="external-link" href="http://www.softwarefreedom.in/index.php?option=com_idoblog&task=viewpost&id=86&Itemid=70">Software Freedom Law Centre</a>, and
individual experts (such as the lawyer and published author <a class="external-link" href="http://www.iltb.net/2011/02/draft-rules-on-intermediary-liability-released-by-the-ministry-of-it/">Apar Gupta</a>)
sent in comments. Companies <a class="external-link" href="http://online.wsj.com/article/SB10001424052748704681904576314652996232860.html?mod=WSJINDIA_hps_LEFTTopWhatNews">such as Google</a>, <a class="external-link" href="http://e2enetworks.com/2011/05/13/e2e-networks-response-to-draft-rules-for-intermediary-guidelines/">E2E Networks</a>, and others had apparently
raised concerns as well. The press has published many a cautionary note, including editorials, op-ed and articles in <a class="external-link" href="http://www.thehindu.com/opinion/lead/article1487299.ece">the</a> <a class="external-link" href="http://www.thehindu.com/opinion/editorial/article1515144.ece">Hindu</a>, <a class="external-link" href="http://www.thehoot.org/web/home/story.php?sectionId=6&mod=1&pg=1&valid=true&storyid=5163">the Hoot</a>, Medianama.com, and Kafila.com, well before the new rules were notified. We at CIS even received a 'read notification'
from the email account of the Group Coordinator of the DIT’s Cyber Laws
Division—Dr. Gulshan Rai—on Thursday, March 3, 2011 at 12:04 PM (we had
sent the mail to Dr. Rai on Monday, February 28, 2011). We never
received any acknowledgement, though, not even after we made an express
request for acknowledgement (and an offer to meet them in person to
explain our concerns) on Tuesday, April 5, 2011 in an e-mail sent to Mr.
Prafulla Kumar and Dr. Gulshan Rai of DIT.</p>
<p>The
process can hardly be called 'transparent' when the replies received
from 'industry associations and other stakeholders' have not been made
public by the DIT. Those comments which are public all indicate that
serious concerns were raised as to the constitutionality of the Rules.</p>
<p>The Government has been forward looking to create a conducive
environment for the Internet medium to catapult itself onto a different
plane with the evolution of the Internet. The Government remains fully
committed to freedom of speech and expression and the citizen’s rights
in this regard.</p>
<p><span id="internal-source-marker_0.8528041979429147">The DIT has limited this statement to the rules on intermediary due
diligence, and has not spoken about the controversial new rules that
stifle cybercafes, and restrict users' privacy and freedom to receive
information.<br /></span></p>
<p><span id="internal-source-marker_0.8528041979429147"></span>If
the government is serious about creating a conducive environment for
innovation, privacy and free expression on the Internet, then it wouldn’t be
passing Rules that curb down on them, and it definitely will not be
doing so in such a non-transparent fashion.</p></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries'>http://editors.cis-india.org/internet-governance/blog/rebuttal-dit-press-release-intermediaries</a>
</p>
No publisherpraneshFreedom of Speech and ExpressionIT ActFeaturedIntermediary Liability2012-07-11T13:18:04ZBlog Entry