The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 1 to 15.
Zero Draft of Content Removal Best Practices White Paper
http://editors.cis-india.org/internet-governance/blog/zero-draft-of-content-removal-best-practices-white-paper
<b>EFF and CIS Intermediary Liability Project is aimed towards the creation of a set of principles for intermediary liability in consultation with groups of Internet-focused NGOs and the academic community.</b>
<p style="text-align: justify; ">The draft paper has been created to frame the discussion and will be made available for public comments and feedback. The draft document and the views represented here are not representative of the positions of the organisations involved in the drafting.</p>
<p style="text-align: justify; "><a class="external-link" href="http://tinyurl.com/k2u83ya">http://tinyurl.com/k2u83ya</a></p>
<p style="text-align: justify; ">3 September 2014</p>
<h2 style="text-align: justify; ">Introduction</h2>
<p style="text-align: justify; ">The purpose of this white paper is to frame the discussion at several meetings between groups of Internet-focused NGOs that will lead to the creation of a set of principles for intermediary liability.</p>
<p style="text-align: justify; ">The principles that develop from this white paper are intended as a civil society contribution to help guide companies, regulators and courts, as they continue to build out the legal landscape in which online intermediaries operate. One aim of these principles is to move towards greater consistency with regards to the laws that apply to intermediaries and their application in practice.</p>
<p style="text-align: justify; ">There are three general approaches to intermediary liability that have been discussed in much of the recent work in this area, including CDT’s 2012 report called “Shielding the Messengers: Protecting Platforms for Expression and Innovation.” The CDT’s 2012 report divides approaches to intermediary liability into three models: 1. Expansive Protections Against Liability for Intermediaries, 2. Conditional Safe Harbor from Liability, 3. Blanket or Strict Liability for Intermediaries.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt1"><sup>[1]</sup></a></p>
<p style="text-align: justify; ">This white paper argues in the alternative that (a) the “expansive protections against liability” model is preferable, but likely not possible given the current state of play in the legal and policy space (b) therefore the white paper supports “conditional safe harbor from liability” operating via a ‘notice-to-notice’ regime if possible, and a ‘notice and action’ regime if ‘notice-to-notice’ is deemed impossible, and finally (c) all of the other principles discussed in this white paper should apply to whatever model for intermediary liability is adopted unless those principles are facially incompatible with the model that is finally adopted.</p>
<p style="text-align: justify; ">As further general background, this white paper works from the position that there are three general types of online intermediaries- Internet Service Providers (ISPs), search engines, and social networks. As outlined in the recent draft UNESCO Report (from which this white paper draws extensively);</p>
<p style="text-align: justify; ">“With many kinds of companies operating many kinds of products and services, it is important to clarify what constitutes an intermediary. In a 2010 report, the Organization for Economic Co-operation and Development (OECD) explains that Internet intermediaries “bring together or facilitate transactions between third parties on the Internet. They give access to, host, transmit and index content, products and services originated by third parties on the Internet or provide Internet-based services to third parties.”</p>
<p style="text-align: justify; ">Most definitions of intermediaries explicitly exclude content producers. The freedom of expression advocacy group Article 19 distinguishes intermediaries from “those individuals or organizations who are responsible for producing information in the first place and posting it online.” Similarly, the Center for Democracy and Technology explains that “these entities facilitate access to content created by others.” The OECD emphasizes “their role as ‘pure’ intermediaries between third parties,” excluding “activities where service providers give access to, host, transmit or index content or services that they themselves originate.” These views are endorsed in some laws and court rulings. In other words, publishers and other media that create and disseminate original content are not intermediaries. Examples of such media entities include a news website that publishes articles written and edited by its staff, or a digital video subscription service that hires people to produce videos and disseminates them to subscribers.</p>
<p style="text-align: justify; ">For the purpose of this case study we will maintain that intermediaries offer services that host, index, or facilitate the transmission and sharing of content created by others. For example, Internet Service Providers (ISPs) connect a user’s device, whether it is a laptop, a mobile phone or something else, to the network of networks known as the Internet. Once a user is connected to the Internet, search engines make a portion of the World Wide Web accessible by allowing individuals to search their database. Search engines are often an essential go-between between websites and Internet users. Social networks connect individual Internet users by allowing them to exchange messages, photos, videos, as well as by allowing them to post content to their network of contacts, or the public at large. Web hosting providers, in turn, make it possible for websites to be published and to be accessed online.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt2"><sup>[2]</sup></a></p>
<h2 style="text-align: justify; ">General Principles for ISP Governance - Content Removals</h2>
<p style="text-align: justify; ">The discussion that follows below outlines nine principles to guide companies, government, and civil society in the development of best practices related to the regulation of online content through intermediaries, as norms, policies, and laws develop in the coming years. The nine principles are: Transparency, Consistency, Clarity, Mindful Community Policy Making, Necessity and Proportionality in Content Restrictions, Privacy, Access to Remedy, Accountability, and Due Process in both Legal and Private Enforcement. Each principle contains subsections that expand upon the theme of the principle to cover more specific issues related to the rights and responsibilities of online intermediaries, government, civil society, and users.</p>
<h3 style="text-align: justify; ">Principle I: Transparency</h3>
<p style="text-align: justify; ">“Transparency enables users’ right to privacy and right to freedom of expression. Transparency of laws, policies, practices, decisions, rationale, and outcomes related to privacy and restrictions allow users to make informed choices with respect to their actions and speech online. As such - both governments and companies have a responsibility in ensuring that the public is informed through transparency initiatives.” <a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt3"><sup>[3]</sup></a></p>
<p style="text-align: justify; "><b>Government Transparency</b></p>
<ul style="text-align: justify; ">
<li>In general, governments should publish transparency reports:</li>
</ul>
<p style="text-align: justify; ">As part of the democratic process, the citizens of each country have a right to know how their government is applying its laws, and a right to provide feedback about the government’s legal interpretations of its laws. Thus, all governments should be required to publish online transparency reports that provide information about all requests issued by any branch or agency of government for the removal or restriction of online content. Further, governments should allow for the submission of comments and suggestions by a webform hosted on the same webpage where that government’s transparency report is hosted. There should also be some legal mechanism that requires the government to look at the feedback provided by its citizens, ensure that relevant feedback is passed along to legislative bodies, and provide for action to be taken on the citizen-provided feedback where appropriate. Finally, and where possible, the raw data that constitutes each government’s transparency report should be made available online, for free, in a common file format such as .csv, so that civil society may have easy access to it for research purposes.</p>
<li style="text-align: justify; ">Governments should be more transparent about content orders that they impose on ISPs<br />The legislative process proceeds most effectively when the government knows how the laws that it creates are applied in practice and is able to receive feedback from the public about how those laws should change further, or remain the same. Relatedly, regulation of the Internet is most effective when the legislative and judicial branches are aware of what the other is doing. For all of these reasons, governments should publish information about all of the court orders and executive requests for content removals that they send to online intermediaries. Publishing all of this information in one place necessarily requires that some single entity within the government collects the information, which will have the benefits of giving the government a holistic view of how it is regulating the internet, encouraging dialogue between different branches of government about how best to create and enforce internet content regulation, and encouraging dialogue between the government and its citizens about the laws that govern internet content and their application. </li>
<li style="text-align: justify; ">Governments should make the compliance requirements they impose on ISPs public<br />Each government should maintain a public website that publishes as complete a picture as possible of the content removal requests made by any branch of that government, including the judicial branch. The availability of a public website of this type will further many of the goals and objectives discussed elsewhere in this section. The website should be biased towards high levels of detail about each request and towards disclosure that requests were made, subject only to limited exceptions for compelling public policy reasons, where the disclosure bias conflicts directly with another law, or where disclosure would reveal a user’s PII. The information should be published periodically, ideally more than once a year. The general principle should be: the more information made available, the better. On the same website where a government publishes its ‘Transparency Report,’ that government should attempt to provide a plain-language description of its various laws related to online content, to provide users notice about what content is lawful vs. unlawful, as well as to show how the laws that it enacts in the Internet space fit together. Further, and as discussed in section “b,” infra, government should provide citizens with an online feedback mechanism so that they may participate in the legislative process as it applies to online content.</li>
<li style="text-align: justify; ">Governments should give their citizens a way to provide input on these policies<br />Private citizens should have the right to provide feedback on the balancing between their civil liberties and other public policies such as security that their government engages in on their behalf. If and when these policies and the compliance requirements they impose on online intermediaries are made publicly available online, there should also be a feedback mechanism built into the site where this information is published. This public feedback mechanism could take a number of different forms, like, for example, a webform that allowed users to indicate their level of satisfaction with prevailing policy choices by choosing amongst several radio buttons, while also providing open text fields to allow the user to submit clarifying comments and specific suggestions. In order to be effective, this online feedback mechanism would have to be accompanied by some sort of legal and budgetary apparatus that would ensure that the feedback was monitored and given some minimum level of deference in the discussions and meetings that led to new policies being created.</li>
<p style="text-align: justify; ">Government should meet users concerned about its content policies in the online domain. Internet users, as citizens of both the internet and the country their country of origin, have a natural interest in defining and defending their civil liberties online; government should meet them there to extend the democratic process to the Internet. Denying Internet users a voice in the policymaking processes that determine their rights undermines government credibility and negatively influences users’ ability to freely share information online. As such, content policies should be posted in general terms online and users should have the ability to provide input on those policies online.</p>
<p style="text-align: justify; "><b>ISP Transparency</b><br />“The transparency practices of a company impact users’ freedom expression by providing insight into the scope of restriction that is taking in place in specific jurisdiction. Key areas of transparency for companies include: specific restrictions, aggregate numbers related to restrictions, company imposed regulations on content, and transparency of applicable law and regulation that the service provider must abide by.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt4"><sup>[4]</sup></a></p>
<p style="text-align: justify; ">“Disclosure by service providers of notices received and actions taken can provide an important check against abuse. In addition to providing valuable data for assessing the value and effectiveness of a N&A system, creating the expectation that notices will be disclosed may help deter fraudulent or otherwise unjustified notices. In contrast, without transparency, Internet users may remain unaware that content they have posted or searched for has been removed pursuant due to a notice of alleged illegality. Requiring notices to be submitted to a central publication site would provide the most benefit, enabling patterns of poor quality or abusive notices to be readily exposed.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt5"><sup>[5]</sup></a> Therefore, ISPs at all levels should publish transparency reports that include:</p>
<ul style="text-align: justify; ">
<li>Government Requests</li>
</ul>
<p style="text-align: justify; ">All requests from government agencies and courts should be published in a periodic transparency report, accessible on the intermediary’s website, that publishes information about the requests the intermediary received and what the intermediary did with them in the highest level of detail that is legally possible. The more information that is provided about each request, the better the understanding that the public will have about how laws that affect their rights online are being applied. That said, steps should be taken to prevent the disclosure of personal information in relation to the publication of transparency reports. Beyond redaction of personal information, however, the maximum amount of information about each request should be published, subject as well to the (ideally minimal) restrictions imposed by applicable law. A thorough Transparency Report published by an ISP or online intermediary should include information about the following categories of requests:</p>
<li style="text-align: justify; ">Police and/or Executive Requests<br />This category includes all requests to the intermediary from an agency that is wholly a part of the national government; from police departments, to intelligence agencies, to school boards from small towns. Surfacing information about all requests from any part of the government helps to avoid corruption and/or inappropriate exercises of governmental power by reminding all government officials, regardless of their rank or seniority, that information about the requests they submit to online intermediaries is subject to public scrutiny. </li>
<li style="text-align: justify; ">Court Orders<br />This category includes all orders issued by courts and signed by a judicial officer. It can include ex-parte orders, default judgments, court orders directed at an online intermediary, or court orders directed at a third party presented to the intermediary as evidence in support of a removal request. To the extent legally possible, detailed information should be published about these court orders detailing the type of court order each request was, its constituent elements, and the actions(s) that the intermediary took in response to it. All personally identifying information should be redacted from any court orders that are published by the intermediary as part of a transparency report before publication.</li>
<li style="text-align: justify; ">First Party<br />Information about court orders should be further broken down into two groups; first party and third party. First party court orders are orders directed at the online intermediary in an adversarial proceeding to which the online intermediary was a party.</li>
<li style="text-align: justify; ">Third Party<br />As mentioned above, ‘third party’ refers to court orders that are not directed at the online intermediary, but rather a third party such as an individual user who posted an allegedly defamatory remark on the intermediary’s platform. If the user who obtains a court order approaches an online intermediary seeking removal of content with a court order directed at the poster of, say, the defamatory content, and the intermediary decides to remove the content in response to the request, the online intermediary that decided to perform the takedown should publish a record of that removal. To be accepted by an intermediary, third party court orders should be issued by a court of appropriate jurisdiction after an adversarial legal proceeding, contain a certified and specific statement that certain content is unlawful, and specifically identify the content that the court has found to be unlawful, by specific, permalinked URL where possible.</li>
<p style="text-align: justify; ">This type of court order should be broken out separately from court orders directed at the applicable online intermediary in companies’ transparency reports because merely providing aggregate numbers that do not distinguish between the two types gives an inaccurate impression to users that a government is attempting to censor more content than it actually is. The idea of including first party court orders to remove content as a subcategory of ‘government requests’ is that a government’s judiciary speaks on behalf of the government, making determinations about what is permitted under the laws of that country. This analogy does not hold for court orders directed at third parties- when the court made its determination of legality on the content in question, it did not contemplate that the intermediary would remove the content. As such, the court likely did not weigh the relevant public interest and policy factors that would include the importance of freedom of expression or the precedential value of its decision. Therefore, the determination does not fairly reflect an attempt by the government to censor content and should not be considered as such.</p>
<p style="text-align: justify; ">Instead, and especially considering that these third party court order may be the basis for a number of content removals, third party court orders should be counted separately and presented with some published explanation in the company’s transparency report as to what they are and why the company has decided it should removed content pursuant to its receipt of one.</p>
<p style="text-align: justify; "><b>Private-Party Requests</b><br />Private-party requests are requests to remove content that are not issued by a government agency or accompanied by a court order. Some examples of private party requests include copyright complaints submitted pursuant to the Digital Millennium Copyright Act or complaints based on the laws of specific countries, such as laws banning holocaust denial in Germany.</p>
<p style="text-align: justify; "><b>Policy/TOS Enforcement</b><br />To give users a complete picture of the content that is being removed from the platforms that they use, corporate transparency reports should also provide information about the content that the intermediary removes pursuant to its own policies or terms of service, though there may not be a legal requirement to do so.</p>
<p style="text-align: justify; "><b>User Data Requests</b><br />While this white paper is squarely focused on liability for content posted online and best practices for deciding when and how content should be removed from online services, corporate transparency reports should also provide information about requests for user data from executive agencies, courts, and others.</p>
<h3 style="text-align: justify; ">Principle II: Consistency</h3>
<li style="text-align: justify; ">Legal requirements for ISPs should be consistent, based on a global legal framework that establishes baseline limitations on legal immunity<br />Broad variation amongst the legal regimes of the countries in which online intermediaries operate increases compliance costs for companies and may discourage them from offering their services in some countries due to the high costs of localized compliance. Reducing the number of speech platforms that citizens have access to limits their ability to express themselves. Therefore, to ensure that citizens of a particular country have access to a robust range of speech platforms, each country should work to harmonize the requirements that it imposes upon online intermediaries with the requirements of other countries. While a certain degree of variation between what is permitted in one country as compared to another is inevitable, all countries should agree on certain limitations to intermediary liability, such as the following: </li>
<li style="text-align: justify; ">Conduits should be immune from claims about content that they neither created nor modified<br />As noted in the 2011 Joint Declaration on Freedom of Expression and the Internet, “[n]o one who simply provides technical Internet services such as providing access, or searching for, or transmission or caching of information, should be liable for content generated by others, which is disseminated using those services, as long as they do not specifically intervene in that content or refuse to obey a court order to remove that content, where they have the capacity to do so (‘mere conduit principle’).”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt6"><sup>[6]</sup></a></li>
<li style="text-align: justify; ">Court orders should be required for the removal of content that is related to speech, such as defamation removal requests<br />In the Center for Democracy and Technology’s Additional Responses Regarding Notice and Action, CDT outlines the case against allowing notice and action procedures to apply to defamation removal requests. They write: </li>
<p style="text-align: justify; ">“Uniform notice-and-action procedures should not apply horizontally to all types of illegal content. In particular, CDT believes notice-and-takedown is inappropriate for defamation and other areas of law requiring complex legal and factual questions that make private notices especially subject to abuse. Blocking or removing content on the basis of mere allegations of illegality raises serious concerns for free expression and access to information. Hosts are likely to err on the side of caution and comply with most if not all notices they receive, because evaluating notices is burdensome and declining to comply may jeopardize their protection from liability. The risk of legal content being taken down is especially high in cases where assessing the illegality of the content would require detailed factual analysis and careful legal judgments that balance competing fundamental rights and interests. Intermediaries will be extremely reluctant to exercise their own judgment when the legal issues are unclear, and it will be easy for any party submitting a notice to claim a good faith belief that the content in question is unlawful. In short, the murkier the legal analysis, the greater the potential for abuse.</p>
<p style="text-align: justify; ">To reduce this risk, removal of or disablement of access to content based on unadjudicated allegations of illegality (i.e., notices from private parties) should be limited to cases where the content at issue is manifestly illegal – and then only with necessary safeguards against abuse as described above.</p>
<p style="text-align: justify; ">CDT believes that online free expression is best served by narrowing what is considered manifestly illegal and subject to takedown upon private notice. With proper safeguards against abuse, for example, notice-and-action can be an appropriate policy for addressing online copyright infringement. Copyright is an area of law where there is reasonable international consensus regarding what is illegal and where much infringement is straightforward. There can be difficult questions at the margins – for example concerning the applicability of limitations and exceptions such as “fair use” – but much online infringement is not disputable.</p>
<p style="text-align: justify; ">Quite different considerations apply to the extension of notice-and-action procedures to allegations of defamation or other illegal content. Other areas of law, including defamation, routinely require far more difficult factual and legal determinations. There is greater potential for abuse of notice-and-action where illegality is less manifest and more disputable. If private notices are sufficient to have allegedly defamatory content removed, for example, any person unhappy about something that has been written about him or her would have the ability and incentive to make an allegation of defamation, creating a significant potential for unjustified notices that harm free expression. This and other areas where illegality is more disputable require different approaches to notice and action. In the case of defamation, CDT believes “notice” for purposes of removing or disabling access to content should come only from a competent court after full adjudication.</p>
<p style="text-align: justify; ">In cases where it would be inappropriate to remove or disable access to content based on untested allegations of illegality, service providers receiving allegations of illegal content may be able to take alternative actions in response to notices. Forwarding notices to the content provider or preserving data necessary to facilitate the initiation of legal proceedings, for example, can pose less risk to content providers’ free expression rights, provided there is sufficient process to allow the content provider to challenge the allegations and assert his or her rights, including the right to speak anonymously.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt7"><sup>[7]</sup></a></p>
<h3 style="text-align: justify; ">Principle III: Clarity</h3>
<li style="text-align: justify; ">All notices that request the removal of content should be clear and meet certain minimum requirements<br />The Center for Democracy and Technology outlined requirements for clear notices in a notice and action system in response a European Commission public comment period on a revised notice and action regime.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt8"><sup>[8]</sup></a> They write:</li>
<p style="text-align: justify; ">“Notices should include the following features:</p>
<ol style="text-align: justify; ">
<li>Specificity. Notices should be required to specify the exact location of the material – such as a specific URL – in order to be valid. This is perhaps the most important requirement, in that it allows hosts to take targeted action against identified illegal material without having to engage in burdensome search or monitoring. Notices that demand the removal of particular content wherever it appears on a site without specifying any location(s) are not sufficiently precise to enable targeted action. </li>
<li>Description of alleged illegal content. Notices should be required to include a detailed description of the specific content alleged to be illegal and to make specific reference to the law allegedly being violated. In the case of copyright, the notice should identify the specific work or works claimed to be infringed. </li>
<li>Contact details. Notices should be required to contain contact information for the sender. This facilitates assessment of notices’ validity, feedback to senders regarding invalid notices, sanctions for abusive notices, and communication or legal action between the sending party and the poster of the material in question. </li>
<li>Standing: Notices should be issued only by or on behalf of the party harmed by the content. For copyright, this would be the rightsholder or an agent acting on the rightsholderʼs behalf. For child sexual abuse images, a suitable issuer of notice would be a law enforcement agency or a child abuse hotline with expertise in assessing such content. For terrorism content, only government agencies would have standing to submit notice. </li>
<li>Certification: A sender of a notice should be required to attest under legal penalty to a good-faith belief that the content being complained of is in fact illegal; that the information contained in the notice is accurate; and, if applicable, that the sender either is the harmed party or is authorized to act on behalf of the harmed party. This kind of formal certification requirement signals to notice-senders that they should view misrepresentation or inaccuracies on notices as akin to making false or inaccurate statements to a court or administrative body. </li>
<li>Consideration of limitations, exceptions, and defenses: Senders should be required to certify that they have considered in good faith whether any limitations, exceptions, or defenses apply to the material in question. This is particularly relevant for copyright and other areas of law in which exceptions are specifically described in law. </li>
<li>An effective appeal and counter-notice mechanism. A notice-and-action regime should include counter-notice procedures so that content providers can contest mistaken and abusive notices and have their content reinstated if its removal was wrongful. </li>
<li>Penalties for unjustified notices. Senders of erroneous or abusive notices should face possible sanctions. In the US, senders may face penalties for knowingly misrepresenting that content is infringing, but the standard for “knowingly misrepresenting” is quite high and the provision has rarely been invoked. A better approach might be to use a negligence standard, whereby a sender could be held liable for damages or attorneys’ fees for making negligent misrepresentations (or for repeatedly making negligent misrepresentations). In addition, the notice-and-action system should allow content hosts to ignore notices from senders with an established record of sending erroneous or abusive notices or allow them to demand more information or assurances in notices from those who have in the past submitted erroneous notices. (For example, hosts might be deemed within the safe harbor if they require repeat abusers to specifically certify that they have actually examined the alleged infringing content before sending a notice).”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt9"><sup>[9]</sup></a> </li>
</ol>
<li style="text-align: justify; ">All ISPs should publish their content removal policies online and keep them current as they evolve<br />The UNESCO report states, by way of background, that “[c]ontent restriction practices based on Terms of Service are opaque. How companies remove content based on Terms of Service violations is more opaque than their handling of content removals based on requests from authorized authorities. When content is removed from a platform based on company policy, [our] research found that all companies provide a generic notice of this restriction to the user, but do not provide the reason for the restriction. Furthermore, most companies do not provide notice to the public that the content has been removed. In addition, companies are inconsistently open about removal of accounts and their reasons for doing so.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt10"><sup>[10]</sup></a></li>
<p style="text-align: justify; ">There are legitimate reasons why an ISP may want to have policies that permit less content, and a narrower range of content, than is technically permitted under the law, such as maintaining a product that appeals to families. However, if a company is going to go beyond the minimal legal requirements in terms of content that it must restrict, the company should have clear policies that are published online and kept up-to-date to provide its users notice of what content is and is not permitted on the company’s platform. Notice to the user about the types of content that are permitted encourages her to speak freely and helps her to understand why content that she posted was taken down if it must be taken down for violating a company policy.</p>
<li style="text-align: justify; ">When content is removed, a clear notice should be provided in the product that explains in simple terms that content has been removed and why<br />This subsection works in conjunction with “ii,” above. If content is removed for any reason, either pursuant to a legal request or because of a violation of company policy, a user should be able to learn that content was removed if they try to access it. Requiring an on-screen message that explains that content has been removed and why is the post-takedown accompaniment to the pre-takedown published online policy of the online intermediary: both work together to show the user what types of content are and are not permitted on each online platform. Explaining to users why content has been removed in sufficient detail may also spark their curiosity as to the laws or policies that caused the content to be removed, resulting in increased civic engagement in the internet law and policy space, and a community of citizens that demands that the companies and governments it interacts with are more responsive to how it thinks content regulation should work in the online context.</li>
<p style="text-align: justify; ">The UNESCO report provides the following example of how Google provides notice to its users when a search result is removed, which includes a link to a page hosted by Chilling Effects:<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt11"><sup>[11]</sup></a></p>
<p style="text-align: justify; ">“When search results are removed in response to government or copyright holder demands, a notice describing the number of results removed and the reasons for their removal is displayed to users (see screenshot below) and a copy of the request to the independent non-proft organization ChillingEffects.org, which archives and publishes the request. When possible the company also contacts the website’s owners.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt12"><sup>[12]</sup></a></p>
<p style="text-align: justify; ">This is an example of the message that is displayed when Google removes a search result pursuant to a copyright complaint.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt13"><sup>[13]</sup></a></p>
<li style="text-align: justify; ">Requirements that governments impose on intermediaries should be as clear and unambiguous as possible<br />Imposing liability on internet intermediaries without providing clear guidance as to the precise type of content that is not lawful and the precise requirements of a legally sufficient notice encourages intermediaries to over-remove content. As Article 19 noted in its 2013 report on intermediary liability:</li>
<p style="text-align: justify; ">“International bodies have also criticized ‘notice and takedown’ procedures as they lack a clear legal basis. For example, the 2011 OSCE report on Freedom of Expression on the internet highlighted that: Liability provisions for service providers are not always clear and complex notice and takedown provisions exist for content removal from the Internet within a number of participating States. Approximately 30 participating States have laws based on the EU E-Commerce Directive. However, the EU Directive provisions rather than aligning state level policies, created differences in interpretation during the national implementation process. These differences emerged once the national courts applied the provisions.</p>
<p style="text-align: justify; ">These procedures have also been criticized for being unfair. Rather than obtaining a court order requiring the host to remove unlawful material (which, in principle at least, would involve an independent judicial determination that the material is indeed unlawful), hosts are required to act merely on the say-so of a private party or public body. This is problematic because hosts tend to err on the side of caution and therefore take down material that may be perfectly legitimate and lawful. For example, in his report, the UN Special Rapporteur on freedom of expression noted:</p>
<p style="text-align: justify; ">[W]hile a notice-and-takedown system is one way to prevent intermediaries from actively engaging in or encouraging unlawful behavior on their services, it is subject to abuse by both State and private actors. Users who are notified by the service provider that their content has been flagged as unlawful often has little recourse or few resources to challenge the takedown. Moreover, given that intermediaries may still be held financially or in some cases criminally liable if they do not remove content upon receipt of notification by users regarding unlawful content, they are inclined to err on the side of safety by overcensoring potentially illegal content. Lack of transparency in the intermediaries’ decision-making process also often obscures discriminatory practices or political pressure affecting the companies’ decisions. Furthermore, intermediaries, as private entities, are not best placed to make the determination of whether a particular content is illegal, which requires careful balancing of competing interests and consideration of defenses.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt14"><sup>[14]</sup></a></p>
<p style="text-align: justify; ">Considering the above, if liability is to be imposed on intermediaries for certain types of unlawful content, the legal requirements that outline what is unlawful content and how to report it must be clear. Lack of clarity in this area will result in over-removal of content by rational intermediaries that want to minimize their legal exposure and compliance costs. Over-removal of content is at odds with the goals of freedom of expression.</p>
<p style="text-align: justify; ">The UNESCO Report made a similar recommendation, stating that; “Governments need to ensure that legal frameworks and company policies are in place to address issues arising out of intermediary liability. These legal frameworks and policies should be contextually adapted and be consistent with a human rights framework and a commitment to due process and fair dealing. Legal and regulatory frameworks should also be precise and grounded in a clear understanding of the technology they are meant to address, removing legal uncertainty that would provide opportunity for abuse.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt15"><sup>[15]</sup></a></p>
<p style="text-align: justify; ">Similarly, the 2011 Joint Declaration on Freedom of Expression and the Internet states:</p>
<p style="text-align: justify; ">“Consideration should be given to insulating fully other intermediaries, including those mentioned in the preamble, from liability for content generated by others under the same conditions as in paragraph 2(a). At a minimum, intermediaries should not be required to monitor user-generated content and should not be subject to extrajudicial content takedown rules which fail to provide sufficient protection for freedom of expression (which is the case with many of the ‘notice and takedown’ rules currently being applied).”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt16"><sup>[16]</sup></a></p>
<h3 style="text-align: justify; ">Principle IV: Mindful Community Policy Making</h3>
<p style="text-align: justify; ">“Laws and regulations as well as corporate policies are more likely to be compatible with freedom of expression if they are developed in consultation with all affected stakeholders – particularly those whose free expression rights are known to be at risk.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt17"><sup>[17]</sup></a> To be effective, policies should be created through a multi-stakeholder consultation process that gives voice to the communities most at risk of being targeted for the information they share online. Further, both companies and governments should embed an ‘outreach to at-risk communities’ step into both legislative and policymaking processes to be especially sure that their voices are heard. Finally, civil society should work to ensure that all relevant stakeholders have a voice in both the creation and revision of policies that affect online intermediaries. In the context of corporate policymaking, civil society can use strategies from activist investing to encourage investors to make the human rights and freedom of expression policies of Internet companies’ part of the calculus that investors use to decide where to place their money. Considering the above;</p>
<ol style="text-align: justify; ">
<li style="text-align: justify; ">Human rights impact assessments, considering the impact of the proposed law or policy on various communities from the perspectives of gender, sexuality, sexual preference, ethnicity, religion, and freedom of expression, should be required before:</li>
<li>New laws are written that govern content issues affecting ISPs or conduct that occurs primarily online</li>
<li style="text-align: justify; ">“Protection of online freedom of expression will be strengthened if governments carry out human rights impact assessments to determine how proposed laws or regulations will affect Internet users’ freedom of expression domestically and globally.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt18"><sup>[18]</sup></a></li>
</ol>
<li style="text-align: justify; ">Intermediaries enact new policies<br />“Protection of online freedom of expression will be strengthened if companies carry out human rights impact assessments to determine how their policies, practices, and business operations affect Internet users’ freedom of expression. This assessment process should be anchored in robust engagement with stakeholders whose freedom of expression rights are at greatest risk online, as well as stakeholders who harbor concerns about other human rights affected by online speech.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt19"><sup>[19]</sup></a></li>
<li style="text-align: justify; ">Multi-stakeholder consultation processes should precede any new legislation that will apply to content issues affecting online intermediaries or online conduct<br />“Laws and regulations as well as corporate policies are more likely to be compatible with freedom of expression if they are developed in consultation with all affected stakeholders – particularly those whose free expression rights are known to be at risk.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt20"><sup>[20]</sup></a></li>
<li style="text-align: justify; ">Civil society and public interest groups should encourage responsible investment in companies who implement policies that reflect best practices for internet intermediaries<br />“Over the past thirty years, responsible investors have played a powerful role in incentivizing companies to improve environmental sustainability, supply chain labor practices, and respect for human rights of communities where companies physically operate. Responsible investors can also play a powerful role in incentivizing companies to improve their policies and practices affecting freedom of expression and privacy by developing metrics and criteria for evaluating companies on these issues in the same way that they evaluate companies on other “environmental, social, and governance” criteria.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt21"><sup>[21]</sup></a></li>
<h3 style="text-align: justify; ">Principle V: Necessity and Proportionality in Content Restriction</h3>
<li style="text-align: justify; ">Content should only be restricted when there is a legal basis for doing so, or the removal is performed in accordance with a clear, published policy of the ISP<br />As CDT outlined in its 2012 intermediary liability report, “[a]ctions required of intermediaries must be narrowly tailored and proportionate, to protect the fundamental rights of Internet users. Any actions that a safe-harbor regime requires intermediaries to take must be evaluated in terms of the principle of proportionality and their impact on Internet users’ fundamental rights, including rights to freedom of expression, access to information, and protection of personal data. Laws that encourage intermediaries to take down or block certain content have the potential to impair online expression or access to information. Such laws must therefore ensure that the actions they call for are proportional to a legitimate aim, no more restrictive than is required for achievement of the aim, and effective for achieving the aim. In particular, intermediary action requirements should be narrowly drawn, targeting specific unlawful content rather than entire websites or other Internet resources that may support both lawful and unlawful uses.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt22"><sup>[22]</sup></a></li>
<li style="text-align: justify; ">When content must be restricted, it should be restricted in the most minimal way possible (i.e., prefer domain removals to IP-blocking)<br />There are a number of different ways that access to content can be restricted. Examples include hard deletion of the content from all of a company’s servers, blocking the download of an app or other software program in a particular country, blocking the content on all IP addresses affiliated with a particular country (“IP-Blocking”), removing the content from a particular domain of a product (i.e., removing from a link from the .fr version of a search engine that remains accessible on the .com version), blocking content from a ‘version’ of an online product that is accessible through a ‘country’ or ‘language’ setting on that product, or some combination of the last three options (i.e., an online product that directs the user to a version of the product based on the country that their IP address is coming from, but where the user can alter a URL or manipulate a drop-down menu to show her a different ‘country version’ of the product, providing access to content that may otherwise be inaccessible). </li>
<p style="text-align: justify; ">While almost all of the different types of content restrictions described above can be circumvented by technical means such as the use of proxies, IP-cloaking, or Tor, the average internet user does not know that these techniques exist, much less how to use them. Of the different types of content restrictions described above, a domain removal, for example, is easier for an individual user to circumvent than IP-Blocked content because you only have to change the URL of the product you are using to, i.e. “.com” to see content that has been locally restricted. To get around an IP-block, you would have to be sufficiently savvy to employ a proxy or cloak your true IP address.</p>
<p style="text-align: justify; ">Therefore, the technical means used to restrict access to controversial content has a direct impact on the magnitude of the actual restriction on speech. The more restrictive the technical removal method, the fewer people that will have access to that content. To preserve access to lawful content, online intermediaries should choose the least restrictive means of complying with removal requests, especially when the removal request is based on the law of a particular country that makes certain content unlawful that is not unlawful in other countries. Further, when building new products and services, intermediaries should built in removal capability that minimally restricts access to controversial content.</p>
<li style="text-align: justify; ">If content is restricted due to its illegality in a particular country, the geographical scope of the content restriction should be as minimal as possible<br />Building on the discussion in “ii,” supra, a user should be able to access content that is lawful in her country even if it is not lawful in another country. Different countries have different laws and it is often difficult for intermediaries to determine how to effectively respond to requests and reconcile the inherent conflicts that result. For example, content that denies the holocaust is illegal in certain countries, but not in others. If an intermediary receives a request to remove content based on the laws of a particular country and determines that it will comply because the content is not lawful in that country, it should not restrict access to the content such that it cannot be accessed by users in other countries where the content is lawful. To respond to a request based on the law of a particular country by blocking access to that content for users around the world, or even users of more than one country, essentially allows for extraterritorial application of the laws of the country that the request came from. While it is preferable to standardize and limit the legal requirements imposed on online intermediaries throughout the world, to the extent that this is not possible, the next-best option is to limit the application of laws that are interpreted to declare certain content unlawful to the users that live in that country. Therefore, intermediaries should choose the technical means of content restriction that is most narrowly tailored to limit the geographical scope and impact of the removal.</li>
<li style="text-align: justify; ">The ability of conduits (telecommunications/internet service providers) to filter content should be minimized to the extent technically and legally possible</li>
<p style="text-align: justify; ">The 2011 Joint Declaration on Freedom of Expression and the Internet made the following points about the dangers of allowing filtering technology:</p>
<p style="text-align: justify; ">“Mandatory blocking of entire websites, IP addresses, ports, network protocols or types of uses (such as social networking) is an extreme measure – analogous to banning a newspaper or broadcaster – which can only be justified in accordance with international standards, for example where necessary to protect children against sexual abuse.</p>
<p style="text-align: justify; ">Content filtering systems which are imposed by a government or commercial service provider and which are not end-user controlled are a form of prior censorship and are not justifiable as a restriction on freedom of expression.</p>
<p style="text-align: justify; ">Products designed to facilitate end-user filtering should be required to be accompanied by clear information to end-users about how they work and their potential pitfalls in terms of over-inclusive filtering.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt23"><sup>[23]</sup></a></p>
<p style="text-align: justify; ">In short, filtering at the conduit level is a blunt instrument that should be avoided whenever possible. Similar to how conduits should not be legally responsible for content that they neither host nor modify (the ‘mere conduit’ rule discussed supra), conduits should technically restrict their ability to filter content such that it would be inefficient for government agencies to contact them to have content filtered. Mere conduits are not able to assess the context surrounding the controversial content that they are asked to remove and are therefore not the appropriate party to receive takedown requests. Further, when mere conduits have the technical ability to filter content, they open themselves to pressure from government to exercise that capability. Therefore, mere conduits should limit or not build in the capability to filter content.</p>
<li style="text-align: justify; ">Notice and notice, or notice and judicial takedown, should be preferred to notice and takedown, which should be preferred to unilateral removal<br />Mechanisms for content removal that involve intermediaries acting without any oversight or accountability, or those which only respond to the interests of the party requesting removal, are unlikely to do a very good job at balancing public and private interests. A much better balance is likely to be struck through a mechanism where power is distributed between the parties, and/or where an independent and accountable oversight mechanism exists.</li>
<p style="text-align: justify; ">Considered in this way, there is a continuum of content removal mechanisms that ranges from those are the least balanced and accountable, and those that are more so. The least accountable is the unilateral removal of content by the intermediary without legal compulsion in response to a request received, without affording the uploader of the content the right to be heard or access to remedy.</p>
<p style="text-align: justify; ">Notice and takedown mechanisms fit next along the continuum, provided that they incorporate, as the DMCA attempts to do, an effective appeal and counter-notice mechanism. However where notice and takedown falls down is that the cost and incentive structure is weighted towards removal of content in the case of doubt or dispute, resulting in more content being taken down and staying down than would be socially optimal.</p>
<p style="text-align: justify; ">A better balance is likely to be struck by a “notice and notice” regime, which provides strong social incentives for those whose content is reported to be unlawful to remove the content, but does not legally compel them to do so. If legal compulsion is required, a court order must be separately obtained.</p>
<p style="text-align: justify; ">Canada is an example of a jurisdiction with a notice and notice regime, though limited to copyright content disputes. Although this regime is now established in legislation, it formalizes a previous voluntary regime, whereby major ISPs would forward copyright infringement notifications received from rightsholders to subscribers, but without removing any content and without releasing subscriber data to the rightsholders absent a court order. Under the new legislation additional record-keeping requirements are imposed on ISPs, but otherwise the essential features of the regime remain unchanged.</p>
<p style="text-align: justify; ">Analysis of data collected during this voluntary regime indicates that it has been effective in changing the behavior of allegedly infringing subscribers. A 2010 study by the Entertainment Software Association of Canada (ESAC) found that 71% of notice recipients did not infringe again, whereas a similar 2011 study by Canadian ISP Rogers found 68% only received one notice, and 89% received no more than two notices, with only 1 subscriber in 800,000 receiving numerous notices.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt24"><sup>[24]</sup></a> However, in cases where a subscriber has a strong good faith belief that the notice they received was wrong, there is no risk to them in disregarding the erroneous notice – a feature that does not apply to notice and takedown.</p>
<p style="text-align: justify; ">Another similar way in which public and private interests can be balanced is through a notice and judicial takedown regime, whereby the rightsholder who issues a notice about offending content must have it assessed by an independent judicial (or perhaps administrative) authority before the intermediary will respond by taking the content down.</p>
<p style="text-align: justify; ">An example of this is found in Chile, again limited to the case of copyright.<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt25"><sup>[25]</sup></a> In response to its Free Trade Agreement with the United States, the system introduced in 2010 is broadly similar to the DMCA, with the critical difference that intermediaries are not required to take material down in order to benefit from a liability safe harbor, until such time as a court order for removal of the material is made. Responsibility for evaluating the copyright claims made is therefore shifted from intermediaries onto the courts.</p>
<p style="text-align: justify; ">Although this requirement does impose a burden on the rightsholder, this serves a purpose by disincentivizing the issue of automated or otherwise unjustified notices that are more likely to restrict or chill freedom of expression. In cases where there is no serious dispute about the legality of the content, it is unlikely that the lawsuit would be defended. In any case, the legislation authorizes the court to issue a preliminary injunction on an ex parte basis, on condition of payment of a bond.</p>
<li style="text-align: justify; ">Intermediaries should be allowed to charge for the time and expense associated with processing legal requests<br />As an intermediary, it is time consuming and relatively expensive to understand the obligations that each country’s legal regime imposes on you, and to accurately how each legal request should be handled. Especially for intermediaries without many resources, such as forum operators or owners of home Wifi networks, the costs associated with being an intermediary can be prohibitive. Therefore, it should be within their rights to charge for their compliance costs if they are either below a certain user threshold or can show financial necessity in some way.</li>
<li style="text-align: justify; ">Legal requirements imposed on intermediaries should be a floor, not a ceiling- ISPs can adopt more restrictive policies to more effectively serve their users as long as they have published policies that explain what they are doing<br />The Internet has space for a wide range of platforms and applications directed to different communities, with different needs and desires. A social networking site directed at children, for example, may reasonably want to have policies that are much more restrictive than a political discussion board. Therefore, legal requirements that compel intermediaries to take down content should be seen as a ‘floor,’ but not a ‘ceiling’ on the range and quantity that of content those intermediaries may remove. Intermediaries should retain control over their own policies as long as they are transparent about what those policies are, what type of content the intermediary removes, and why they removed certain pieces of content. </li>
<h3 style="text-align: justify; ">Principle VI: Privacy</h3>
<li style="text-align: justify; ">It is important to protect the ability of Internet users to speak by narrowing and making less ambiguous the range of content that intermediaries can be held liable for, but it is also very important to make users feel comfortable sharing their view by ensuring that their privacy is protected. Protecting the user’s ability to share her views, especially when those views are controversial or have a direct bearing on important political issues, requires that the user can trust the intermediaries that she uses. This concept can be further broken down into three sub-principles:</li>
<li style="text-align: justify; ">The user’s personal information should be protected to the greatest extent possible given the state of the art in encryption, security, and policy<br />Users will be less willing to speak on important topics if they have legitimate concerns that their data may be taken from them. As stated in the UNESCO Report, “[b]ecause of the amount of personal information held by companies and ability to access the same, a company’s practices around collection, access, disclosure, and retention are key. To a large extent a service provider’s privacy practices are influenced by applicable law and operating licenses required by the host government. These can include requirements for service providers to verify subscribers, collect and retain subscriber location data, and cooperate with law enforcement when requested. Outcome: The implications of companies trying to balance a user’s expectation for privacy with a government’s expectation for cooperation can be serious and are inadequately managed in all jurisdictions studied.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt26"><sup>[26]</sup></a></li>
<li style="text-align: justify; ">Where possible, ISPs should help to preserve the user’s right to speak anonymously<br />An important aspect of an Internet user’s ability to exercise her right to free expression online is ability to speak anonymously. Anonymous speech is one of the great advances of the Internet as a communications medium and should be preserved to the extent possible. As noted by special rapporteur Frank LaRue, “[i]n order for individuals to exercise their right to privacy in communications, they must be able to ensure that these remain private, secure and, if they choose, anonymous. Privacy of communications infers that individuals are able to exchange information and ideas in a space that is beyond the reach of other members of society, the private sector, and ultimately the State itself. Security of communications means that individuals should be able to verify that only their intended recipients, without interference or alteration, receive their communications and that the communications they receive are equally free from intrusion. Anonymity of communications is one of the most important advances enabled by the Internet, and allows individuals to express themselves freely without fear of retribution or condemnation.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt27"><sup>[27]</sup></a></li>
<li style="text-align: justify; ">The user’s PII should never be sold or used without her consent, and she should always know what is being done with it via an easily comprehensible dashboard<br />The user’s trust in the online platform that she uses and relies upon is influenced not only by the relationships the intermediary maintains with the government, but also with other commercial entities. A user, who feels that her data will be constantly shared with third parties, perhaps without her consent and/or for marketing purposes, will never feel like she is able to freely express her opinion. Therefore, it is the intermediary’s responsibility to ensure that its users know exactly what information it retains about them, who it shares that information with and under what circumstances, and how to change the way that her data is shared. All of this information should be available on a dashboard that is comprehensible to the average user, and which gives her the ability to easily modify or withdraw her consent to the way her data is being shared, or the amount of data, or specific data, that the intermediary is retaining about her.</li>
<h3 style="text-align: justify; ">Principle VII: Access to Remedy</h3>
<li style="text-align: justify; ">As noted in the UNESCO Report, “Remedy is the third central pillar of the UN Guiding Principles on Business and Human Rights, placing an obligation both on governments and on companies to provide individuals access to effective remedy. This area is where both governments and companies are almost consistently lacking. Across intermediary types, across jurisdictions and across the types of restriction, individuals whose content is restricted and individuals who wish to access such content are offered little or no effective recourse to appeal restriction decisions, whether in response to government orders, third party requests or in accordance with company policy. There are no private grievance or due process mechanisms that are clearly communicated and readily available to all users, or consistently applied.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt28"><sup>[28]</sup></a></li>
<p style="text-align: justify; "><br />Any notice and takedown system is subject to abuse, and any company policy that results in the removal of content is subject to mistaken or inaccurate takedowns, both of which are substantial problems that can only be remedied by the ability for users to let the intermediary know when the intermediary improperly removed a specific piece of content and the technical and procedural ability of the intermediary to put the content back. However, the technical ability to reinstate content that was improperly removed may conflict with data retention laws. This conflict should be explored in more detail. In general, however, every time content is removed, there should be:</p>
<li style="text-align: justify; ">A clear mechanism through which users can request reinstatement of content<br />When an intermediary decides to remove content, it should be immediately clear to the user that content has been removed and why it was removed (see discussion of in-product notice, supra). If the user disagrees with the content removal decision, there should be an obvious, online method for her to request reinstatement of the content.</li>
<li style="text-align: justify; ">Reinstatement of content should be technically possible<br />When intermediaries (who are subject to intermediary liability) are building new products, they should build the capability to remove content into the product with a high degree of specificity so as to allow for narrowly tailored content removals when a removal is legally required. Relatedly, all online intermediaries should build the capability to reinstate content into their products while maintaining compliance with data retention laws.</li>
<li style="text-align: justify; ">Intermediaries should have policies and procedures in place to handle reinstatement requests<br />Between the front end (online mechanism to request reinstatement of content) and the backend (technical ability to reinstate content) is the necessary middle layer, which consists of the intermediary’s internal policies and processes that allow for valid reinstatement requests to be assessed and acted upon. In line with the corporate ‘responsibility to respect’ human rights, and considered along with the human rights principle of ‘access to remedy,’ intermediaries should have a system in place from the time that an online product launches to ensure that reinstatement requests can be made and will be processed quickly and appropriately.</li>
<h3 style="text-align: justify; ">Principle VIII: Accountability</h3>
<li style="text-align: justify; ">Governments must ensure that independent, transparent, and impartial accountability mechanisms exist to verify the practices of government and companies with regards to managing content created online<br />“While it is important that companies make commitments to core principles on freedom of expression and privacy, make efforts to implement those principles through transparency, policy advocacy, and human rights impact assessments, it is also important that companies take these steps in a manner that is accountable to stakeholders. One way of doing this is by committing to external third party assurance to verify that their policies and practices are being implemented to a meaningful standard, with acceptable consistency wherever their service is offered. Such assurance gains further public credibility when carried out with the supervision and affirmation of multiple stakeholders including civil society groups, academics, and responsible investors. The Global Network Initiative provides one such mechanism for public accountability. Companies not currently participating in GNI, or a process of similar rigor and multi-stakeholder involvement, should be urged by users, investors, and regulators to do so.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt29"><sup>[29]</sup></a></li>
<li style="text-align: justify; ">Civil society should encourage comparative studies between countries and between ISPs with regards to their content removal practices to identify best practices<br />Civil society has the unique ability to look longitudinally across this issue to determine and compare how different intermediaries and governments are responding to content removal requests. Without information about how other governments and intermediaries are handling these issues, it will be difficult for each government or intermediary to learn how to improve its laws or policies. Therefore, civil society has an important role to play in the process of creating increasingly better human rights outcomes for online platforms by performing and sharing ongoing, comparative research.</li>
<li style="text-align: justify; ">Civil society should establish best practices and benchmarks against which ISPs and government can be measured, and should track governments and ISPs over time in public reports<br />“A number of projects that seek, define and implement indicators and benchmarks for governments or companies are either in development (examples include: UNESCO’s Indicators of Internet Development project examining country performance, Ranking Digital Rights focusing on companies) or already in operation (examples include the Web Foundation’s Web Index, Freedom House’s Internet Freedom Index, etc.). The emergence of credible, widely-used benchmarks and indicators that enable measurement of country and company performance on freedom of expression will help to inform policy, practice, stakeholder engagement processes, and advocacy.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt30"><sup>[30]</sup></a></li>
<h3 style="text-align: justify; ">Principle IX: Due Process - In Both Legal and Private Enforcement</h3>
<li style="text-align: justify; ">ISPs should always consider context before removing content and Governments and courts should always consider context before ordering that certain content be removed<br />“Governments need to ensure that legal frameworks and company policies are in place to address issues arising out of intermediary liability. These legal frameworks and policies should be contextually adapted and be consistent with a human rights framework and a commitment to due process and fair dealing. Legal and regulatory frameworks should also be precise and grounded in a clear understanding of the technology they are meant to address, removing legal uncertainty that would provide opportunity for abuse.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt31"><sup>[31]</sup></a></li>
<li style="text-align: justify; ">Principles for Courts</li>
<p style="text-align: justify; ">An independent and impartial judiciary exists, at least in part, to preserve the citizen’s due process rights. Many have called for an increased reliance on courts to make determinations about the legality of content posted online in order to both shift the censorship function from unaccountable private actors and to ensure that courts only order the removal of content that is actually unlawful. However, when courts do not have an adequate technical understanding of how content is created and shared on the internet, the rights of the intermediaries that facilitate the posting of the content, and who should be ordered to remove unlawful content, they do not add value to the online ecosystem. Therefore, courts should keep certain principles in mind to preserve the due process rights of the users that post content and the intermediaries that host the content.</p>
<li style="text-align: justify; ">Preserve due process for intermediaries- do not order them to do something before giving them notice and the opportunity to appear before the court</li>
<p style="text-align: justify; ">In a dispute between two private parties over a specific piece of content posted online, it may appear to the court that the easy solution is to order the intermediary who hosts the content to remove it. However, this approach does not extend any due process protections to the intermediary and does not adequately reflect the intermediary's status as something other than the creator of the content. If a court feels that it is necessary for an intermediary to intervene in a legal proceeding between two private parties, the court should provide the intermediary with proper notice and give them the opportunity to appear before the court before issuing any orders.</p>
<li style="text-align: justify; ">Necessity and proportionality of judicial determinations- judicial orders determining the illegality of specific content should be narrowly tailored to avoid over-removal of content </li>
<p style="text-align: justify; ">With regards to government removal requests, the UNESCO Report notes that “[o]ver-broad law and heavy liability regimes cause intermediaries to over-comply with government requests in ways that compromise users’ right to freedom of expression, or broadly restrict content in anticipation of government demands even if demands are never received and if the content could potentially be found legitimate even in a domestic court of law.”<a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt32"><sup>[32]</sup></a> Courts should follow the same principle: only order the removal of the bare minimum of content that is necessary to remedy the harm identified and nothing more.</p>
<li style="text-align: justify; ">Courts should clarify whether ISPs have to remove content in response to court orders directed to third parties, or only have to remove content when directly ordered to do so (first party court orders) after an adversarial proceeding to which the ISP was a party</li>
<p style="text-align: justify; ">See discussion of the difference between first party and third party court orders (supra, section a., “Transparency”). Ideally, any decision that courts reach on this issue would be consistent across different countries.</p>
<li style="text-align: justify; ">Questions- related unresolved issues that should be kicked to the larger group</li>
<li style="text-align: justify; ">How should the conflict between access to remedy and data retention laws that say content must be hard deleted after a certain period of time be resolved? I think the access to remedy has to be subordinated to the data protection laws. Let's make that our draft position, but continue to flag it for discussion.</li>
<li style="text-align: justify; ">Should ISPs have to remove content in response to court orders directed to third parties, or only have to remove content when directly ordered to do so (first party court orders) after an adversarial proceeding to which the ISP was a party? I think first party orders. Let's make that our draft position, but continue to flag it for discussion.</li>
<hr style="text-align: justify; " />
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref1">[1]</a> Center for Democracy and Technology, Shielding the Messengers: Protecting Platforms for Expression and Innovation at 4-15 (Version 2, 2012), available at <a href="https://www.google.com/url?q=https%3A%2F%2Fwww.cdt.org%2Ffiles%2Fpdfs%2FCDT-Intermediary-Liability-2012.pdf&sa=D&sntz=1&usg=AFQjCNHNG5ji0HEiYXyelfwwK8qTCgOHiw">https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf</a> (see pp.4-15 for an explanation of these different models and the pros and cons of each).</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref2">[2]</a> UNESCO, “Fostering Freedom Online: The Roles, Challenges, and Obstacles of Internet Intermediaries” at 6-7 (Draft Version, June 16th, 2014) (Hereinafter “UNESCO Report”).</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref3">[3]</a> UNESCO Report at 56.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref4">[4]</a> UNESCO Report at 37.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref5">[5]</a> Center for Democracy and Technology, Additional Responses Regarding Notice and Action, Available at https://www.cdt.org/files/file/CDT%20N&A%20supplement.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref6">[6]</a> The United Nations (UN) Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information, Article 19, Global Campaign for Free Expression, and the Centre for Law and Democracy, JOINT DECLARATION ON FREEDOM OF EXPRESSION AND THE INTERNET at 2 (2011), available at <a href="http://www.google.com/url?q=http%3A%2F%2Fwww.osce.org%2Ffom%2F78309&sa=D&sntz=1&usg=AFQjCNF8QmlhRMreM_BT0Eyfrw_J7ZdTGg">http://www.osce.org/fom/78309</a> (Hereinafter “Joint Declaration on Freedom of Expression).</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref7">[7]</a> Center for Democracy and Technology, Additional Responses Regarding Notice and Action, Available at https://www.cdt.org/files/file/CDT%20N&A%20supplement.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref8">[8]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref9">[9]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref10">[10]</a> UNESCO Report at 113-14.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref11">[11]</a> ‘Chilling Effects’ is a website that allows recipients of ‘cease and desist’ notices to submit the notice to the site and receive information about their legal rights. For more information about ‘Chilling Effects’ see: http://www.chillingeffects.org.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref12">[12]</a> Id. at 73. You can see an example of a complaint published on Chilling Effects at the following location. “DtecNet DMCA (Copyright) Complaint to Google,” Chilling Effects Clearinghouse, March 12, 2013, www.chillingeffects.org/notice.cgi?sID=841442.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref13">[13]</a> UNESCO Report at 73.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref14">[14]</a> Article 19, Internet Intermediaries: Dilemma of Liability (2013), available at http://www.article19.org/data/files/Intermediaries_ENGLISH.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref15">[15]</a> UNESCO Report at 120.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref16">[16]</a> Joint Declaration on Freedom of Expression and the Internet at 2.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref17">[17]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref18">[18]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref19">[19]</a> Id. at 121.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref20">[20]</a> Id. at 104.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref21">[21]</a> Id. at 122.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref22">[22]</a> Center for Democracy and Technology, Shielding the Messengers: Protecting Platforms for Expression and Innovation at 12 (Version 2, 2012), available at https://www.cdt.org/files/pdfs/CDT-Intermediary-Liability-2012.pdf.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref23">[23]</a> Joint Declaration on Freedom of Expression at 2-3.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref24">[24]</a> Geist, Michael, Rogers Provides New Evidence on Effectiveness of Notice-and-Notice System (2011), available at http://www.michaelgeist.ca/2011/03/effectiveness-of-notice-and-notice/</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref25">[25]</a> Center for Democracy and Technology, Chile’s Notice-and-Takedown System for Copyright Protection: An Alternative Approach (2012), available at https://www.cdt.org/files/pdfs/Chile-notice-takedown.pdf</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref26">[26]</a> UNESCO Report at 54.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref27">[27]</a> “Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, Frank La Rue (A/HRC/23/40),” United Nations Human Rights, 17 April 2013, http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A.HRC.23.40_EN.pdf, § 24, p. 7.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref28">[28]</a> UNESCO Report at 118.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref29">[29]</a> UNESCO Report at 122.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref30">[30]</a> Id.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref31">[31]</a> UNESCO Report at 120.</p>
<p style="text-align: justify; "><a href="https://docs.google.com/document/d/1S3pSuo49pqI7gIxP0-ogmVstk7EEnPRs2MPX7ncxrmc/pub#ftnt_ref32">[32]</a> Id. at 119.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/zero-draft-of-content-removal-best-practices-white-paper'>http://editors.cis-india.org/internet-governance/blog/zero-draft-of-content-removal-best-practices-white-paper</a>
</p>
No publisherjyotiInternet GovernanceIntermediary Liability2014-09-10T07:11:09ZBlog EntryYou Have the Right to Remain Silent
http://editors.cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent
<b>Reflecting upon the state of freedom of speech and expression in India, in the wake of the shut-down of the political satire website narendramodiplans.com.</b>
<hr />
<p style="text-align: justify; ">Nishant Shah's <a class="external-link" href="http://www.downtoearth.org.in/content/you-have-right-remain-silent">column was published in Down to Earth</a> on July 17, 2013.</p>
<hr />
<p style="text-align: justify; ">It took less than a day for narendramodiplans.com, a political satire website that had more than 60,000 hits in the 20 hours of its existence, to be taken down. A simple webpage that showed a smiling picture of Narendra Modi, the touted candidate for India’s next Prime Ministerial campaign, flashing his now trademark ‘V’ for <span><s>Vengeance</s> </span> Victory sign. At the first glimpse it looked like another smart media campaign by the net-savvy minister who has already made use of the social web quite effectively, to connect with his constituencies and influence the younger voting population in the country. Below the image of Mr. Modi was a text that said, "For a detailed explanation of how Mr. Narendra Modi plans to run the nation if elected to the house as a Prime Minister and also for his view/perspective on 2002 riots please click the link below." The button, reminiscent of 'sale' signs on shops that offer permanent discounts, promised to reveal, for once and for all, the puppy plight of Mr. Modi's politics and his plans for the country that he seeks to lead.</p>
<p style="text-align: justify; ">However, when one tried to click on the button, hoping, at least for a manifesto that combined the powers of Machiavelli with the sinister beauty of Kafka, it proved to be an impossible task. The button wiggled, and jiggled, and slithered all over the page, running away from the mouse following it. Referencing the layers of evasive answers, the engineered Public Relations campaigns that try to obfuscate the history to some of the most pointed questions that have been posited to the Modi government through judicial and public forums, the button never stayed still enough to actually reveal the promised answers. For people who are familiar with the history of such political satire and protest online would immediately recognise that this wasn’t the most original of ideas. In fact, it was borrowed from another website - <a href="http://www.thepmlnvision.com/" title="http://www.thepmlnvision.com/">http://www.thepmlnvision.com/</a> that levelled similar accusations of lack of transparency and accountability on the part of Nawaz Sharif of Pakistan. Another instance, which is now also shut down, had a similar deployment where the webpage claimed to give a comprehensive view into Rahul Gandhi’s achievements, to question his proclaimed intentions of being the next prime-minister. In short, this is an internet meme, where a simple web page and a java script allowed for a critical commentary on the future of the next elections and the strengthening battle between #feku and #pappu that has already taken epic proportions on Twitter.</p>
<p style="text-align: justify; ">The early demise of these two websites (please do note, when you click on the links that the Nawaz Sharif website is still working) warns us of the tightening noose around freedom of speech and expression that politicos are responsible for in India. It has been a dreary last couple of years already, with the passing of the <a href="http://www.downtoearth.org.in/content/cis-india.org/internet-governance/intermediary-liability-in-india" target="_blank">Intermediaries Liabilities Rules</a> as an amendment to the IT Act of India, <a href="http://www.indianexpress.com/news/spy-in-the-web/888509/1" target="_blank">Dr. Sibal proposing to pre-censor the social web</a> in a quest to save the face of erring political figures,<a href="http://www.indianexpress.com/news/two-girls-arrested-for-facebook-post-questioning-bal-thackeray-shutdown-of-mumbai-get-bail/1033177/" target="_blank"> teenagers being arrested for voicing political dissent</a>, and <a href="http://en.wikipedia.org/wiki/Aseem_Trivedi" target="_blank">artists being prosecuted</a> for exercising their rights to question the state of governance in our country. Despite battles to keep the web an open space that embodies the democratic potentials and the constitutional rights of freedom of speech and expression in the country, it has been a losing fight to keep up with the ad hoc and dictatorial mandates that seem to govern the web.</p>
<table class="invisible">
<tbody>
<tr>
<th><img src="http://editors.cis-india.org/home-images/Namo.png" alt="Narendra Modi Plans" class="image-inline" title="Narendra Modi Plans" /></th>
</tr>
<tr>
<td>Above is a screen shot from narendramodiplans.com website</td>
</tr>
</tbody>
</table>
<p style="text-align: justify; ">We have no indication of why this latest piece of satirical expression, which should be granted immunity as a work of art, if not as an individual’s right to free speech, was suddenly taken down. The website now has a message that says, “I quit. In a country with freedom of speech, I assumed that I was allowed to make decent satire on any politician more particularly if it is constructive. Clearly, I was wrong.” The web is already abuzz with conspiracy theories, each sounding scarier than the other because they seem so plausible and possible in a country that has easily sacrificed our right to free speech and expression at the altar of political egos. And whether you subscribe to any of the theories or not, whether your sympathies lie with the BJP or with the UPA, whether or not you approve of the political directions that the country seems to be headed in, there is no doubt that you should be as agitated as I am, about the fact that we are in a fast-car to blanket censorship, and we are going there in style.</p>
<p style="text-align: justify; ">What happens online is not just about this one website or the one person or the one political party – it is a reflection on the rising surveillance and bully state that presumes that making voices (and sometimes people) invisible, is enough to resolve the problems that they create. And what happens on the web is soon going to also affect the ways in which we live our everyday lives. So the next time, you call some friends over for dinner, and then sit arguing about the state of politics in the country, make sure your windows are all shut, you are wearing tin-foil hats and if possible, direct all conversations to the task of finally <a href="http://bollywoodjournalist.com/2013/07/08/desperately-seeking-mamta-kulkarni/" target="_blank">finding Mamta Kulkarni</a>. Because anything else that you say might either be censored or land you in a soup, and the only recourse you might have would be a website that shows the glorious political figures of the country, with a sign that says “To defend your right to free speech and expression, please click here”. And you know that you are never going to be able to click on that sign. Ever.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent'>http://editors.cis-india.org/internet-governance/blog/down-to-earth-july-17-2013-nishant-shah-you-have-the-right-to-remain-silent</a>
</p>
No publishernishantFreedom of Speech and ExpressionSocial MediaInternet GovernanceIntermediary Liability2013-07-22T06:59:53ZBlog EntryWhy should we care about takedown timeframes?
http://editors.cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes
<b>The issue of content takedown timeframe - the time period an intermediary is allotted to respond to a legal takedown order - has received considerably less attention in conversations about intermediary liability. This article examines the importance of framing an appropriate timeframe towards ensuring that speech online is not over-censored, and frames recommendations towards the same.
</b>
<p> </p>
<p> </p>
<p><em>This article first <a class="external-link" href="https://cyberbrics.info/why-should-we-care-about-takedown-timeframes/">appeared</a> in the CyberBRICS website. It has since been <a class="external-link" href="https://www.medianama.com/2020/04/223-content-takedown-timeframes-cyberbrics/">cross-posted</a> to the Medianama.</em></p>
<p><em>The findings and opinions expressed in this article are derived from the larger research report 'A deep dive into content takedown timeframes', which can be accessed <a class="external-link" href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">here</a>.</em></p>
<p><strong>Introduction</strong></p>
<p>Since the Ministry of Electronics and Information Technology (MeitY) proposed the draft amendments to the intermediary liability guidelines in December of 2018, speculations regarding their potential effects have been numerous. These have included, <a class="external-link" href="http://www.medianama.com/2020/01/223-traceability-accountability-necessary-intermediary-liability/">mapping</a> the requirement of traceability of originators vis-a-vis chilling effect on free speech online, or <a class="external-link" href="http://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/">critiquing</a> the proactive filtering requirement as potentially leading to censorship.</p>
<p>One aspect, however, that has received a lesser amount of attention is encoded within Rule 3(8) of the draft amendments. By the virtue of that rule, the time-limit given to the intermediaries to respond to a legal content takedown request (“turnaround time”) has been reduced from 36 hours (as it was in the older version of the rules) to 24 hours. In essence, intermediaries, when faced with a takedown order from the government or the court, would now have to remove the concerned piece of content within 24 hours of receipt of the notice.</p>
<p>Why is this important? Consider this: the <a class="external-link" href="http://indiacode.nic.in/bitstream/123456789/1999/3/A2000-21.pdf">definition</a> of an ‘intermediary’ within the Indian law encompasses a vast amount of entities – cyber cafes, online-marketplaces, internet service providers and more. Governance of any intermediary liability norms would accordingly require varying levels of regulation, each of which recognizes the different composition of these entities. In light of that, the content takedown requirement, and specifically the turnaround time becomes problematic. Let alone that the vast amount of entities under the definition of intermediaries would probably find it impossible to implement this obligation due to their technical architecture, this obligation also seems to erase the nuances existing within entities which would actually fall within its scope. </p>
<p>Each category of online content, and more importantly, each category of intermediary are different, and any content takedown requirement must appreciate these differences. A smaller intermediary may find it more difficult to adhere to a stricter, shorter timeframe, than an incumbent. A piece of ‘terrorist’ content may be required to be treated with more urgency than something that is defamatory. These contextual cues are critical, and must be accordingly incorporated in any law on content takedown.</p>
<p>While making our submissions to the draft amendments, we found that there was a lack of research from the government’s side justifying the shortened turnaround time, nor were there any literature which focussed on turnaround time-frames as a critical point of regulation of intermediary liability. Accordingly, I share some findings from our research in the subsequent sections, which throw light on certain nuances that must be considered before proposing any content takedown time-frame. It is important to note that our research has not yet found what should be an appropriate turnaround time in a given situation. However, the following findings would hopefully start a preliminary conversation which may ultimately lead us to a right answer.</p>
<p><strong>What to consider when regulating takedown time-frames?</strong></p>
<p>I classify the findings from our research into a chronological sequence: a) broad legal reforms, b) correct identification of scope and extent of the law, c) institution of proper procedural safeguards, and d) post-facto review of the time-frame for evidence based policy-making.</p>
<p><em>1. Broad legal reforms: Harmonize the law on content takedown.</em></p>
<p>The Indian law for content takedown is administered through two different provisions under the Information Technology (IT) Act, each with their own legal procedures and scope. While the 24-hour turnaround time would be applicable for the procedure under one of them, there would continue to <a class="external-link" href="http://cis-india.org/internet-governance/resources/information-technology-procedure-and-safeguards-for-blocking-for-access-of-information-by-public-rules-2009">exist</a> a completely different legal procedure under which the government could still effectuate content takedown. For the latter, intermediaries would be given a 48-hour timeframe to respond to a government request with clarifications (if any).</p>
<p>Such differing procedures contributes to the creation of a confusing legal ecosystem surrounding content takedown, leading to arbitrary ways in which Indian users experience internet censorship. Accordingly, it is important to harmonize the existing law in a manner that the procedures and safeguards are seamless, and the regulatory process of content takedown is streamlined.</p>
<p><em>2. Correct identification of scope and extent of the law: Design a liability framework on the basis of the differences in the intermediaries, and the content in question.</em></p>
<p>As I have highlighted before, regulation of illegal content online cannot be <a class="external-link" href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/">one-size-fits-all</a>. Accordingly, a good law on content takedown must account for the nuances existing in the way intermediaries operate and the diversity of speech online. More specifically, there are two levels of classification that are critical.</p>
<p><em>One</em>, the law must make a fundamental classification between the intermediaries within the scope of the law. An obligation to remove illegal content can be implemented only by those entities whose technical architecture allows them to. While a search engine would be able to delink websites that are declared ‘illegal’, it would be absurd to expect a cyber cafe to follow a similar route of responding to a legal takedown order within a specified timeframe.</p>
<p>Therefore, one basis of classification must incorporate this difference in the technical architecture of these intermediaries. Apart from this, the law must also design liability for intermediaries on the basis of their user-base, annual revenue generated, and the reach, scope and potential impact of the intermediary’s actions.</p>
<p><em>Two, </em>it is important that the law recognizes that certain types of content would require more urgent treatment than other types of content. Several regulations across jurisdiction, including the NetzDG and the EU Regulation on Preventing of Dissemination of Terrorist Content Online, while problematic in their own counts, attempt to either limit their scope of application or frame liability based on the nature of content targeted.</p>
<p>The Indian law on the other hand, encompasses within its scope, a vast, varying array of content that is ‘illegal’, which includes on one hand, critical items like threatening ‘the sovereignty and integrity of India’ and on the other hand, more subjective speech elements like ‘decency or morality’. While an expedited time-frame may be permissible for the former category of speech, it is difficult to justify the same for the latter. More contextual judgments may be needed to assess the legality of content that is alleged to be defamatory or obscene, thereby making it problematic to have a shorter time-frame for the same.</p>
<p><em>3. Institution of proper procedural safeguards: Make notices mandatory and make sanctions gradated</em>.</p>
<p>Apart from the correct identification of scope and extent, it is important that there are sufficient procedural safeguards to ensure that the interests of the intermediaries and the users are not curtailed. While these may seem ancillary to the main point, how the law chooses to legislate on these issues (or does not), nevertheless has a direct bearing on the issue of content takedown and time-frames.</p>
<p>Firstly, while the Indian law mandates content takedown, it does not mandate a process through which a user is notified of such an action being taken. The mere fact that an incumbent intermediary is able to respond to removal notifications within a specified time-frame does not imply that its actions would not have ramifications on free speech. Ability to takedown content does not translate into accuracy of the action taken, and the Indian law fails to take this into account.</p>
<p>Therefore, additional obligations of informing users when their content has been taken down, institutes due process in the procedure. In the context of legal takedown, such notice mechanisms also <a class="external-link" href="http://www.eff.org/wp/who-has-your-back-2019">empower</a> users to draw attention to government censorship and targeting.</p>
<p>Secondly, a uniform time-frame of compliance, coupled with severe sanctions goes on to disrupt the competition against the smaller intermediaries. While the current law does not clearly elaborate upon the nature of sanctions that would be imposed, general principles of the doctrine of safe harbour dictate that upon failure to remove the content, the intermediary would be subject to the same level of liability as the person uploading the content. This threat of sanctions may have adverse effects on free speech online, resulting in potential <a class="external-link" href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf">over-censorship</a> of legitimate speech.</p>
<p>Accordingly, sanctions should be restricted to instances of systematic violations. For critical content, the contours of what constitutes systematic violation may differ. The regulator must accordingly take into account the nature of content which the intermediary failed to remove, while assessing their liability.</p>
<p><em>4. Post-facto review of the time-frame for evidence based policy-making: Mandate transparency reporting.</em></p>
<p>Transparency reporting, apart from ensuring accountability of intermediary action, is also a useful tool for understanding the impact of the law, specifically with relation to time period of response. The NetzDG, for all its criticism, has received <a class="external-link" href="https://www.article19.org/wp-content/uploads/2017/09/170901-Legal-Analysis-German-NetzDG-Act.pdfhttp://">support</a> for requiring intermediaries to produce bi-annual transparency reports. These reports provide us important insight into the efficacy of any proposed turnaround time, which in turn helps us to propose more nuanced reforms into the law.</p>
<p>However, to cull out the optimal amount of information from these reports, it is important that these reporting practices are standardized. There exists some international body of work which proposes a methodology for standardizing transparency reports, including the Santa Clara Principles and the Electronic Frontier Foundation’s (EFF) ‘Who has your back?’ reports. We have also previously proposed a methodology that utilizes some of these pointers.</p>
<p>Additionally, due to the experimental nature of the provision, including a review provision in the law would ensure the efficacy of the exercise can also be periodically assessed. If the discussion in the preceding section is any indication, the issue of an appropriate turnaround time is currently in a regulatory flux, with no correct answer. In such a scenario, periodic assessments compel policymakers and stakeholders to discuss effectiveness of solutions, and the nature of the problems faced, leading to <a class="external-link" href="http://www.livemint.com/Opinion/svjUfdqWwbbeeVzRjFNkUK/Making-laws-with-sunset-clauses.html">evidence-based</a> policymaking.</p>
<p><strong>Why should we care?</strong></p>
<p>There is a lot at stake while regulating any aspect of intermediary liability, and the lack of smart policy-making may result in the dampening of the interests of any one of the stakeholder groups involved. As the submissions to the draft amendments by various civil societies and industry groups show, the updated turnaround time suffers from issues, which if not addressed, may lead to over-removal, and lack of due process in the content removal procedure.</p>
<p>Among others, these submissions pointed out that the shortened time-frame did not allow the intermediaries sufficient time to scrutinize a takedown request to ensure that all technical and legal requirements are adhered to. This in turn, may also prompt third-party action against user actions. Additionally, the significantly short time-frame also raised several implementational challenges. For smaller companies with fewer employees, such a timeframe can both be burdensome, from both a financial and capability point of view. This in turn, may result in over-censorship of speech online.</p>
<p>Failing to recognize and incorporate contextual nuances into any law on intermediary liability therefore, may critically alter the way we interact with online intermediaries, and in a larger scheme, with the internet.</p>
<p> </p>
<p> </p>
<p> </p>
<div> </div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes'>http://editors.cis-india.org/internet-governance/blog/why-should-we-care-about-takedown-timeframes</a>
</p>
No publisherTorSharkContent takedownIntermediary LiabilityChilling Effect2020-04-10T04:58:56ZBlog EntryWebinar on the draft Intermediary Guidelines Amendment Rules
http://editors.cis-india.org/internet-governance/news/webinar-on-the-draft-intermediary-guidelines-amendment-rules
<b>CCAOI and the ISOC Delhi Chapter organised a webinar on January 10 to discuss the draft "The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018". Gurshabad Grover was a discussant in the panel.</b>
<p>The agenda of the discussion was:</p>
<ul>
<li>A brief introduction to the draft highlighting the key issues[Shashank Mishra]</li>
<li>Invited experts sharing their view on the paper and questions asked [Nehaa Chaudhari, Paul Brooks, Arjun Sinha, Gurshabad Grover]</li>
<li>Open Discussion Q&A</li>
<li>Summarizing the session</li>
</ul>
<div>A recording of the session can be <a class="external-link" href="https://livestream.com/internetsociety/intermediaryrules">accessed here</a></div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/webinar-on-the-draft-intermediary-guidelines-amendment-rules'>http://editors.cis-india.org/internet-governance/news/webinar-on-the-draft-intermediary-guidelines-amendment-rules</a>
</p>
No publisherAdminFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-01-18T02:13:23ZNews ItemWebinar on counter-comments to the draft Intermediary Guidelines
http://editors.cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines
<b>CCAOI and the ISOC Delhi Chapter organised a webinar on February 11 to discuss the comments submitted to the Information Technology [Intermediary Guidelines (Amendment) Rules] 2018, and counter-comments that were due by February 14. </b>
<p>The agenda of the discussion was:</p>
<ul>
<li>A brief introduction to the counter comment process [Shashank Mishra]</li>
<li>Invited stakeholders comment on key issues and perspectives on the submissions and the points to be countered.</li>
</ul>
<p>The following people participated:</p>
<ul>
<li>Amba Kak, Mozilla</li>
<li>Rajesh Chharia, ISPAI</li>
<li>Gurshabad Grover, CIS</li>
<li>Priyanka Chaudhari, SFLC</li>
<li>Divij Joshi, Vidhi Centre for Legal Policy</li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines'>http://editors.cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines</a>
</p>
No publisherAdminInternet GovernanceIntermediary LiabilityInformation Technology2019-02-22T01:51:19ZNews ItemUN Special Rapporteur Report on Freedom of Expression and the Private Sector: A Significant Step Forward
http://editors.cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward
<b>On 6 June 2016, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye, released a report on the Information and Communications Technology (“ICT”) sector and freedom of expression in the digital age. Vidushi Marda and Pranesh Prakash highlight the most important aspects of the report.</b>
<h2 dir="ltr">Background</h2>
<p dir="ltr">Today, the private sector is more closely linked to the freedom of expression than it has ever been before. The ability to speak to a mass audience was at one time a privilege restricted to those who had access to mass media. However, with digital technologies, that privilege is available to far more people than was ever possible in the pre-digital era. As private content created on these digital networks is becoming increasingly subject to state regulation, it is crucial to examine the role of the private sector in respect of the freedom of speech and expression.</p>
<p dir="ltr">The first foray by the Special Rapporteur into this broad area has resulted in a sweeping report, that covers almost every aspect of freedom of expression within the ICT sector, except competition which we will elaborate on later in this post.</p>
<h2 dir="ltr">Introduction</h2>
<p dir="ltr">The report aims to “provide guidance on how private actors should protect and promote freedom of expression in a digital age”. It identifies the relevant international legal framework as Article 19 of the <a href="https://treaties.un.org/doc/Publication/UNTS/Volume%20999/volume-999-I-14668-English.pdf">International Covenant on Civil and Political Rights</a>, and Article 19 of the <a href="http://www.un.org/en/udhrbook/pdf/udhr_booklet_en_web.pdf">Universal Declaration of Human Rights</a>. The UN “Protect, Respect and Remedy” Framework and Guiding Principles, also known as the <a href="http://business-humanrights.org/sites/default/files/reports-and-materials/Ruggie-report-7-Apr-2008.pdf">Ruggie Principles</a> provide the framework for private sector responsibilities on business and human rights.</p>
<p dir="ltr">The report categorises different roles of the private sector in organising, accessing, regulating and populating the internet. This is important because the manner in which the ICT sector affects the freedom of expression is far more complicated than traditional communication industries. The report identifies the distinct impact of internet service providers, hardware and software companies, domain name registries and registrars, search engines, platforms, web hosting services, platforms, data brokers and e-commerce facilities on the freedom of expression.</p>
<h2>Legal and Policy Issues</h2>
<div>The Special Rapporteur discusses four distinct legal and policy issues that find relevance in respect of this problem statement: Content Regulation, Surveillance and Digital Security, Transparency and Remedies.</div>
<div> </div>
<h3>Content Regulation</h3>
<p dir="ltr">The report identifies two main channels through which content regulation takes place: the state, and internal processes.</p>
<p>Noting that digital content made on private networks is increasingly subject to State regulation, the report highlights the competing interests of intermediaries who manage platforms and States which demand for regulation of this content on grounds of defamation, blasphemy, protection of national security etc. This tension is demonstrated through vague laws that compel individuals and private corporations to over-comply and err on the side of caution “in order to avoid onerous penalties, filtering content of uncertain legal status and engaging in other modes of censorship and self-censorship.” Excessive intermediary liability forces intermediaries to over-comply with requests in order to ensure that local access to their platforms are not blocked. States attempt at regulating content outside the law through extra legal restrictions, and push private actors to take down content on their own initiative. Filtering content is another method, wherein States block and filter content through the private sector. Government blacklists, illegal content and suspended accounts are methods employed, and these have sometimes raised concerns of necessity and proportionality. <a href="http://scroll.in/article/807277/whatsapp-in-kashmir-when-big-brother-wants-to-go-beyond-watching-you">Network or service shutdowns</a> are classified as a “particularly pernicious” method of content regulation. Non neutral networks also are a method of content regulation with the possibilities of internet service providers throttling traffic. Zero rating is a potential issue, although the report acknowledges that “it remains a subject of debate whether they may be permissible in areas genuinely lacking Internet access”.</p>
<p>The other node of content regulation has been identified as internal policies and practices of the private sector. <a href="https://consentofthenetworked.com/author/rebeccamackinnon/">Terms of service</a> restrictions are often tailored to the jurisdiction’s laws and policies and don’t always address the needs and interests of vulnerable groups. Further, the report notes, <a href="http://www.catchnews.com/tech-news/facebook-free-basics-gatekeeping-powers-extend-to-manipulating-public-discourse-1452077063.html">design and engineering choices</a> of how private players choose to curate content are algorithmically determined and increasingly control the information that we consume. </p>
<h3>Transparency</h3>
<div> The report notes that transparency enables those entities subject to internet regulation to take informed decisions about their responsibilities and liabilities in a digital sphere and points out, that there is a severe lack of transparency about government requests to restrict or remove content. Some states even prohibit the publication of such information, with India being one example. In respect of the private sector, content hosting platforms sometimes at least reveal the circumstances under which content is removed due to a government request, although this is rather erratic. The report recognises the need to balance transparency with competing concerns like security and trade secrecy, and this is a matter of continued debate.</div>
<div> </div>
<h3 dir="ltr">Surveillance and Digital Security</h3>
<p>Freedom of expression concerns arise as data transmitted on private networks is gradually being subjected to surveillance and interference from the State and private actors. The report finds that several internet companies have reported an increase in government requests for customer data and user information. According to the Special Rapporteur, effective resistance strategies include inclusion of human rights guarantees, restrictively interpreting government requests negotiations. Private players also make surveillance and censorship equipment that enable States to intercept communications. Covert surveillance has been previously reported, with States tapping into communications as and when necessary. When private entities become aware of interception and covert surveillance, their human rights responsibilities arise. As private entities work towards enhancing encryption, anonymity and user security, states respond by <a href="http://www.cnbc.com/2016/03/29/apple-vs-fbi-all-you-need-to-know.html">compelling companies</a> to create loopholes for them to circumvent such privacy and security enhancing technology.</p>
<h3 dir="ltr">Remedies</h3>
<p>Unlawful content removal, opaque suspensions, data security breaches are commonplace occurrences in the digital sphere. The ICCPR guarantees that all people whose rights have been violated must have an effective remedy, and similarly, the Ruggie principles require that remedial and grievance mechanisms must be provided by corporations. There is some ambiguity on how these complaint or appeal mechanisms should be designed and implemented, and the nature and structure of these mechanisms is also unclear. The report states that it is necessary to investigate the role of the state in supplementing/regulating corporate mechanisms, its role in ensuring that there is a mechanism for remedies, and its responsibility to make sure that more easily and financially accessible alternatives exist for remedial measures.<br /><br /></p>
<h2> Special Rapporteur’s priorities for future work and thematic developments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Investigating laws, policies and extralegal measures that equip governments to impose restrictions on the provision of telecommunications and internet services. Examining the responsibility of companies to respond in a way that respects human rights, mitigates harm, and provides avenues for redress.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Evaluating content restrictions under terms of service and community standards. Private actors face substantial pressure from governments and individuals to restrict expression, and a priority is to evaluate the interplay of private and state actions on freedom of expression in light of human rights obligations and responsibilities.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Focusing on the legitimacy of rationales for intermediary liability for content hosting, restrictions, conditions for removing third party content.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Exploring censorship and surveillance within the human rights framework, and encouraging greater scrutiny before using these technologies for purposes that undermine the freedom of expression.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Identifying ways to balance an increasing scope of freedom of expression with the need to address governmental interests in national security and public order.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet access - Future work will explore issues around access and private sector engagement and investment in ensuring affordability and accessibility, particularly considering marginalized groups.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Internet governance - Internet governance frameworks and reform efforts are sensitive to the needs of women, sexual minorities and other vulnerable communities. Throughout this future work, the Special Rapporteur will pay particular attention to legal developments (legislative, regulatory, and judicial) at national and regional levels.</p>
</li></ol>
<div> </div>
<h2>Conclusions and Recommendations</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">States: The report recommends that states should not pressurise the private sector to interfere with the freedom of speech and expression in a manner that does not meet the condition of necessary and proportionate principles. Any request to take down content or access customer information must be based on validly enacted law, subject to oversight, and demonstrate necessary and proportionate means of achieving the aims laid down in Article 19(3) of the ICCPR.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Private Actors: The Special Rapporteur recommends that private actors develop and implement transparent human rights assessment procedures, and develop policies keeping in mind their human rights impact. Apart from this, private entities should integrate commitments to the freedom of expression into internal processes and ensure the “greatest possible transparency”.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">International Organisations: The report recommends that organisations make resources and educational material on internet governance publicly accessible. The Special Rapporteur also recommends encouraging meaningful civil society participation in multi-stakeholder policy making and standard setting processes, with an increased focus on sensitivity to human rights.</p>
</li></ol>
<div> </div>
<h2>CIS Comments</h2>
<ol><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS strongly agrees with the expansion of the Special Rapporteur’s scope that this report represents. He is no longer looking solely at states but at the private sector too.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">CIS also notes that competition is an important aspect of the freedom of expression, but has not been discussed in this report. Viable alternatives to platforms, networks, internet service providers etc., will ensure a healthy, competitive marketplace, and will have a positive impact in resolving the issues identified above.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr">Our <a href="http://cis-india.org/internet-governance/intermediary-liability-in-india.pdf/view">work</a> has called for maintaining a balanced approach to liability of intermediaries for their users’ actions, since excessive liability or strict liability would lead to over-caution and removal of legitimate speech, while having no liability at all would make it difficult to act effectively against harmful speech, e.g., revenge porn.</p>
</li><li style="list-style-type: decimal;" dir="ltr">
<p dir="ltr"><a href="http://cis-india.org/internet-governance/blog/cis-position-on-net-neutrality">CIS’ work</a> on network neutrality has highlighted the importance of neutrality for freedom of speech, and has advocated for an evidence-based approach that ensures there is neither under-regulation, nor over-regulation. The Special Rapporteur suggests that ‘Zero-Rating’ practices always violate Net Neutrality, but the majority of the definitions of Net Neutrality proposed by academics and followed by regulators across the world often do not include Zero-Rating. Similarly, he suggests that the main exception for Zero-Rating is for areas genuinely lacking access to the Internet, whereas the potential for some forms of Zero-Rating to further freedom of expression, especially of minorities, even in areas with access to the Internet, provides sufficient reason for the issue to merit greater debate.</p>
</li></ol>
<div> </div>
<div> </div>
<div>(Pranesh Prakash was invited by the Special Rapporteur to provide his views and took part in a meeting that contributed to this report)</div>
<div> </div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward'>http://editors.cis-india.org/internet-governance/un-special-rapporteur-report-on-freedom-of-expression-and-the-private-sector-a-significant-step-forward</a>
</p>
No publishervidushiFreedom of Speech and ExpressionInternet GovernanceUNHRCDigital MediaIntermediary LiabilityICT2016-06-08T17:27:22ZBlog EntryTwitter's India troubles show tough path ahead for digital platforms
http://editors.cis-india.org/internet-governance/news/dw-june-21-2021-aditya-sharma-twitter-india-troubles-show-tough-path-ahead-for-digital-platforms
<b>Twitter is in a standoff with Indian authorities over the government's new digital rules. Critics see the rules as an attempt to curb free speech, while others say more action is needed to hold tech giants accountable.
</b>
<p style="text-align: justify; ">The blog by Aditya Sharma <a class="external-link" href="https://www.dw.com/en/twitters-india-troubles-show-tough-path-ahead-for-digital-platforms/a-57980916">was published by DW</a> on 21 June 2021. Torsha Sarkar was quoted.</p>
<hr style="text-align: justify; " />
<p style="text-align: justify; "><img src="http://editors.cis-india.org/home-images/Intermediary.jpg/@@images/08eb8de3-4fd6-408f-94d2-3f202da0e730.jpeg" alt="Intermediary" class="image-right" title="Intermediary" /></p>
<p style="text-align: justify; ">Twitter holds a relatively low share of India's social media market. But, since 2017, the huge nation has emerged as Twitter's fastest-growing market, becoming critical to its global expansion plans.</p>
<p style="text-align: justify; ">In February, the Indian government <a href="https://www.dw.com/en/india-targets-twitter-whatsapp-with-new-regulatory-rules/a-56708566">introduced new guidelines</a> to regulate digital content on rapidly growing social media platforms.</p>
<p style="text-align: justify; ">The so-called Intermediary Guidelines are aimed at regulating content on internet platforms such as Twitter and Facebook, making them more accountable to legal requests for the removal of posts and sharing information about the originators of messages.</p>
<p style="text-align: justify; ">Employees at these companies can be held criminally liable for not complying with the government's requests.</p>
<p style="text-align: justify; ">Large social media firms must also set up mechanisms to address grievances and appoint executives to liaise with law enforcement under the new rules, as well as appoint an India-based compliance officer who would be held criminally liable for the content on their platforms.</p>
<p style="text-align: justify; ">The Indian government says the rules empower "users who become victims of defamation, morphed images, sexual abuse," among other online crimes. It also said that the rules seek to tackle the problem of disinformation.</p>
<p style="text-align: justify; ">But critics fear that the rules could be used to target government opponents and make sure dissidents don't use the platforms.</p>
<p style="text-align: justify; ">Social media companies were expected to comply with the new rules by May 25.</p>
<p style="text-align: justify; ">Some Indian media reports have recently said that Twitter lost its status as an "intermediary" and the legal protection that came with it, due to its failure to comply with the new rules.</p>
<h3 style="text-align: justify; ">Failure to comply and serious implications</h3>
<p style="text-align: justify; ">Apar Gupta, the executive director of the Internet Freedom Foundation, a New Delhi-based digital rights advocacy group, says failure to comply with the rules could threaten Twitter's India operations.</p>
<p style="text-align: justify; ">"Not complying with the rules would pose a real risk to Twitter's operational environment," he told DW.</p>
<p style="text-align: justify; ">"It will need to go to court to defend itself each time criminal prosecutions are launched against it," he added.</p>
<p style="text-align: justify; ">The first case against Twitter was filed last week, where it was charged with failing to stop the spread of a video on its platform that allegedly incited "hate and enmity" between two religious groups.</p>
<p style="text-align: justify; "><span>'Heavy censorship'</span></p>
<p style="text-align: justify; ">Gupta says adhering to all the government's demands would substantially change Twitter.</p>
<p style="text-align: justify; ">"Absolute compliance would mean heavy censorship of individual tweets, removal of the manipulated media tags, and blocking/suspension of accounts at the government's behest," he said.</p>
<p style="text-align: justify; ">Torsha Sarkar, policy officer at the Bengaluru-based Centre for Internet and Society, fears that Twitter might at times be compelled to overcomply with government demands, threatening user rights.</p>
<p style="text-align: justify; ">"This can be either by over-complying with flawed information requests, thereby selling out its users, or taking down content that offends the majoritarian sensibilities," she told DW.</p>
<p style="text-align: justify; ">Last week, three special rapporteurs appointed by a top UN human rights body expressed "serious concerns" that certain parts of the guidelines "may result in the limiting or infringement of a wide range of human rights."</p>
<p style="text-align: justify; ">They urged New Delhi to review the rules, adding that they did not conform to India's international human rights obligations and could threaten the digital rights of Indians.</p>
<h3 style="text-align: justify; ">Twitter's balancing act</h3>
<p style="text-align: justify; ">It is not the first time that Twitter has been accused of giving in to government pressure to censor content on its platform.</p>
<p style="text-align: justify; ">At the height of the long-running farmer protests, <a href="https://www.dw.com/en/farmer-protests-india-blocks-prominent-twitter-accounts-detains-journalists/a-56411354">Twitter blocked hundreds of tweets</a> and accounts, including the handle of a prominent news magazine. It subsequently unblocked them following public outrage.</p>
<p style="text-align: justify; ">The US company stopped short of complying with demands to block the accounts of activists, politicians and journalists, arguing that such a move would "violate their fundamental right to free expression under Indian law."</p>
<p style="text-align: justify; ">According to local media reports, Twitter's Indian executives were reportedly threatened with fines and imprisonment if the accounts were not taken down.</p>
<h3 style="text-align: justify; ">Special police notify Twitter offices</h3>
<p style="text-align: justify; ">Last month, the labeling of a tweet by a politician from the ruling BJP as "manipulated media" prompted a special unit of the <a href="https://www.dw.com/en/india-police-visit-twitter-offices-over-manipulated-tweet/a-57650193">Delhi police to visit Twitter's offices</a> in the capital and neighboring Gurgaon. Police notified the offices about an investigation into the labeling of the post.</p>
<p style="text-align: justify; ">Twitter India's managing director, Manish Maheswari, was said to have been asked to appear before the police for questioning, according to media reports.</p>
<p style="text-align: justify; ">Some Twitter employees have refused to talk about the ongoing tensions for fear of government reprisals.</p>
<p style="text-align: justify; ">"Such kind of intimidation does not happen every day. (But) Everyone at Twitter India is terrified," people familiar with the matter told DW on the condition of anonymity.</p>
<h3 style="text-align: justify; ">Big Tech VS sovereign power?</h3>
<p style="text-align: justify; ">Those calling for better regulation of tech giants say transnational <a href="https://www.dw.com/en/india-social-media-conflict/a-57702394">social media companies like Twitter lack accountability</a>, blaming them for the alleged inaction against online abuse and disinformation campaigns.</p>
<p style="text-align: justify; ">"The problem with these rules is that they centralize greater power toward the government without providing for the objective benefit of rights toward users," Gupta said.</p>
<p style="text-align: justify; ">"If Twitter were to comply with these rules, it would make a bad situation worse," he said.</p>
<p style="text-align: justify; ">Twitter is unlikely to ditch a major market such as India.</p>
<p style="text-align: justify; ">Sarkar from the Centre for Internet and Society said "It might be difficult to say how the powers of big tech are going to collide with sovereign nations, especially in light of flawed legal interventions around the world."</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/dw-june-21-2021-aditya-sharma-twitter-india-troubles-show-tough-path-ahead-for-digital-platforms'>http://editors.cis-india.org/internet-governance/news/dw-june-21-2021-aditya-sharma-twitter-india-troubles-show-tough-path-ahead-for-digital-platforms</a>
</p>
No publisherAditya SharmaSocial MediaInternet GovernanceIntermediary LiabilityInformation Technology2021-06-26T02:54:19ZNews ItemTo regulate Net intermediaries or not is the question
http://editors.cis-india.org/internet-governance/www-deccan-herald-aug-26-2012-to-regulate-net-intermediaries-or-not-is-the-question
<b>Given the disruption to public order caused by the mass exodus of North-Eastern Indians from several cities, the government has had for the first time in many years, a legitimate case to crackdown on Internet intermediaries and their users.</b>
<hr />
<p style="text-align: justify; ">Sunil's column was <a class="external-link" href="http://www.deccanherald.com/content/274218/to-regulate-net-intermediaries-not.html">published</a> in the Deccan Herald on August 26, 2012.</p>
<hr />
<p style="text-align: justify; ">There was, of course, much room for improvement in the manner in which the government conducted the censorship. But the policy question that becomes most pertinent now is: do we need to regulate Internet intermediaries further? The answer is yes and no. <br /> <br /> There are areas where these intermediaries need to be regulated in order to protect citizen and consumer interest. But to deal with rumour-mongering and hate speech, there is sufficient provisions in Indian law to deal with the current disruption in public order and any similar disruptions in the future. <br /> <br /> It is a common misunderstanding to assume that all civil society organisations that advocate civil liberties on networked technologies are regulatory doves that wish to dismantle regulation of the private sector and allow them complete free hand for innovation and, perhaps, causing harm to public interest.<br /> <br /> The opposite is also not necessarily true. We are not hawks, those that believe in maximal regulation of the private sector. The state should regulate the private sector in areas where the citizens are unable to protect their own interest and self-regulation is inadequate. But there are many other areas where regulation needs to be dismantled in the interests of citizen and public interest. <br /> <br /> Dr Rohan Samarajiva, founder of a Colombo-based regional policy think tank LIRNEasia, explains this best using the ‘law of soft toys’. When his daughter was young he told her that in Sri Lanka there was a law which mandated that every time she got a new soft toy, she would have to necessarily give away another one.<br /> <br /> The regulatory lesson here is: the mandate for regulation cannot keep endlessly expanding. As the government moves into new areas of regulation, it should also exit other older areas where regulatory rupee is providing limited returns. These decisions should be based on evidence of harm caused to citizens and consumers. The following are a list of areas where regulation is required for Internet intermediaries:<br /> <br /> Privacy: India needs the office of the privacy commissioner established and an articulation of national privacy principles through the enactment of the long awaited Privacy Act. This privacy commissioner should be able to investigate complaints against intermediaries, proactively investigate companies, order remedial action and fine companies that violate the principles and other policies in force. Remedial action could require change in policies, features, data retention policies and services etc. <br /> <br /> Competition: Many of these intermediaries have been taken to court on anti-trust complaints, fined and subjected to remedial action by regulators in America and Europe. <br /> <br /> Earlier this year, BharatMatrimony.com has filed a complaint against Google at the Competition Commission of India (CCI) alleging anti-competitive practices in its Adwords program. In addition, based on a report submitted by Consumer Unity & Trust Society (CUTS), a civil society organisation, CCI has initiated an investigation into Google's search engine for anti-competitive practices. If they are found guilty of breaking competition law they could be fined up to 10 per cent of their turnover.<br /> <br /> Speech: Article 19(2) of the Constitution permits Parliament to enact laws that place eight categories of reasonable restrictions on speech. Unfortunately, the Information Technology Act and its associated rules attempts to expand these restrictions and in addition does not comply with the principles of natural justice. Ideally, all those impacted by the censorship should be informed and should be able to seek redress and reinstatement for the censured speech.<br /> <br /> The policy sting operation conducted by the Centre for Internet and Society (CIS) last year demonstrated that intermediaries are risk-averse and tend to over-comply with takedown notices. There is a clear chilling effect on speech online and it is important that the Act and rules be amended at the earliest.<br /> <br /> Intellectual Property: Policies that fall under this inappropriate umbrella term for many differently configured laws make the yet unproven fundamental assumption that granting limited monopolies to rights holders, usually corporations, will result in greater innovation. However, citizen and consumer interest is protected through provisions for exceptions and limitations in laws such as copyright, patent, trademarks etc. Some examples of these safeguards that guarantee access to knowledge in Indian law include compulsory licences, patent opposition, fair-dealing etc. <br /> <br /> There are many other areas where special treatment may be required for intermediaries. For example tax law needs to handle evasion techniques like the Double Irish and the Dutch Sandwich. Given my lengthy wish-list of regulation of Internet intermediaries, why then has CIS become an NGO member of the Global Network Initiative?<br /> <br /> This is because I believe that technological development happen too quickly for us to purely depend on government regulation. Self-regulation has an important role to play in keeping up with these rapid changes. As self-regulatory norms mature they could be formalised into policy by the government.<br /> <br /> Therefore, I consider it a privilege that CIS has been accepted as a member of this self-regulatory initiative and we influence GNI norms using our Indian perspective. However, when self-regulation fails to protect public interest, then the government must step in to regulate Internet intermediaries.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/www-deccan-herald-aug-26-2012-to-regulate-net-intermediaries-or-not-is-the-question'>http://editors.cis-india.org/internet-governance/www-deccan-herald-aug-26-2012-to-regulate-net-intermediaries-or-not-is-the-question</a>
</p>
No publishersunilFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceIntermediary LiabilityCensorship2012-08-26T06:12:48ZBlog EntryThe Take Down of Free Speech Online
http://editors.cis-india.org/news/newslaundry-april-1-2014-somi-das-the-take-down-of-free-speech-online
<b>As part of a study to access rate of compliance, in 2011, the Centre for Internet and Society Bangalore sent frivolous “take down” requests to seven prominent intermediaries. The study showed exactly how easy it is to take down online content. </b>
<p style="text-align: justify; ">This was published in <a class="external-link" href="http://www.newslaundry.com/2014/04/01/the-take-down-of-free-speech-online/">Newslaundry</a> on April 1, 2014. CIS research on Intermediary Liabilities is quoted.</p>
<hr />
<p style="text-align: justify; ">CIS found that six out of the seven intermediaries “<a href="http://editors.cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet" target="_blank">over complied</a>” with the notices. Facts such as these about intermediary liability were discussed in a panel discussion “Intermediary Liability & Freedom of Expression in India” in Delhi on March 27, 2014 organised by Centre for Communication Governance at National Law University in collaboration with the Global Network Initiative.</p>
<p style="text-align: justify; ">The panel also included Professor Ranbir Singh, Vice Chancellor of NLU, Jermyn Brooks<b><i> (</i></b>Independent Chair – Global Network Initiative, Washington DC), Shyam Divan (Senior Advocate, Supreme Court of India) and SiddharthVaradarajan (Journalist). They discussed proxy censorship by government through private players and how e-business’ lose out on opportunities because of the current legal framework in the country within which intermediaries have to function.</p>
<p style="text-align: justify; ">According to<a href="http://www.indiankanoon.org/doc/1752240/" target="_blank"> Section 2(1)(w) of The Information Technology Act, 2000,</a> “intermediary”- with respect to any particular electronic message -signifies any person who on behalf of another person receives, stores or transmits that message or provides any service with respect to that message.According to Rishab Dara, recipient of the Google policy Fellowship 2011, in an article titled, <a href="http://editors.cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet" target="_blank">Intermediary Liability in India: Chilling Effects on Free Expression on the Internet</a>, “intermediaries are widely recognised as essential cogs in the wheel of exercising the right to freedom of expression on the Internet. Most major jurisdictions around the world have introduced legislations for limiting intermediary liability in order to ensure that this wheel does not stop spinning”.</p>
<p style="text-align: justify; ">The “safe harbor”or what is also known asIntermediary Liability Laws according to Section 79 of the Information Technology Act are given below:</p>
<h3 style="text-align: justify; ">Intermediaries not to be Liable in Certain Cases</h3>
<p style="text-align: justify; "><i>(1) Notwithstanding anything contained in any law for the time being in force but subject to the provisions of sub-sections (2) and (3), an intermediary shall not be liable for any third party information, data, or communication link made available or hosted by him. </i></p>
<p style="text-align: justify; "><i> (2) The provisions of sub-section (1) shall apply if—</i></p>
<p style="text-align: justify; "><i>(a) the function of the intermediary is limited to providing access to a communication system over which information made available by third parties is transmitted or temporarily stored or hosted; or </i></p>
<p style="text-align: justify; "><i>(b) the intermediary does not—</i></p>
<p style="text-align: justify; "><i> (i) initiate the transmission,</i></p>
<p style="text-align: justify; "><i>(ii) select the receiver of the transmission, and</i></p>
<p style="text-align: justify; "><i>(iii) select or modify the information contained in the transmission;</i></p>
<p style="text-align: justify; "><i>(c) the intermediary observes due diligence while discharging his duties under this Act and also observes such other guidelines as the Central Government may prescribe in this behalf.</i></p>
<p style="text-align: justify; "><i>(3) The provisions of sub-section (1) shall not apply if—</i></p>
<p style="text-align: justify; "><i>(a) the intermediary has conspired or abetted or aided or induced, whether by threats or promise or othorise in the commission of the unlawful act;</i></p>
<p style="text-align: justify; "><i>(b) upon receiving actual knowledge, or on being notified by the appropriate Government or its agency that any information, data or communication link residing in or connected to a computer resource controlled by the intermediary is being used to commit the unlawful act, the intermediary fails to expeditiously remove or disable access to that material on that resource without vitiating the evidence in any manner.</i></p>
<p style="text-align: justify; ">Under the Act, the intermediary needs to act on a complaint within 36 hours of a take down notice -failing which they will be liable to legal action if the case is taken to the court.</p>
<p style="text-align: justify; ">Shyam Divan spoke about the absurdity of the 36-hour turnaround time that an intermediary has between receiving a complaint and taking down the content. According to him, without any kind of legal option to fall back on, intermediaries decide to comply with such take down notices fearing “serious penalties and possibility of prosecution” which results in “indirect censorship”. He also said, “Domestic constitution in itself is not going to be sufficient”. “Meta-constitutions” which are transnational and have uniform laws across countries could be a possible solution to the current confusion as the internet is a global phenomenon and it would ensure that “the extent of our online rights would not be limited to the constitution of the country”.</p>
<p style="text-align: justify; ">Giving the example of hate speech, Siddharth Varadarajan, mentioned the Indian executive’s different approaches towards different mediums. Referring to hate speeches made during the 1993 Bombay riots by Shiv Sena leaders and those made during the 2002 Gujarat riots, he said, “Hate speech never gets prosecuted when made amid a physical crowd in a volatile situation.I can understand why politicians won’t be prosecuted but why so much sensitivity on online content. This paradox is worth reflecting on.Despite its limited reach, the executive reacts in such a hyper-sensitive manner”.He adds that as the editor of a news website one faces daily problems in taking decisions on online content especially on comment moderation and whether the website would be responsible for a certain comment made by a reader. Echoing Shyam Divan’s views,he said that in India more than the punishment, when a case is filed, the legal process itself becomes a punishment, which forces Internet Service Providers to comply with requests of blocking online content.</p>
<p style="text-align: justify; ">The Global Network Initiative is a Washington-based organisation that provides a framework for companies to deal with governments requesting censorship or surveillance of online content, “rooted in international standards legal framework also interesting people”. According to a report released by it, “provided that the existing safe harbour regime is improved, intermediaries can become a significant part of the economy and their GDP contribution may increase to more than 1.3 per cent by 2015. The potential corresponds to $41 billion by 2015”.Jermyn Brooks<b><i>,</i></b>Independent Chair of GNI,argued that instead of focusing all efforts on ensuring that the Information Technology (Intermediaries Guidelines) Rules, 2011 gets struck down by Courts for its unconstitutionality, there should also be a movement to effect policy changes through the amendment of the law. According to him, such a proposition would be more lucrative for a government looking for “re-invigoration of economic growth in India”.</p>
<p style="text-align: justify; ">The discussion was significant in the light that a number of cases related to the IT Act and freedom of online speech will be heard in the Supreme Court in the coming months. A petition by <i>Mouthshut.com </i>challenges the Information Technology (Intermediaries Guidelines) Rules 2011 “which effectively creates a notice and takedown regime for content hosted by intermediaries”. Another important case up for hearing is a petition by Member of Parliament Rajeev Chandrashekhar,“which also challenges these rules on grounds that they are ambiguous, require private parties to subjectively assess objectionable content, and that they undermine the safe harbour exemptions from liability granted to intermediaries by section 79 of the IT Act”. The People’s Union for Civil Liberties (PUCL<i>) </i>has challenged the Intermediaries Guidelines rules as well as the Procedure and Safeguards for Blocking for Access of Information by the Public Rules 2009. “This petition has pointed to the lack of transparency in the blocking procedure, which does not currently offer the public any notice or reasons for the blocking.”</p>
<p style="text-align: justify; ">“The cases pending before the Supreme Court will have a significant impact on the freedom of expression. We should never take our rights for granted – the interpretation of these rights needs to be consistent with their spirit”, said Professor Ranbir Singh.</p>
<p style="text-align: justify; ">Citing the recent example of the <a href="http://timesofindia.indiatimes.com/india/After-Penguin-another-publisher-recalls-Wendy-Donigers-book/articleshow/31426314.cms" target="_blank">Wendy Doniger</a> episode, Varadarajan says, “If Penguin chooses to pack up at the District court level, you know how Internet Service Providers would react to take down notices…Specific targeting of online speech would ultimately have a negative impact on the traditional media”. And that is the crux of the matter. In the absence of intermediate liability not being limited, online censorship and the curtailment of the freedom of speech will become far easier and will only worsen.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/news/newslaundry-april-1-2014-somi-das-the-take-down-of-free-speech-online'>http://editors.cis-india.org/news/newslaundry-april-1-2014-somi-das-the-take-down-of-free-speech-online</a>
</p>
No publisherpraskrishnaInternet GovernanceIntermediary Liability2014-04-06T05:19:50ZNews ItemThe Supreme Court Judgment in Shreya Singhal and What It Does for Intermediary Liability in India?
http://editors.cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability
<b>Even as free speech advocates and users celebrate the Supreme Court of India's landmark judgment striking down Section 66A of the Information Technology Act of 2000, news that the Central government has begun work on drafting a new provision to replace the said section of the Act has been trickling in.</b>
<p style="text-align: justify; ">The SC judgement in upholding the constitutionality of Section 69A (procedure for blocking websites) and in reading down Section 79 (exemption from liability of intermediaries) of the IT Act, raises crucial questions regarding transparency, accountability and under what circumstances may reasonable restrictions be placed on free speech on the Internet. While discussions and analysis of S. 66A continue, in this post I will focus on the aspect of the judgment related to intermediary liability that could benefit from further clarification from the apex court and in doing so, will briefly touch upon S. 69A and secret blocking.</p>
<h3 style="text-align: justify; ">Conditions qualifying intermediary for exemption and obligations not related to exemption</h3>
<p align="JUSTIFY">The intermediary liability regime in India is defined under S. 79 and assosciated rules that were introduced to protect intermediaries for liability from user generated content and ensure the Internet continues to evolve as a <i>“marketplace of ideas”</i>. But as intermediaries may not have sufficient legal competence or resources to deliberate on the legality of an expression, they may end up erring on the side of caution and takedown lawful expression. As a study by Centre for Internet and Society (CIS) in 2012 revealed, the criteria, procedure and safeguards for administration of the takedowns as prescribed by the rules lead to a chilling effect on online free expression.</p>
<p align="JUSTIFY"><span><span><span>S. 69A grants powers to the Central Government to </span></span></span><span><i><span>“issue directions for blocking of public access to any information through any computer resource”.</span></i></span><span><span><span> The 2009 </span></span></span><span><span><span>rules </span></span></span><span><span><span>allow the blocking of websites by a court order, </span></span></span><span><span><span>and </span></span></span><span><span><span>sets in place a review committee to review the decision to block websites </span></span></span><span><span><span>a</span></span></span><span><span><span>s also establishes </span></span></span><span><span><span>penalt</span></span></span><span><span><span>ies </span></span></span><span><span><span>for the intermediary </span></span></span><span><span><span>that fails to extend cooperation in this respect. </span></span></span></p>
<p align="JUSTIFY"><span><span><span>There are two key aspects of both these provisions that must be noted:</span></span></span></p>
<p align="JUSTIFY">a) S. 79 is an exemption provision that qualifies the intermediary for conditional immunity, as long as they fulfil the conditions of the section. The judgement notes this distinction, adding that “<i>being an exemption provision, it is closely related to provisions which provide for offences including S. 69A.”</i></p>
<p align="JUSTIFY"><span><span><span>b) S. 69A does not contribute to immunity for the intermediary rather places additional obligations on the intermediary and as the judgement notes </span></span></span><span><i><span>“intermediary who finally fails to comply with the directions issued who is punishable under sub-section (3) of 69A.”</span></i></span><span><span><span> The provision though outside of the conditional immunity liability regime enacted through S. 79 contributes to the restriction of access to, or removing content online by placing liability on intermediaries to block unlawful third party content or information that is being generated, transmitted, received, stored or hosted by them. Therefore restriction requests must fall within the contours outlined in Article 19(2) and include principles of natural justice and elements of due process.</span></span></span></p>
<h3 align="JUSTIFY">Subjective Determination of Knowledge</h3>
<p align="JUSTIFY">The provisions for exemption laid down in S. 79 do not apply when they receive <i>“actual knowledge” </i>of illegal content under section 79(3)(b). Prior to the court's verdict actual knowledge could have been interpreted to mean the intermediary is called upon its own judgement under sub-rule (4) to restrict impugned content in order to seek exemption from liability. Removing the need for intermediaries to take on an adjudicatory role and deciding on which content to restrict or takedown, the SC has read down <i>“actual knowledge”</i> to mean that there has to be a court order directing the intermediary to expeditiously remove or disable access to content online. The court also read down <i>“upon obtaining knowledge by itself”</i> and <i>“brought to actual knowledge”</i> under Rule 3(4) in the same manner as 79(3)(b).</p>
<p align="JUSTIFY"><span><span><span>Under S.79(3)(b) the intermediary must comply with the orders from the executive in order to qualify for immunity. Further, S. 79 (3)(b) goes beyond the specific categories of restriction identified in Article 19(2) by including the term </span></span></span><span><i><span>“unlawful acts”</span></i></span><span><span><span> and places the executive in an adjudicatory role of determining the illegality of content. The government cannot emulate private regulation as it is bound by the Constitution and the court addresses this issue by applying the limitation of 19(2) on unlawful acts, </span></span></span><span><i><span>“the court order and/or the notification by the appropriate government or its agency must strictly conform to the subject matters aid down in Article 19(2).”</span></i></span><span><span><span> </span></span></span></p>
<p align="JUSTIFY"><span><span><span>By reading down of S. 79 (3) (b) the court has addressed the issue of intermediaries </span></span></span><span><span><span>complying with tak</span></span></span><span><span><span>edown requests from non-government entities and </span></span></span><span><span><span>has </span></span></span><span><span><span>made government notifications and court orders to be consistent with reasonable restrictions in Article 19(2). This is an important clarification from the court, because this places limits on the private censorship of intermediaries and the invisible censorship of opaque government takedown requests as they must </span></span></span><span><span><span>and should </span></span></span><span><span><span>adhere, to </span></span></span><span><span><span>the </span></span></span><span><span><span>boundaries set by Article 19(2).</span></span></span></p>
<h3><span><span><span>Procedural Safeguards</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The SC does not touch upon other parts of the rules and in not doing so, has left significant procedural issues open for debate. It is relevant to bear in mind and as established above, S. 69A blocking and restriction requirements for the intermediary are part of their additional obligations and do not qualify them for immunity. The court ruled in favour of upholding S. 69A as constitutional on the basis that blocking orders are issued when the executive has sufficiently established that it is absolutely necessary to do so, and that the necessity is relatable to only some subjects set out in Article 19(2). Further the court notes that reasons for the blocking orders must be recorded in writing so that they may be challenged through writ petitions. The court also goes on to specify that under S. 69A the intermediary and the 'originator' if identified, have the right to be heard before the committee decides to issue the blocking order. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>Under S. 79 the intermediary must also comply with government restriction orders and the procedure for notice and takedown is not sufficiently transparent and lacks procedural safeguards that have been included in the notice and takedown procedures under S. 69. For example, there is no requirement for committee to evaluate the necessity of issuing the restriction order, though the ruling does clarify that these restriction notices must be within the confines of Article 19(2). The judgement could have gone further to directing the government to state their entire cause of action and provide reasonable level of proof (prima facie). It should have also addressed issues such as the government using extra-judicial measures to restrict content including collateral pressures to force changes in terms of service, to promote or enforce so-called "voluntary" practices. </span></span></span></p>
<h3><span><span><span>Accountability</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The judgement could also have delved deeper into issues of accountability such as the need to consider 'udi alteram partem' by providing the owner of the information or the intermediary a hearing prior to issuing the restriction or blocking order nor is an post-facto review or appeal mechanism made available except for the recourse of writ petition. Procedural uncertainty around wrongly restricted content remains, including what limitations should be placed on the length, duration and geographical scope of the restriction. The court also does not address the issue of providing a recourse for the third party provider of information to have the removed information restored or put-back remains unclear. Relatedly, the court also does not clarify the concerns related to frivolous requests by establishing penalties nor is there a codified recourse under the rules presently, for the intermediary to claim damages even if it can be established that the takedown process is being abused.</span></span></span></p>
<h3><span><span><span>Transparency</span></span></span></h3>
<p style="text-align: justify; "><span><span><span>The bench in para 113 in addressing S. 79 notes that the intermediary in addition to publishing rules and regulations, privacy policy and user agreement for access or usage of their service has to also inform users of the due diligence requirements including content restriction policy under rule 3(2). However, the court ought to have noted the differentiation between different categories of intermediaries which may require different terms of use. Rather than stressing a standard terms of use as a procedural safeguard, the court should have insisted on establishing terms of use and content restriction obligations that is proportional to the role of the intermediary and based on the liability accrued in providing the service, including the impact of the restriction by the intermediary both on access and free speech. By placing requirement of disclosure or transparency on the intermediary including what has been restricted under the intermediary's own terms of service, the judgment could have gone a step further than merely informing users of their rights in using the service as it stands presently, to ensuring that users can review and have knowledge of what information has been restricted and why. The judgment also does not touch upon broader issues of intermediary liability such as proactive filtering sought by government and private parties, an important consideration given the recent developments around the right to be forgotten in Europe and around issues of defamation and pornography in India. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>The judgment, while a welcome one in the direction of ensuring the Internet remains a democratic space where free speech thrives, could benefit from the application of the recently launched Manila principles developed by CIS and others. The Manila Principles is a framework of baseline safeguards and best practices that should be considered by policymakers and intermediaries when developing, adopting, and reviewing legislation, policies and practices that govern the liability of intermediaries for third-party content. </span></span></span></p>
<p style="text-align: justify; "><span><span><span>The court's ruling is truly worth celebrating, in terms of the tone it sets on how we think of free speech and the contours of censorship that exist in the digital space. But the real impact of this judgment lies in the debates and discussions which it will throw open about content removal practices that involve intermediaries making determinations on requests received, or those which only respond to the interests of the party requesting removal. As the Manila Principles highlight a balance between public and private interests can be obtained through a mechanism where power is distributed between the parties involved, and where an impartial, independent, and accountable oversight mechanism exists. <br /></span></span></span></p>
<p><span><span><span><br /></span></span></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability'>http://editors.cis-india.org/internet-governance/blog/sc-judgment-in-shreya-singhal-what-it-means-for-intermediary-liability</a>
</p>
No publisherjyotiIT ActCensorshipFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityChilling Effect2015-04-17T23:59:34ZBlog EntryThe Ministry And The Trace: Subverting End-To-End Encryption
http://editors.cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption
<b>A legal and technical analysis of the 'traceability' rule and its impact on messaging privacy.</b>
<p> </p>
<p>The paper was published in the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">NUJS Law Review Volume 14 Issue 2 (2021)</a>.</p>
<hr />
<h2>Abstract</h2>
<div class="justify">
<div class="pbs-main-wrapper">
<p>End-to-end
encrypted messaging allows individuals to hold confidential
conversations free from the interference of states and private
corporations. To aid surveillance and prosecution of crimes, the Indian
Government has mandated online messaging providers to enable
identification of originators of messages that traverse their platforms.
This paper establishes how the different ways in which this
‘traceability’ mandate can be implemented (dropping end-to-end
encryption, hashing messages, and attaching originator information to
messages) come with serious costs to usability, security and privacy.
Through a legal and constitutional analysis, we contend that
traceability exceeds the scope of delegated legislation under the
Information Technology Act, and is at odds with the fundamental right to
privacy.</p>
<p> </p>
<p>Click here to read the <a class="external-link" href="http://nujslawreview.org/2021/07/09/the-ministry-and-the-trace-subverting-end-to-end-encryption/">full paper</a>.</p>
</div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption'>http://editors.cis-india.org/internet-governance/blog/the-ministry-and-the-trace-subverting-end-to-end-encryption</a>
</p>
No publisherGurshabad Grover, Tanaya Rajwade and Divyank KatiraCryptographyIntermediary LiabilityConstitutional LawInternet GovernanceMessagingEncryption Policy2021-07-12T08:18:18ZBlog EntryThe Case of Whatsapp Group Admins
http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins
<b></b>
<p style="text-align: justify; ">Censorship laws in India have now roped in group administrators of chat groups on instant messaging platforms such as Whatsapp (<i>group admin(s)</i>) for allegedly objectionable content that was posted by other users of these chat groups. Several incidents<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn1">[1]</a> were reported this year where group admins were arrested in different parts of the country for allowing content that was allegedly objectionable under law. A few reports mentioned that these arrests were made under Section 153A<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn2">[2]</a> read with Section 34<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn3">[3]</a> of the Indian Penal Code (<i>IPC</i>) and Section 67<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn4">[4]</a> of the Information Technology Act (<i>IT Act</i>).</p>
<p style="text-align: justify; "><span>Targeting of a group admin for content posted by other members of a chat group has raised concerns about how this liability is imputed. Whether a group admin should be considered an intermediary under Section 2 (w) of the IT Act? If yes, whether a group admin would be protected from such liability?</span></p>
<h3><strong>Group admin as an intermediary</strong></h3>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; ">Whatsapp is an instant messaging platform which can be used for mass communication by opting to create a chat group. A chat group is a feature on Whatsapp that allows joint participation of Whatsapp users. The number of Whatsapp users on a single chat group can be up to 100. Every chat group has one or more group admins who control participation in the group by deleting or adding people. <a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn5">[5]</a> It is imperative that we understand that by choosing to create a chat group on Whatsapp whether a group admin can become liable for content posted by other members of the chat group.</p>
<p style="text-align: justify; "><span>Section 34 of the IPC provides that when a number of persons engage in a criminal act with a common intention, each person is made liable as if he alone did the act. Common intention implies a pre-arranged plan and acting in concert pursuant to the plan. It is interesting to note that group admins have been arrested under Section 153A on the ground that a group admin and a member posting content on a chat group that is actionable under this provision have common intention to post such content on the group. But would this hold true when for instance, a group admin creates a chat group for posting lawful content (say, for matchmaking purposes) and a member of the chat group posts content which is actionable under law (say, posting a video abusing Dalit women)? Common intention can be established by direct evidence or inferred from conduct or surrounding circumstances or from any incriminating facts.</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn6">[6]</a></p>
<p style="text-align: justify; "><span>We need to understand whether common intention can be established in case of a user merely acting as a group admin. For this purpose it is necessary to see how a group admin contributes to a chat group and whether he acts as an intermediary.</span></p>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; "><span>We know that parameters for determining an intermediary differ across jurisdictions and most global organisations have categorised them based on their role or technical functions.</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn7">[7]</a><span> Section 2 (w) of the Information Technology Act, 2000 (</span><i>IT Act</i><span>) defines an intermediary as </span><i>any person, who on behalf of another person, receives, stores or transmits messages or provides any service with respect to that message</i><span> </span><i>and includes the telecom services providers, network providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online marketplaces and cyber cafés</i><span>. Does a group admin receive, store or transmit messages on behalf of group participants or provide any service with respect to messages of group participants or falls in any category mentioned in the definition? Whatsapp does not allow a group admin to receive, or store on behalf of another participant on a chat group. Every group member independently controls his posts on the group. However, a group admin helps in transmitting messages of another participant to the group by allowing the participant to be a part of the group thus effectively providing service in respect of messages. A group admin therefore, should be considered an intermediary. However his contribution to the chat group is limited to allowing participation but this is discussed in further detail in the section below.</span></p>
<p style="text-align: justify; "><span>According to the Organisation for Economic Co-operation and Development (OECD), in a 2010 report</span><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn8">[8]</a><span>, an internet intermediary brings together or facilitates transactions between third parties on the Internet. It gives access to, hosts, transmits and indexes content, products and services originated by third parties on the Internet or provide Internet-based services to third parties. A Whatsapp chat group allows people who are not on your list to interact with you if they are on the group admins’ contact list. In facilitating this interaction, according to the OECD definition, a group admin may be considered an intermediary.</span></p>
<h3><strong>Liability as an intermediary</strong></h3>
<p style="text-align: justify; "><strong> </strong></p>
<p style="text-align: justify; ">Section 79 (1) of the IT Act protects an intermediary from any liability under any law in force (for instance, liability under Section 153A pursuant to the rule laid down in Section 34 of IPC) if an intermediary fulfils certain conditions laid down therein. An intermediary is required to carry out certain due diligence obligations laid down in Rule 3 of the Information Technology (Intermediaries Guidelines) Rules, 2011 (<i>Rules</i>). These obligations include monitoring content that infringes intellectual property, threatens national security or public order, or is obscene or defamatory or violates any law in force (Rule 3(2)).<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn9">[9]</a> An intermediary is liable for publishing or hosting such user generated content, however, as mentioned earlier, this liability is conditional. Section 79 of IT Act states that an intermediary would be liable only if it initiates transmission, selects receiver of the transmission and selects or modifies information contained in the transmission that falls under any category mentioned in Rule 3 (2) of the Rules. While we know that a group admin has the ability to facilitate sharing of information and select receivers of such information, he has no direct editorial control over the information shared. Group admins can only remove members but cannot remove or modify the content posted by members of the chat group. An intermediary is liable in the event it fails to comply with due diligence obligations laid down under rule 3 (2) and 3 (3) of the Rules however, since a group admin lacks the authority to initiate transmission himself and control content, he can’t comply with these obligations. Therefore, a group admin would be protected from any liability arising out of third party/user generated content on his group pursuant to Section 79 of the IT Act.</p>
<p style="text-align: justify; "><span>It is however relevant to note whether the ability of a group admin to remove participants amounts to an indirect form of editorial control.</span></p>
<h3><strong>Other pertinent observations</strong></h3>
<p style="text-align: justify; "><strong><span> </span></strong></p>
<p style="text-align: justify; ">In several reports<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn10">[10]</a> there have been discussions about how holding a group admin liable makes the process convenient as it is difficult to locate all the users of a particular group. This reasoning may not be correct as the Whatsapp policy<a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn11">[11]</a> makes it mandatory for a prospective user to provide his mobile number in order to use the platform and no additional information is collected from group admins which may justify why group admins are targeted. Investigation agencies can access mobile numbers of Whatsapp users and gain more information from telecom companies.</p>
<p style="text-align: justify; "><span>It is also interesting to note that the group admins were arrested after a user or someone familiar to a user filed a complaint with the police about content being objectionable or hurtful. Earlier this year, the apex court had ruled in the case of </span><i>Shreya Singhal v. Union of India</i><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftn12">[12]</a><span> that an intermediary needed a court order or a government notification for taking down information. With actions taken against group admins on mere complaints filed by anyone, it is clear that the law enforcement officials have been overriding the mandate of the court.</span></p>
<h3><strong>Conclusion</strong></h3>
<p> </p>
<p><span style="text-align: justify; ">According to a study conducted by a global research consultancy, TNS Global, around 38 % of internet users in India use instant messaging applications such as Snapchat and Whatsapp on a daily basis, Whatsapp being the most widely used application. These figures indicate the scale of impact that arrests of group admins may have on our daily communication.</span></p>
<p style="text-align: justify; "><span>It is noteworthy that categorising a group admin as an intermediary would effectively make the Rules applicable to all Whatsapp users intending to create groups and make it difficult to enforce and would perhaps blur the distinction between users and intermediaries.</span></p>
<p style="text-align: justify; "><span>The critical question however is whether a chat group is considered a part of the bundle of services that Whatsapp offers to its users and not as an independent platform that makes a group admin a separate entity. Also, would it be correct to draw comparison of a Whatsapp group chat with a conference call on Skype or sharing a Google document with edit rights to understand the domain in which censorship laws are penetrating today?</span></p>
<p style="text-align: justify; "> </p>
<p style="text-align: justify; "><i>Valuable contribution by Pranesh Prakash and Geetha Hariharan</i></p>
<hr size="1" style="text-align: justify; " width="33%" />
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref1">[1]</a> <a href="http://www.nagpurtoday.in/whatsapp-admin-held-for-hurting-religious-sentiment/06250951">http://www.nagpurtoday.in/whatsapp-admin-held-for-hurting-religious-sentiment/06250951</a> ; <a href="http://www.catchnews.com/raipur-news/whatsapp-group-admin-arrested-for-spreading-obscene-video-of-mahatma-gandhi-1440835156.html">http://www.catchnews.com/raipur-news/whatsapp-group-admin-arrested-for-spreading-obscene-video-of-mahatma-gandhi-1440835156.html</a> ; <a href="http://www.financialexpress.com/article/india-news/whatsapp-group-admin-along-with-3-members-arrested-for-objectionable-content/147887/">http://www.financialexpress.com/article/india-news/whatsapp-group-admin-along-with-3-members-arrested-for-objectionable-content/147887/</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref2">[2]</a> Section 153A. “Promoting enmity between different groups on grounds of religion, race, place of birth, residence, language, etc., and doing acts prejudicial to maintenance of harmony.— (1) Whoever— (a) by words, either spoken or written, or by signs or by visible representations or otherwise, promotes or attempts to promote, on grounds of religion, race, place of birth, residence, language, caste or community or any other ground whatsoever, disharmony or feelings of enmity, hatred or ill-will between different religious, racial, language or regional groups or castes or communities…” or 2) Whoever commits an offence specified in sub-section (1) in any place of worship or in any assembly engaged in the performance of religious worship or religious ceremonies, shall be punished with imprisonment which may extend to five years and shall also be liable to fine.</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref3">[3]</a> Section 34. Acts done by several persons in furtherance of common intention – When a criminal act is done by several persons in furtherance of common intention of all, each of such persons is liable for that act in the same manner as if it were done by him alone.</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref4">[4]</a> Section 67 Publishing of information which is obscene in electronic form. -Whoever publishes or transmits or causes to be published in the electronic form, any material which is lascivious or appeals to the prurient interest or if its effect is such as to tend to deprave and corrupt persons who are likely, having regard to all relevant circumstances, to read, see or hear the matter contained or embodied in it, shall be punished on first conviction with imprisonment of either description for a term which may extend to five years and with fine which may extend to one lakh rupees and in the event of a second or subsequent conviction with imprisonment of either description for a term which may extend to ten years and also with fine which may extend to two lakh rupees."</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref5">[5]</a> https://www.whatsapp.com/faq/en/general/21073373</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref6">[6]</a> Pandurang v. State of Hyderabad AIR 1955 SC 216</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref7">[7]</a><a href="https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf">https://www.eff.org/files/2015/07/08/manila_principles_background_paper.pdf</a>; <a href="http://unesdoc.unesco.org/images/0023/002311/231162e.pdf">http://unesdoc.unesco.org/images/0023/002311/231162e.pdf</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref8">[8]</a> http://www.oecd.org/internet/ieconomy/44949023.pdf</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref9">[9]</a> Rule 3(2) (b) of the Rules</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref10">[10]</a><a href="http://www.thehindu.com/news/national/other-states/if-you-are-a-whatsapp-group-admin-better-be-careful/article7531350.ece">http://www.thehindu.com/news/national/other-states/if-you-are-a-whatsapp-group-admin-better-be-careful/article7531350.ece</a>; http://www.newindianexpress.com/states/tamil_nadu/Social-Media-Administrator-You-Could-Land-in-Trouble/2015/10/10/article3071815.ece; <a href="http://www.medianama.com/2015/10/223-whatsapp-group-admin-arrest/">http://www.medianama.com/2015/10/223-whatsapp-group-admin-arrest/</a>; <a href="http://www.thenewsminute.com/article/whatsapp-group-admin-you-are-intermediary-and-here%E2%80%99s-what-you-need-know-35031">http://www.thenewsminute.com/article/whatsapp-group-admin-you-are-intermediary-and-here%E2%80%99s-what-you-need-know-35031</a></p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref11">[11]</a> https://www.whatsapp.com/legal/</p>
<p style="text-align: justify; "><a href="file:///C:/Users/HP/Desktop/Whatsapp%20group%20admins.docx#_ftnref12">[12]</a> http://supremecourtofindia.nic.in/FileServer/2015-03-24_1427183283.pdf</p>
<div>
<div id="ftn12"></div>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins'>http://editors.cis-india.org/internet-governance/blog/the-case-of-whatsapp-group-admins</a>
</p>
No publisherJapreet GrewalIT ActIntermediary LiabilityCensorship2015-12-08T10:25:42ZBlog EntrySuper Cassettes v. MySpace (Redux)
http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace
<b>The latest judgment in the matter of Super Cassettes v. MySpace is a landmark and progressive ruling, which strengthens the safe harbor immunity enjoyed by Internet intermediaries in India. It interprets the provisions of the IT Act, 2000 and the Copyright Act, 1957 to restore safe harbor immunity to intermediaries even in the case of copyright claims. It also relieves MySpace from pre-screening user-uploaded content, endeavouring to strike a balance between free speech and censorship. CIS was one of the intervenors in the case, and has been duly acknowledged in the judgment.</b>
<p> </p>
<p>On 23rd December 2016, Justice Ravindra Bhat and Justice Deepa Sharma of the Delhi High Court delivered a decision overturning the 2012 order in the matter of Super Cassettes Industries Limited v. MySpace. The 2012 order was heavily criticized, for it was agnostic to the technological complexities of regulating speech on the Internet and cast unfathomable burdens on MySpace. In the following post I summarise the decision of the Division Bench. Click <a class="external-link" href="http://lobis.nic.in/ddir/dhc/SRB/judgement/24-12-2016/SRB23122016FAOOS5402011.pdf">here</a> to read the judgment.</p>
<h3><strong>Brief Facts</strong></h3>
<p>In 2007, Super Cassettes Industries Limited (SCIL) filed a suit against MySpace, a social networking platform, alleging copyright infringement against MySpace. The platform allowed users to upload and share media files,
<em>inter alia</em>, and it was discovered that users were sharing SCIL’s copyrighted works sans authorisation. SCIL promptly proceeded to file a civil suit against MySpace for primary infringement under section 51(a)(i)
of the Copyright Act as well as secondary infringement under section 51(a)(ii).</p>
<p> The 2012 order was extremely worrisome as it had turned the clock several decades back on concepts of internet intermediary liability. The court had held MySpace liable for copyright infringement despite it having shown no knowledge about specific instances of infringement; that it removed infringing content upon complaints; and that Super Cassettes had failed to submit songs to MySpace's song ID database. The most impractical burden of duty that the court pronounced was that MySpace was required to pre-screen content, rather than relying on post-infringement measures to remove infringing content. This was a result of interpreting due diligence to include pre-screening.</p>
<p>The court injuncted MySpace from permitting any uploads of SCIL's copyrighted content, and directed to expeditiously execute content removal requests. To read CIS' analysis of the Single Judge's interim order, click <a class="external-link" href="http://cis-india.org/a2k/blogs/super-cassettes-v-my-space">here</a>.</p>
<p>In the instant judgment, the bench limited their examination to MySpace’s liability for secondary infringement, and left the direct infringement determination to the Single Judge at the subsequent trial stage. In doing so, the court answered the following three questions:</p>
<h4>1) Whether MySpace could be said to have knowledge of infringement so as to attract liability for
secondary infringement under Section 51(a)(ii)?</h4>
<p>No. According to the Court, in the case of internet intermediaries, section 51(a)(ii) contemplates actual knowledge and not general awareness.</p>
<p>Elaborating re the circumstances of the case, the Court held that to attract liability for secondary infringement, MySpace should have had actual knowledge and not mere awareness of the infringement. Appreciating the difference between virtual and physical worlds, the judgment stated “<em>the nature of internet media is such that the interpretation of knowledge cannot be the same as that is used for a physical premise.”</em></p>
<p>As per the court, the following facts only amounted to a general awareness, which was not sufficient to establish secondary liability:</p>
<ol><li>Existence of user agreement terms which prohibited users from unauthorised uploading of content;<br />
</li><li>Operation of post-infringement mechanisms instituted by MySpace to identify and remove content;<br />
</li><li>SCIL sharing a voluminous catalogue of 100,000 copyrighted songs with MySpace, expecting the latter to monitor and quell any infringement;<br />
</li><li>Modifying videos to insert ads in them: SCIL contended that MySpace invited users to share and upload content which it would use to insert ads and make revenues – and this amounted to knowledge. The Court found that video modification for ad insertion only changed the format of the video and not the content; further, it was a pure automated process and there was no human intervention.</li></ol>
<p>Additionally, no constructive knowledge could be attributed to MySpace to demonstrate reasonable ground for believing that infringement had occurred. A reasonable belief could emerge only after MySpace had perused all the content uploaded and shared on its platform – a task that was impossible to perform due to the voluminous catalogue
handed to it and existing technological limitations.</p>
<p>The Court imposed a duty on SCIL to specify the works in which it owned copyright <em>and </em>being shared
without authorisation on MySpace. It held that merely giving names of all content it owned without expressly pointing out the infringing works was contrary to the established principles of copyright law. Further, MySpace contended and the judge agreed, that in many instances the works were legally shared by distributors and performers – and often users created remixed works which only bore semblance to the title of the copyright work.</p>
<p class="callout"><strong><em>In such cases it becomes even more important for a plaintiff such as
MySpace to provide specific titles, because while an intermediary may
remove the content fearing liability and damages, an authorized
individual’s license and right to fair use will suffer or stand negated.
(Para 38 in decision)</em></strong></p>
<p>Thus, where as MySpace undoubtedly permitted a place of profit for communication of infringing works uploaded by users, it did not have specific knowledge, nor reasonable belief of the infringement.</p>
<h4>2) Does proviso to Section 81 override the "safe harbor" granted to intermediaries under Section 79 of the IT Act, 2000?</h4>
<p>and</p>
<h4>3) Whether it was possible to harmoniously read and interpret Sections 79 and 81 of the IT Act, and Section 51 of the Copyright Act?</h4>
<p>No, the proviso does not override the safe harbor, i.e. the safe harbor
defence cannot be denied to the intermediary in the case of copyright
actions.The three sections have to be read harmoniously, indeed.</p>
<p>
The judgment referred to the Parliamentary Standing Committee report as a relevant tool in interpreting the two provisions, declaring that the rights conferred under the IT Act, 2000 are supplementary and not in derogation of the Patents Act or the Copyright Act. The proviso was inserted only to permit copyright owners to demand action
against intermediaries who may themselves post infringing content – the safe harbor only existed for circumstances when content was third party/user generated.</p>
<p class="callout"><strong><em>Given the supplementary nature of the provisions- one where infringement
is defined and traditional copyrights are guaranteed and the other
where digital economy and newer technologies have been kept in mind, the
only logical and harmonious manner to interpret the law would be to read
them together. Not doing so would lead to an undesirable situation
where intermediaries would be held liable irrespective of their due
diligence. (Para 49 in decision)</em></strong></p>
<p>Regarding section 79, the court reiterated that the section only granted a limited immunity to intermediaries by granting a <em>measured privilege to an intermediary</em>, which was in the nature of an affirmative defence and not a blanket immunity to avoid liability. The very purpose of section 79 was to regulate and limit this liability; where as the Copyright Act granted and controlled rights of a copyright owner.</p>
<p>The Court found Judge Whyte’s decision in Religious Technology Centre v. Netcom Online Communication Services (1995), to be particularly relevant to the instant case, and agreed with its observations. To recall, <em>Netcom</em> was the landmark US ruling which established that when a subscriber was responsible for direct infringement, and the service providers did nothing more than setting up and operating tech systems which were
necessary for the functioning of the Internet, it was illogical to impute liability on the service provider.</p>
<h3><strong>On MySpace Complying with Safe Harbor Requirements under Section 79 of the IT Act, 2000 (and Intermediary Rules, 2011)</strong></h3>
<p>The court held that MySpace's operations were in compliance with section 79(2)(b). The content transmission was initiated at the behest of the users, the recipients were not chosen by MySpace, neither was there modification of content. On the issue of modification, the court reasoned that since modification was an automated process (MySpace was inserting ads) which changed the format only, without MySpace's tacit or expressed control or knowledge, it was in compliance of the legislative requirement.</p>
<p class="callout"><strong><em>Despite several safeguard tools and notice and take down regimes,
infringed videos find their way. The remedy here is not to target
intermediaries but to ensure that infringing material is removed in an
orderly and reasonable manner. A further balancing act is required which
is that of freedom of speech and privatized censorship. If an
intermediary is tasked with the responsibility of identifying infringing
content from non-infringing one, it could have a chilling effect on
free speech; an unspecified or incomplete list may do that.
(Para 62 in decision)</em></strong></p>
On the second aspect of due-diligence, the court held that Mypace complied with the due diligence procedure specified in the Rules - it published rules, regulations, privacy policy and user agreement for access of usage. Reading Rule 3(4) with section 79(2)(c), the court held that it due diligence required MySpace to remove content within 36 hours of gaining actual knowledge or receiving knowledge by another person of the infringing content. <strong>If MySpace failed to take infringing content down accordingly, then only will safe harbour be denied to MySpace.</strong>
<p>This liberal interpretation of due diligence is a big win for internet intermediaries in India.</p>
<h3><strong>Additional Issues Considered by the Court</strong></h3>
<p>MySpace also tried to defend its activities by claiming the shield of the fair dealing section of the Indian Copyright Act. However, the Court refused, stating that the fair dealing defence was inapplicable to the case as the provisions protected transient and incidental storage. Whereas, in the instant circumstances, the content in question was stored/hosted permanently.</p>
<p>MySpace also contended that the Single Judge's injunction order was vague and general and had foisted unimplementable duties on MySpace, disregarding the way the Internet functioned. If MySpace had to strictly comply with the order, it would have to shut its business in India. <strong>The Court said that the Single Judge's order, if enforced, would create a system of unwarranted private censorship, running contrary to the principles of a free speech regime, devoid of considerations of peculiarities of the internet intermediary industry. </strong>Private censorship would also invite upon the ISP the legal risk of wrongfully terminating a user account.</p>
<p>Finally, the Court urged MySpace to explore and innovate techniques to protect the interests of traditional copyright holders in a more efficient manner.</p>
<h3><strong>Relief Granted</strong></h3>
<p>Setting aside the Single Judge's order aside, the Court directed SCIL to provide a specific catalogue of infringing works which also pointed to the URL of the files. Upon receiving such specific knowledge, MySpace has been directed to remove the content within 36 hours of the issued notice. MySpace will also keep an account of the removals, and the revenues earned from ads placed for calculating damages at the trial stage.</p>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace'>http://editors.cis-india.org/a2k/blogs/super-cassettes-v-myspace</a>
</p>
No publishersinhaIntermediary LiabilityCopyrightCensorshipAccess to Knowledge2017-01-18T14:31:25ZBlog EntrySuper Cassettes v. MySpace
http://editors.cis-india.org/a2k/blogs/super-cassettes-v-my-space
<b>The Delhi High Court’s judgment in Super Cassettes v. MySpace last July is worrying for a number of reasons. The court failed to appreciate the working of intermediaries online and disregard all pragmatic considerations involved. The consequences for free expression and particularly for file sharing by users of services online are especially unfavourable. </b>
<p style="text-align: justify; ">The judgment<a href="#fn*" name="fr*">[*]</a>is extremely worrying since it holds MySpace liable for copyright infringement, <b>despite</b> it having shown that it did not know, and could not have known, about each instance of infringement; that it removed each instance of alleged infringement upon mere complaint; that it asked Super Cassettes to submit their songs to their song identification database and Super Cassettes didn't.</p>
<p style="text-align: justify; ">This, in essence, means, that all 'social media services' in which there is even a <b>potential</b> for copyright infringement (such as YouTube, Facebook, Twitter, etc.) are now faced with a choice of either braving lawsuits for activities of their users that they have no control over — they can at best respond to takedown requests after the infringing material has already been put up — or to wind down their operations in India.</p>
<h2 style="text-align: justify; ">The Facts</h2>
<p style="text-align: justify; ">Aside from social networking, MySpace facilitates the sharing of content between its users. This case concerns content (whose copyright vested in T-Series) was uploaded by users to MySpace’s website. It appears that tensions between MySpace and T-Series arose in 2007, when T-Series entered into talks with MySpace to grant it licenses in its copyrighted content, while MySpace asked instead that T-Series register with its rights management programme. Neither the license nor the registration came about, and the infringing material continued to be available on the MySpace website.</p>
<p style="text-align: justify; ">Specifically, T-Series alleged that cases for primary infringement under section 51(a)(i) of the Copyright Act as well as secondary infringement under section 51 (a) (ii) could be made out. Alleging that MySpace had infringed its copyrights and so affected its earnings in royalties, T-Series approached the Delhi High Court and filed a suit seeking injunctive relief and damages. In proceedings for interim relief while the suit was pending, the court granted an injunction, but, in an appeal by MySpace, added the qualification that the content would have to be taken down only on receipt of a specific catalogue of infringing works available on MySpace, rather than a general list of works in which T-Series held a copyright.</p>
<h2 style="text-align: justify; ">The Defence</h2>
<p>While other arguments such as one around the jurisdiction of the court were also raised, the central issues are listed below:</p>
<ol>
<li style="text-align: justify; ">Non-Specificity of Prayer<br />T-Series’ claim in the suit is for a blanket injunction on copyrighted content on the MySpace website. This imposes a clearly untenable, even impossible, burden for intermediaries to comply with.</li>
<li style="text-align: justify; ">Knowledge<br />MySpace argued that no liability could accrue to it on two counts. The first was that it had no actual or direct knowledge or role in the selection of the content, while the second was that no control was exercised, or was exercisable over the uploading of the content. Additionally, there was no possible means by which it could have identified the offending content and segregated it from lawful content, or monitored all of the content that it serves as a platform for.</li>
<li style="text-align: justify; ">Intermediary status and Safe Harbour Protection<br />In relation to its status as an intermediary, MySpace raised several arguments. First, it argued that it had immunity under section 79 of the IT Act and under the US Digital Millennium Copyright Act (US DMCA). Another argument restated what is arguably the most basic tenet of intermediary liability that merely providing the platform by which infringement could occur cannot amount to infringement. In other words, the mere act of facilitating expression over internet does not amount to infringement. It then made reference to its terms of use and its institution of safeguards (in the form of a hash filter, a rights management tool and a system of take-down–stay-down), which it argued clearly reflect an intention to discourage or else address cases of infringement as they arise. MySpace also emphasized that a US DMCA compliant procedure was in place, although T-Series countered that the notice and take down system would not mitigate the infringement.</li>
<li style="text-align: justify; ">Relationship between MySpace and its Users<br />Taking from previous arguments about a lack of control and its status as an intermediary, MySpace argued that it was simply a licensee of users who uploaded content. The license is limited, in that MySpace is only allowed to alter user-generated content so as to make it viewable.</li>
</ol>
<h2 style="text-align: justify; ">Outcomes</h2>
<ol>
<li style="text-align: justify; ">Infringement by Facilitation<br />The court concluded that infringement in terms of section 51 (a) (ii) had occurred in this case, since web space is a “place” in the terms required by the section and there were monetary gains in the form of ad revenue. The argument as to a lack of knowledge of infringement was also rejected on the ground that MySpace’s provision for safeguards against infringement clearly established a reason to believe that infringement will occur. Also referenced as evidence of knowledge, or at least a reason to believe infringement would occur, is the fact that MySpace modifies the format of the content before making it available on its website. It also tested for infringement by authorization in terms of section 14 read with section 51 (a) (i), but concluded that this did not arise here.</li>
<li style="text-align: justify; ">Reading away section 79?<br />The court accepted the argument made by T-Series to the effect that sections 79 and 81 of the IT Act must be read together. Since section 79 would be overridden by section 81’s non-obstante, the effect would be that rights holders’ interests under the Copyright Act will erode intermediaries’ immunity under section 79. </li>
<li style="text-align: justify; ">Due Diligence<br />The court rejected the argument that the provision of due diligence or curative measures post-infringement would be sufficient. Specifically, the contention that the quantum of content being uploaded precludes close scrutiny, given the amount of labour that would be involved, was rejected. Content should not immediately be made available but must be subject to enquiries as to its title or to authentication of its proprietor before it is made available. In fact, it holds that, “there is no reason to axiomatically make each and every work available to the public solely because user has supplied them unless the defendants are so sure that it is not infringement.” (Paragraph 88).</li>
</ol> <ol> </ol>
<p style="text-align: justify; ">There is also an attempt to distinguish the Indian framework from the DMCA. While that law calls for post-infringement measures, it is argued that in India, on reading section 51 with section 55, the focus is on preventing infringement at the threshold. In response to the case that it would be impossible to do so, the court held that since the process here requires MySpace to modify the format of content uploaded to it to make it viewable, it will have a reasonable opportunity to test for infringement.</p>
<h2 style="text-align: justify; ">Analysis</h2>
<h3>Accounting for the Medium of Communication</h3>
<p style="text-align: justify; ">The court’s analysis of the issues begins with a predictable emphasis on how the law of copyright would operate in the context of what is termed “internet computing”, peppered with trite statements about “the virtual world of internet” creating “complexit[ies]” for copyright law. The court appears to have entered into this discussion to establish that the notion of place in section 51 (a) (ii) should extend to “web space” but the statements made here only serve to contrast starkly against its subsequent failure to account for the peculiarities of form and function of intermediaries online. Had this line of argument been taken to its logical conclusion, after the character of the medium had been appreciated, the court’s final conclusion, that MySpace is liable for copyright infringement, would have been an impossible one to arrive at.</p>
<h3 style="text-align: justify; ">And What of Free Speech?</h3>
<p style="text-align: justify; ">As it had argued before the court, intermediaries such as MySpace have no means by which to determine whether content is illegal (whether by reason of amounting to a violation of copyright, or otherwise) until content is uploaded. In other words, there is no existing mechanism by which this determination can be made at the threshold, before posting.</p>
<p style="text-align: justify; ">The court does not engage with the larger consequences for such a scheme of penalizing intermediaries. Censoring patent illegalities at the threshold, even if that were possible is one thing. The precedent that the court creates here is quite another. Given the general difficulty in conclusively establishing whether there is an infringement at all due to the complexities in applying the exceptions contained under section 52, it should not be for ordinary private or commercial interests such as intermediaries to sit in judgment over whether content is or is not published at all. In order to minimize its own liability, the likelihood of legitimate content being censored by the intermediary prior to posting is high.</p>
<p style="text-align: justify; ">The consequences for civil liberties, and free speech and expression online in particular, appear to have been completely ignored in favour of rights holders’ commercial interests.</p>
<h3 style="text-align: justify; ">Consequences for Intermediary Liability and Safe Harbour Protection</h3>
<blockquote class="pullquote" style="text-align: justify; ">Even if every instance in question did amount to an infringement of copyright and a mechanism did exist allowing for removal of content, the effect of this judgment is to create a strict liability regime for intermediaries.</blockquote>
<p style="text-align: justify; ">In other words, the court’s ruling will have the effect that courts’ determination of intermediaries’ liability will become detached from whether or not any fault can be attributed to them. MySpace did make this argument, even going as far as to suggest that doing so would impose strict liability on intermediaries. This would lead to an unprecedented and entirely unjustifiable result. In spite the fact that a given intermediary did apply all available means to prevent the publication of potentially infringing content, it would remain potentially liable for any illegality in the content, even though the illegality could not have been detected or addressed.</p>
<p style="text-align: justify; ">What is perhaps even more worrying is that MySpace’s attempt at proactively and in good faith preventing copyright infringement through its terms of use and in addressing them through its post-infringement measures was explicitly cited as evidence of knowledge of and control over the uploading of copyrighted material, at the threshold rather than ex post. This creates perverse incentives for the intermediary to ignore infringement, to the detriment of rights holders, rather than act proactively to minimize its incidence.</p>
<p style="text-align: justify; ">A final observation is that the court’s use, while pronouncing on relief, of the fact that MySpace makes a “copy” of the uploaded content by converting it into a format that could subsequently be hosted on the site and made accessible to show evidence of infringement and impose liability upon MySpace in itself is a glaring instance of the disingenuous reasoning the court employs throughout the case. There is another problem with the amended section 79, which waives immunity where the intermediary “modifies” material. That term is vague and overreaches, as it does here: altering formats to make content compatible with a given platform is not comparable to choices as to the content of speech or expression, but the reading is tenable under section 79 as it stands.</p>
<p style="text-align: justify; ">The result of all of this is to dislodge the section 79 immunity that accrues to intermediaries and replace that with a presumption that they are liable, rather than not, for any illegality in the content that they passively host.</p>
<h3 style="text-align: justify; ">Effect of the Copyright (Amendment) Act, 2012</h3>
<p style="text-align: justify; ">Since the judgment in the MySpace case, the Copyright Act has been amended to include some provisions that would bear on online service providers and on intermediaries’ liability for hosting infringing content, in particular. Section 52 (1) (b) of the amended Act provides that “transient or incidental storage of a work or performance purely in the technical process of electronic transmission or communication to the public” would not infringe copyright. The other material provision is section 52 (1) (c) which provides that “transient or incidental storage of a work or performance for the purpose of providing electronic links, access or integration, where such links, access or integration has not been expressly prohibited by the right holder, unless the person responsible is aware or has reasonable grounds for believing that such storage is of an infringing copy” will not constitute an infringement of copyright. The latter provision appears to institute a rather rudimentary, and very arguably incomplete, system of notice and takedown by way of a proviso. This requires intermediaries to takedown content on written complaint from copyright owners for a period of 21 days or until a competent rules on the matter whichever is sooner, and restore access to the content once that time period lapses, if there is no court order to sustain it beyond that period.</p>
<p style="text-align: justify; ">This post does not account for the effect that these provisions could have had on the case, but it is already clear, from the sloppy drafting of section 52 (1) (c) and its proviso that they are not entirely salutary even at the outset. At any rate, there appears to be nothing that *<i>determinatively*</i> affects intermediaries’ secondary liability, <i>i.e.</i>, their liability for users’ infringing acts.</p>
<hr />
<p style="text-align: justify; "><i>Disclosure: CIS is now a party to these proceedings at the Delhi High Court. This is a purely academic critique, and should not be seen to have any prejudice to the arguments we will make there.</i></p>
<hr />
<p>[<a href="#fr*" name="fn*">*</a>]. Super Cassettes Industries Ltd. v. MySpace Inc. and Another, on 29 July, 2011, Indian Kanoon - Search engine for Indian Law. See<a class="external-link" href="http://bit.ly/quj6JW"> http://bit.ly/quj6JW</a>, last accessed on October 31, 2012.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/super-cassettes-v-my-space'>http://editors.cis-india.org/a2k/blogs/super-cassettes-v-my-space</a>
</p>
No publisherujwalaAccess to KnowledgeCopyrightIntellectual Property RightsIntermediary LiabilityFeatured2012-10-31T10:27:36ZBlog EntrySummary Report Internet Governance Forum 2015
http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015
<b>Centre for Internet and Society (CIS), India participated in the Internet Governance Forum (IGF) held at Poeta Ronaldo Cunha Lima Conference Center, Joao Pessoa in Brazil from 10 November 2015 to 13 November 2015. The theme of IGF 2015 was ‘Evolution of Internet Governance: Empowering Sustainable Development’. Sunil Abraham, Pranesh Prakash & Jyoti Panday from CIS actively engaged and made substantive contributions to several key issues affecting internet governance at the IGF 2015. The issue-wise detail of their engagement is set out below. </b>
<p align="center" style="text-align: left;"><strong>INTERNET
GOVERNANCE</strong></p>
<p align="justify">
I. The
Multi-stakeholder Advisory Group to the IGF organised a discussion on
<em><strong>Sustainable
Development Goals (SDGs) and Internet Economy</strong></em><em>
</em>at
the Main Meeting Hall from 9:00 am to 12:30 pm on 11 November, 2015.
The
discussions at this session focused on the importance of Internet
Economy enabling policies and eco-system for the fulfilment of
different SDGs. Several concerns relating to internet
entrepreneurship, effective ICT capacity building, protection of
intellectual property within and across borders were availability of
local applications and content were addressed. The panel also
discussed the need to identify SDGs where internet based technologies
could make the most effective contribution. Sunil
Abraham contributed to the panel discussions by addressing the issue
of development and promotion of local content and applications. List
of speakers included:</p>
<ol>
<li>
<p align="justify">
Lenni
Montiel, Assistant-Secretary-General for Development, United Nations</p>
</li><li>
<p align="justify">
Helani
Galpaya, CEO LIRNEasia</p>
</li><li>
<p align="justify">
Sergio
Quiroga da Cunha, Head of Latin America, Ericsson</p>
</li><li>
<p align="justify">
Raúl
L. Katz, Adjunct Professor, Division of Finance and Economics,
Columbia Institute of Tele-information</p>
</li><li>
<p align="justify">
Jimson
Olufuye, Chairman, Africa ICT Alliance (AfICTA)</p>
</li><li>
<p align="justify">
Lydia
Brito, Director of the Office in Montevideo, UNESCO</p>
</li><li>
<p align="justify">
H.E.
Rudiantara, Minister of Communication & Information Technology,
Indonesia</p>
</li><li>
<p align="justify">
Daniel
Sepulveda, Deputy Assistant Secretary, U.S. Coordinator for
International and Communications Policy at the U.S. Department of
State </p>
</li><li>
<p align="justify">
Deputy
Minister Department of Telecommunications and Postal Services for
the republic of South Africa</p>
</li><li>
<p align="justify">
Sunil
Abraham, Executive Director, Centre for Internet and Society, India</p>
</li><li>
<p align="justify">
H.E.
Junaid Ahmed Palak, Information and Communication Technology
Minister of Bangladesh</p>
</li><li>
<p align="justify">
Jari
Arkko, Chairman, IETF</p>
</li><li>
<p align="justify">
Silvia
Rabello, President, Rio Film Trade Association</p>
</li><li>
<p align="justify">
Gary
Fowlie, Head of Member State Relations & Intergovernmental
Organizations, ITU</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">http</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">://</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">www</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">.</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">.</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">org</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">/</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">cms</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">/</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">igf</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">2015-</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">main</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">-</a><a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">sessions</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2327-2015-11-11-internet-economy-and-sustainable-development-main-meeting-room">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2327-2015-11-11-internet-economy-and-sustainable-development-main-meeting-room</a></u></p>
<p align="justify">
Video
link Internet
economy and Sustainable Development here
<a href="https://www.youtube.com/watch?v=D6obkLehVE8">https://www.youtube.com/watch?v=D6obkLehVE8</a></p>
<p align="justify"> II.
Public
Knowledge organised a workshop on <em><strong>The
Benefits and Challenges of the Free Flow of Data </strong></em>at
Workshop Room
5 from 11:00 am to 12:00 pm on 12 November, 2015. The discussions in
the workshop focused on the benefits and challenges of the free flow
of data and also the concerns relating to data flow restrictions
including ways to address
them. Sunil
Abraham contributed to the panel discussions by addressing the issue
of jurisdiction of data on the internet. The
panel for the workshop included the following.</p>
<ol>
<li>
<p align="justify">
Vint
Cerf, Google</p>
</li><li>
<p align="justify">
Lawrence
Strickling, U.S. Department of Commerce, NTIA</p>
</li><li>
<p align="justify">
Richard
Leaning, European Cyber Crime Centre (EC3), Europol</p>
</li><li>
<p align="justify">
Marietje
Schaake, European Parliament</p>
</li><li>
<p align="justify">
Nasser
Kettani, Microsoft</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS
India</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">://</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">www</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">org</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">cms</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshops</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">list</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">of</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">published</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshop</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">proposals</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2467-2015-11-12-ws65-the-benefits-and-challenges-of-the-free-flow-of-data-workshop-room-5">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2467-2015-11-12-ws65-the-benefits-and-challenges-of-the-free-flow-of-data-workshop-room-5</a></p>
<p align="justify">
Video link https://www.youtube.com/watch?v=KtjnHkOn7EQ</p>
<p align="justify"> III.
Article
19 and
Privacy International organised a workshop on <em><strong>Encryption
and Anonymity: Rights and Risks</strong></em>
at Workshop Room 1 from 11:00 am to 12:30 pm on 12 November, 2015.
The
workshop fostered a discussion about the latest challenges to
protection of anonymity and encryption and ways in which law
enforcement demands could be met while ensuring that individuals
still enjoyed strong encryption and unfettered access to anonymity
tools. Pranesh
Prakash contributed to the panel discussions by addressing concerns
about existing south Asian regulatory framework on encryption and
anonymity and emphasizing the need for pervasive encryption. The
panel for this workshop included the following.</p>
<ol>
<li>
<p align="justify">
David
Kaye, UN Special Rapporteur on Freedom of Expression</p>
</li><li>
<p align="justify">
Juan
Diego Castañeda, Fundación Karisma, Colombia</p>
</li><li>
<p align="justify">
Edison
Lanza, Organisation of American States Special Rapporteur</p>
</li><li>
<p align="justify">
Pranesh
Prakash, CIS India</p>
</li><li>
<p align="justify">
Ted
Hardie, Google</p>
</li><li>
<p align="justify">
Elvana
Thaci, Council of Europe</p>
</li><li>
<p align="justify">
Professor
Chris Marsden, Oxford Internet Institute</p>
</li><li>
<p align="justify">
Alexandrine
Pirlot de Corbion, Privacy International</p>
</li></ol>
<p align="justify"><a name="_Hlt435412531"></a>
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">://</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">www</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">intgovforum</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">.</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">org</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">cms</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">worksh</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">o</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">ps</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">/</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">list</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">of</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">published</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">workshop</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">-</a><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">proposals</a><u>
</u></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2407-2015-11-12-ws-155-encryption-and-anonymity-rights-and-risks-workshop-room-1">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2407-2015-11-12-ws-155-encryption-and-anonymity-rights-and-risks-workshop-room-1</a></p>
<p align="justify">
Video link available here https://www.youtube.com/watch?v=hUrBP4PsfJo</p>
<p align="justify"> IV.
Chalmers
& Associates organised a session on <em><strong>A
Dialogue on Zero Rating and Network Neutrality</strong></em>
at the Main Meeting Hall from 2:00 pm to 4:00 pm on 12 November,
2015. The Dialogue provided access to expert insight on zero-rating
and a full spectrum of diverse
views on this issue. The Dialogue also explored alternative
approaches to zero rating such as use of community networks. Pranesh
Prakash provided
a
detailed explanation of harms and benefits related to different
approaches to zero-rating. The
panellists for this session were the following.</p>
<ol>
<li>
<p align="justify">
Jochai
Ben-Avie, Senior Global Policy Manager, Mozilla, USA</p>
</li><li>
<p align="justify">
Igor
Vilas Boas de Freitas, Commissioner, ANATEL, Brazil</p>
</li><li>
<p align="justify">
Dušan
Caf, Chairman, Electronic Communications Council, Republic of
Slovenia</p>
</li><li>
<p align="justify">
Silvia
Elaluf-Calderwood, Research Fellow, London School of Economics,
UK/Peru</p>
</li><li>
<p align="justify">
Belinda
Exelby, Director, Institutional Relations, GSMA, UK</p>
</li><li>
<p align="justify">
Helani
Galpaya, CEO, LIRNEasia, Sri Lanka</p>
</li><li>
<p align="justify">
Anka
Kovacs, Director, Internet Democracy Project, India</p>
</li><li>
<p align="justify">
Kevin
Martin, VP, Mobile and Global Access Policy, Facebook, USA</p>
</li><li>
<p align="justify">
Pranesh
Prakash, Policy Director, CIS India</p>
</li><li>
<p align="justify">
Steve
Song, Founder, Village Telco, South Africa/Canada</p>
</li><li>
<p align="justify">
Dhanaraj
Thakur, Research Manager, Alliance for Affordable Internet, USA/West
Indies</p>
</li><li>
<p align="justify">
Christopher
Yoo, Professor of Law, Communication, and Computer & Information
Science, University of Pennsylvania, USA</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/igf2015-main-sessions" target="_top">http://www.intgovforum.org/cms/igf2015-main-sessions</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2457-2015-11-12-a-dialogue-on-zero-rating-and-network-neutrality-main-meeting-hall-2">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2457-2015-11-12-a-dialogue-on-zero-rating-and-network-neutrality-main-meeting-hall-2</a></p>
<p align="justify"> V.
The
Internet & Jurisdiction Project organised a workshop on
<em><strong>Transnational
Due Process: A Case Study in MS Cooperation</strong></em>
at Workshop Room
4 from 11:00 am to 12:00 pm on 13 November, 2015. The
workshop discussion focused on the challenges in developing an
enforcement framework for the internet that guarantees transnational
due process and legal interoperability. The discussion also focused
on innovative approaches to multi-stakeholder cooperation such as
issue-based networks, inter-sessional work methods and transnational
policy standards. The panellists for this discussion were the
following.</p>
<ol>
<li>
<p align="justify">
Anne
Carblanc Head of Division, Directorate for Science, Technology and
Industry, OECD</p>
</li><li>
<p align="justify">
Eileen
Donahoe Director Global Affairs, Human Rights Watch</p>
</li><li>
<p align="justify">
Byron
Holland President and CEO, CIRA (Canadian ccTLD)</p>
</li><li>
<p align="justify">
Christopher
Painter Coordinator for Cyber Issues, US Department of State</p>
</li><li>
<p align="justify">
Sunil
Abraham Executive Director, CIS India</p>
</li><li>
<p align="justify">
Alice
Munyua Lead dotAfrica Initiative and GAC representative, African
Union Commission</p>
</li><li>
<p align="justify">
Will
Hudsen Senior Advisor for International Policy, Google</p>
</li><li>
<p align="justify">
Dunja
Mijatovic Representative on Freedom of the Media, OSCE</p>
</li><li>
<p align="justify">
Thomas
Fitschen Director for the United Nations, for International
Cooperation against Terrorism and for Cyber Foreign Policy, German
Federal Foreign Office</p>
</li><li>
<p align="justify">
Hartmut
Glaser Executive Secretary, Brazilian Internet Steering Committee</p>
</li><li>
<p align="justify">
Matt
Perault, Head of Policy Development Facebook</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2475-2015-11-13-ws-132-transnational-due-process-a-case-study-in-ms-cooperation-workshop-room-4">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2475-2015-11-13-ws-132-transnational-due-process-a-case-study-in-ms-cooperation-workshop-room-4</a></p>
<p align="justify">
Video
link Transnational
Due Process: A Case Study in MS Cooperation available here <a href="https://www.youtube.com/watch?v=M9jVovhQhd0">https://www.youtube.com/watch?v=M9jVovhQhd0</a></p>
<p align="justify"> VI.
The Internet Governance Project organised a meeting of the
<em><strong>Dynamic
Coalition on Accountability of Internet Governance Venues</strong></em>
at Workshop Room 2 from 14:00
– 15:30 on
12 November, 2015. The coalition
brought together panelists to highlight the
challenges in developing an accountability
framework
for internet governance
venues that include setting up standards and developing a set of
concrete criteria. Jyoti Panday provided the perspective of civil
society on why acountability is necessary in internet governance
processes and organizations. The panelists for this workshop included
the following.</p>
<ol>
<li>
<p>
Robin
Gross, IP Justice</p>
</li><li>
<p>
Jeanette
Hofmann, Director
<a href="http://www.internetundgesellschaft.de/">Alexander
von Humboldt Institute for Internet and Society</a></p>
</li><li>
<p>
Farzaneh
Badiei,
Internet Governance Project</p>
</li><li>
<p>
Erika
Mann,
Managing
Director Public PolicyPolicy Facebook and Board of Directors
ICANN</p>
</li><li>
<p>
Paul
Wilson, APNIC</p>
</li><li>
<p>
Izumi
Okutani, Japan
Network Information Center (JPNIC)</p>
</li><li>
<p>
Keith
Drazek , Verisign</p>
</li><li>
<p>
Jyoti
Panday,
CIS</p>
</li><li>
<p>
Jorge
Cancio,
GAC representative</p>
</li></ol>
<p>
Detailed
description of the workshop is available here
<a href="http://igf2015.sched.org/event/4c23/dynamic-coalition-on-accountability-of-internet-governance-venues?iframe=no&w=&sidebar=yes&bg=no">http://igf2015.sched.org/event/4c23/dynamic-coalition-on-accountability-of-internet-governance-venues?iframe=no&w=&sidebar=yes&bg=no</a></p>
<p>
Video
link https://www.youtube.com/watch?v=UIxyGhnch7w</p>
<p> VII.
Digital
Infrastructure
Netherlands Foundation organized an open forum at
Workshop Room 3
from 11:00
– 12:00
on
10
November, 2015. The open
forum discussed the increase
in government engagement with “the internet” to protect their
citizens against crime and abuse and to protect economic interests
and critical infrastructures. It
brought
together panelists topresent
ideas about an agenda for the international protection of ‘the
public core of the internet’ and to collect and discuss ideas for
the formulation of norms and principles and for the identification of
practical steps towards that goal.
Pranesh Prakash participated in the e open forum. Other speakers
included</p>
<ol>
<li>
<p>
Bastiaan
Goslings AMS-IX, NL</p>
</li><li>
<p>
Pranesh
Prakash CIS, India</p>
</li><li>
<p>
Marilia
Maciel (FGV, Brasil</p>
</li><li>
<p>
Dennis
Broeders (NL Scientific Council for Government Policy)</p>
</li></ol>
<p>
Detailed
description of the open
forum is available here
<a href="http://schd.ws/hosted_files/igf2015/3d/DINL_IGF_Open%20Forum_The_public_core_of_the_internet.pdf">http://schd.ws/hosted_files/igf2015/3d/DINL_IGF_Open%20Forum_The_public_core_of_the_internet.pdf</a></p>
<p>
Video
link available here <a href="https://www.youtube.com/watch?v=joPQaMQasDQ">https://www.youtube.com/watch?v=joPQaMQasDQ</a></p>
<p>
VIII.
UNESCO, Council of Europe, Oxford University, Office of the High
Commissioner on Human Rights, Google, Internet Society organised a
workshop on hate speech and youth radicalisation at Room 9 on
Thursday, November 12. UNESCO shared the initial outcome from its
commissioned research on online hate speech including practical
recommendations on combating against online hate speech through
understanding the challenges, mobilizing civil society, lobbying
private sectors and intermediaries and educating individuals with
media and information literacy. The workshop also discussed how to
help empower youth to address online radicalization and extremism,
and realize their aspirations to contribute to a more peaceful and
sustainable world. Sunil Abraham provided his inputs. Other speakers
include</p>
<p>
1.
Chaired by Ms Lidia Brito, Director for UNESCO Office in Montevideo</p>
<p>
2.Frank
La Rue, Former Special Rapporteur on Freedom of Expression</p>
<p>
3.
Lillian Nalwoga, President ISOC Uganda and rep CIPESA, Technical
community</p>
<p>
4.
Bridget O’Loughlin, CoE, IGO</p>
<p>
5.
Gabrielle Guillemin, Article 19</p>
<p>
6.
Iyad Kallas, Radio Souriali</p>
<p>
7.
Sunil Abraham executive director of Center for Internet and Society,
Bangalore, India</p>
<p>
8.
Eve Salomon, global Chairman of the Regulatory Board of RICS</p>
<p>
9.
Javier Lesaca Esquiroz, University of Navarra</p>
<p>
10.
Representative GNI</p>
<p>
11.
Remote Moderator: Xianhong Hu, UNESCO</p>
<p>
12.
Rapporteur: Guilherme Canela De Souza Godoi, UNESCO</p>
<p>
Detailed
description of the workshop
is available here
<a href="http://igf2015.sched.org/event/4c1X/ws-128-mitigate-online-hate-speech-and-youth-radicalisation?iframe=no&w=&sidebar=yes&bg=no">http://igf2015.sched.org/event/4c1X/ws-128-mitigate-online-hate-speech-and-youth-radicalisation?iframe=no&w=&sidebar=yes&bg=no</a></p>
<p>
Video
link to the panel is available here
<a href="https://www.youtube.com/watch?v=eIO1z4EjRG0">https://www.youtube.com/watch?v=eIO1z4EjRG0</a></p>
<p> <strong>INTERMEDIARY
LIABILITY</strong></p>
<p align="justify">
IX.
Electronic
Frontier Foundation, Centre for Internet Society India, Open Net
Korea and Article 19 collaborated to organize
a workshop on the <em><strong>Manila
Principles on Intermediary Liability</strong></em>
at Workshop Room 9 from 11:00 am to 12:00 pm on 13 November 2015. The
workshop elaborated on the Manila
Principles, a high level principle framework of best practices and
safeguards for content restriction practices and addressing liability
for intermediaries for third party content. The
workshop
saw particpants engaged in over lapping projects considering
restriction practices coming togetehr to give feedback and highlight
recent developments across liability regimes. Jyoti
Panday laid down the key details of the Manila Principles framework
in this session. The panelists for this workshop included the
following.</p>
<ol>
<li>
<p align="justify">
Kelly
Kim Open Net Korea,</p>
</li><li>
<p align="justify">
Jyoti
Panday, CIS India,</p>
</li><li>
<p align="justify">
Gabrielle
Guillemin, Article 19,</p>
</li><li>
<p align="justify">
Rebecca
McKinnon on behalf of UNESCO</p>
</li><li>
<p align="justify">
Giancarlo
Frosio, Center for Internet and Society, Stanford Law School</p>
</li><li>
<p align="justify">
Nicolo
Zingales, Tilburg University</p>
</li><li>
<p align="justify">
Will
Hudson, Google</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></p>
<p align="justify">
Transcript
of the workshop is available here
<a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2423-2015-11-13-ws-242-the-manila-principles-on-intermediary-liability-workshop-room-9">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2423-2015-11-13-ws-242-the-manila-principles-on-intermediary-liability-workshop-room-9</a></p>
<p align="justify">
Video link available here <a href="https://www.youtube.com/watch?v=kFLmzxXodjs">https://www.youtube.com/watch?v=kFLmzxXodjs</a></p>
<p align="justify"> <strong>ACCESSIBILITY</strong></p>
<p align="justify">
X.
Dynamic
Coalition
on Accessibility and Disability and Global Initiative for Inclusive
ICTs organised a workshop on <em><strong>Empowering
the Next Billion by Improving Accessibility</strong></em><em>
</em>at
Workshop Room 6 from 9:00 am to 10:30 am on 13 November, 2015. The
discussion focused on
the need and ways to remove accessibility barriers which prevent over
one billion potential users to benefit from the Internet, including
for essential services. Sunil
Abraham specifically spoke about the lack of compliance of existing
ICT infrastructure with well established accessibility standards
specifically relating to accessibility barriers in the disaster
management process. He discussed the barriers faced by persons with
physical or psychosocial disabilities. The
panelists for this discussion were the following.</p>
<ol>
<li>
<p align="justify">
Francesca
Cesa Bianchi, G3ICT</p>
</li><li>
<p align="justify">
Cid
Torquato, Government of Brazil</p>
</li><li>
<p align="justify">
Carlos
Lauria, Microsoft Brazil</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS India</p>
</li><li>
<p align="justify">
Derrick
L. Cogburn, Institute on Disability and Public Policy (IDPP) for the
ASEAN(Association of Southeast Asian Nations) Region</p>
</li><li>
<p align="justify">
Fernando
H. F. Botelho, F123 Consulting</p>
</li><li>
<p align="justify">
Gunela
Astbrink, GSA InfoComm</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2438-2015-11-13-ws-253-empowering-the-next-billion-by-improving-accessibility-workshop-room-3">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2438-2015-11-13-ws-253-empowering-the-next-billion-by-improving-accessibility-workshop-room-3</a></u></p>
<p align="justify">
Video
Link Empowering
the next billion by improving accessibility <a href="https://www.youtube.com/watch?v=7RZlWvJAXxs">https://www.youtube.com/watch?v=7RZlWvJAXxs</a></p>
<p align="justify"> <strong>OPENNESS</strong></p>
<p align="justify">
XI.
A
workshop on <em><strong>FOSS
& a Free, Open Internet: Synergies for Development</strong></em>
was organized at Workshop Room 7 from 2:00 pm to 3:30 pm on 13
November, 2015. The discussion was focused on the increasing risk to
openness of the internet and the ability of present & future
generations to use technology to improve their lives. The panel shred
different perspectives about the future co-development
of FOSS and a free, open Internet; the threats that are emerging; and
ways for communities to surmount these. Sunil
Abraham emphasised the importance of free software, open standards,
open access and access to knowledge and the lack of this mandate in
the draft outcome document for upcoming WSIS+10 review and called for
inclusion of the same. Pranesh Prakash further contributed to the
discussion by emphasizing the need for free open source software with
end‑to‑end encryption and traffic level encryption based
on open standards which are decentralized and work through federated
networks. The
panellists for this discussion were the following.</p>
<ol>
<li>
<p align="justify">
Satish
Babu, Technical Community, Chair, ISOC-TRV, Kerala, India</p>
</li><li>
<p align="justify">
Judy
Okite, Civil Society, FOSS Foundation for Africa</p>
</li><li>
<p align="justify">
Mishi
Choudhary, Private Sector, Software Freedom Law Centre, New York</p>
</li><li>
<p align="justify">
Fernando
Botelho, Private Sector, heads F123 Systems, Brazil</p>
</li><li>
<p align="justify">
Sunil
Abraham, CIS
India</p>
</li><li>
<p align="justify">
Pranesh
Prakash, CIS
India</p>
</li><li>
<p align="justify">
Nnenna
Nwakanma- WWW.Foundation</p>
</li><li>
<p align="justify">
Yves
MIEZAN EZO, Open Source strategy consultant</p>
</li><li>
<p align="justify">
Corinto
Meffe, Advisor to the President and Directors, SERPRO, Brazil</p>
</li><li>
<p align="justify">
Frank
Coelho de Alcantara, Professor, Universidade Positivo, Brazil</p>
</li><li>
<p align="justify">
Caroline
Burle, Institutional and International Relations, W3C Brazil Office
and Center of Studies on Web Technologies</p>
</li></ol>
<p align="justify">
Detailed
description of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals" target="_top">http://www.intgovforum.org/cms/workshops/list-of-published-workshop-proposals</a></u></p>
<p align="justify">
Transcript
of the workshop is available here
<u><a href="http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2468-2015-11-13-ws10-foss-and-a-free-open-internet-synergies-for-development-workshop-room-7" target="_top">http://www.intgovforum.org/cms/187-igf-2015/transcripts-igf-2015/2468-2015-11-13-ws10-foss-and-a-free-open-internet-synergies-for-development-workshop-room-7</a></u></p>
<p align="justify">
Video
link available here <a href="https://www.youtube.com/watch?v=lwUq0LTLnDs">https://www.youtube.com/watch?v=lwUq0LTLnDs</a></p>
<p align="justify">
<br /><br /></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015'>http://editors.cis-india.org/internet-governance/blog/summary-report-internet-governance-forum-2015</a>
</p>
No publisherjyotiAccess to KnowledgeBig DataFreedom of Speech and ExpressionEncryptionInternet Governance ForumIntermediary LiabilityAccountabilityInternet GovernanceCensorshipCyber SecurityDigital GovernanceAnonymityCivil SocietyBlocking2015-11-30T10:47:13ZBlog Entry