The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 61 to 75.
Google Policy Fellowship Programme: Call for Applications
http://editors.cis-india.org/internet-governance/google-policy-fellowship
<b>The Centre for Internet & Society (CIS) is inviting applications for the Google Policy Fellowship programme. Google is providing a USD 7,500 stipend to the India Fellow, who will be selected by August 15, 2012.</b>
<p>The <a class="external-link" href="http://www.google.com/policyfellowship/">Google Policy Fellowship</a> offers successful candidates an opportunity to develop research and debate on the fellowship focus areas, which include Access to Knowledge, Openness in India, Freedom of Expression, Privacy, and Telecom, for a period of about ten weeks starting from August 2012 upto October 2012. CIS will select the India Fellow. Send in your applications for the position by June 27, 2012.</p>
<p>To apply, please send to<a class="external-link" href="mailto:google.fellowship@cis-india.org"> google.fellowship@cis-india.org</a> the following materials:</p>
<ol><li><strong>Statement of Purpose</strong>: A brief write-up outlining about your interest and qualifications for the programme including the relevant academic, professional and extracurricular experiences. As part of the write-up, also explain on what you hope to gain from participation in the programme and what research work concerning free expression online you would like to further through this programme. (About 1200 words max).</li><li><strong>Resume</strong></li><li><strong>Three references</strong></li></ol>
<h2>Fellowship Focus Areas</h2>
<ul><li><strong>Access to Knowledge</strong>: Studies looking at access to knowledge issues in India in light of copyright law, consumers law, parallel imports and the interplay between pervasive technologies and intellectual property rights, targeted at policymakers, Members of Parliament, publishers, photographers, filmmakers, etc.</li><li><strong>Openness in India</strong>: Studies with policy recommendations on open access to scholarly literature, free access to law, open content, open standards, free and open source software, aimed at policymakers, policy researchers, academics and the general public. </li><li><strong>Freedom of Expression</strong>: Studies on policy, regulatory and legislative issues concerning censorship and freedom of speech and expression online, aimed at bloggers, journalists, authors and the general public.</li><li><strong>Privacy</strong>: Studies on privacy issues like data protection and the right to information, limits to privacy in light of the provisions of the constitution, media norms and privacy, banking and financial privacy, workplace privacy, privacy and wire-tapping, e-governance and privacy, medical privacy, consumer privacy, etc., aimed at policymakers and the public.</li><li><strong>Telecom</strong>: Building awareness and capacity on telecommunication policy in India for researchers and academicians, policymakers and regulators, consumer and civil society organisations, education and library institutions and lay persons through the creation of a dedicated web based resource focusing on knowledge dissemination.<br /></li></ul>
<h2>Frequently Asked Questions</h2>
<ul><li><strong>What is the Google Policy Fellowship program?</strong><br />The Google Policy Fellowship program offers students interested in Internet and technology related policy issues with an opportunity to spend their summer working on these issues at the Centre for Internet and Society at Bangalore. Students will work for a period of ten weeks starting from July 2012. The research agenda for the program is based on legal and policy frameworks in the region connected to the ground-level perceptions of the fellowship focus areas mentioned above.<br /></li></ul>
<ul><li><strong>I am an International student can I apply and participate in the program? Are there any age restrictions on participating?</strong><br />Yes. You must be 18 years of age or older by January 1, 2012 to be eligible to participate in Google Policy Fellowship program in 2012.<br /></li></ul>
<ul><li><strong>Are there citizenship requirements for the Fellowship?</strong><br />For the time being, we are only accepting students eligible to work in India (e.g. Indian citizens, permanent residents of India, and individuals presently holding an Indian student visa. Google cannot provide guidance or assistance on obtaining the necessary documentation to meet the criteria.<br /></li></ul>
<ul><li><strong>Who is eligible to participate as a student in Google Policy Fellowship program?</strong><br />In order to participate in the program, you must be a student. Google defines a student as an individual enrolled in or accepted into an accredited institution including (but not necessarily limited to) colleges, universities, masters programs, PhD programs and undergraduate programs. Eligibility is based on enrollment in an accredited university by January 1, 2012.<br /></li></ul>
<ul><li><strong>I am an International student can I apply and participate in the program?</strong><br />In order to participate in the program, you must be a student (see Google's definition of a student above). You must also be eligible to work in India (see section on citizen requirements for fellowship above). Google cannot provide guidance or assistance on obtaining the necessary documentation to meet this criterion.</li><li><strong>I have been accepted into an accredited post-secondary school program, but have not yet begun attending. Can I still take part in the program?</strong><br />As long as you are enrolled in a college or university program as of January 1, 2012, you are eligible to participate in the program.</li><li><strong>I graduate in the middle of the program. Can I still participate?</strong><br />As long as you are enrolled in a college or university program as of January 1, 2012, you are eligible to participate in the program.</li></ul>
<h2>Payments, Forms, and Other Administrative Stuff</h2>
<h3>How do payments work?*</h3>
<p>Google will provide a stipend of USD 7,500 equivalent to each Fellow for the summer.</p>
<ul><li>Accepted students in good standing with their host organization will receive a USD 2,500 stipend payable shortly after they begin the Fellowship in August 2012.</li><li>Students who receive passing mid-term evaluations by their host organization will receive a USD 1,500 stipend shortly after the mid-term evaluation in September 2012.</li><li>Students who receive passing final evaluations by their host organization and who have submitted their final program evaluations will receive a USD 3,500 stipend shortly after final evaluations in October 2012.</li></ul>
<p>Please note: <em>Payments will be made by electronic bank transfer, and are contingent upon satisfactory evaluations by the host organization, completion of all required enrollment and other forms. Fellows are responsible for payment of any taxes associated with their receipt of the Fellowship stipend</em>.</p>
<p><strong>*</strong>While the three step payment structure given here corresponds to the one in the United States, disbursement of the amount may be altered as felt necessary.</p>
<h3>What documentation is required from students?</h3>
<p>Students should be prepared, upon request, to provide Google or the host organization with transcripts from their accredited institution as proof of enrollment or admission status. Transcripts do not need to be official (photo copy of original will be sufficient).</p>
<h3>I would like to use the work I did for my Google Policy Fellowship to obtain course credit from my university. Is this acceptable?</h3>
<p>Yes. If you need documentation from Google to provide to your school for course credit, you can contact Google. We will not provide documentation until we have received a final evaluation from your mentoring organization.</p>
<h2>Host Organizations<br /></h2>
<h3>What is Google's relationship with the Centre for Internet and Society?</h3>
<p>Google provides the funding and administrative support for individual fellows directly. Google and the Centre for Internet and Society are not partners or affiliates. The Centre for Internet and Society does not represent the views or opinions of Google and cannot bind Google legally.</p>
<h2>Important Dates<br /></h2>
<h3><strong>What is the program timeline?</strong></h3>
<table class="plain">
<tbody>
<tr>
<td>June 27, 2012</td>
<td>Student Application Deadline. Applications must be received by midnight.</td>
</tr>
<tr>
<td>July 18, 2012</td>
<td>Student applicants are notified of the status of their applications.</td>
</tr>
<tr>
<td>August 2012</td>
<td>Students begin their fellowship with the host organization (start date to be determined by students and the host organization); Google issues initial student stipends.</td>
</tr>
<tr>
<td>September 2012</td>
<td>Mid-term evaluations; Google issues mid-term stipends.</td>
</tr>
<tr>
<td>October 2012</td>
<td>Final evaluations; Google issues final stipends.</td>
</tr>
</tbody>
</table>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/google-policy-fellowship'>http://editors.cis-india.org/internet-governance/google-policy-fellowship</a>
</p>
No publisherpraskrishnaAccess to KnowledgeFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceResearchTelecomIntermediary LiabilityCensorshipOpenness2012-05-24T15:38:28ZBlog EntryGNI and IAMAI Launch Interactive Slideshow Exploring Impact of India's Internet Laws
http://editors.cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws
<b>The Global Network Initiative and the Internet and Mobile Association of India have come together to explain how India’s Internet and technology laws impact economic innovation and freedom of expression. </b>
<p>The <a class="external-link" href="http://www.globalnetworkinitiative.org/">Global Network Initiative (GNI)</a>, and the <a class="external-link" href="http://www.iamai.in/">Internet and Mobile Association of India (IAMAI)</a> have launched an interactive slide show exploring the impact of existing Internet laws on users and businesses in India. The slide show created by Newsbound, and to which Centre for Internet and Society (CIS) has contributed its comments—explain the existing legislative mechanisms prevalent in India, map the challenges of the regulatory environment and highlight areas where such mechanisms can be strengthened.</p>
<p>Foregrounding the difficulties of content regulation, the slides are aimed at informing users and the public of the constraints of current legal mechanisms in place, including safe harbour and take down and notice provisions. Highlighting Section 79(3) and the Intermediary Liability Rules issued in 2011, the slide show identifies some of the challenges faced by Internet platforms, such as the broad interpretation of the legislation by the executive branch.</p>
<p>Challenges governing Internet platforms highlighted in the slide show include uniform Terms of Service that do not consider the type of service being provided by the platform, uncertain requirements for taking down content and compliance obligations related to information disclosure. Further the issues of over compliance and misuse of the legal notice and take down system introduced under Section 79 of the Information Technology (Intermediaries Guidelines) Rules 2011.</p>
<p>The Rules were created with the purpose of providing guidelines for the ‘post-publication redressal mechanism expression as envisioned in the Constitution of India'. However, since their introduction, the Rules have been criticised extensively, by both the national and the international media on account of not conforming to principles of natural justice and freedom of expression. Critics have pointed out that by not recognising the different functions performed by the different intermediaries and by not providing safeguards against misuse of such mechanism for suppressing legitimate expression, the Rules have a chilling effect on freedom of expression.</p>
<p>Under the current Rules, the third party provider/creator of information is not given a chance to be heard by the intermediary, nor is there a requirement to give a reasoned decision by the intermediary to the creator whose content has been taken down. The take down procedure also, does not have any provisions for restoring the removed information, such as providing a counter notice filing mechanism or appealing to a higher authority. Further, the content criteria for removal of content includes terms like 'disparaging' and 'objectionable', which are not defined and prima facie seem to be beyond the reasonable restrictions envisioned by the Constitution of India. With uncertainty in content criteria and no safeguards to prevent abuse complainant may send frivolous complaints and suppress legitimate expressions without any fear of repercussions.</p>
<p>Most importantly, the redressal mechanism under the Rules shifts the burden of censorship, previously, the exclusive domain of the judiciary or the executive, and makes it the responsibility of private intermediaries. Often, private intermediaries, do not have sufficient legal resources to subjectively determine the legitimacy of a legal claim, resulting in over compliance to limit liability. The slide show cites the <a href="http://editors.cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet">2011 CIS research carried out by Rishabh Dara</a> to determine whether the Rules lead to a chilling effect on online free expression, towards highlighting the issue of over compliance and self censorship.</p>
<p>The initiative is timely, given the change of guard in India, and stresses, not only the economic impact of fixing the Internet legal framework, but also the larger impact on users rights and freedom of expression. The initiative calls for a legal environment for the Internet that enables innovation, protects the rights of users, and provides clear rules and regulations for businesses large and small.</p>
<p>See the slideshow here: <a href="http://globalnetworkinitiative.org/india">How India’s Internet Laws Can Help Propel the Country Forward</a></p>
<p><strong>Other GNI reports and resources: </strong></p>
<p><a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">Closing the Gap: Indian Online Intermediaries and a Liability System Not Yet Fit for Purpose</a></p>
<p><a href="http://www.globalnetworkinitiative.org/sites/default/files/Closing%20the%20Gap%20-%20Copenhagen%20Economics_March%202014_0.pdf">Strengthening Protections for Online Platforms Could Add Billions to India’s GDP</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws'>http://editors.cis-india.org/internet-governance/blog/gni-and-iamai-launch-interactive-slideshow-exploring-impact-of-indias-internet-laws</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionInternet GovernanceIntermediary LiabilityChilling EffectInformation Technology2014-07-17T12:01:01ZBlog EntryFinding Needles in Haystacks - Discussing the Role of Automated Filtering in the New Indian Intermediary Liability Rules
http://editors.cis-india.org/internet-governance/blog/finding-needles-in-haystacks-discussing-the-role-of-automated-filtering-in-the-new-indian-intermediary-liability-rules
<b>On the 25th of February this year The Government of India notified the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. The new Rules broaden the scope of which entities can be considered as intermediaries to now include curated-content platforms (Netflix) as well as digital news publications. This blogpost analyzes the rule on automated filtering, in the context of the growing use of automated content moderation.
</b>
<p class="p1"><span class="s1">This article first <a class="external-link" href="https://www.law.kuleuven.be/citip/blog/finding-needles-in-haystacks/">appeared</a> on the KU Leuven's Centre for IT and IP (CITIP) blog. Cross-posted with permission.</span></p>
<p class="p1"><span class="s1">----</span></p>
<p class="p1"><span class="s1">Mathew Sag in his 2018 <a href="https://scholarship.law.nd.edu/cgi/viewcontent.cgi?article=4761&context=ndlr"><span class="s2">paper</span></a> on internet safe harbours discussed how the internet resulted in a shift from the traditional gatekeepers of knowledge (publishing houses) that used to decide what knowledge could be showcased, to a system where everybody who has access to the internet can showcase their work. A “<em>content creator</em>” today ranges from legacy media companies to any person who has access to a smartphone and an internet connection. In a similar trajectory, with the increase in websites and mobile apps and the functions that they serve, the scope of what is an internet intermediary has widened all over the world. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1"><strong>Who is an Intermediary?</strong></span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">In India the definition of “<em>intermediary</em>” is found under Section 2(w) of the <a href="https://www.meity.gov.in/writereaddata/files/itbill2000.pdf"><span class="s2">Information Technology (IT) Act 2000</span></a>, which defines an Intermediary as <em>“with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecoms service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-marketplaces and cyber cafes”.</em> The all-encompassing nature of the definition has allowed the dynamic nature of intermediaries to be included under the definition of the Act, and the Guidelines that have been published periodically (<a href="https://www.meity.gov.in/writereaddata/files/GSR314E_10511%25281%2529_0.pdf"><span class="s2">2011</span></a>, <a href="https://www.meity.gov.in/writereaddata/files/Draft_Intermediary_Amendment_24122018.pdf"><span class="s2">2018</span></a> and <a href="https://www.meity.gov.in/writereaddata/files/Intermediary_Guidelines_and_Digital_Media_Ethics_Code_Rules-2021.pdf"><span class="s2">2021</span></a>). With more websites and social media companies, and even more content creators online today, there is a need to look at ways in which intermediaries can remove illegal content or content that goes against their community guidelines.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">Along with the definition of an intermediary, the IT Act, under Section 79, provides exemptions which grant safe harbours to internet intermediaries, from liability from third-party content, and further empowers the central government to make Rules that act as guidelines for the intermediaries to follow. The Intermediary Liability Rules hence seek to regulate content and lay down safe harbour provisions for intermediaries and internet service providers. To keep up with the changing nature of the internet and internet intermediaries, India relies on the Intermediary Liability Rules to regulate and provide a conducive environment for intermediaries. In view of this provision India has as of now published three versions of the Intermediary Liability (IL) Rules. The first Rules came out in<a href="https://www.meity.gov.in/writereaddata/files/GSR314E_10511%25281%2529_0.pdf"><span class="s2"> 2011</span></a>, followed by the introduction of draft amendments to the law in<a href="https://www.meity.gov.in/writereaddata/files/Draft_Intermediary_Amendment_24122018.pdf"><span class="s2"> 2018</span></a> and finally the latest <a href="https://www.meity.gov.in/writereaddata/files/Intermediary_Guidelines_and_Digital_Media_Ethics_Code_Rules-2021.pdf"><span class="s2">2021 </span></a>version, which would supersede the earlier Rules of 2011. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1"><strong>The Growing Use of Automated Content Moderation </strong></span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">With each version of the Rules there seemed to be changes that ensured that they were abreast with the changing face of the internet and the changing nature of both content and content creator. Hence the 2018 version of the Rules showcase a push towards automated content filtering. The text of Rule 3(9) reads as follows: “<em>The Intermediary shall deploy technology based automated tools or appropriate mechanisms, with appropriate controls, for proactively identifying and removing or disabling public access to unlawful information or content</em>”.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">Under Rule 3(9), intermediaries were required to deploy automated tools or appropriate mechanisms to proactively identify, remove or disable public access to unlawful content. However, neither the 2018 IL Rules, nor the parent Act (the IT Act) specified which content can be deemed unlawful. The 2018 Rules also failed to establish the specific responsibilities of the intermediaries, instead relying on vague terms like “<em>appropriate mechanisms</em>” and with “<em>appropriate controls</em>”. Hence it can be seen that though the Rules mandated the use of automated tools, neither them nor the IT Act provided clear guidelines on what could be removed. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">The lack of clear guidelines and list of content that can be removed had left the decision up to the intermediaries to decide which content, if not actively removed, could cost them their immunity. It has been previously documented that the lack of clear guidelines in the 2011 version of the <a href="https://cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet"><span class="s2">Rules</span></a>, led to intermediaries over complying with take down notices, often taking down content that did not warrant it. The existing tendency to over-comply, combined with automated filtering could have resulted in a number of <a href="https://cis-india.org/internet-governance/how-india-censors-the-web-websci#:~:text=One%2520of%2520the%2520primary%2520ways,certain%2520websites%2520for%2520its%2520users."><span class="s2">unwarranted take downs</span></a>.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">While the 2018 Rules mandated the deployment of automated tools, the year 2020, (possibly due to the pandemic induced work from home safety protocols and global lockdowns) saw major social media companies announcing the move towards a fully automated system of content<a href="https://www.medianama.com/2020/03/223-facebook-content-moderation-coronavirus-medianamas-take/"><span class="s2"> moderation</span></a>. Though the use of automated content removal seems like the right step considering the <a href="https://www.businessinsider.in/tech/news/facebook-content-moderator-who-quit-reportedly-wrote-a-blistering-letter-citing-stress-induced-insomnia-among-other-trauma/articleshow/82075608.cms"><span class="s2">trauma </span></a>that human moderators had to go through, the algorithms that are being used now to remove content are relying on the parameters, practices and data from earlier removals made by the human moderators. More recently, in India with the emergence of the second wave of the COVID19 wave, the Ministry of Electronics and Information Technology has <a href="https://www.thehindu.com/news/national/govt-asks-social-media-platforms-to-remove-100-covid-19-related-posts/article34406733.ece"><span class="s2">asked </span></a>social media platforms to remove “<em>unrelated, old and out of the context images or visuals, communally sensitive posts and misinformation about COVID19 protocols</em>”.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1"><strong>The New IL Rules - A ray of hope?</strong></span></p>
<p class="p3"><span class="s3">The 2021 version of the IL Rules provides a more nuanced approach to the use of automated content filtering compared to the earlier version. Rule 4(4) now requires only “</span><span class="s1">significant social media intermediaries” to use automated tools to identity and take down content pertaining to “child sexual abuse material”, or “depicting rape”, or any information which is identical to a content that has already been removed through a take-down notice. The Rules define a social media intermediary as “<em>intermediary which primarily or solely enables interaction between two or more users and allows them to create, upload, share, disseminate, modify or access information using its services”</em> .The Rules also go a step further to create another type of intermediary, the significant social media intermediary. A significant social media intermediary is defined as one “<em>having a number of registered users in India above such threshold as notified by the Central Government</em>''. Hence what can be considered as a social media intermediary that qualifies as a significant one could change at any time.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s4">Along with adding a new threshold (qualifying as a significant social media intermediary) the Rules, in contrast to the 2018 version, also emphasises the need of such removal to be </span><span class="s1">proportionate to the interests of freedom of speech and expression and privacy of users. The Rules also call for “<em>appropriate human oversight</em>” as well as a periodic review of the tools used for content moderation. The Rules by using the term “<em>shall endeavor</em>” aids in reducing the pressure on the intermediary to set up these mechanisms. This also means that the requirement is now on a best effort basis, as opposed to the word “<em>shall</em>” in the 2018 version of the Rules, which made it mandatory.</span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p1"><span class="s1">Although the Rules now narrow down the instances where automated content removal can take place, the concerns around over compliance and censorship still loom. One of the reasons for concern is that the Rules still fail to require the intermediaries to set up a mechanism for redress or for appeals to such removal. Additionally, the provision that states that automated systems could remove content that have been previously taken down, creates a cause for worry as the propensity of the intermediaries to over comply and take down content has already been documented. This then brings us back to the previous issue where the social media company’s automated systems were removing legitimate news sources. Though the 2021 Rules tries to clarify certain provisions related to automated filtering, like the addition of the safeguards, the Rules also suffer from vague provisions that could cause issues related to compliance. The use of terms such as “<em>proportionate</em>”, “<em>having regard to free speech</em>” etc. fail to lay down definitive directions for the intermediaries (in this case SSMI) to comply with. Additionally, as earlier stated, being qualified as a SSMI can change at any time, either based on the change in the number of users, or the change in the threshold of users, mandated by the government. The absence of human intervention during removal, vague guidelines and fear of losing out on safe harbour provisions, add to the already increasing trend of censorship in social media. With the use of automated means and the fast, and almost immediate removal of content would mean that certain content creators might not even be able to post their content <a href="https://www.eff.org/wp/unfiltered-how-youtubes-content-id-discourages-fair-use-and-dictates-what-we-see-online"><span class="s2">online.</span><span class="s5"> With the use of proactive filtering through automated means the content can be removed almost immediately.</span></a></span><span class="s6"> </span><span class="s1">With India’s current trend of new internet users, some of these creators would also be <a href="https://timesofindia.indiatimes.com/business/india-business/for-the-first-time-india-has-more-rural-net-users-than-urban/articleshow/75566025.cms"><span class="s2">first time users</span></a> of the internet. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p3"><span class="s1"><strong>Conclusion</strong></span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p3"><span class="s1">The need for automated removal of content is understandable, based not only on the sheer volume of content but also the nightmare stories of the toll it takes on human content moderators, who otherwise have to go through hours of disturbing content. Though the Indian Intermediary Liability Guidelines have improved from the earlier versions in terms of moving away from mandating proactive filtering, there still needs to be consideration of how these technologies are used, and the laws should understand the shift in the definition of who a content creator is. There needs to be ways of recourse to unfair removal of content and a means to get an explanation of why the content was removed, via notices to the user. In the case of India, the notices should be in Indian languages as well, so that the people are able to understand them. </span></p>
<p class="p2"><span class="s1"></span></p>
<p class="p3"><span class="s1">In the absence of further clear guidelines, the perils of over-censorship by the intermediaries in order to stay out of trouble could lead to further stifling of not just freedom of speech but also access to information. In addition, the fear of content being taken down or even potential prosecution could mean that people resort to self-censorship, preventing them from exercising their fundamental rights to freedom of speech and expression, as guaranteed by the Indian Constitution. We hope that the next version of the Rules take a more nuanced approach to automated content removal and ensure adequate and specific safeguards to ensure a conducive environment for both intermediaries and content creators. </span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/finding-needles-in-haystacks-discussing-the-role-of-automated-filtering-in-the-new-indian-intermediary-liability-rules'>http://editors.cis-india.org/internet-governance/blog/finding-needles-in-haystacks-discussing-the-role-of-automated-filtering-in-the-new-indian-intermediary-liability-rules</a>
</p>
No publisherShweta Mohandas and Torsha SarkarInternet GovernanceIntermediary LiabilityArtificial Intelligence2021-08-03T07:28:53ZBlog EntryEuropean Court of Justice rules Internet Search Engine Operator responsible for Processing Personal Data Published by Third Parties
http://editors.cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties
<b>The Court of Justice of the European Union has ruled that an "an internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties.” The decision adds to the conundrum of maintaining a balance between freedom of expression, protecting personal data and intermediary liability.</b>
<p style="text-align: justify; ">The ruling is expected to have considerable impact on reputation and privacy related takedown requests as under the decision, data subjects may approach the operator directly seeking removal of links to web pages containing personal data. Currently, users prove whether data needs to be kept online—the new rules reverse the burden of proof, placing an obligation on companies, rather than users for content regulation.</p>
<h3>A win for privacy?</h3>
<p style="text-align: justify; ">The ECJ ruling addresses Mario Costeja González complaint filed in 2010, against Google Spain and Google Inc., requesting that personal data relating to him appearing in search results be protected and that data which was no longer relevant be removed. Referring to <a href="http://eur-lex.europa.eu/LexUriServ/LexUriServ.do?uri=CELEX:31995L0046:en:HTML">the Directive 95/46/EC</a> of the European Parliament, the court said, that Google and other search engine operators should be considered 'controllers' of personal data. Following the decision, Google will be required to consider takedown requests of personal data, regardless of the fact that processing of such data is carried out without distinction in respect of information other than the personal data.</p>
<p style="text-align: justify; ">The decision—which cannot be appealed—raises important of questions of how this ruling will be applied in practice and its impact on the information available online in countries outside the European Union. The decree forces search engine operators such as Google, Yahoo and Microsoft's Bing to make judgement calls on the fairness of the information published through their services that reach over 500 million people across the twenty eight nation bloc of EU.</p>
<p style="text-align: justify; ">ECJ rules that search engines 'as a general rule,' should place the right to privacy above the right to information by the public. Under the verdict, links to irrelevant and out of date data need to be erased upon request, placing search engines in the role of controllers of information—beyond the role of being an arbitrator that linked to data that already existed in the public domain. The verdict is directed at highlighting the power of search engines to retrieve controversial information while limiting their capacity to do so in the future.</p>
<p style="text-align: justify; ">The ruling calls for maintaining a balance in addressing the legitimate interest of internet users in accessing personal information and upholding the data subject’s fundamental rights, but does not directly address either issues. The court also recognised, that the data subject's rights override the interest of internet users, however, with exceptions pertaining to nature of information, its sensitivity for the data subject's private life and the role of the data subject in public life. Acknowledging that data belongs to the individual and is not the right of the company, European Commissioner Viviane Reding, <a href="https://www.facebook.com/permalink.php?story_fbid=304206613078842&id=291423897690447&_ga=1.233872279.883261846.1397148393">hailed the verdict</a>, "a clear victory for the protection of personal data of Europeans".</p>
<p style="text-align: justify; ">The Court stated that if data is deemed irrelevant at the time of the case, even if it has been lawfully processed initially, it must be removed and that the data subject has the right to approach the operator directly for the removal of such content. The liability issue is further complicated by the fact, that search engines such as Google do not publish the content rather they point to information that already exists in the public domain—raising questions of the degree of liability on account of third party content displayed on their services.</p>
<p style="text-align: justify; ">The ECJ ruling is based on the case originally filed against Google, Spain and it is important to note that, González argued that searching for his name linked to two pages originally published in 1998, on the website of the Spanish newspaper La Vanguardia. The Spanish Data Protection Agency did not require La Vanguardia to take down the pages, however, it did order Google to remove links to them. Google appealed this decision, following which the National High Court of Spain sought advice from the European court. The definition of Google as the controller of information, raises important questions related to the distinction between liability of publishers and the liability of processors of information such as search engines.</p>
<h3>The 'right to be forgotten'</h3>
<p style="text-align: justify; ">The decision also brings to the fore, the ongoing debate and <a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law">fragmented opinions within the EU</a>, on the right of the individual to be forgotten. The <a href="http://www.bbc.com/news/technology-16677370">'right to be forgotten</a>' has evolved from the European Commission's wide-ranging plans of an overhaul of the commission's 1995 Data Protection Directive. The plans for the law included allowing people to request removal of personal data with an obligation of compliance for service providers, unless there were 'legitimate' reasons to do otherwise. Technology firms rallying around issues of freedom of expression and censorship, have expressed concerns about the reach of the bill. Privacy-rights activist and European officials have upheld the notion of the right to be forgotten, highlighting the right of the individual to protect their honour and reputation.</p>
<p style="text-align: justify; ">These issues have been controversial amidst EU member states with the UK's Ministry of Justice claiming the law 'raises unrealistic and unfair expectations' and has <a href="http://www.theguardian.com/technology/2013/apr/04/britain-opt-out-right-to-be-forgotten-law">sought to opt-out</a> of the privacy laws. The Advocate General of the European Court <a href="http://curia.europa.eu/juris/document/document.jsf?text=&docid=138782&pageIndex=0&doclang=EN&mode=req&dir=&occ=first&part=1&cid=362663#Footref91">Niilo Jääskinen's opinion</a>, that the individual's right to seek removal of content should not be upheld if the information was published legally, contradicts the verdict of the ECJ ruling. The European Court of Justice's move is surprising for many and as Richard Cumbley, information-management and data protection partner at the law firm Linklaters <a href="http://turnstylenews.com/2014/05/13/europe-union-high-court-establishes-the-right-to-be-forgotten/">puts it</a>, “Given that the E.U. has spent two years debating this right as part of the reform of E.U. privacy legislation, it is ironic that the E.C.J. has found it already exists in such a striking manner."</p>
<p style="text-align: justify; ">The economic implications of enforcing a liability regime where search engine operators censor legal content in their results aside, the decision might also have a chilling effect on freedom of expression and access to information. Google <a href="http://www.theguardian.com/technology/2014/may/13/right-to-be-forgotten-eu-court-google-search-results">called the decision</a> “a disappointing ruling for search engines and online publishers in general,” and that the company would take time to analyze the implications. While the implications of the decision are yet to be determined, it is important to bear in mind that while decisions like these are public, the refinements that Google and other search engines will have to make to its technology and the judgement calls on the fairness of the information available online are not public.</p>
<p style="text-align: justify; ">The ECJ press release is available <a href="http://curia.europa.eu/jcms/upload/docs/application/pdf/2014-05/cp140070en.pdf">here</a> and the actual judgement is available <a href="http://curia.europa.eu/juris/documents.jsf?pro=&lgrec=en&nat=or&oqp=&lg=&dates=&language=en&jur=C%2CT%2CF&cit=none%252CC%252CCJ%252CR%252C2008E%252C%252C%252C%252C%252C%252C%252C%252C%252C%252Ctrue%252Cfalse%252Cfalse&num=C-131%252F12&td=%3BALL&pcs=Oor&avg">here</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties'>http://editors.cis-india.org/internet-governance/blog/ecj-rules-internet-search-engine-operator-responsible-for-processing-personal-data-published-by-third-parties</a>
</p>
No publisherjyotiFreedom of Speech and ExpressionSocial MediaInternet GovernanceIntermediary Liability2014-05-14T14:18:46ZBlog EntryDonald Trump is attacking the social media giants; here’s what India should do differently
http://editors.cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently
<b>For a robust and rights-respecting public sphere, India needs to ensure that large social media platforms receive adequate protections, and are made more responsible to its users.</b>
<p>This piece was first published at <a class="external-link" href="https://scroll.in/article/965151/donald-trump-is-attacking-the-social-media-giants-heres-what-india-should-do-differently">Scroll</a>. The authors would like to thank Torsha Sarkar for reviewing and editing the piece, and to Divij Joshi for his feedback.</p>
<hr />
<div id="article-contents" class="article-body">
<p>In retaliation to Twitter <a class="link-external" href="https://www.nytimes.com/2020/05/26/technology/twitter-trump-mail-in-ballots.html" rel="nofollow noopener" target="_blank">labelling</a> one of US President Donald Trump’s tweets as being misleading, the White House signed an <a class="link-external" href="https://www.whitehouse.gov/presidential-actions/executive-order-preventing-online-censorship/" rel="nofollow noopener" target="_blank">executive order</a>
on May 28 that seeks to dilute protections that social media companies
in the US have with respect to third-party content on their platforms.</p>
<p>The
order argues that social media companies that engage in censorship stop
functioning as ‘passive bulletin boards’: they must consequently be
treated as ‘content creators’, and be held liable for content on their
platforms as such. The shockwaves of the decision soon reached India,
with news coverage of the event <a class="link-external" href="https://www.business-standard.com/article/companies/trump-twitter-spat-debate-rages-on-role-of-social-media-companies-120053100055_1.html" rel="nofollow noopener" target="_blank">starting</a> to <a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/feud-between-donald-trump-and-jack-dorsey-can-have-long-lasting-effects-on-how-we-consume-media-in-india/articleshow/76111556.cms" rel="nofollow noopener" target="_blank">debate</a> the <a class="link-external" href="https://economictimes.indiatimes.com/tech/internet/trumps-move-against-social-media-cos-unlikely-to-change-indias-stand/articleshow/76094586.cms?from=mdr" rel="nofollow noopener" target="_blank">consequences</a> of Trump’s order on how India regulates internet services and social media companies.</p>
<p>The
debate on the responsibilities of online platforms is not new to India,
and recently took main stage in December 2018 when the Ministry of
Electronics and Information Technology, Meity, published a draft set of
guidelines that most online services – ‘intermediaries’ – must follow.
The draft rules, which haven’t been notified yet, propose to
significantly expand the obligations on intermediaries.</p>
<p>Trump’s
executive order, however, comes in the context of content moderation
practices by social media platforms, i.e. when platforms censor speech
of their volition, and not because of legal requirements. The legal
position of content moderation is relatively under-discussed, at least
in legal terms, when it comes to India.</p>
<p>In contrast to
commentators who have implicitly assumed that Indian law permits content
moderation by social media companies, we believe Indian law fails to
adequately account for content moderation and curation practices
performed by social media companies. There may be adverse consequences
for the exercise of freedom of expression in India if this lacuna is not
filled soon.</p>
<h3 class="cms-block cms-block-heading">India vs US<br /></h3>
<p>A
useful starting point for the analysis is to compare how the US and
India regulate liability for online services. In the US, Section 230 of
the Communications Decency Act provides online services with broad
immunity from liability for third party content that they host or
transmit.</p>
<p>There are two critical components to what is generally referred to as Section 230.</p>
<p>First,
providers of an ‘interactive computer service’, like your internet
service provider or a company like Facebook, will not be treated as
publishers or speakers of third-party content. This system has allowed
the internet speech and economy to <a class="link-external" href="https://law.emory.edu/elj/content/volume-63/issue-3/articles/how-law-made-silicon-valley.html" rel="nofollow noopener" target="_blank">flourish</a>
since it allows companies to focus on their service without a constant
paranoia for what users are transmitting through their service.</p>
<p>The
second part of Section 230 states that services are allowed to moderate
and remove, in ‘good faith’, such third-party content that they may
deem offensive or obscene. This allows for online services to instate
their own community guidelines or content policies.</p>
<p>In India,
section 79 of the Information Technology Act is the analogous provision:
it grants intermediaries conditional ‘safe harbour’. This means
intermediaries, again like Facebook or your internet provider, are
exempt from liability for third-party content – like messages or videos
posted by ordinary people – provided their functioning meets certain
requirements, and they comply with the allied rules, known as
Intermediary Guidelines.</p>
<p>The notable and stark difference between
Indian law and Section 230 is that India’s IT Act is largely silent on
content moderation practices. As Rahul Matthan <a class="link-external" href="https://www.livemint.com/opinion/columns/shield-online-platforms-for-content-moderation-to-work-11591116270685.html" rel="nofollow noopener" target="_blank">points out</a>,
there is no explicit allowance in Indian law for platforms to take down
content based on their own policies, even if such actions are done in
good faith.</p>
<h3 class="cms-block cms-block-heading">Safe harbour</h3>
<div> </div>
<p>One
may argue that the absence of an explicit permission does not
necessarily mean that any platform engaging in content moderation
practices will lose its safe harbour. However, the language of Section
79 and the allied rules may even create room for divesting social media
platforms of their safe harbour.</p>
<p>The first such indication is
that the conditions to qualify for safe harbour, intermediaries must not
modify said content, not select the recipients of particular content,
and take information down when it is brought to their notice by
governments or courts.</p>
<p>Most of the conditions are almost a
verbatim copy of a ‘mere conduit’ as defined by the EU Directive on
E-Commerce, 2000. This definition was meant to encapsulate the
functioning of services like infrastructure providers, which transmit
content without exerting any real control. Thus, by adopting this
definition for all intermediaries, Indian law mostly considers internet
services, even social media platforms, to be passive plumbing through
which information flows.</p>
<p>It is easy to see how this narrow conception of online services is severely <a class="link-external" href="https://georgetownlawtechreview.org/wp-content/uploads/2018/07/2.2-Gilespie-pp-198-216.pdf" rel="nofollow noopener" target="_blank">lacking</a>.</p>
<p>Most prominent social media platforms <a class="link-external" href="http://guidelines." rel="nofollow noopener" target="_blank">remove</a> or <a class="link-external" href="https://techcrunch.com/2019/12/16/instagram-fact-checking/" rel="nofollow noopener" target="_blank">hide</a> content, <a class="link-external" href="https://about.fb.com/news/2016/06/building-a-better-news-feed-for-you/" rel="nofollow noopener" target="_blank">algorithmically curate</a> news-feeds to make users keep coming back for more, and increasingly add <a class="link-external" href="https://blog.twitter.com/en_us/topics/product/2020/updating-our-approach-to-misleading-information.html" rel="nofollow noopener" target="_blank">labels</a>
to content. If the law is interpreted strictly, these practices may be
adjudged to run afoul of the aforementioned conditions that
intermediaries need to satisfy in order to qualify for safe harbour.</p>
<h3 class="cms-block cms-block-heading">Platforms or editors?<br /></h3>
<p>For
instance, it can be argued that social media platforms initiate
transmission in some form when they pick and ‘suggest’ relevant
third-party content to users. When it comes to newsfeeds, neither the
content creator nor the consumer have as much control over how their
content is disseminated or curated as much as the platform does. By
curating newsfeeds, social media platforms can be said to essentially
‘selecting the receiver’ of transmissions.</p>
<p>The Intermediary
Guidelines further complicate matters by specifically laying out what is
not to be construed as ‘editing’ under the law. Under rule 3(3), the
act of taking down content pursuant to orders under the Act will not be
considered as ‘editing’ of said content.</p>
<p>Since the term ‘editing’
has been left undefined beyond the negative qualification, several
social media intermediaries may well qualify as editors. They use
algorithms that curate content for their users; like traditional news
editors, these algorithms use certain <a class="link-external" href="https://www.researchgate.net/profile/Michael_Devito/publication/302979999_From_Editors_to_Algorithms_A_values-based_approach_to_understanding_story_selection_in_the_Facebook_news_feed/links/5a19cc3d4585155c26ac56d4/From-Editors-to-Algorithms-A-values-based-approach-to-understanding-story-selection-in-the-Facebook-news-feed.pdf" rel="nofollow noopener" target="_blank">‘values’</a>
to determine what is relevant to their audiences. In other words, one
can argue that it is difficult to draw a bright line between editorial
and algorithmic acts.</p>
<p>To retain their safe harbour, the
counter-argument that social media platforms can rely is the fact that
Rule 3(5) of the Intermediary Guidelines requires intermediaries to
inform users that intermediaries reserve the right to take down user
content that relates to a wide of variety of acts, including content
that threatens national security, or is “[...] grossly harmful,
harassing, blasphemous, [etc.]”.</p>
<p>In practice, however, the
content moderation practices of some social media companies may go
beyond these categories. Additionally, the rule does not address the
legal questions created by these platforms’ curation of news-feeds.</p>
<p>The
purpose of highlighting how Section 79 treats the practices of social
media platforms is not with the intention of arguing that these
platforms should be held liable for user-generated content. Online
spaces created by social media platforms have allowed for individuals to
express themselves and participate in political organisation and <a class="link-external" href="https://www.pewresearch.org/internet/2018/07/11/public-attitudes-toward-political-engagement-on-social-media/" rel="nofollow noopener" target="_blank">debate</a>.</p>
<p>A
level of protection of intermediaries from immunity is therefore
critical for the protection of several human rights, especially the
right to freedom of speech. This piece only serves to highlight that
section 79 is antiquated and unfit to deal with modern online services.
The interpretative dangers that exist in the provision create regulatory
uncertainty for organisations operating in India.</p>
<h3 class="cms-block cms-block-heading">Dangers to speech<br /></h3>
<p>These dangers may not just be theoretical.</p>
<p>Only last year, Twitter CEO Jack Dorsey was <a class="link-external" href="https://www.hindustantimes.com/india-news/twitter-ceo-jack-dorsey-summoned-by-parliamentary-panel-on-feb-25-panel-refuses-to-hear-other-officials/story-8x9OUbNBo36uvp92L5nOKI.html" rel="nofollow noopener" target="_blank">summoned</a>
by the Parliamentary Committee on Information Technology to answer
accusations of the platform having a bias against ‘right-wing’ accounts.
More recently, BJP politician Vinit Goenka <a class="link-external" href="https://www.medianama.com/2020/06/223-vinit-goenka-twitter-khalistan/" rel="nofollow noopener" target="_blank">encouraged people to file cases against Twitter</a> for promoting separatist content.</p>
<p>Recent <a class="link-external" href="https://sflc.in/sites/default/files/reports/Intermediary_Liability_2_0_-_A_Shifting_Paradigm.pdf" rel="nofollow noopener" target="_blank">interventions</a>
from the Supreme Court have imposed proactive filtration and blocking
requirements on intermediaries, but these have been limited to
reasonable restrictions that may be imposed on free speech under Article
19 of India’s Constitution. Content moderation policies of
intermediaries like Twitter and Facebook go well beyond the scope of
Article 19 restrictions, and the apex court has not yet addressed this.</p>
<p>The
Delhi High Court, in Christian Louboutin v. Nakul Bajaj, has already
highlighted criteria for when e-commerce intermediaries can stake claim
to Section 79 safe harbour protections based on the active (or passive)
nature of their services. While the order came in the context of
intellectual property violations, nothing keeps a court from similarly
finding that Facebook and Twitter play an ‘active’ role when it comes to
content moderation and curation.</p>
<p>These companies may one day
find the ‘safe harbour’ rug pulled from under their feet if a court
reads section 79 more strictly. In fact, judicial intervention may not
even be required. The threat of such an interpretation may simply be
exploited by the government, and used as leverage to get social media
platforms to toe the government line.</p>
<h3 class="cms-block cms-block-heading">Protection and responsibility<br /></h3>
<p>Unfortunately,
the amendments to the intermediary guidelines proposed in 2018 do not
address the legal position of content moderation either. More recent
developments <a class="link-external" href="https://www.medianama.com/2020/04/223-meity-information-technology-act-amendments/" rel="nofollow noopener" target="_blank">suggest</a>
that the Meity may be contemplating amending the IT Act. This presents
an opportunity for a more comprehensive reworking of the Indian
intermediary liability regime than what is possible through delegated
legislation like the intermediary rules.</p>
<p>Intermediaries, rather
than being treated uniformly, should be classified based on their
function and the level of control they exercise over the content they
process. For instance, network infrastructure should continue to be
treated as ‘mere conduits’ and enjoy broad immunity from liability for
user-generated content.</p>
<p>More complex services like search engines
and online social media platforms can have differentiated
responsibilities based on the extent they can contextualise and change
content. The law should carve out an explicit permission to platforms to
moderate content in good faith. Such an allowance should be accompanied
by outlining best practices that these platforms can follow to ensure <a class="link-external" href="https://santaclaraprinciples.org/" rel="nofollow noopener" target="_blank">transparency and accountability</a> to their users.</p>
<p>For
a robust and rights-respecting public sphere, India needs to ensure
that large social media platforms receive adequate protections, and are
made more responsible to its users.</p>
<p><em>Anna Liz Thomas is a law
graduate and a policy researcher, currently working with the Centre for
Internet and Society. Gurshabad Grover manages research in the freedom
of expression and internet governance team at CIS</em>.</p>
</div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently'>http://editors.cis-india.org/internet-governance/blog/donald-trump-is-attacking-the-social-media-giants-here2019s-what-india-should-do-differently</a>
</p>
No publisherAnna Liz Thomas and Gurshabad GroverContent takedownFreedom of Speech and ExpressionIntermediary Liability2020-06-25T09:07:52ZBlog EntryDon't Shoot the Messenger: Speech on Intermediary Liability at 22nd SCCR of WIPO
http://editors.cis-india.org/a2k/blogs/intermediary-liability-wipo-speech
<b>This is a speech made by Pranesh Prakash at an side-event co-organized by the World Intellectual Property Organization and the Internet Society on intermediary liability, to coincide with the release of Prof. Lillian Edwards's WIPO-commissioned report on 'Role and Responsibility of the Internet Intermediaries in the Field of Copyright'.</b>
<p>Good afternoon. I've been asked to provide a user's perspective to the question of intermediary liability. "In what cases should an Internet intermediary—a messenger—be held liable for the doings of a third party?" is the broad question. I believe that in answering that question we can be guided by two simple principles: As long as intermediaries don't exercise direct editorial control, they should not be held liable; and as long as they don't instigate or encourage the illegal activity, they should not be held liable. In all other cases, attacking Internet intermediaries generally a sign of 'shooting the messenger'.
General intermediary liability and intermediary liability for copyright infringement share a common philosophical foundation, and so I will talk about general intermediary liability first.</p>
<p>While going about holding intermediaries liable, we must remember that what is at stake here is the fact that intermediaries are a necessary component of ensuring freedom of speech and self-expression on the World Wide Web. In this regard, we must keep in mind the joint declaration issued by <a href="http://www.cidh.oas.org/relatoria/showarticle.asp?artID=848&lID=1">four freedom of expression rapporteurs under the aegis of the Organization of American States on June 1, 2011</a>:</p>
<blockquote>
<p>Intermediary Liability</p>
<p>a. No one who simply provides technical Internet services such as providing access, or searching for, or transmission or caching of information, should be liable for content generated by others, which is disseminated using those services, as long as they do not specifically intervene in that content or refuse to obey a court order to remove that content, where they have the capacity to do so (‘mere conduit principle’).</p>
<p>b. Consideration should be given to insulating fully other intermediaries, including those mentioned in the preamble, from liability for content generated by others under the same conditions as in paragraph 2(a). At a minimum, intermediaries should not be required to monitor user-generated content and should not be subject to extra-judicial content takedown rules which fail to provide sufficient protection for freedom of expression (which is the case with many of the ‘notice and takedown’ rules currently being applied).</p>
</blockquote>
<p>It is useful to keep in mind what the kind of liability we affix on offline intermediaries: Would we hold a library responsible for unlawful material that a user has placed on its shelves without its encouragement?</p>
<p>Ensuring a balanced system of intermediary liability is also very important in preserving the forms of innovations we have seen online. Ensuring that intermediaries aren't always held liable for what third parties do is an essential component of encouraging new models of participation, such as Wikipedia. While Wikipedia has community-set standards with regard to copyright, obscenity, and other such issues, holding the Wikimedia Foundation (which has only around 30-40 people) itself responsible for what millions of users write on Wikipedia will hamper such new models of peer-production. This point, unfortunately, has not prevented the Wikimedia Foundation being sued a great number of times in India, a large percentage of which take the form of SLAPP ('strategic lawsuit against public participation') cases, since if the real intention had been to remove the offending content, editing Wikipedia is an easy enough way of achieving that.</p>
<p>While searching for these balanced solutions, we need to look beyond Europe, and look at how countries like Chile, Brazil, India and others are looking at these issues. Unfortunately, this being Geneva, most of the people I see represented in this room are from the developed world as are the examples we are discussing (France and Spain).</p>
<p>In India, for instance, the Internet Service Providers Association made it clear in 2006 (when there was an outcry over censorship of blogging platforms) that they do not want to be responsible for deciding whether something about which they have received a complaint is unlawful or not.</p>
<p>With respect to copyright and the Internet, while the Internet allows for copyright infringement to be conducted more easily, it also allows for copyright infringement to be spotted more easily. Earlier, if someone copied, it would be difficult to find out. Now that is not so. So, that balance is already ingrained, and while many in the industry focus on the fact of easier infringement and thus ask for increased legal protection, such increase in legal protection is not required since the same technological factors that enable increased infringement also enable increased ability to know about that infringement.</p>
<p>On the Internet, intermediaries sometimes engage in primary infringement due to the very nature of digital technology. In the digital sphere, everything is a copy. Thus, whenever you're working on a computer, copies of the copyrighted that show up on your screen are automatically copied to your computer's RAM. Whenever you download anything from the Internet, copies of it are created en route to your computer. (That is the main reason that exceptions in the copyright laws of most countries that allow you to re-sell a book you own don't apply to electronic books.) In such a case, intermediaries must be specially protected. </p>
<p>Additionally, online activities that we take for granted, for instance search technologies, violate the copyright law of most countries. For online search technology to be reasonably fast (instead of taking hours for each search), the searching has to be done on a copies (cache) of actual websites instead of the actual websites. For image searching, it would be unreasonable to expect search companies to take licences for all the images they allow you to search through. Yet, not doing so might violate the copyright laws of many countries. No one, or so one would think, would argue that search engines should be made illegal, but in some countries copyright law is being used to attack intermediaries.</p>
<p>As noted above, intermediaries are a necessary part of online free speech. Current methods of regulating copyright infringement by users via intermediaries online may well fall afoul of internationally accepted standards of human rights. Frank La Rue, the UN Special Rapporteur on Freedom of Opinion and Expression in <a href="http://www2.ohchr.org/english/bodies/hrcouncil/docs/17session/A.HRC.17.27_en.pdf">his recent report to the UN Human Rights Council</a> stated:</p>
<blockquote>
<p>While blocking and filtering measures deny access to certain content on the Internet, States have also taken measures to cut off access to the Internet entirely. </p>
<p>The Special Rapporteur is deeply concerned by discussions regarding a centralized “on/off” control over Internet traffic. In addition, he is alarmed by proposals to disconnect users from Internet access if they violate intellectual property rights. This also includes legislation based on the concept of “graduated response”, which imposes a series of penalties on copyright infringers that could lead to suspension of Internet service, such as the so-called “three-strikes law” in France and the Digital Economy Act 2010 of the United Kingdom.</p>
<p>Beyond the national level, the Anti-Counterfeiting Trade Agreement (ACTA) has been proposed as a multilateral agreement to establish international standards on intellectual property rights enforcement. While the provisions to disconnect individuals from Internet access for violating the treaty have been removed from the final text of December 2010, the Special Rapporteur remains watchful about the treaty’s eventual implications for intermediary liability and the right to freedom of expression.</p>
</blockquote>
<p>With respect to graduated response, there is very little that one can add to Prof. Edwards's presentation. I would like to add one further suggestion that Prof. Ed Felten originally put forward as a 'modest proposal': Corporations which make or facilitate three wrongful accusations should face the same penalty as the users who are accused thrice.
The recent US strategy of seizing websites even before trial has been sufficiently criticised, so I shall not spend my time on it.</p>
<p>I still have not seen any good evidence as to why for other kinds of primary or secondary liability incurred by online intermediaries the procedure for offline copyright infringement should not apply, since they are usually crafted taking into account principles of natural justice.</p>
<p>The only 'international' and slightly troublesome issue that a resolution is needed to is that of problems relating to different jurisdiction’s laws applying on a single global network. However, this question is much larger one that of copyright and a copyright-specific solution cannot be found. Thus WIPO is not the right forum for the redress of that problem.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/blogs/intermediary-liability-wipo-speech'>http://editors.cis-india.org/a2k/blogs/intermediary-liability-wipo-speech</a>
</p>
No publisherpraneshIntermediary LiabilityIntellectual Property RightsCopyrightAccess to Knowledge2012-06-01T15:01:08ZBlog EntryDo IT Rules 2011 indirectly leads to Censorship of Internet
http://editors.cis-india.org/news/do-it-rules-indirectly-lead-to-censorship-of-internet
<b>Pranesh Prakash along with Dr. Arvind Gupta, National Convener, BJP IT Cell and Ms.
Mishi Choudhary, Executive Director, SFLC participated in a panel discussion on censorship of the Internet on May 8, 2012.
</b>
<p>The discussion was broadcast on Yuva iTV. See the video below:</p>
<h2>Video</h2>
<p><iframe src="http://www.youtube.com/embed/KRIJRhpW-Bc" frameborder="0" height="315" width="320"></iframe></p>
<p><a class="external-link" href="http://www.youtube.com/watch?v=KRIJRhpW-Bc">Click for the video on YouTube</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/news/do-it-rules-indirectly-lead-to-censorship-of-internet'>http://editors.cis-india.org/news/do-it-rules-indirectly-lead-to-censorship-of-internet</a>
</p>
No publisherpraskrishnaIT ActInternet GovernanceVideoIntermediary LiabilityCensorship2012-05-31T09:00:41ZNews ItemDeitY says 143 URLs have been Blocked in 2015; Procedure for Blocking Content Remains Opaque and in Urgent Need of Transparency Measures
http://editors.cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015
<b>Across India on 30 December 2014, following an order issued by the Department of Telecom (DOT), Internet Service Providers (ISPs) blocked 32 websites including Vimeo, Dailymotion, GitHub and Pastebin.</b>
<p style="text-align: justify;">In February 2015, the Centre for Internet and Society (CIS) requested the Department of Electronics and Information Technology (DeitY) under the Right to Information Act, 2005 (RTI Act) to provide information clarifying the procedures for blocking in India. We have received a response from DeitY which may be <a href="http://editors.cis-india.org/internet-governance/blog/response-deity.clarifying-procedures-for-blocking.pdf" class="external-link">seen here</a>.</p>
<p style="text-align: justify;">In this post, I shall elaborate on this response from DeitY and highlight some of the accountability and transparency measures that the procedure needs. To stress the urgency of reform, I shall also touch upon two recent developments—the response from Ministry of Communication to questions raised in Parliament on the blocking procedures and the Supreme Court (SC) judgment in Shreya Singhal v. Union of India.</p>
<h2 style="text-align: justify;">Section 69A and the Blocking Rules</h2>
<p align="JUSTIFY" class="western">Section 69A of the Information Technology Act, 2008 (S69A hereinafter) grants powers to the central government to issue directions for blocking of access to any information through any computer resource. In other words, it allows the government to block any websites under certain grounds. The Government has notified rules laying down the procedure for blocking access online under the Procedure and Safeguards for Blocking for Access of Information by Public Rules, 2009 (Rules, 2009 hereinafter). CIS has produced a poster explaining the blocking procedure (<a href="http://cis-india.org/internet-governance/blog/blocking-websites.pdf/at_download/file">download PDF</a>, 2.037MB).</p>
<p align="JUSTIFY" class="western">There are <em>three key aspects</em> of the blocking rules that need to be kept under consideration:</p>
<h3 align="JUSTIFY" class="western">Officers and committees handling requests</h3>
<p style="text-align: justify;"><strong>Designated Officer (DO)</strong> – Appointed by the Central government, officer not below the rank of Joint Secretary.<br /><strong>Nodal Officer (NO)</strong> – Appointed by organizations including Ministries or Departments of the State governments and Union Territories and any agency of the Central Government. <br /><strong>Intermediary contact</strong>–Appointed by every intermediary to receive and handle blocking directions from the DO.<br /><strong>Committee for Examination of Request (CER)</strong> – The request along with printed sample of alleged offending information is examined by the CER—committee with the DO serving as the Chairperson and representatives from Ministry of Law and Justice; Ministry of Home Affairs; Ministry of Information and Broadcasting and representative from the Indian Computer Emergency Response Team (CERT-In). The CER is responsible for examining each blocking request and makes recommendations including revoking blocking orders to the DO, which are taken into consideration for final approval of request for blocking by the Secretary, DOT. <br /><strong>Review Committee (RC) </strong>– Constituted under rule 419A of the Indian Telegraph Act, 1951, the RC includes the Cabinet Secretary, Secretary to the Government of India (Legal Affairs) and Secretary (Department of Telecom). The RC is mandated to meet at least once in 2 months and record its findings and has to validate that directions issued are in compliance with S69A(1).</p>
<h3 style="text-align: justify;">Provisions outlining the procedure for blocking</h3>
<p>Rules 6, 9 and 10 create three distinct blocking procedures, which must commence within 7 days of the DO receiving the request.</p>
<p style="text-align: justify;">a) Rule 6 lays out the first procedure, under which any person may approach the NO and request blocking, alternatively, the NO may also raise a blocking request. After the NO of the approached Ministry or Department of the State governments and Union Territories and/or any agency of the Central Government, is satisfied of the validity of the request they forward it to the DO. Requests when not sent through the NO of any organization, must be approved by Chief Secretary of the State or Union Territory or the Advisor to the Administrator of the Union Territory, before being sent to the DO.</p>
<p style="text-align: justify;">The DO upon receiving the request places, must acknowledge receipt within 24 four hours and places the request along with printed copy of alleged information for validation by the CER. The DO also, must make reasonable efforts to identify the person or intermediary hosting the information, and having identified them issue a notice asking them to appear and submit their reply and clarifications before the committee at a specified date and time, within forty eight hours of the receipt of notice.</p>
<p style="text-align: justify;">Foreign entities hosting the information are also informed and the CER gives it recommendations after hearing from the intermediary or the person has clarified their position and even if there is no representation by the same and after examining if the request falls within the scope outlined under S69A(1). The blocking directions are issued by the Secretary (DeitY), after the DO forwards the request and the CER recommendations. If approval is granted the DO directs the relevant intermediary or person to block the alleged information.</p>
<p style="text-align: justify;" class="western">b) Rule 9 outlines a procedure wherein, under emergency circumstances, and after the DO has established the necessity and expediency to block alleged information submits recommendations in writing to the Secretary, DeitY. The Secretary, upon being satisfied by the justification for, and necessity of, and expediency to block information may issue an blocking directions as an interim measure and must record the reasons for doing so in writing.</p>
<p style="text-align: justify;" class="western">Under such circumstances, the intermediary and person hosting information is not given the opportunity of a hearing. Nevertheless, the DO is required to place the request before the CER within forty eight hours of issuing of directions for interim blocking. Only upon receiving the final recommendations from the committee can the Secretary pass a final order approving the request. If the request for blocking is not approved then the interim order passed earlier is revoked, and the intermediary or identified person should be directed to unblock the information for public access.</p>
<p style="text-align: justify;" class="western">c) Rule 10 outlines the process when an order is issued by the courts in India. The DO upon receipt of the court order for blocking of information submits it to the Secretary, DeitY and initiates action as directed by the courts.</p>
<h3 style="text-align: justify;" class="western">Confidentiality clause</h3>
<p style="text-align: justify;">Rule 16 mandates confidentiality regarding all requests and actions taken thereof, which renders any requests received by the NO and the DO, recommendations made by the DO or the CER and any written reasons for blocking or revoking blocking requests outside the purview of public scrutiny. More detail on the officers and committees that enforce the blocking rules and procedure can be found <a href="http://cis-india.org/internet-governance/blog/is-india2019s-website-blocking-law-constitutional-2013-i-law-procedure">here</a>.</p>
<h2>Response on blocking from the Ministry of Communication and Information Technology</h2>
<p style="text-align: justify;">The response to our RTI from E-Security and Cyber Law Group is timely, given the recent clarification from the Ministry of Communication and Information Technology to a number of questions, raised by parliamentarian Shri Avinash Pande in the Rajya Sabha. The questions had been raised in reference to the Emergency blocking order under IT Act, the current status of the Central Monitoring System, Data Privacy law and Net Neutrality. The Centre for Communication Governance (CCG), National Law University New Delhi have extracted a set of 6 questions and you can read the full article <a href="https://ccgnludelhi.wordpress.com/2015/04/24/governments-response-to-fundamental-questions-regarding-the-internet-in-india/">here</a>.</p>
<p align="JUSTIFY" class="western">The governments response as quoted by CCG, clarifies under rule 9—the Government has issued directions for emergency blocking of <em>a total number of 216 URLs from 1st January, 2014 till date </em>and that <em>a total of 255 URLs were blocked in 2014 and no URLs has been blocked in 2015 (till 31 March 2015)</em> under S69A through the Committee constituted under the rules therein. Further, a total of 2091 URLs and 143 URLs were blocked in order to comply with the directions of the competent courts of India in 2014 and 2015 (till 31 March 2015) respectively. The government also clarified that the CER, had recommended not to block 19 URLs in the meetings held between 1<sup>st</sup><sup> </sup>January 2014 upto till date and so far, two orders have been issued to revoke 251 blocked URLs from 1st January 2014 till date. Besides, CERT-In received requests for blocking of objectionable content from individuals and organisations, and these were forwarded to the concerned websites for appropriate action, however the response did not specify the number of requests.</p>
<p align="JUSTIFY" class="western">We have prepared a table explaining the information released by the government and to highlight the inconsistency in their response.</p>
<table class="grid listing">
<colgroup> <col width="331"> <col width="90"> <col width="91"> <col width="119"> </colgroup>
<tbody>
<tr>
<td rowspan="2">
<p align="LEFT"><strong>Applicable rule and procedure outlined under the Blocking Rules</strong></p>
</td>
<td colspan="3">
<p align="CENTER"><strong>Number of websites</strong></p>
</td>
</tr>
<tr>
<td>
<p align="CENTER"><em>2014</em></p>
</td>
<td>
<p align="CENTER"><em>2015</em></p>
</td>
<td>
<p align="CENTER"><em>Total</em></p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 6 - Blocking requests from NO and others</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
<td>
<p align="CENTER">None</p>
</td>
<td>
<p align="CENTER">255</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 9 - Blocking under emergency circumstances</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">216</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Rule 10 - Blocking orders from Court</p>
</td>
<td>
<p align="CENTER">2091</p>
</td>
<td>
<p align="CENTER">143</p>
</td>
<td>
<p align="CENTER">2234</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Requests from individuals and orgs forwarded to CERT-In</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Recommendations to not block by CER</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">19</p>
</td>
</tr>
<tr>
<td>
<p align="LEFT">Number of blocking requests revoked</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">-</p>
</td>
<td>
<p align="CENTER">251</p>
</td>
</tr>
</tbody>
</table>
<p>In a <a href="http://sflc.in/deity-says-2341-urls-were-blocked-in-2014-refuses-to-reveal-more/">response </a>to an RTI filed by the Software Freedom Law Centre, DeitY said that 708 URLs were blocked in 2012, 1,349 URLs in 2013, and 2,341 URLs in 2014.</p>
<h2>Shreya Singhal v. Union of India</h2>
<p style="text-align: justify;">In its recent judgment, the SC of India upheld the constitutionality of 69A, stating that it was a narrowly-drawn provision with adequate safeguards. The constitutional challenge on behalf of the People’s Union for Civil Liberties (PUCL) considered the manner in which the blocking is done and the arguments focused on the secrecy present in blocking.</p>
<p style="text-align: justify;">The rules may indicate that there is a requirement to identify and contact the originator of information, though as an expert <a href="http://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">has pointed out</a>, there is no evidence of this in practice. The court has stressed the importance of a written order so that writ petitions may be filed under Article 226 of the Constitution. In doing so, the court seems to have assumed that the originator or intermediary is informed, and therefore held the view that any procedural inconsistencies may be challenged through writ petitions. However, this recourse is rendered ineffective not only due to procedural constraints, but also because of the confidentiality clause. The opaqueness through rule 16 severely reigns in the recourse that may be given to the originator and the intermediary. While the court notes that rule 16 requiring confidentality was argued to be unconstitutional, it does not state its opinion on this question in the judgment. One expert, holds the <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">view</a> that this, by implication, requires that requests cannot be confidential. However, such a reading down of rule 16 is yet to be tested.</p>
<p style="text-align: justify;">Further, Sunil Abraham has <a href="http://cis-india.org/internet-governance/blog/economic-and-political-weekly-sunil-abraham-april-11-2015-shreya-singhal-and-66a">pointed</a> out, “block orders are unevenly implemented by ISPs making it impossible for anyone to independently monitor and reach a conclusion whether an internet resource is inaccessible as a result of a S69A block order or due to a network anomaly.” As there are no comprehensive list of blocked websites or of the legal orders through which they are blocked exists, the public has to rely on media reports and filing RTI requests to understand the censorship regime in India. CIS has previously <a href="http://cis-india.org/internet-governance/blog/analysing-blocked-sites-riots-communalism">analysed</a> the leaked block lists and lists received as responses to RTI requests which have revealed that the block orders are full of errors and blocking of entire platforms and not just specific links has taken place.</p>
<p style="text-align: justify;">While the state has the power of blocking content, doing so in secrecy and without judical scrutiny, mark deficiencies that remain in the procedure outlined under the provisions of the blocking rules . The Court could read down rule 16 except for a really narrow set of exceptions, and in not doing so, perhaps has overlooked the opportunities for reform in the existing system. The blocking of 32 websites, is an example of the opaqueness of the system of blocking orders, and where the safeguards assumed by the SC are often not observed such as there being no access to the recommendations that were made by the CER, or towards the revocation of the blocking orders subsequently. CIS filed the RTI to try and understand the grounds for blocking and related procedures and the response has thrown up some issues that must need urgent attention.</p>
<h2>Response to RTI filed by CIS</h2>
<p align="JUSTIFY" class="western">Our first question sought clarification on the websites blocked on 30<sup>th</sup><sup> </sup>December 2014 and the response received from DeitY, E-Security and Cyber Law Group reveals that the websites had been blocked as “they were being used to post information related to ISIS using the resources provided by these websites”. The response also clarifies that the directions to block were issued on <em>18-12-2014 and as of 09-01-2015</em>, after obtaining an undertaking from website owners, stating their compliance with the Government and Indian laws, the sites were unblocked.</p>
<p align="JUSTIFY" class="western">It is not clear if ATS, Mumbai had been intercepting communication or if someone reported these websites. If the ATS was indeed intercepting communication, then as per the rules, the RC should be informed and their recommendations sought. It is unclear, if this was the case and the response evokes the confidentiality clause under rule 16 for not divulging further details. Based on our reading of the rules, court orders should be accessible to the public and without copies of requests and complaints received and knowledge of which organization raised them, there can be no appeal or recourse available to the intermediary or even the general public.</p>
<p align="JUSTIFY" class="western">We also asked for a list of all requests for blocking of information that had been received by the DO between January 2013 and January 2015, including the copies of all files that had accepted or rejected. We also specifically, asked for a list of requests under rule 9. The response from DeitY stated that since January 1, 2015 to March 31, 2015 directions to block 143 URLs had been issued based on court orders. The response completely overlooks our request for information, covering the 2 year time period. It also does not cover all types of blocking orders under rule 6 and rule 9, nor the requests that are forwarded to CERT-In, as we have gauged from the ministry's response to the Parliament. Contrary to the SC's assumption of contacting the orginator of information, it is also clear from DeitY's response that only the websites had been contacted and the letter states that the “websites replied only after blocking of objectionable content”. </p>
<p align="JUSTIFY" class="western">Further, seeking clarification on the functioning of the CER, we asked for the recent composition of members and the dates and copies of the minutes of all meetings including copies of the recommendations made by them. The response merely quotes rule 7 as the reference for the composition and does not provide any names or other details. We ascertain that as per the DeitY website Shri B.J. Srinath, Scientist-G/GC is the appointed Designated Officer, however this needs confirmation. While we are already aware of the structure of the CER which representatives and appointed public officers are guiding the examination of requests remains unclear. Presently, there are 3 Joint Secretaries appointed under the Ministry of Law and Justice, the Home Ministry has appointed 19, while 3 are appointed under the Ministry of Information and Broadcasting. Further, it is not clear which grade of scientist would be appointed to this committee from CERT-In as the rules do not specify this. While the government has clarified in their answer to Parliament that the committee had recommended not to block 19 URLs in the meetings held between 1st January 2014 to till date, it is remains unclear who is taking these decisions to block and revoke blocked URLs. The response from DeitY specifies that the CER has met six times between 2014 and March 2015, however stops short on sharing any further information or copies of files on complaints and recommendations of the CER, citing rule 16.</p>
<p align="JUSTIFY" class="western">Finally, answering our question on the composition of the RC the letter merely highlights the provision providing for the composition under 419A of the Indian Telegraph Rules, 1951. The response clarifies that so far, the RC has met once on 7th December, 2013 under the Chairmanship of the Cabinet Secretary, Department of Legal Affaits and Secretary, DOT. Our request for minutes of meetings and copies of orders and findings of the RC is denied by simply stating that “minutes are not available”. Under 419A, any directions for interception of any message or class of messages under sub-section (2) of Section 5 of the Indian Telegraph Act, 1885 issued by the competent authority shall contain reasons for such direction and a copy of such order shall be forwarded to the concerned RC within a period of seven working days. Given that the RC has met just once since 2013, it is unclear if the RC is not functioning or if the interception of messages is being guided through other procedures. Further, we do not yet know details or have any records of revocation orders or notices sent to intermediary contacts. This restricts the citizens’ right to receive information and DeitY should work to make these available for the public.</p>
<p align="JUSTIFY" class="western">Given the response to our RTI, the Ministry's response to Parliament and the SC judgment we recommend the following steps be taken by the DeitY to ensure that we create a procedure that is just, accountable and follows the rule of law.</p>
<p align="JUSTIFY" class="western">The revocation of rule 16 needs urgent clarification for two reasons:</p>
<ol>
<li>Under Section 22 of the RTI Act provisions thereof, override all conflicting provisions in any other legislation.</li>
<li style="text-align: justify;">In upholding the constitutionality of S69A the SC cites the requirement of reasons behind blocking orders to be recorded in writing, so that they may be challenged by means of writ petitions filed under <a href="http://indiankanoon.org/doc/1712542/">A</a><a href="http://indiankanoon.org/doc/1712542/">rticle 226</a> of the Constitution of India.</li></ol>
<p style="text-align: justify;">If the blocking orders or the meetings of the CER and RC that consider the reasons in the orders are to remain shrouded in secrecy and unavailable through RTI requests, filing writ petitions challenging these decisions will not be possible, rendering this very important safeguard for the protection of online free speech and expression infructuous. In summation, the need for comprehensive legislative reform remains in the blocking procedures and the government should act to address the pressing need for transparency and accountability. Not only does opacity curtial the strengths of democracy it also impedes good governance. We have filed an RTI seeking a comprehensive account of the blocking procedure, functioning of committees from 2009-2015 and we shall publish any information that we may receive.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015'>http://editors.cis-india.org/internet-governance/blog/deity-says-143-urls-blocked-in-2015</a>
</p>
No publisherjyotiCensorshipFreedom of Speech and ExpressionRTIIntermediary LiabilityAccountabilityFeatured69AInternet GovernanceChilling EffectTransparencyHomepageBlocking2015-04-30T07:37:40ZBlog EntryContent takedown and users' rights
http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1
<b>After Shreya Singhal v Union of India, commentators have continued to question the constitutionality of the content takedown regime under Section 69A of the IT Act (and the Blocking Rules issued under it). There has also been considerable debate around how the judgement has changed this regime: specifically about (i) whether originators of content are entitled to a hearing, (ii) whether Rule 16 of the Blocking Rules, which mandates confidentiality of content takedown requests received by intermediaries from the Government, continues to be operative, and (iii) the effect of Rule 16 on the rights of the originator and the public to challenge executive action. In this opinion piece, we attempt to answer some of these questions.</b>
<p style="text-align: justify;" class="normal"> </p>
<p style="text-align: justify;" class="normal">This article was first <a class="external-link" href="http://https://theleaflet.in/content-takedown-and-users-rights/">published</a> at the Leaflet. It has subsequently been republished by <a class="external-link" href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content">Scroll.in</a>, <a class="external-link" href="https://kashmirobserver.net/2020/02/15/content-takedown-and-users-rights/">Kashmir Observer</a> and the <a class="external-link" href="https://cyberbrics.info/content-takedown-and-users-rights/">CyberBRICS blog</a>. </p>
<p style="text-align: justify;" class="normal"><strong><br /></strong></p>
<p style="text-align: justify;" class="normal"><strong>Introduction</strong></p>
<p style="text-align: justify;" class="normal">Last year, several Jio users from different states <a href="https://www.medianama.com/2019/03/223-indiankanoon-jio-block/">reported</a> that sites like Indian Kanoon, Reddit and Telegram were inaccessible through their connections. While attempting to access the website, the users were presented with a notice that the websites were blocked on orders from the Department of Telecommunications (DoT). When contacted by the founder of Indian Kanoon, Reliance Jio <a href="https://in.reuters.com/article/us-india-internet-idINKCN1RF14D">stated</a> that the website had been blocked on orders of the government, and that the order had been rescinded the same evening. However, in response to a Right to Information (RTI) request, the DoT <a href="https://twitter.com/indiankanoon/status/1218193372210323456">said</a> they had no information about orders relating to the blocking of Indian Kanoon.</p>
<p style="text-align: justify;" class="normal">Alternatively, consider that the Committee to Protect Journalists (CPJ) <a href="https://cpj.org/blog/2019/10/india-opaque-legal-process-suppress-kashmir-twitter.php">expressed concern</a> last year that the Indian government was forcing Twitter to suspend accounts or remove content relating to Kashmir. They reported that over the last two years, the Indian government suppressed a substantial amount of information coming from the area, and prevented Indians from accessing more than five thousand tweets.</p>
<p style="text-align: justify;" class="normal">These instances are <a href="https://www.hindustantimes.com/analysis/to-preserve-freedoms-online-amend-the-it-act/story-aC0jXUId4gpydJyuoBcJdI.html">symptomatic</a> of a larger problem of opaque and arbitrary content takedown in India, enabled by the legal framework under the Information Technology (IT) Act. The Government derives its powers to order intermediaries (entities storing or transmitting information on behalf of others, a definition which includes internet service providers and social media platforms alike) to block online resources through <a href="https://indiankanoon.org/doc/10190353/">section 69A</a> of the IT Act and the <a href="https://meity.gov.in/writereaddata/files/Information%20Technology%20%28%20Procedure%20and%20safeguards%20for%20blocking%20for%20access%20of%20information%20by%20public%29%20Rules%2C%202009.pdf">rules</a> [“the blocking rules”] notified thereunder. Apart from this, <a href="https://indiankanoon.org/doc/844026/">section 79</a> of the IT Act and its allied rules also prescribe a procedure for content removal. <a href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">Conversations</a> with one popular intermediary revealed that the government usually prefers to use its powers under section 69A, possibly because of the opaque nature of the procedure that we highlight below.</p>
<p style="text-align: justify;" class="normal">Under section 69A, a content removal request can be sent by authorised personnel in the Central Government not below the rank of a Joint Secretary. The grounds for issuance of blocking orders under section 69A are: “<em>the interest of the sovereignty and integrity of India, defence of India, the security of the state, friendly relations with foreign states or public order or for preventing incitement to the commission of any cognisable offence relating to the above.</em>” Specifically, the blocking rules envisage the process of blocking to be largely executive-driven, and require strict confidentiality to be maintained around the issuance of blocking orders. This shrouds content takedown orders in a cloak of secrecy, and makes it impossible for users and content creators to ascertain the legitimacy or legality of the government action in any instance of blocking.</p>
<p style="text-align: justify;" class="normal"><strong>Issues</strong></p>
<p style="text-align: justify;" class="normal">The Supreme Court had been called to determine the constitutional validity of section 69A and the allied rules in <a href="https://indiankanoon.org/doc/110813550/"><em>Shreya Singhal v Union of India</em></a>. The petitioners had contended that as per the procedure laid down by these rules, there was no guarantee of pre-decisional hearing afforded to the originator of the information. Additionally, the petitioners pointed out that the safeguards built into section 95 and 96 of the Code of Criminal Procedure (CrPC), which allow state governments to ban publications and persons to initiate legal challenges to those actions respectively, were absent from the blocking procedures. Lastly, the petitioners assailed rule 16 of the blocking rules, which mandated confidentiality of blocking procedures, on the grounds that it was affecting their fundamental rights.</p>
<p style="text-align: justify;" class="normal">The Court, however, found little merit in these arguments. Specifically, the Court found that section 69A was narrowly drawn and had sufficient procedural safeguards, which included the grounds of issuance of a blocking order being specifically drawn, and mandating that the reasons of the website blocking be in writing, thus making it amenable to judicial review. Further, the Court also found that the provision of setting up of a review committee saved the law from being constitutional infirmity. In the Court’s opinion, the mere absence of additional safeguards, as the ones built into the CrPC, did not mean that the law was unconstitutional.</p>
<p style="text-align: justify;" class="normal">But do the ground realities align with the Court’s envisaged implementation of these principles? Apar Gupta, a counsel for the petitioners, <a href="https://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">pointed</a> out that there was no recorded instance of pre-decisional hearing being granted to show that this safeguard contained in the rules was actually being implemented. However, Gautam Bhatia <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">read</a> <em>Shreya Singhal </em>to make an important advance: that the right of hearing be mandatorily extended to the ‘originator’, i.e. the content creator.</p>
<p style="text-align: justify;" class="normal">Additionally, Bhatia also noted that the Court, while upholding the constitutionality of the procedure under section 69A, held that the “<em>reasons have to be recorded in writing in such blocking order so that they may be assailed in a writ petition under Article 226 of the Constitution.</em>”</p>
<p style="text-align: justify;" class="normal">There are two important takeaways from this. <em>Firstly</em>, he argued that the broad contours of the judgment invoke an established constitutional doctrine — that the fundamental right under Article 19(1)(a) does not merely include the right of expression, but also the <em>right of access to information. </em>Accordingly, the right of challenging a blocking order was not only vested in the originator or the concerned intermediary, but may rest with the general public as well. And <em>secondly</em>, by the doctrine of necessary implication, it followed that for the general public to challenge any blocking order under Article 226, the blocking orders must be made public. While Bhatia concedes that public availability of blocking orders may be an over-optimistic reading of the judgment, recent events suggest that even the commonly-expected result, i.e. that the content creators having the right to a hearing, has not been implemented by the Government.</p>
<p style="text-align: justify;" class="normal">Consider the <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">blocking</a> of the satirical website DowryCalculator.com in September 2019 on orders from the government. The website displayed a calculator that suggests a ‘dowry’ depending on the salary and education of a prospective groom: even if someone misses the satire, the contents of the website are not immediately relatable to any grounds of removal listed under section 69A of the IT Act.</p>
<p style="text-align: justify;" class="normal"> Tanul Thakur, the creator of the website, was not granted a hearing despite the fact that he had publicly claimed the ownership of the website at various times and that the website had been covered widely by the press. The information associated with the domain name also publicly lists Thakur’s name and contact information. Clearly, the government made no effort to contact Thakur when passing the order. Perhaps even more worryingly, when he <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">tried</a> to access a copy of the blocking order by filing a RTI, the MeitY cited the confidentiality rule to deny him the information.</p>
<p style="text-align: justify;" class="normal">This incident documents a fundamental problem plaguing the rules: the confidentiality clause is still being used to deny disclosure of key information on content takedown orders. The government has also used the provision to deny citizens a list of blocked websites , as responses to RTI requests have proven <a href="https://cis-india.org/internet-governance/blog/rti-application-to-bsnl-for-the-list-of-websites-blocked-in-india">time</a> and <a href="https://sflc.in/deity-provides-list-sites-blocked-2013-withholds-orders">again</a>.</p>
<p style="text-align: justify;" class="normal">Clearly, the Supreme Court’s rationale in considering Section 69A and the blocking rules as constitutional is not one that is implemented in reality. The confidentiality clause is preventing legal challenges to content blocking in totality: content creators are unable access the orders, and hence are unable to understand the executive’s reasoning in ordering their content to be blocked from public access.</p>
<p style="text-align: justify;" class="normal">As we noted earlier, the grounds of issuing a blocking order under section 69A pertain to certain reasonable restrictions on expression permitted by Article 19(2), which are couched in broad terms. The government’s implementation of section 69A and the rules make it impossible for any judicial review or accountability on the conformity of blocking orders with the mentioned grounds under the rules, or any reasonable restriction at all.</p>
<p style="text-align: justify;" class="normal"><strong>The Way Forward</strong></p>
<p style="text-align: justify;" class="normal">From the opacity of proceedings under the law, to the lack of information regarding the same on public domain, the Indian content takedown regime leaves a lot to be desired from both the government and intermediaries at play. </p>
<p style="text-align: justify;" class="normal">First, we believe the Supreme Court’s decision in <em>Shreya Singhal v. Union of India</em> casts an obligation on the government to attempt to contact the content creator if they are passing a content takedown order to an intermediary. <em>Second</em>, even if the content creator is unavailable for a hearing at that instance, the confidentiality clause should not be used to prevent future disclosure of information to the content creator, so that affected citizens can access and challenge these orders.</p>
<p style="text-align: justify;" class="normal">While we wait for legal reform, intermediaries can also step up to ensure the rights of users online are upheld. On receiving formal orders, intermediaries should <a href="https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass">assess</a> the legality of the received request. This should involve ensuring that only authorised agencies and personnel have sent the content removal orders, that the order specifically mentions what provision the government is exercising the power under, and that the content removal requests relate to the grounds of removal that are permissible under section 69A. For instance, intermediaries should refuse to entertain content removal requests under section 69A of the IT Act if they relate to obscenity, a ground not covered by the provision.</p>
<p style="text-align: justify;" class="normal">The representatives of the intermediary should also push for the committee to grant a hearing to the content creator. Here, the intermediary can act as a liaison between the uploader and the governmental authorities.</p>
<p style="text-align: justify;" class="normal">The Supreme Court’s recent decision in <a href="https://indiankanoon.org/doc/82461587/"><em>Anuradha Bhasin v. Union of India</em></a><em> </em>offers a glimmer of hope for user rights online<em>. </em>While the case primarily challenged the orders imposing section 144 of the CrPC and a communication blockade in Jammu and Kashmir, the final decision does affirm the fundamental principle that government-imposed restrictions on the freedom of expression and assembly must be made available to the public and affected parties to enable challenges in a court of law.</p>
<p style="text-align: justify;" class="normal"> The judiciary has yet another opportunity to consider the provision and the rules: late last year, Tanul Thakur <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">approached</a> the Delhi High Court to challenge the orders passed by the government to ISPs to block his website. One hopes that the future holds robust reforms to the content takedown regime.</p>
<p style="text-align: justify;" class="normal"> We live in an era where the ebb and flow of societal discourse is increasingly channeled through intermediaries on the internet. In the absence of a mature, balanced and robust framework that enshrines the rule of law, we risk arbitrary modulation of the marketplace of ideas by the executive.</p>
<p style="text-align: justify;" class="normal"><em> </em></p>
<p style="text-align: justify;" class="normal"><em>Torsha Sakar and Gurshabad Grover are researchers at the Centre for Internet and Society.</em></p>
<p style="text-align: justify;" class="normal"><em>Disclosure: The Centre for Internet and Society is a recipient of research grants from Facebook and Google.</em></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1'>http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1</a>
</p>
No publisherTorsha Sarkar, Gurshabad GroverInternet FreedomInternet GovernanceIntermediary LiabilityCensorship2020-02-17T05:18:25ZBlog EntryConstitutional Analysis of the Information Technology (Intermediaries' Guidelines) Rules, 2011
http://editors.cis-india.org/internet-governance/constitutional-analysis-of-intermediaries-guidelines-rules
<b>Ujwala Uppaluri provides a constitutional analysis of the Information Technology (Intermediaries' Guidelines) Rules notified in April 2011, and examines its compatibility with Articles 14, 19, 21 of the Constitution of India.</b>
<h2>Summary of Salient Provisions</h2>
<p style="text-align: justify; ">The <b>Information Technology (Intermediaries’ Guidelines) Rules, 2011</b> (‘<b>the Intermediary Guidelines</b>’)<b> </b> were notified in April, 2011 as rules enacted in exercise of powers conferred under section 87(2)(zg) read with Section 79 of the Information Technology Act, 2000 (as amended) (‘<b>the IT Act</b>’).</p>
<p style="text-align: justify; ">Rule 2 of the Intermediary Guidelines imports definitions for key terms from the IT Act. Notably, this includes an importation of Section 2 (w) by <b>Rule 2 (i)</b>, which defines “intermediary” broadly in the following terms:</p>
<p style="text-align: justify; ">“<i> “intermediary”, with respect to any particular electronic records, means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record and includes telecom service providers, network service providers, internet service providers, web-hosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes;</i>”<i> </i></p>
<p style="text-align: justify; ">Rule 3 whose margin note indicates that it is limited to due diligence measures to be adhered to by intermediaries nevertheless also raises other liabilities by creating a regime to censor content, pre-publication as well as once content has been made publically available online.</p>
<p style="text-align: justify; "><b>Sub-rule (2) of Rule 3</b> inventories the classes of content which are deemed actionable, with only clause (i), clause (c), clause (e) and, arguably clause (h), of that rule addressing the national interest, public order and security restrictions cognizable under Article 19(2) of the Constitution. The remainder of grounds includes private claims such as content which “belongs to another person”<a href="#fn1" name="fr1">[1]</a>, or otherwise infringes proprietary rights<a href="#fn2" name="fr2">[2]</a>, or is “defamatory”<a href="#fn3" name="fr3">[3]</a>. Still others are terminologically indeterminate and purely subjective, with the terms “grossly harmful”, “harassing” and “disparaging” being examples.</p>
<p style="text-align: justify; ">This sub-rule also includes a number of redundancies. While there is reference to libelous as well as defamatory content in clause (b), it is well established that Indian law does not admit of the former concept, instead dissolving the common law distinction between the two to treat them alike.<a href="#fn4" name="fr4">[4]</a> There is also clause (e), which prohibits content which is all ready illegal for violating the provisions of an existing statute and the residuary phrasing of the clause (b)’s reference to content which is “otherwise unlawful in any manner whatever”.</p>
<p style="text-align: justify; ">The sub-rules immediately following the list in Rule 3(2) address the consequences of users publishing content listed in that rule:</p>
<p style="text-align: justify; "><b>Sub-rule (3) of rule 3</b> provides that intermediaries will not knowingly deal in any manner whatsoever, whether by hosting, publication, transmission or otherwise, with any content of the types that are listed in the previous clause.</p>
<p style="text-align: justify; "><b>Sub-rule (4) of rule 3</b> creates a complaints mechanism in respect of content incompatible with Rule 3 (2) by requiring intermediaries to disable access to offending content within 36 hours of obtaining knowledge themselves or on being brought to “actual knowledge” by an “affected person”. The Intermediaries Guidelines do nothing to clarify what would amount to “actual knowledge”, to indicate in unambiguous terms, which parties would have sufficient <i>locus</i> to bring complaints in order to be deemed an “affected person” for the purposes of these provisions or to suggest that there is a procedure or timeline for action by the intermediary, such that requirements such notice to the author of the content and time for the preparation of a defence by the author and/or the intermediary are accounted for. Rule 3 (4) also requires that all information which is taken down be preserved, along with “associated records” for a duration of atleast ninety days for investigative purposes.</p>
<p style="text-align: justify; "><b>Sub-rule (5) of rule 3 </b>mandates that intermediaries inform users that non-compliance with the Intermediary Guidelines, <i>inter alia</i>, is a ground for the exercise of their right to terminate access or usage rights and remove non-compliant content.</p>
<p style="text-align: justify; ">Finally, <b>sub-rule (11) of rule 3 </b>requires intermediaries to name Grievance Officers to receive complaints on any matters relating to the computer resources made available by the intermediary, including for non-compliance or harm in terms of Rule 3 (2). This officer is bound to respond to the complaint within one month from the date of receipt of the complaint.</p>
<p style="text-align: justify; ">In the result, the Intermediary Guidelines create a two-track system by which private censorship is legitimized online. In the first place, intermediaries can take down content on their own motion where they are of the opinion that the content falls under any of the grounds enumerated in Rule 3 (2) or, alternatively, do so in response to a complaint, in terms of Rule 3 (4).</p>
<p style="text-align: justify; ">In addition to the provisions relating to censorship, the Intermediary Guidelines also provide for information to be given over to government agencies making a request with lawful authority and in writing under <b>sub-rule (7) of rule 3</b>, for data protection measures in accordance with the Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Information) Rules, 2011 notified under Section 43A of the IT Act to be adhered to (<b>sub-rule (8) of rule 3</b>) and for intermediaries to report and share information realting to cyber security with CERT-In (<b>sub-rule (9) of rule 3</b>).</p>
<h2>Areas of Infirmity</h2>
<p style="text-align: justify; ">It is doubtful whether the Intermediary Guidelines could pass constitutional muster, on several grounds:</p>
<h3>Compatibility with Article 19 (1) (a) and (2)</h3>
<p><i>(a) Applicability of Article 19 (2) to Rule 3 (2) Grounds</i></p>
<p style="text-align: justify; ">In <i>Romesh Thappar v. State of Madras</i><a href="#fn5" name="fr5">[5]</a> the Supreme Court held that the freedom of speech and expression under Article 19(1)(a) includes the freedom to propogate and disseminate ideas. It also held that very narrow and stringent limits govern the permissibility of legislative abridgment of the right of free speech. Ordinarily, any abridgement of free speech by means of censorship must be compatible with one or more of the grounds provided for under Article 19 (2), and the Supreme Court held in <i>Express Newspapers (Private) Ltd. v. Union of India</i><a href="#fn6" name="fr6">[6]</a>that limitations on the exercise of the Article 19(1)(a) right which do not fall within Article 19(2) cannot be upheld.</p>
<p style="text-align: justify; ">Further, the right to free speech applies across all media, and the internet is no exception. In <i>Secretary, Ministry of Information and Broadcasting v. Cricket Association of Bengal</i><a href="#fn7" name="fr7">[7]</a>, the Supreme Court reflected the understanding that where media are different, such that the treatment accorded to them must be different in accordance with that indicia of difference, it will treat them as such in order to uphold fundamental rights. More specifically, in <i>Ajay Goswami v. Union of India</i><a href="#fn8" name="fr8">[8]</a>, the Supreme Court opined (in <i>obiter</i>) that the internet, as a unique medium of expression, deserved a different standard of protection than other mediums that have preceded it.</p>
<p style="text-align: justify; ">Rule 3 (2) of the Intermediary Guidelines, which lists the grounds for censorship, is not complaint with Article 19 (2) for two reasons:</p>
<p style="text-align: justify; "><i>First</i>, many of the grounds mentioned have no constitutional basis whatsoever. Rule 3 (2) prohibits, <i>inter alia</i>, content which “grossly harmful”, “harassing”, “invasive of another’s privacy”, “hateful”, “disparaging”, “grossly offensive” or “menacing”, in addition to content which is simply illegal, and should be actionable <i>ex post</i> rather than prohibited <i>ex ante </i>(content infringing intellectual property under Rule 3 (2) (d), for example). Most of the terms employed are not legal standards, but merely subjective indicators of personal sensitivities, while still others though legal do not figure in Article 19 (2). Since the whole scheme of the Intermediary Guidelines is premised on these extra-constitutional grounds, they are, as a whole, subject to being to being struck down.</p>
<p style="text-align: justify; "><i>Second</i>, the restriction is unreasonable because instead of preserving rights online in accordance with <i>Ajay Goswami</i>, the Intermediary Guidelines unjustifiably abridge the right to speak and receive information on the internet. The Intermediary Guidelines overreach in their scope, by including as actionable content which is not itself punishable when communicated via any other medium. For example, disparaging speech, as long as it is not defamatory, is not criminalised in India, and cannot be because the Constitution does not allow for it. Similarly, content about gambling in print is not unlawful, but now all Internet intermediaries are required to remove any content that promotes gambling.</p>
<p style="text-align: justify; "><i>(b) Nature of Censorship: Directness of Censorship and Legitimacy of Private and Prior Censorship</i></p>
<p style="text-align: justify; ">In judging whether a statute is constitutional, the effect that the statute will have on the fundamental rights of citizens must be examined. The Supreme Court held in <i>Bennett Coleman & Co. v. Union of India</i><a href="#fn9" name="fr9">[9]</a> that the test was to examine whether the <i>effect</i> of an impugned action was to abridge a fundamental right, notwithstanding its object.</p>
<p style="text-align: justify; ">Further, while it is true in light of the Supreme Court’s holdings in <i>Prakash Jha Productions v. Union of India</i><a href="#fn10" name="fr10">[10]</a><i> </i>that pre-censorship is permissible within the Indian constitutional scheme, this permissibility is qualified. Prior censorship may be undertaken only within closely regulated circumstances, such as under the grounds in the Cinematograph Act, 1952, and even then, only by an appropriately empowered governmental entity.</p>
<p style="text-align: justify; ">The Intermediary Guidelines create mechanisms for the abridgement of the freedom of speech which amount to indirect and unjustifiable prior censorship, contrary to Article 19 (2):</p>
<p style="text-align: justify; "><i>Firstly</i>, while the state does not itself censor under these rules, it has empowered private, commercial entities to do so <i>vide </i>the Intermediary Guidelines. These rules thus transfer the executive power of censorship to private intermediaries. This amounts to an indirect form of censorship for the purposes of the <i>Bennett Coleman </i>test and has the result of increased censorship on the Internet because the state granted legislative sanction to such a system, although it does not censor by itself or through a state agency. The Intermediary Guidelines, and specifically Rule 3 (4) read with Rule 3 (2), place a burden on intermediaries to decide on the lawfulness of content as a pre-condition for their statutory exemption from liability. An intermediary, on receiving a complaint, to ensure that it continues to receive the protection offered by Section 79 of the IT Act, will be forced to disable access to the content posted by a user. Thus, the direct effect of the rules will be strict censoring of content posted on-line by users. The rules will have a direct effect on the fundamental right of freedom of speech and expression guaranteed under Article 19(1) of the Constitution unreasonable restrictions on fundamental rights, that are imposed by a statute or executive orders are liable to be struck down as unconstitutional.</p>
<p style="text-align: justify; "><i>Secondly</i>,<i> </i>while prior censorship is permissible only in a strictly limited range of cases, the Intermediary Guidelines allow for an unrestrained and unlimited degree of prior and arguably invisible censorship. Rule 3 of the Intermediary Guidelines clearly envisages such a system of prior censorship. Whereas the consequences for passively displaying content incompatible with Rule 3(2) would be a complete waiver and dissolution of the Section 79 immunity that would ordinary accrue to neutral intermediaries, intermediaries or complainants have no obligation in respect of ensuring the tenability of complaints and the grounds cited in them. The Intermediary Guidelines do not draw a distinction between arbitrary actions of an intermediary and take-downs subsequent to a request. Further, the inclusion of a residuary clause in Rule 3 (2) (b) allowing pre-censorship of content which is “unlawful in any manner whatever”, also indicates that the Intermediary Guidelines allow the use of the exceptional instrument of not only allows private censorship, but that they actively encourage it as the default rule rather than the exception without any justification whatsoever.</p>
<p style="text-align: justify; "><i>(c)</i><i> Vagueness and Overbreadth: Possibility for Over-Censorship</i></p>
<p style="text-align: justify; ">Vagueness in the terms of a restriction to free speech is grounds for it to be struck down, even where the ground is apparently broadly constitutional. The Supreme Court held in <i>Sakal Papers (P) Ltd. v. Union of India</i><a href="#fn11" name="fr11">[11]</a> that the Constitution must be interpreted in order to enable citizens to enjoy their rights to fullest measure, subject to limited permissible restrictions. In <i>Romesh Thapar</i><a href="#fn12" name="fr12">[12]</a><i> </i>the Supreme Court also held that a legislation authorizing the imposition of restrictions on free speech in language wide enough to cover restrictions which are permissible as well as extra-constitutional will be held to be wholly unconstitutional.</p>
<p style="text-align: justify; ">The grounds listed in Rule 3 (2) of the Intermediary Guidelines are highly subjective, private interest grounds which are not defined either in the Intermediary Guidelines or in the IT Act itself. These include terms such as “grossly harmful”, “harassing”, “invasive of another’s privacy”, “hateful”, “disparaging”, “grossly offensive” or “menacing”. Consequently, the Intermediary Guidelines constitute unreasonable restrictions on freedom of speech, with Rule 3 (2) containing vague terms which, in addition to falling beyond the purview of Article 19(2), cover only private and subjective grounds, incapable of objective definition or application.</p>
<p style="text-align: justify; ">Further, the Intermediary Guidelines do no precisely define the term “affected person” employed in Rule 3 (4). Thus, complaints from <i>any</i> party, including those uninvolved or unaffected by content must all be complied with, without qualification.</p>
<p style="text-align: justify; ">In the result, the vagueness of the grounds in Rule 3 (2) and the diffuse terminology of “affected person” leaves Rule 3 (2) grounds serving as placeholders for whatever claim a complainant, having no <i>locus</i> whatsoever, chooses to bring, without regard for whether it is constitutional or even legal. Online content is thus treated as presumptively illegal and take down of content as the presumptive course of action. Additionally, there is a further consequence to the vagueness and overbreadth of the terms in Rule 3 (2): because of the indeterminacy in the grounds listed thereunder, intermediaries tasked with enforcing the law will tend to err on the side of caution and censor, rather than keep speech accessible online. There is empirical evidence to show that cautious intermediaries will over-censor and over comply with complaints in order to avoid liability under Section 79 of the IT Act.<a href="#fn13" name="fr13">[13]</a></p>
<p style="text-align: justify; "><i>(d) Contravention of International Human Rights Norms & Horizontal Application</i></p>
<p style="text-align: justify; ">The censorship regime constructed by the Intermediary Guidelines is non-compliant not only with domestic requirements under the Constitution, but also with India’s obligations under international human rights law under Articles 19 of the Universal Declaration of Human Rights (‘<b>UDHR</b>’) and the International Covenant on Civil and Political Rights (‘<b>ICCPR</b>’), under the UN Human Rights Council’s Report of the Special Rapporteur Frank La Rue on the Promotion and Protection of the Right to Freedom of Opinion and Expression (2011)<a href="#fn14" name="fr14">[14]</a>(‘<b>Special Rapporteur’s Report</b>’) and the UN Human Rights Council Resolution on Internet Freedom (2012)<a href="#fn15" name="fr15">[15]</a> (‘<b>UN Internet Freedom Resolution</b>’).</p>
<p style="text-align: justify; ">While the ICCPR as well as the UDHR guarantee a right to free speech “through any…media of…choice” in their respective Articles 19, the Special Rapporteur’s Report and the UN Internet Freedom Resolution recognize the need for special efforts to be undertaken by states to preserve free speech on the internet. The former document justifies censorship only in the most limited circumstances and makes specific mention of the commercial interests that may be implicated in delivering free speech.</p>
<p style="text-align: justify; ">Through the Intermediary Guidelines, the Indian state creates a system by which the right to free speech can be systematically violated by private and undisclosed entities and even empowers them to do so, without imposing any constitutional safeguards whatsoever. Thus, egregious violations of the right to free speech and expression are a direct and inevitable consequence of the Intermediary Guidelines. To the degree that the Indian Supreme Court has enagaged with free speech online, it appears from <i>Ajay Goswami </i>that it would apply standards consistent with international law obligations to rectify the Intermediary Guidelines to meet them.</p>
<p style="text-align: justify; ">Further, the Indian Supreme Court has held, where necessary for their true enjoyement, that fundamental rights may involve a degree of horizontality in their application. In other words, private action could be guided by fundamental rights, such as in <i>Vishaka v. State of Rajasthan</i><a href="#fn16" name="fr16">[16]</a> which evidences the Supreme Court’s willingness to hold that private entities could be held to constitutional and international human rights law standards where that is necessary for the real rather than illusory enjoyment of fundamental rights.</p>
<p style="text-align: justify; ">As a result, the Intermediary Guidelines are also liable to be struck down for their failure to recognize and account for the role of private interests while empowering them with the right to curtail fundamental rights.</p>
<h3>Compatibility with Article 21</h3>
<p style="text-align: justify; "><i> (a) Adverse Impact on Privacy (and consequently on Free Speech)</i></p>
<p style="text-align: justify; ">A constitutional right to privacy has been read into Article 21’s guarantee of life and personal liberty in several instances by the Supreme Court. The State is consequently under an obligation to refrain from interfering, whether by itself or through any of its agencies, with private lives and spaces. By the same coin, laws which encourage unwarranted state or societal intrusions into private life will contravene the victim’s Article 21 right. In <i>People’s Union for Civil Liberties v. Union of India</i>,<a href="#fn17" name="fr17">[17]</a> the Supreme Court held that Article 21 privacy protected individuals against the interception and monitoring of private communications by the state in the absence of sufficient safeguards.</p>
<p style="text-align: justify; ">Also, an individual’s privacy interests in information relating to him are not dissolved merely because information is not confidential or because another entity has some property interest in that information. In <i>District Registrar and Collector, Hyderabad v. Canara Bank</i><a href="#fn18" name="fr18">[18]</a>, the Supreme Court recognized that even where the search of private documents was concerned, Article 21 protected “persons not places”, <i>i.e.</i>, that the privacy interest did not vest in property or communications but, rather, in the rightsholder himself.</p>
<p style="text-align: justify; ">The Intermediary Guidelines include no limits whatsoever on the scope of disclosures that government agencies can demand or expect to retain, in contravention of Article 21.</p>
<p style="text-align: justify; ">Specifically, Rule 3 (4), which requires data retention for a statutory minimum of ninety days of content taken down as well as “associated records”, violates users’ rights to privacy. In addition to the financial and technical burden (in storing and securing data) imposed by the Intermediary Guidelines in requiring potentially unlimited data retention by intermediaries, there is no clarity as to what or how much information precisely must be held in the form of “associated records”. Instead of subjecting data to limited and closely qualified retention by private intermediaries, and thus limiting the impairment of the fundamental right to privacy to the minimum possible degree necessary, Rule 3 (4) imposes blanket data retention requirements.</p>
<p style="text-align: justify; ">Further, Rule 3 (7), which makes any information held by an intermediary subject to being disclosed to the government upon request is also inconsistent with the requirement that the right to life and personal liberty be violated only in accordance with fair, just and reasonable procedures. Notwithstanding that Rule 3 (7) is consistent with Section 67C of the IT Act and specific rules framed in regard to the surveillance of communications, it is also unconstitutional because it fails to include any safeguards whatsoever in the process of surveillance. These would include, as minimum obligatory conditions in light of <i>PUCL</i>, the requirement that the surveilled be informed of the surveillance and be allowed to challenge its propriety <i>ex ante </i>or its procedural regularity <i>ex post</i>, or atleast administrative or judicial review <i>ex parte</i>.</p>
<p style="text-align: justify; "><i>(b) Non-compliance with Due Process and Natural Justice Requirements</i></p>
<p style="text-align: justify; ">Article 21 explicitly includes a due process guarantee. This means that the right to life and personal liberty, and its constituent rights, can be interfered with only through constitutionally consistent procedures. A cornerstone of fair procedure, compliant with the rule of law, is the notion of natural justice. Consequently, Article 21 contemplates that the procedure by which fundamental rights are curtailed will satisfy natural justice principles.</p>
<p style="text-align: justify; ">In <i>Maneka Gandhi v. Union of India</i>,<a href="#fn19" name="fr19">[19]</a> the Supreme Court held that natural justice was not a rigid or mechanical term, but one that referred to those practices and principles that would ensure<i> </i>“fair play in action”<i>.</i> In addition the Court held that all deviations<i> </i>from natural justice requirements must be supported by a sufficiently justificatory “compelling state interest”. Specifically, in <i>Union</i> <i>of</i> <i>India</i> <i>v.</i> <i>Tulsiram</i> <i>Patel</i><a href="#fn20" name="fr20">[20]</a>, the Supreme Court held that the principle of natural justice required the satisfaction of the <i>audi alteram partem</i> rule, which consisted of several requirements, including the requirement that a person against whose detriment an action is taken be informed of the case against him and be afforded a full and fair opportunity to respond. Finally, in <i>M.C. Mehta v. Union of India</i><a href="#fn21" name="fr21">[21]</a> the Supreme Court held that the absence of due notice and a reasonable opportunity to respond would vitiate any holding to the rightsholder’s detriment. <i> </i></p>
<p style="text-align: justify; ">The Intermediary Guidelines fail to satisfy the requirement of natural justice, and particularly the rights to prior notice as well as that of the affected party to a hearing:</p>
<p style="text-align: justify; ">By requiring that content be taken down swiftly (within 36 hours of complaint, under Rule 3 (4)) and by failing to require the author of the content to be informed of the complaint and its contents, the Intermediary Guidelines violate the author’s right to notice and consequently affect his/her right to prepare and present a defence at all. In practice, authors of content which is the subject of a complaint may never know of the complaint or even of the fact of the take down, given the absence of any mechanism under the rules by which they could have been informed. In a scheme for silent, invisible censorship, authors are never afforded an opportunity to challenge the take down, just as they have no opportunity to rebut the initial complaint. In addition, at any event, it is the intermediary, a biased private entity whose immunity under Section 79 of the IT Act could be called into question based on the outcome, who must make the determination as to the legality of the content.</p>
<p style="text-align: justify; ">While there is nothing to prohibit intermediaries from informing authors on the receipt of a complaint, the limited time within which action must be taken means that such intermediaries would risk liability for non-compliance with the compliant and a waiver of their Section 79 immunity, where the content is not taken down, whether because communication does not occur within the 36 hour timeframe or because an author elects to resist takedown. By creating a system in which takedowns necessarily occur in response to complaints, irrespective of their legitimacy, the Intermediary Guidelines presume and rule in favour of the complainants and in favour of (private) censorship instead of presuming in favour of the preservation of the fundamental right to free speech, or even maintaining neutrality between the two ends.</p>
<h3 style="text-align: justify; ">Compatibility with Article 14</h3>
<h2></h2>
<h2></h2>
<p style="text-align: justify; ">The guarantee of “equal protection of laws” requires equality of treatment of persons who are similarly situated, without discrimination <i>inter se</i>. It is a corollary that that persons differently situated cannot be treated alike. <i>In</i><i> E.</i><i> P.</i><i> Royappa</i><i> v. State</i><i> of</i><i> Tamil</i><i> Nadu</i><a href="#fn22" name="fr22">[22]</a><i> the</i><i> Supreme</i><i> Court</i><i> held</i><i> that arbitrary or unfair actions necessarily run counter to Article 14. The Supreme Court explained in M/S</i><i> Sharma</i><i> Transport</i><i> v.</i><i> Government</i><i> of</i><i> Andhra Pradesh</i><a href="#fn23" name="fr23">[23]</a><i> that</i> arbitrary actions are actions which are unreasonable, non-rational done capriciously or without adequate determining principle, reason or in accordance with due judgment. In addition, Article 14 also requires that state action be reasonable. I<i>n</i><i> Mahesh</i><i> Chandra</i><i> v.</i><i> Regional</i><i> Manager,</i><i> U.P.</i><i> Financial</i><i> Corporation</i><a href="#fn24" name="fr24">[24]</a><i> it was held that discretion must be exercised objectively, and that what is not fair or just will be unreasonable, and subject to being struck down as unconstitutional.</i>Additionally, Article 14 also requires that the basis upon which classifications are undertaken for the purposes of same or differential treatment be reasoned and fair. The Supreme Court held in <i>Sube Singh v. State of Haryana</i><a href="#fn25" name="fr25">[25]</a> that the state’s failure to support a classification on the touchstone of reasonability, with the existence of intelligible differentia or the rational basis of achieving a stated object, will be ground for it to be held arbitrary and unreasonable. Finally, all state action having the potential to curtail Article 14 must be reasonable, justifiable, undertaken in <i>exercise of </i>constitutional powers and be informed and guided by public interest. The Supreme Court held to this effect i<i>n</i><i> Kasturi</i><i> Lal</i><i> Lakshmi</i><i> Reddy</i><i> v.</i><i> State</i><i> of</i><i> Jammu</i><i> and</i><i> Kashmir</i><a href="#fn26" name="fr26">[26]</a>.</p>
<p style="text-align: justify; ">The Intermediary Guidelines contravene Article 14 on the following grounds:</p>
<p style="text-align: justify; "><i>First</i>, intermediaries who are not similarly situated are treated alike. Rule 2 (i) imports the IT Act’s omnibus definition of the term “intermediary”, such that all classes of intermediaries, ranging from intermediaries which control the architecture of the internet and the hardware which enables it to run (such as ISPs and DNS providers) to intermediaries that enable content creation, sharing and communications online (such as email clients, content aggregators, social networking services and content hosts), are empowered to censor and are required to comply with complaints regarding content. Intermediaries, for the purposes of the IT Act and the Intermediary Guidelines, thus refer to a large and disparate group of providers of services enabling access to as well as use of the Internet. Reasoned state action must recognize that their liabilities must necessarily vary with the specific type of service that each provides. The Intermediary Guidelines fail to do so, and are consequently incompatible with Article 14.</p>
<p style="text-align: justify; "><i>Second</i>, the Intermediary Guidelines treat the same or similar content across media differently, without apparent justification. More specifically, users of the internet are unfairly discriminated against. All of the Rule 3 (2) grounds which are not explicitly mentioned in Article 19 (2) in particular reflect this discriminatory, unreasoned treatment. To illustrate, the prohibition under Rule 3 (2) on the display of any content online when it relates to gambling treats speakers using the internet differently from speakers communicating this content via any other medium of communication. Given that nothing in the nature of the medium itself attaches a new or different character to the content, criminality or liability must attach to such content in a medium-neutral fashion. So, while content qualifying as seditious under law remains so across media, whether it be print, audio or video broadcast or online, the same as not the case for communications on the internet. In other words, while gambling itself may be prohibited under law, speech or expression involving it is nowhere prohibited under law. While such content is legal and protected across print and broadcasting media, the same content is liable to take down online. This would amount to discriminatory treatment of equal content <i>merely</i> because speakers choose the internet, and the speech occurred online.</p>
<p style="text-align: justify; "><i>Third</i>, the Intermediary Guidelines accord unrestrained discretion in the curtailment of fundamental rights to <i>private </i>functionaries, without any guidance whatsoever. This should have been the sole reserve of the state. In addition to the lack of guidance, the breadth of the grounds for censorship in Rule 3 (2), some of which are<i> themselves incapable of precise and non-subjective application</i>, means that private censorship can occur to an arguably unlimited degree. Expecting compliance with such terms, and attaching liability (for intermediaries) or a curtailment of fundamental rights (for generators of content), without the provision of a right to challenge or even, more fundamentally, be informed is both unreasonable and arbitrary.</p>
<p style="text-align: justify; ">Similarly, Rules 3 (4) and 3 (5) empower intermediaries to take down content without providing any realistic opportunity of hearing to its author. Intermediaries are accorded an adjudicatory role to the intermediary in deciding questions whether or not authors can access their fundamental right to free speech in the process. This role is ordinarily reserved for competent courts or administrative authorities, which are subject to constitutional checks and balances and a general obligation to preserve and promote fundamental rights. Assigning such functions to a self-interested private entity without any accountability whatsoever is both unreasonable as well as arbitrary.</p>
<p style="text-align: justify; "><i>Finally</i>, the Intermediary Guidelines fail to account for the public interest because they directly restrict the public’s freedom of speech and expression, without any justifiable reason, and privilege the personal and not necessarily constitutional sensitivities of private complainants instead. Rule 3(3) in effect vests an extraordinary power of censorship in intermediaries, entities which operate on the basis of private interest and outside the limits of administrative or even the most basic human rights control. Safeguards must apply to power-bearers to the degree and in the manner required in relation to the nature of the power, rather than its holder, if fundamental rights are to be legislatively preserved. While the Supreme Court in <i>A.K. Kraipak v. Union of India</i><a href="#_ftn27">[27]</a> extended the applicability of natural justice principles from judicial bodies alone and quasi-judicial bodies to administrative bodies as well, the applicability of such principles still remains limited to state entities. In other words, there is an acknowledged difficulty in applying public law standards to private, commercial entities.</p>
<p style="text-align: justify; ">The Intermediary Guidelines thus vest the right to abridge core fundamental rights (under Articles 14, 19 and 21) in private delegates operating outside public law controls that constrain the scope in which the power can be exercised and ensure that citizen interest can be preserved. In the alternative, they also failed to provide for other safeguards to prevent abuse to the detriment of fundamental rights private delegates of governmental power, even as they granted such powers in unlimited terms. As a result, the Intermediary Guidelines evidence thoughtless, arbitrary, unreasoned and unjust state action.</p>
<h3 style="text-align: justify; ">Vires vis á vis the Parent Act</h3>
<p style="text-align: justify; ">While it is permissible within the constitutional scheme for legislative functions of the Parliament to be delegated to a degree, they may be struck down on several grounds. In general, per <i>Indian</i><i> </i><i>Express</i><i> </i><i>Newspapers</i><i> </i><i>(Bombay)</i><i> </i><i>Pvt.</i><i> </i><i>Ltd.</i><i> </i><i>v.</i><i> </i><i>Union</i><i> </i><i>of</i><i> </i><i>India</i><a href="#_ftn28">,[28]</a> subordinate legislation can be challenged not only on any of grounds on which the parent legislation is vulnerable to challenge, but also on the grounds that it does not conform to parent statute, that it is contrary to other statutes or that it is unreasonable, in the sense that it is manifestly arbitrary. Notably, the Court also held here that subordinate legislation is liable to being struck down where it fails to conform to constitutional requirements, or, specifically that “it offends Article 14 or Article 19 (1) (a) of the Constitution”.</p>
<p style="text-align: justify; ">It is a well-accepted proposition that delegated legislation which travels outside the scope of its enabling law will not stand as valid. It was held in <i>Agricultural</i><i> </i><i>Market</i><i> </i><i>Committee</i><i> </i><i>v.</i><i> </i><i>Shalimar</i><i> </i><i>Chemical</i><i> </i><i>Works</i><i> </i><i>Ltd </i><a href="#_ftn29">[29]</a> that a delegate cannot alter the scope of the act under which it has been it has been empowered to make rules, or even of a provision or principle included there under. In <i>State</i><i> </i><i>of</i><i> </i><i>Karnataka</i><i> v</i><i>.</i><i> </i><i>Ganesh</i><i> </i><i>Kamath</i><a href="#_ftn30">[30]</a> the Supreme Court held that “it is a well settled principle of interpretation of statutes that the conferment of rule-making power by an Act does not enable the rule-making authority to make a rule which travels beyond the scope of the enabling Act or which is inconsistent there with or repugnant thereto”. Similarly, in <i>KSEB</i><i> </i><i>v.</i><i> </i><i>Indian</i><i> </i><i>Aluminium</i><i> </i><i>Company</i><a href="#_ftn31">[31]</a>, it held that“subordinate legislation cannot be said to be valid unless it is within the scope of the rule making power provided in the statute”.</p>
<p style="text-align: justify; ">The Intermediary Guidelines were enacted under Sections 79(2) and 87(2)(zg) of the Information Technology Act, 2000 (as amended). While the latter provision explicitly grants the Central Government rule-making powers by which it can lay out guidelines to be followed by intermediaries in order to comply with Section 79(2), it appears that the rules in their current form appear to have been drafted based on a misunderstanding of Section 79.</p>
<p style="text-align: justify; ">Section 79(2) itself merely clarifies the circumstances in which intermediaries can claim that intermediaries are not liable for content where they do not initiate the transmission of potentially actionable content or select its recipient, modify its contents and observe all necessary “due diligence” requirements under the IT Act and rules.</p>
<p style="text-align: justify; ">The extent to which the Intermediary Guidelines alter the intent and scope of section 79 (or other provisions of the IT Act, in some cases) clearly leaves them <i>ultra vires</i> the parent statute. The specific instances of deviation by the Intermediary Guidelines from the IT Act are listed below:</p>
<p style="text-align: justify; "><i>First</i>, Rule 3 (3) is ultra vires section 79 of the IT Act. Where this rule expressly prohibits the hosting, publication or initiation of transmission of content described in Rule 3 (2), section 79 does not intend any prohibition. All that it does is to waive the immunity otherwise accorded to intermediaries where the conditions specified are not satisfied. In other words, the section is optional, rather than mandatory and punitive: whether or not an intermediary can claim immunity will depend on whether it chooses to comply with section 79 (2).</p>
<p style="text-align: justify; "><i>Second</i>, Rule 3 (4) requires intermediaries to take steps to disable access to within 36 hours of receiving a complaint in relation thereto. This is inconsistent with section 69B of the IT Act, which lays down in detail, the procedure to be followed to disable access to information. Since section 69B is statutory law, Rule 3 (4), being mere delegated legislation, will have to yield in its favour.</p>
<p style="text-align: justify; "><i>Third</i>, Rule 3 (7) is <i>ultra</i><i> </i><i>vires</i> sections 69 and 69B, and falls outside the scope of section 79 (2). Rule 3 (7) provides that intermediaries must comply with requests for information or assistance when required to do so by appropriate authorities. This provision has no relation to the contents of section 79, which regulates intermediaries’ liability for content, and under which these rules were notified. In addition, rules have already been issued under the properly relevant sections, namely sections 69 and 69B, to provide a procedure to be followed by the government for the interception, monitoring, and decryption of information held by intermediaries. Rule 3 (7) is not consistent with the rules under sections 69 and 69B, as it removes all safeguards that those rules included. Under the Information Technology (Procedure and Safeguards for Interception, Monitoring, and Decryption) Rules 2009, for instance, permission must be obtained from the competent authority before an intermediary can be directed to provide access to its records and facilities while Rule 3 (7) makes intermediaries answerable to virtually any request from any government agency.</p>
<p align="left"><b> </b></p>
<hr align="left" size="1" width="33%" />
<p>[<a href="#fr1" name="fn1">1</a>]. Rule 3 (2) (a).</p>
<p>[<a href="#fr2" name="fn2">2</a>]. Rule 3 (2) (d).</p>
<p>[<a href="#fr3" name="fn3">3</a>]. Rule 3 (2) (b)</p>
<p>[<a href="#fr4" name="fn4">4</a>]. Section 499, Indian Penal Code, 1860 (“Defamation” is defined to include both written and spoken words).</p>
<p>[<a href="#fr5" name="fn5">5</a>]. AIR 1950 SC 124.</p>
<p>[<a href="#fr6" name="fn6">6</a>]. AIR 1958 SC 578.</p>
<p>[<a href="#fr7" name="fn7">7</a>]. AIR 1995 SC 1236.</p>
<p>[<a href="#fr8" name="fn8">8</a>].(2007) 1 SCC 170.</p>
<p>[<a href="#fr9" name="fn9">9</a>]. AIR 1973 SC 106.</p>
<p>[<a href="#fr10" name="fn10">10</a>]. (2011) 8 SCC 372.</p>
<p>[<a href="#fr11" name="fn11">11</a>]. AIR 1962 SC 305, ¶31.</p>
<p>[<a href="#fr12" name="fn12">12</a>]. <i>Supra, </i>n.5.</p>
<p>[<a href="#fr13" name="fn13">13</a>]. Centre for Internet & Society, <i>Intermediary Liability in India</i><i>: Chilling Effects on Free Expression on the Internet 2011</i> <i>available at</i> cis-india.org/internet-governance/chilling-effects-on-free-expression-on-internet/intermediary-liability-in-india.pdf.</p>
<p>[<a href="#fr14" name="fn14">14</a>]. UN Document no. A/HRC/17/27.</p>
<p>[<a href="#fr15" name="fn15">15</a>]. UN Document no. A/HRC/20/.13.</p>
<p>[<a href="#fr16" name="fn16">16</a>]. AIR 1997 SC 3011.</p>
<p>[<a href="#fr17" name="fn17">17</a>]. AIR 1997 SC 568.</p>
<p>[<a href="#fr18" name="fn18">18</a>]. (2005) 1 SCC 496.</p>
<p>[<a href="#fr19" name="fn19">19</a>]. 1978 SCR (2) 621.</p>
<p>[<a href="#fr20" name="fn20">20</a>]. AIR 1985 SC 1416.</p>
<p>[<a href="#fr21" name="fn21">21</a>]. AIR 1999 SC 2583.</p>
<p>[<a href="#fr22" name="fn22">22</a>]. AIR 1974 SC 555.</p>
<p>[<a href="#fr23" name="fn23">23</a>]. AIR 2002 SC<i> </i>322<i>.</i></p>
<p>[<a href="#fr24" name="fn24">24</a>]. AIR 1993 SC 935<i>.</i></p>
<p>[<a href="#fr25" name="fn25">25</a>]. (2001) 7 SCC 545, 548, ¶10.</p>
<p>[<a href="#fr26" name="fn26">26</a>].1980 AIR 1992.</p>
<p>[<a href="#fr27" name="fn27">27</a>]. <i>AIR</i> 1970 SC 150.</p>
<p>[<a href="#fr28" name="fn28">28</a>]. AIR 1986 SC 515.</p>
<p>[<a href="#fr29" name="fn29">29</a>]. AIR 1997 SC 2502.</p>
<p>[<a href="#fr30" name="fn30">30</a>]. (1983) 2 SCC 40.</p>
<p>[<a href="#fr35" name="fn31">31</a>]. AIR 1976 SC 1031.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/constitutional-analysis-of-intermediaries-guidelines-rules'>http://editors.cis-india.org/internet-governance/constitutional-analysis-of-intermediaries-guidelines-rules</a>
</p>
No publisherujwalaInternet GovernanceIntermediary LiabilityInformation Technology2012-10-31T08:44:41ZBlog EntryComments to the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021
http://editors.cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021
<b>The Centre for Internet & Society (CIS) presented its comments on the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (‘the rules’), which were released on 6 June, 2022 for public comments.</b>
<p style="text-align: justify; ">These comments examine whether the proposed amendments are in adherence to established principles of constitutional law, intermediary liability and other relevant legal doctrines. We thank the Ministry of Electronics and Information Technology (MEITY) for allowing us this opportunity. Our comments are divided into two parts. In the first part, we reiterate some of our comments to the existing version of the rules, which we believe holds relevance for the proposed amendments as well. And in the second part, we provide issue-wise comments that we believe need to be addressed prior to finalising the amendments to the rules.</p>
<hr />
<p style="text-align: justify; ">To access the full text of the Comments to the draft amendments to the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, <a href="http://editors.cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-it-rules-2021.pdf" class="internal-link">click here</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021'>http://editors.cis-india.org/internet-governance/blog/comments-to-draft-amendments-to-the-it-rules-2021</a>
</p>
No publisherAnamika Kundu, Digvijay Chaudhary, Divyansha Sehgal, Isha Suri and Torsha SarkarDigital MediaInternet GovernanceIntermediary LiabilityInformation Technology2022-07-07T02:39:28ZBlog EntryComments on the Draft Rules under the Information Technology Act
http://editors.cis-india.org/internet-governance/blog/comments-draft-rules
<b>The Centre for Internet and Society commissioned an advocate, Ananth Padmanabhan, to produce a comment on the Draft Rules that have been published by the government under the Information Technology Act. In his comments, Mr. Padmanabhan highlights the problems with each of the rules and presents specific recommendations on how they can be improved. These comments were sent to the Department of Information and Technology.</b>
<h2><em>Comments on the Draft Rules under the Information Technology Act as Amended by the Information Technology (Amendment) Act, 2008</em></h2>
<p><em><strong>Submitted by the Centre for Internet and Society, Bangalore</strong></em></p>
<p><em><strong>Prepared by Ananth Padmanabhan, Advocate in the Madras High Court</strong></em></p>
<h2>Interception, Monitoring and Decryption</h2>
<h3>Section 69</h3>
<p>The section says:</p>
<ol><li>Where the Central Government or a State Government or any of its officer specially authorised by the Central Government or the State Government, as the case may be, in this behalf may, if satisfied that it is necessary or expedient so to do in the interest of the sovereignty or integrity of India, defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of any cognizable offence relating to above or for investigation of any offence, it may subject to the provisions of sub-section (2), for reasons to be recorded in writing, by order, direct any agency of the appropriate Government to intercept, monitor or decrypt or cause to be intercepted or monitored or decrypted any information generated, transmitted, received or stored in any computer resource. </li><li>The procedure and safeguards subject to which such interception or monitoring or decryption may be carried out, shall be such as may be prescribed.</li><li>The subscriber or intermediary or any person in-charge of the computer resource shall, when called upon by any agency referred to in sub-section (1), extend all facilities and technical assistance to-</li></ol>
<p> (a) provide access to or secure access to the computer resource
generating transmitting, receiving or storing such information; or</p>
<p>
(b) intercept, monitor, or decrypt the information, as the case may be; or</p>
(c) provide information stored in computer resource.
<ol><li>The subscriber or intermediary or any person who fails to assist the agency referred to in sub-section (3) shall be punished with imprisonment for a term which may extend to seven years and shall also be liable to fine. <br /></li></ol>
<p><strong><br /></strong></p>
<p><strong>Recommendation #1</strong><br />Section 69(3) should be amended and the following proviso be inserted:</p>
<p class="callout">Provided that only those intermediaries with respect to any information or computer resource that is sought to be monitored, intercepted or decrypted, shall be subject to the obligations contained in this sub-section, who are, in the opinion of the appropriate authority, prima facie in control of such transmission of the information or computer resource. The nexus between the intermediary and the information or the computer resource that is sought to be intercepted, monitored or decrypted should be clearly indicated in the direction referred to in sub-section (1) of this section.</p>
<p><br /><strong>Reasons for the Recommendation </strong><br />In the case of any information or computer resource, there may be more than one intermediary who is associated with such information. This is because “intermediary” is defined in section 2(w) of the amended Act as,</p>
<p class="callout">“with respect to any electronic record means any person who on behalf of another person receives, stores or transmits that record or provides any service with respect to that record, including telecom service providers, network service providers, internet service providers, webhosting service providers, search engines, online payment sites, online-auction sites, online-market places and cyber cafes”. </p>
<p><br />The State or Central Government should not be given wide-ranging powers to enforce cooperation on the part of any such intermediary without there being a clear nexus between the information that is sought to be decrypted or monitored by the competent authority, and the control that any particular intermediary may have over such information.</p>
<p>To give an illustration, merely because some information may have been posted on an online portal, the computer resources in the office of the portal should not be monitored unless the portal has some concrete control over the nature of information posted in it. This has to be stipulated in the order of the Central or State Government which authorizes interception of the intermediary. </p>
<p><br /><strong>Recommendation #2</strong><br />Section 69(4) should be repealed.</p>
<p><br /><strong>Reasons for the Recommendation</strong><br />The closest parallels to Section 69 of the Act are the provisions in the Telegraph Rules which were brought in after the decision in PUCL v. Union of India, (1997) 1 SCC 301, famously known as the telephone tapping case.</p>
<p>Section 69(4) fixes tremendous liability on the intermediary for non-cooperation. This is violative of Article 14. Similar provisions in the Indian Penal Code and Code of Criminal Procedure, which demand cooperation from members of the public as regards production of documents, letters etc., and impose punishment for non-cooperation on their part, impose a maximum punishment of one month. It is bewildering why the punishment is 7 years imprisonment for an intermediary, when the only point of distinction between an intermediary under the IT Act and a member of the public under the IPC and CrPC is the difference in the media which contains the information.</p>
<p>Section 69(3) is akin to the duty cast upon members of the public to extend cooperation under Section 39 of the Code of Criminal Procedure by way of providing information as to commission of any offence, or the duty, when a summons is issued by the Court or the police, to produce documents under Sections 91 and 92 of the Code of Criminal Procedure. The maximum punishment for non-cooperation prescribed by the Indian Penal Code for omission to cooperate or wilful breach of summons is only a month under Sections 175 and 176 of the Indian Penal Code. Even the maximum punishment for furnishing false information to the police is only six months under Section 177 of the IPC. When this is the case with production of documents required for the purpose of trial or inquiry, it is wholly arbitrary to impose a punishment of six years in the case of intermediaries who do not extend cooperation for providing access to a computer resource which is merely apprehended as being a threat to national security etc. A mere apprehension, however reasonable it may be, should not be used to pin down a liability of such extreme nature on the intermediary.</p>
<p>This would also amount to a violation of Articles 19(1)(a) as well as 19(1)(g) of the Constitution, not to mention Article 20(3). To give an example, much of the information received from confidential sources by members of the press would be stored in computer resources. By coercing them, through the 7 year imprisonment threat, to allow access to this computer resource and thereby part with this information, the State is directly infringing on their right under Article 19(1)(a). Furthermore, if the “subscriber” is the accused, then section 69(4) goes against Article 20(3) by forcing the accused to bear witness against himself.</p>
<p> </p>
<h3>Draft Rules under Section 69 <br /></h3>
<p><strong>Rule 3</strong><br />Directions for interception or monitoring or decryption of any information generated, transmitted, received or stored in any computer resource under sub- section (2) of section 69 of the Information Technology (Amendment) Act, 2008 (hereinafter referred to as the said Act) shall not be issued except by an order made by the concerned competent authority who is Union Home Secretary in case of Government of India; the Secretary in-charge of Home Department in a State Government or Union Territory as the case may be. In unavoidable circumstances, such order may be made by an officer, not below the rank of a Joint Secretary to the Government of India, who has been duly authorised by the Union Home Secretary or by an officer equivalent to rank of Joint Secretary to Government of India duly authorised by the Secretary in-charge of Home Department in the State Government or Union Territory, as the case may be:</p>
<p>Provided that in emergency cases – <br />(i) in remote areas, where obtaining of prior directions for interception or monitoring or decryption of information is not feasible; or <br />(ii) for operational reasons, where obtaining of prior directions for interception or monitoring or decryption of any information generated, transmitted, received or stored in any computer resource is not feasible;</p>
<p>the required interception or monitoring or decryption of any information generated, transmitted, received or stored in any computer resource shall be carried out with the prior approval of the Head or the second senior most officer of the Security and Law Enforcement Agencies (hereinafter referred to as the said Security Agencies) at the Central Level and the officers authorised in this behalf, not below the rank of Inspector General of Police or an officer of equivalent rank, at the State and Union Territory level. The concerned competent authority, however, shall be informed of such interceptions or monitoring or decryption by the approving authority within three working days and that such interceptions or monitoring or decryption shall be got confirmed by the concerned competent authority within a period of seven working days. If the confirmation from the concerned competent authority is not received within the stipulated seven working days, such interception or monitoring or decryption shall cease and the same information shall not be intercepted or monitored or decrypted thereafter without the prior approval of the concerned competent authority, as the case may be. </p>
<p><br /><strong>Recommendation #3</strong><br />In Rule 3, the following proviso may be inserted:</p>
<p class="callout">“Provided that in the event of cooperation by any intermediary being required for the purpose of interception, monitoring or decryption of such information as is referred to in this Rule, prior permission from a Supervisory Committee headed by a retired Judge of the Supreme Court or the High Courts shall be obtained before seeking to enforce the Order mentioned in this Rule against such intermediary.”</p>
<p><strong><br /></strong></p>
<p><strong>Reasons for the Recommendation </strong><br />Section 69 and the draft rules suffer from absence of essential procedural safeguards. This has come in due to the blanket emulation of the Telegraph Rules. Additional safeguards should have been prescribed to ensure that the intermediary is put to minimum hardship when carrying on the monitoring or being granted access to a computer resource. Those are akin to a raid, in the sense that it can stop an online e-commerce portal from carrying out operations for a day or even more, thus affecting their revenue. It is therefore recommended that in any situation where cooperation from the intermediary is sought, prior judicial approval has to be taken. The Central or State Government cannot be the sole authority in such cases.</p>
<p>Furthermore, since access to the computer resource is required, an executive order should not suffice, and a search warrant or an equivalent which results from a judicial application of the mind (by the Supervisory Committee, for instance) should be required.</p>
<p><br /><strong>Recommendation #4</strong><br />The following should be inserted after the last line in Rule 22:</p>
<p class="callout">The Review Committee shall also have the power to award compensation to the intermediary in cases where the intermediary has suffered loss or damage due to the actions of the competent authority while implementing the order issued under Rule 3.</p>
<p><strong><br /></strong></p>
<p><strong>Reasons for the Recommendation</strong><br />The Review Committee should be given the power to award compensation to the loss suffered by the intermediary in cases where the police use equipment or software for monitoring/decryption that causes damage to the intermediary’s computer resources / networks. The Review Committee should also be given the power to award compensation in the case of monitoring directions which are later found to be frivolous or even worse, borne out of mala fide considerations. These provisions will act as a disincentive against the abuse of power contained in Section 69. </p>
<p> </p>
<h2>Blocking of Access to Information</h2>
<h3>Section 69A</h3>
<p>The section provides for blocking of websites if the government is satisfied that it is in the interests of the purposes enlisted in the section. It also provides for penalty of up to seven years for intermediaries who fail to comply with the directions under this section. <br />The rules under this section describe the procedure which have to be followed barring which the review committee may, after due examination of the procedural defects, order an unblocking of the website.</p>
<p> </p>
<p><strong>Section 69A(3)</strong><br />The intermediary who fails to comply with the direction issued under sub-section (1) shall be punished with an imprisonment for a term which may extend to seven years and also be liable to fine.</p>
<p> </p>
<p><strong>Recommendation #5</strong><br />The penalty for intermediaries must be lessened.</p>
<p> </p>
<p><strong>Reasons for Recommendations </strong><br />The penal provision in this section which prescribes up to seven years imprisonment and a fine on an intermediary who fails to comply with the directions so issued is also excessively harsh. Considering the fact that various mechanisms are available to escape the blocking of websites, the intermediaries must be given enough time and space to administer the block effectively and strict application of the penal provisions must be avoided in bona fide cases.</p>
<p>The criticism about Section 69 and the draft rules in so far as intermediary liability is concerned, will also apply mutatis mutandis to these rules as well as Section 69A.</p>
<p> </p>
<h3>Draft Rules under Section 69A</h3>
<p><strong>Rule 22: Review Committee</strong><br />The Review Committee shall meet at least once in two months and record its findings whether the directions issued under Rule (16) are in accordance with the provisions of sub-section (2) of section 69A of the Act. When the Review Committee is of the opinion that the directions are not in accordance with the provisions referred to above, it may set aside the directions and order for unblocking of said information generated, transmitted, received, stored or hosted in a computer resource for public access.</p>
<p><br /><strong>Recommendation #6</strong><br />A permanent Review Committee should be specially for the purposes of examining procedural lapses. </p>
<p><br /><strong>Reasons for Recommendation </strong><br />Rule 22 provides for a review committee which shall meet a minimum of once in every two months and order for the unblocking of a site of due procedures have not been followed. This would mean that if a site is blocked, there could take up to two months for a procedural lapse to be corrected and it to be unblocked. Even a writ filed against the policing agencies for unfair blocking would probably take around the same time. Also, it could well be the case that the review committee will be overborne by cases and may fall short of time to inquire into each. Therefore, it is recommended that a permanent Review Committee be set up which will monitor procedural lapses and ensure that there is no blocking in the first place before all the due procedural requirements are met. <br /><br /></p>
<h2>Monitoring and Collection of Traffic Data</h2>
<h3>Draft Rules under Section 69B</h3>
<p>The section provides for monitoring of computer networks or resources if the Central Government is satisfied that conditions so mentioned are satisfied.</p>
<p>The rules provide for the manner in which the monitoring will be done, the process by which the directions for the same will be issued and the liabilities of the intermediaries and monitoring officers with respect to confidentiality of the information so monitored.</p>
<p><br /><strong>Grounds for Monitoring </strong><br /><strong>Rule 4</strong><br />The competent authority may issue directions for monitoring and collection of traffic data or information generated, transmitted, received or stored in any computer resource for any or all of the following purposes related to cyber security:<br />(a) forecasting of imminent cyber incidents;<br />(b) monitoring network application with traffic data or information on computer resource;<br />(c) identification and determination of viruses/computer contaminant;<br />(d) tracking cyber security breaches or cyber security incidents;<br />(e) tracking computer resource breaching cyber security or spreading virus/computer contaminants;<br />(f) identifying or tracking of any person who has contravened, or is suspected of having contravened or being likely to contravene cyber security;<br />(g) undertaking forensic of the concerned computer resource as a part of investigation or internal audit of information security practices in the computer resource;<br />(h) accessing a stored information for enforcement of any provisions of the laws relating to cyber security for the time being in force;<br />(i) any other matter relating to cyber security.</p>
<p><br /><strong>Rule 6</strong><br />No direction for monitoring and collection of traffic data or information generated, transmitted, received or stored in any computer resource shall be given for purposes other than those specified in Rule (4).</p>
<p><br /><strong>Recommendation #7</strong><br />Clauses (a), (b), (c), and (i) of Rule 4 must be repealed.</p>
<p><br /><strong>Reasons for Recommendations </strong><br />The term “cyber incident” has not been defined, and “cyber security” has been provided a circular definition. Rule 6 clearly states that no direction for monitoring and collection of traffic data or information generated, transmitted, received or stored in any computer resource shall be given for purposes other than those specified in Rule 4. Therefore, it may prima facie appear that the government is trying to lay down clear and strict safeguards when it comes to monitoring at the expense of a citizens' privacy. However, Rule 4(i) allows the government to monitor if it is satisfied that it is “any matter related to cyber security”. This may well play as a ‘catch all’ clause to legalise any kind of monitoring and collection and therefore defeats the purported intention of Rule 6 of safeguarding citizen’s interests against arbitrary and groundless intrusion of privacy. Also, the question of degree of liability of the intermediaries or persons in charge of the computer resources for leak of secret and confidential information remains unanswered. <br /><br /><strong>Rule 24: Disclosure of monitored data </strong><br />Any monitoring or collection of traffic data or information in computer resource by the employee of an intermediary or person in-charge of computer resource or a person duly authorised by the intermediary, undertaken in course of his duty relating to the services provided by that intermediary, shall not be unlawful, if such activities are reasonably necessary for the discharge his duties as per the prevailing industry practices, in connection with :<br />(vi) Accessing or analysing information from a computer resource for the purpose of tracing a computer resource or any person who has contravened, or is suspected of having contravened or being likely to contravene, any provision of the Act that is likely to have an adverse impact on the services provided by the intermediary.</p>
<p><br /><strong>Recommendation #8</strong><br />Safeguards must be introduced with respect to exercise of powers conferred by Rule 24(vi). </p>
<p><br /><strong>Reasons for Recommendations </strong><br />Rule 24(vi) provides for access, collection and monitoring of information from a computer resource for the purposes of tracing another computer resource which has or is likely to contravened provisions of the Act and this is likely to have an adverse impact on the services provided by the intermediary. Analysis of a computer resource may reveal extremely confidential and important data, the compromise of which may cause losses worth millions. Therefore, the burden of proof for such an intrusion of privacy of the computer resource, which is first used to track another computer resource which is likely to contravene the Act, should be heavy. Also, this violation of privacy should be weighed against the benefits accruing to the intermediary. The framing of sub rules under this clearly specifying the same is recommended. </p>
<p><br />The disclosure of sensitive information by a monitoring agency for purposes of ‘general trends’ and ‘general analysis of cyber information’ is uncalled for as it dissipates information among lesser bodies that are not governed by sufficient safeguards and this could result in outright violation of citizen’s privacy.</p>
<p> </p>
<h2>Manner of Functioning of CERT-In</h2>
<h3>Draft Rules under Section 70B(5)</h3>
<p>Section 70B provides for an Indian Computer Emergency Response Team (CERT-In) which shall serve as a national agency for performing duties as prescribed by clause 4 of this section in accordance to the rules as prescribed.<br />The rules provide for CERT-In’s authority, composition of advisory committee, constituency, functions and responsibilities, services, stakeholders, policies and procedures, modus operandi, disclosure of information and measures to deal with non compliance of orders so issued. However, there are a few issues which need to be addressed as under:</p>
<p><br /><strong>Definitions</strong><br />In these Rules, unless the context otherwise requires, “Cyber security incident” means any real or suspected adverse event in relation to cyber security that violates an explicit or implied security policy resulting in unauthorized access, denial of service/ disruption, unauthorized use of a computer resource for processing or storage of information or changes to data, information without authorization.</p>
<p><br /><strong>Recommendation #9</strong><br />The words ‘or implied’’ must be excluded from rule 2(g) which defines ‘cyber security incident’, and the term ‘security policy’ must be qualified to state what security policy is being referred to.</p>
<p><br /><strong>Reasons for Recommendation</strong><br />“Cyber security incident” means any real or suspected adverse event in relation to cyber security that violates an explicit or implied security policy resulting in unauthorized access, denial of service/disruption, unauthorized use of a computer resource for processing or storage of information or changes to data, information without authorization. </p>
<p><br />Thus, the section defines any circumstance where an explicit or implied security policy is contravened as a ‘cyber security incident’. Without clearly stating what the security policy is, an inquiry into its contravention is against an individual’s civil rights. If an individual’s actions are to be restricted for reasons of security, then the restrictions must be expressly defined and such restrictions cannot be said to be implied.</p>
<p><br /><strong>Rule 13(4): Disclosure of Information </strong><br />Save as provided in sub-rules (1), (2), (3) of rule 13, it may be necessary or expedient to so to do, for CERT-In to disclose all relevant information to the stakeholders, in the interest of sovereignty or integrity of India, defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of an offence relating to cognizable offence or enhancing cyber security in the country.</p>
<p><br /><strong>Recommendation #10</strong><br />Burden of necessity for disclosure of information should be made heavier. </p>
<p><br /><strong>Reasons for the Recommendation</strong><br />Rule 13(4) allows the disclosure of information by CERT-In in the interests of ‘enhancing cyber security’. This enhancement however needs to be weighed against the detriment caused to the individual and the burden of proof must be on the CERT-In to show that this was the only way of achieving the required. </p>
<p><br /><strong>Rule 19: Protection for actions taken in Good Faith </strong><br />All actions of CERT-In and its staff acting on behalf of CERT-In are taken in good faith in fulfillment of its mandated roles and functions, in pursuance of the provisions of the Act or any rule, regulations or orders made thereunder. CERT-In and its staff acting on behalf of CERT-In shall not be held responsible for any unintended fallout of their actions.</p>
<p><br /><strong>Recommendation #11</strong><br />CERT-In should be made liable for their negligent action and no presumption of good faith should be as such provided for. </p>
<p><br /><strong>Reasons for the Recommendation </strong><br />Rule 19 provides for the protection of CERT-In members for the actions taken in ‘good faith’. It defines such actions as ‘unintended fallouts’. Clearly, if information has been called for and the same is highly confidential, then this rule bars the remedy for any leak of the same due to the negligence of the CERT-In members. This is clearly not permissible as an agency that calls for delicate information should also be held responsible for mishandling the same, intentionally or negligently. Good faith can be established if the need arises, and no presumption as to good faith needs to be provided.</p>
<p> </p>
<h3>Draft Rules under Section 52</h3>
<p>These rules, entitled the “Cyber Appellate Tribunal (Salary, Allowances and Other Terms and Conditions of Service of Chairperson and Members) Rules, 2009” are meant to prescribe the framework for the independent and smooth functioning of the Cyber Appellate Tribunal. This is so because of the specific functions entrusted to this Appellate Tribunal. Under the IT Act, 2000 as amended by the IT (Amendment) Act, 2008, this Tribunal has the power to entertain appeals against orders passed by the adjudicating officer under Section 47.</p>
<p><br /><strong>Recommendation #12</strong><br />Amend qualifications Information Technology (Qualification and Experience of Adjudicating Officers and Manner of Holding Enquiry) Rules, 2003, to require judicial training and experience.</p>
<p><br /><strong>Reasons for the Recommendation</strong><br />It is submitted that an examination of these rules governing the Appellate Tribunal cannot be made independent of the powers and qualifications of Adjudicating Officers who are the original authority to decide on contravention of provisions in the IT Act dealing with damage to computer system and failure to furnish information. Even as per the Information Technology (Qualification and Experience of Adjudicating Officers and Manner of Holding Enquiry) Rules, 2003, persons who did not possess judicial experience and training, such as those holding the post of Director in the Central Government, were qualified to perform functions under Section 46 and decide whether there has been unauthorized access to a computer system. This involves appreciation of evidence and is not a merely administrative function that could be carried on by any person who has basic knowledge of information technology.</p>
<p>Viewed from this angle, the qualifications of the Cyber Appellate Tribunal members should have been made much tighter as per the new draft rules. The above rules when read with Section 50 of the IT Act, as amended in 2008, do not say anything about the qualification of the technical members apart from the fact that such person shall not be appointed as a Member, unless he is, or has been, in the service of the Central Government or a State Government, and has held the post of Additional Secretary or Joint Secretary or any equivalent post. Though special knowledge of, and professional experience in, information technology, telecommunication, industry, management or consumer affairs, has been prescribed in the Act as a requirement for any technical member.</p>
<p> </p>
<h3>Draft Rules under Section 54</h3>
<p>These Rules do not suffer any defect and provide for a fair and reasonable enquiry in so far as allegations made against the Chairperson or the members of the Cyber Appellate Tribunal are concerned.</p>
<p> </p>
<h2>Penal Provisions</h2>
<h3>Section 66A</h3>
<p>Any person who sends, by means of a computer resource or a communication device,<br /> (a) any information that is grossly offensive or has menacing character; or<br /> (b) any information which he knows to be false, but for the purpose of causing annoyance, inconvenience, danger, obstruction, insult, injury, criminal intimidation, enmity, hatred or ill will, persistently by making use of such computer resource or a communication device,<br /> (c) any electronic mail or electronic mail message for the purpose of causing annoyance or inconvenience or to deceive or to mislead the addressee or recipient about the origin of such messages,<br />shall be punishable with imprisonment for a term which may extend to three years and with fine.<br />Sec. 32 of the 2008 Act inserts Sec. 66A which provides for penal measures for mala fide use of electronic resources to send information detrimental to the receiver. For the section to be attracted the ‘information’ needs to be grossly offensive, menacing, etc. and the sender needs to have known it to be false.</p>
<p>While the intention of the section – to prevent activities such as spam-sending – might be sound and even desirable, there is still a strong argument to be made that words is submitted that the use of words such as ‘annoyance’ and ‘inconvenience’ (in s.66A(c)) are highly problematic. Further, something can be grossly offensive without touching upon any of the conditions laid down in Article 19(2). Without satisfying the conditions of Article 19(2), this provision would be ultra vires the Constitution.</p>
<p><br /><strong>Recommendation #13</strong><br />The section should be amended and words which lead to ambiguity must be excluded.</p>
<p><br /><strong>Reasons for the Recommendation </strong><br />A clearer phrasing as to what exactly could convey ‘ill will’ or cause annoyance in the electronic forms needs to be clarified. It is possible in some electronic forms for the receiver to know the content of the information. In such circumstances, if such a possibility is ignored and annoyance does occur, is the sender still liable? Keeping in mind the complexity of use of electronic modes of transmitting information, it can be said that several such conditions arise which the section has vaguely covered. Therefore, a stricter and more clinical approach is necessary. </p>
<p><br /><strong>Recommendation #14</strong><br />A proviso should be inserted to this section providing for specific exceptions to the offence contained in this section for reasons such as fair comment, truth, criticism of actions of public officials etc. </p>
<p> </p>
<p><strong>Reasons for the Recommendation </strong><br />The major problem with Section 66A lies in clause (c) as per which any electronic mail or electronic mail message sent with the purpose of causing annoyance or inconvenience is covered within the ambit of offensive messages. This does not pay heed to the fact that even a valid and true criticism of the actions of an individual, when brought to his notice, can amount to annoyance. Indeed, it may be brought to his attention with the sole purpose of causing annoyance to him. When interpreting the Information Technology Act, it is to be kept in mind that the offences created under this Act should not go beyond those prescribed in the Indian Penal Code except where there is a wholly new activity or conduct, such as hacking for instance, which is sought to be criminalized.</p>
<p>Offensive messages have been criminalized in the Indian Penal Code subject to the conditions specified in Chapter XXII being present. It is not an offence to verbally insult or annoy someone without anything more being done such as a threat to commit an offence, etc. When this is the case with verbal communications, there is no reason to make an exception for those made through the electronic medium and bring any electronic mail or message sent with the purpose of causing annoyance or inconvenience within the purview of an offensive message.</p>
<p> </p>
<h3>Section 66F</h3>
<p>The definition of cyber-terrorism under this provision is too wide and can cover several activities which are not actually of a “terrorist” character. <br />Section 66F(1)(B) is particularly harsh and goes much beyond acts of “terrorism” to include various other activities within its purview. As per this provision, <br />“[w]hoever knowingly or intentionally penetrates or accesses a computer resource without authorisation or exceeding authorised access, and by means of such conduct obtains access to information, data or computer database that is restricted for reasons for the security of the State or foreign relations, or any restricted information, data or computer database, with reasons to believe that such information, data or computer database so obtained may be used to cause or is likely to cause injury to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, public order, decency or morality, or in relation to contempt of court, defamation or incitement to an offence, or to the advantage of any foreign nation, group of individuals or otherwise, commits the offence of cyber terrorism.”</p>
<p>This provision suffers from several defects and hence ought to be repealed. </p>
<p><br /><strong>Recommendation #15</strong><br />Section 66F(1)(B) has to be repealed or suitably amended to water down the excessively harsh operation of this provision. The restrictive nature of the information that is unauthorisedly accessed must be confined to those that are restricted on grounds of security of the State or foreign relations. The use to which such information may be put should again be confined to injury to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order. A mere advantage to a foreign nation cannot render the act of unauthorized access one of cyber-terrorism as long as such advantage is not injurious or harmful in any manner to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order. A mens rea requirement should also be introduced whereby mere knowledge that the information which is unauthorisedly accessed can be put to such uses as given in this provision should not suffice for the unauthorised access to amount to cyber-terrorism. The unauthorised access should be with the intention to put such information to this use. The amended provision would read as follows:</p>
<p class="callout">“[w]hoever knowingly or intentionally penetrates or accesses a computer resource without authorisation or exceeding authorised access, and by means of such conduct obtains access to information, data or computer database that is restricted for reasons for the security of the State or foreign relations, with the intention that such information, data or computer database so obtained may be used to cause injury to the interests of the sovereignty and integrity of India, the security of the State, friendly relations with foreign States, or public order, commits the offence of cyber terrorism.”</p>
<p class="callout"> </p>
<p><strong>Reasons for the Recommendation </strong><br />The ambit of this provision goes much beyond information, data or computer database which is restricted only on grounds of security of the State or foreign relations and extends to “any restricted information, data or computer database”. This expression covers any government file which is marked as confidential or saved in a computer used exclusively by the government. It also covers any file saved in a computer exclusively used by a private corporation or enterprise. Even the use to which such information can be put need not be confined to those that cause or are likely to cause injury to the interests of the sovereignty and integrity of India, the security of the State, or friendly relations with foreign States. Information or data which is defamatory, amounting to contempt of court, or against decency / morality, are all covered within the scope of this provision. This goes way beyond the idea of a terrorist activity and poses serious questions. While there is no one globally accepted definition of cyberterrorism, it is tough to conceive of slander as a terrorist activity.</p>
<p>To give an illustration, if a journalist managed to unauthorisedly break into a restricted database, even one owned by a private corporation, and stumbled upon information that is defamatory in character, he would have committed an act of “cyber-terrorism.” Various kinds of information pertaining to corruption in the judiciary may be precluded from being unauthorisedly accessed on the ground that such information may be put to use for committing contempt of court. Any person who gains such access would again qualify as a cyber-terrorist. The factual situations are numerous where this provision can be put to gross misuse with the ulterior motive of muzzling dissent or freezing access to information that may be restricted in nature but nonetheless have a bearing on probity in public life etc. It is therefore imperative that this provision may be toned down as recommended above. <br /><br /></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/comments-draft-rules'>http://editors.cis-india.org/internet-governance/blog/comments-draft-rules</a>
</p>
No publisherpraneshIT ActEncryptionIntellectual Property RightsIntermediary LiabilityPublicationsCensorship2011-09-21T06:13:42ZBlog EntryCIS Para-wise Comments on Intermediary Due Diligence Rules, 2011
http://editors.cis-india.org/internet-governance/blog/intermediary-due-diligence
<b>On February 7th 2011, the Department of Information Technology, MCIT published draft rules on its website (The Information Technology (Due diligence observed by intermediaries guidelines) Rules, 2011) in exercise of the powers conferred by Section 87(2)(zg), read with Section 79(2) of the Information Technology Act, 2000. Comments were invited from the public before February 25th 2011. Accordingly, Privacy India and Centre for Internet and Society, Bangalore have prepared the following para-wise comments for the Ministry’s consideration.</b>
<h2>A. General Objections</h2>
<p>A number of the provisions under these Rules have no nexus with their parent provision, namely s.79(2). Section 79(1) provides for exemption from liability for intermediaries. Section 79(2) thereupon states:</p>
<blockquote></blockquote>
<blockquote>
<p>79. Intermediaries not to be liable in certain cases—</p>
<blockquote>
<p>(2) The provisions of sub-section (1) shall apply if—</p>
<blockquote>
<blockquote></blockquote>
</blockquote>
<blockquote>
<p>(a) the function of the intermediary is limited to providing access to a communication system over which information made available by third parties is transmitted or temporarily stored or hasted; or</p>
<p>(b) the intermediary does not—</p>
<blockquote>
<p>(i) initiate the transmission,</p>
<p>(ii) select the receiver of the transmission, and</p>
<p>(iii) select or modify the information contained in the transmission;</p>
</blockquote>
<blockquote>
<blockquote></blockquote>
</blockquote>
<p>(c) the intermediary observes due diligence while discharging his duties under this Act and also observes such other guidelines as the Central Government may prescribe in this behalf.</p>
</blockquote>
</blockquote>
</blockquote>
<blockquote></blockquote>
<p> </p>
<p>Therefore, by not observing any of the provisions of the Rules, the intermediary opens itself up for liability for actions of its users. However, many of the provisions of the Rules have no rational nexus with due diligence to be observed by the intermediary to absolve itself from liability.</p>
<h2>B. Specific Objections</h2>
<h3>Rule 2(b), (c), and (k)</h3>
<blockquote></blockquote>
<blockquote></blockquote>
<blockquote>
<p>(b) “Blog” means a type of website, usually maintained by an individual with regular entries of commentary, descriptions of events, or other material such as graphics or video. Usually blog is a shared on-line journal where users can post diary entries about their personal experiences and hobbies;</p>
</blockquote>
<blockquote></blockquote>
<blockquote>
<p>(c) “Blogger” means a person who keeps and updates a blog;</p>
</blockquote>
<blockquote>
<p>(k) “User” means any person including blogger who uses any computer resource for the purpose of sharing information, views or otherwise and includes other persons jointly participating in using the computer resource of intermediary</p>
</blockquote>
<blockquote></blockquote>
<h3><strong>Comments</strong></h3>
<p> It is unclear why it is necessary to specifically target bloggers as users, leaving out other users such as blog commenters, social network users, microbloggers, podcasters, etc. It makes the rules technologically non-neutral.</p>
<h3><strong>Recommendation</strong></h3>
<p>We recommend that these 3 sub-rules be deleted.</p>
<h3> Rule 3(2)</h3>
<blockquote></blockquote>
<blockquote>
<p>3. <strong>Due Diligence observed by intermediary</strong>.— The intermediary shall observe following due diligence while discharging its duties.</p>
<blockquote>
<p>(2) The intermediary shall notify users of computer resource not to use, display, upload, modify, publish, transmit, update, share or store any information that : —</p>
<blockquote>
<p>(a) belongs to another person;</p>
<p>(b) is harmful, threatening, abusive, harassing, blasphemous, objectionable, defamatory, vulgar, obscene, pornographic, paedophilic, libellous, invasive of another’s privacy, hateful, or racially, ethnically or otherwise objectionable, disparaging, relating or encouraging money laundering or gambling, or otherwise unlawful in any manner whatever;</p>
<p>(c) harm minors in any way;</p>
<p>(d) infringes any patent, trademark, copyright or other proprietary rights;</p>
<p>(e) violates any law for the time being in force;</p>
<p>(f) discloses sensitive personal information of other person or to which the user does not have any right to;</p>
<p>(g) causes annoyance or inconvenience or deceives or misleads the addressee about the origin of such messages or communicates any information which is grossly offensive or menacing in nature;</p>
<p>(h) impersonate another person;</p>
<p>(i) contains software viruses or any other computer code, files or programs designed to interrupt, destroy or limit the functionality of any computer resource;</p>
<p>(j) threatens the unity, integrity, defence, security or sovereignty of India, friendly relations with foreign states, or or public order or causes incitement to the commission of any cognizable offence or prevents investigation of any offence or is insulting any other nation.</p>
</blockquote>
</blockquote>
</blockquote>
<blockquote>
<blockquote></blockquote>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>Firstly, such ‘standard’ terms of use [1] might make sense for one intermediary, but not for all. For instance, an intermediary such as site with user-generated content (e.g., Wikipedia) would need different terms of use from an intermediary such as an e-mail provider (e.g., Hotmail), because the kind of liability they accrue are different. This is similar to how the liability that a newspaper publisher accrues is different from that accrued by the post office. However, forcing standard terms of use negates this difference. Thus, these are impractical.</p>
<p>Secondly, read with the legal obligation of the intermediary to remove such information (contained in rule 3(3)), they vest an extraordinary power of censorship in the hands of the intermediary, which could easily lead to the stifling of the constitutionally guaranteed freedom of speech online. Analogous restrictions do not exist in other fields, e.g., against the press in India or against courier companies, and there is no justification to impose them on content posted online. Taken together, these provisions make it impossible to publish critical views about anything without the risk of being summarily censored.</p>
<p>Thirdly, while it is possible to apply Indian law to intermediaries, it is impracticable to require all intermediaries (whether in India or not) to have in their terms of use India-specific clauses such as rule 3(2)(j). Instead, it is better to merely require them to ask their users to follow all relevant laws.</p>
<p>Individual instances of how these rules are overly broad are contained in an appendix to this submission.</p>
<h3><strong>Recommendation</strong></h3>
<p>We strongly recommend the deletion of this sub-rule, except clause (e).</p>
<h3>Rule 3(3)</h3>
<blockquote>
<p>(3) The intermediary shall not itself host or publish or edit or store any information or shall not initiate the transmission, select the receiver of transmission, and select or modify the information contained in the transmission as specified in sub-rule (2).</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>This sub-rule is ultra vires s.79 of the IT Act, which does not require intermediaries not to “host or publish or edit or store any information”. If fact, s.79(2) merely states that by violating the provisions of s.79(2), the intermediary loses the protection of s.79(1). It does not however make it unlawful to violate s.79(2), as rule 3(3) does. This makes rule 3(3) ultra vires the Act.</p>
<h3><strong>Recommendation</strong></h3>
<p>This sub-rule should be deleted.</p>
<h3><strong>Rule 3(4)</strong></h3>
<blockquote>
<p>(4) The intermediary upon obtaining actual knowledge by itself or been brought to actual knowledge by an authority mandated under the law for the time being in force in writing or through email signed with electronic signature about any such information as mentioned in sub-rule (2) above, shall act expeditiously to work with user or owner of such information to remove access to such information that is claimed to be infringing or to be the subject of infringing activity. Further the intermediary shall inform the police about such information and preserve the records for 90 days</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>This rule is also ultra vires s.69A of the IT Act as well as the Constitution of India. Section 69A states all the grounds on which an intermediary may be required to restrict access to information [2]. It does not allow for expansion of those grounds, because it has been carefully worded to maintains its constitutional validity vis-a-vis Articles 19(1)(a) and 19(2) of the Constitution of India. The rules framed under s.69A prescribe an elaborate procedure before such censorship may be ordered. The rules under s.69A will be rendered nugatory if any person could get content removed or blocked under s.79(2).<strong><br /></strong></p>
<p>This rule requires an intermediary to immediately take steps to remove access to information merely upon receiving a written request from “any authority mandated under the law”. Thus, for example, any authority can easily immunize itself from criticism on the internet by simply sending a written notice to the intermediary concerned. This is directly contrary to, and completely subverts the legislative intent expressed in Section 69B which lays down an elaborate procedure to be followed before any information can be lawfully blocked.</p>
<p>If any person is aggrieved by information posted online, they may seek their remedies—including the relief of injunction—from courts of law, under generally applicable civil and criminal law. Inserting a rule such as this one would take away the powers of the judiciary in India to define the line dividing permissible and impermissible speech, and vest it instead in the whims of each intermediary. This can only have a chilling effect on debates in the public domain (of which the Internet is a part) which is the foundation of any democracy.</p>
<h3><strong>Recommendation</strong></h3>
<p>This rule should modified so that an intermediary is obliged to take steps towards removal of content only when (a) backed by an order from a court or (b) a direction issued following the procedure prescribed by the rules framed under Section 69A.</p>
<h3>Rule 3(5) & (7) & (8) & (10)</h3>
<blockquote></blockquote>
<blockquote>
<p>(5) The Intermediary shall inform its users that in case of non-compliance with terms of use of the services and privacy policy provided by the Intermediary, the Intermediary has the right to immediately terminate the access rights of the users to the site of Intermediary;</p>
<p>(7) The intermediary shall not disclose sensitive personal information;</p>
<p>(8) Disclosure of information by intermediary to any third party shall require prior permission or consent from the provider of such information, who has provided such information under lawful contract or otherwise;</p>
<p>(10) The information collected by the intermediary shall be used for the purpose for which it has been collected.</p>
</blockquote>
<blockquote></blockquote>
<h3><strong>Comments</strong></h3>
<p>These sub-rules have no nexus with intermediary liability or non-liability under s.79(2). For instance, it is unreasonable to say that an intermediary may be held liable for the actions of its users if it does not inform its users about its right to terminate access by the user to its services. Furthermore, not all intermediaries need be websites, as sub-rule 5 assumes. An intermediary can even be an “internet service provider” or a “cyber cafe” or a “telecom service provider”, as per rule 2(j) read with s.2(1)(w) of the IT Act.</p>
<p>The requirements under sub-rules (7), (8), and (10) are rightfully the domain of s.43A and the rules made thereunder, and not s.79(2) nor these rules.</p>
<h3><strong>Recommendation</strong></h3>
<p>These sub-rules should be deleted, and sub-rules (7), (8), and (10) may placed instead in the rules made under s.43A.</p>
<h3>Rule 3(9)</h3>
<blockquote>
<p>(9) Intermediary shall provide information to government agencies who are lawfully authorised for investigative, protective, cyber security or intelligence activity. The information shall be provided for the purpose of verification of identity, or for prevention, detection, investigation, prosecution, cyber security incidents and punishment of offences under any law for the time being in force, on a written request stating clearly the purpose of seeking such information.</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>This provision is ultra vires ss.69 and 69B. Rules have already been issued under ss.69 and 69B which stipulate the mechanism and procedure to be followed by the government for interception, monitoring or decrypting information in the hands of intermediaries. Thus under the Interception Rules 2009 framed under Section 69, permission must first be obtained from a “competent authority” before an intermediary can be directed to provide access to its records and facilities. The current rule completely removes the safeguards contained in s.69 and its rules, and would make intermediaries answerable to virtually any request from any government agency. This is contrary to the legislative intent expressed in Section 69.</p>
<h3><strong>Recommendation</strong></h3>
<p>We recommend this sub-rule be deleted.</p>
<h3><strong>Rule 3(12)</strong></h3>
<blockquote>
<p>(12) The intermediary shall report cyber security incidents and also share cyber security incidents related information with the Indian Computer Emergency Response Team.</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>The rules relating to how and when the Indian Computer Emergency Response Team may request for information from intermediaries is rightfully the subject matter of s.70B(5) [3] and the rules made thereunder by virtue of the rule making power granted by s.87(2)(yd). The subject matter of rule 3(12) is not liability of intermediaries for third-party actions, hence there is no nexus between the rule-making power, and the rule.</p>
<h3><strong>Recommendations</strong></h3>
<p>We recommend that this sub-rule be deleted.</p>
<h3>Rule 3(14)</h3>
<blockquote>
<p>(14) The intermediary shall publish on its website the designated agent to receive notification of claimed infringements.</p>
</blockquote>
<h3><strong>Comments</strong></h3>
<p>It is unclear what “infringements” are being referred to in this sub-rule. Neither s.79 nor these rules provide for “infringements”. The same reasoning applied for rule 3(4) would also apply here. It would be better to require the intermediary to publish on its website a method of providing judicial notice.</p>
<h3><strong>Recommendations</strong></h3>
<p>Delete, and replace with a requirement for the intermediary to publish on its website a method of providing judicial notice.<strong><br /></strong></p>
<h2>Footnotes <br /></h2>
<ol><li>
<p>For instance, the Section B(1) of the World of Warcraft Code of Conduct “When engaging in Chat, you may not: (i) Transmit or post any content or language which, in the sole and absolute discretion of Blizzard, is deemed to be offensive, including without limitation content or language that is unlawful, harmful, threatening, abusive, harassing, defamatory, vulgar, obscene, hateful, sexually explicit, or racially, ethnically or otherwise objectionable.</p>
</li><li>
<p>It is only “in the interest of sovereignty and integrity of India. defence of India, security of the State, friendly relations with foreign States or public order or for preventing incitement to the commission of any cognizable offence relating to above” that intermediaries may be issued directions to block access to information.</p>
</li><li>
<p>70B(5) sates that the The manner of performing functions and duties of the agency referred to in sub-section (1) shall be such as may be prescribed.</p>
</li></ol>
<p> </p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/intermediary-due-diligence'>http://editors.cis-india.org/internet-governance/blog/intermediary-due-diligence</a>
</p>
No publisherpraneshFreedom of Speech and ExpressionIT ActIntermediary Liability2012-07-11T10:27:26ZBlog EntryChilling Effects and Frozen Words
http://editors.cis-india.org/internet-governance/chilling-effects-frozen-words
<b>What if the real danger is not that we lose our freedom of speech and expression but our sense of humour as a nation? Lawrence Liang's op-ed was published in the Hindu on April 30, 2012. </b>
<p>While freedom of speech and expression is an individual right, its actualisation often relies on a vast infrastructure of intermediaries.</p>
<p>In the offline world, this includes newspapers, television channels, public auditoriums, etc. It is often assumed that the internet has created a more robust public sphere of speech by doing away with many structural barriers to free speech. But the fact of the matter is that even if the internet enables a shift from a ‘few to many' to a ‘many to many' model of communication, intermediaries continue to remain important players in facilitating free speech. Can one imagine free speech on the internet being the same without Twitter, social networks or Youtube?</p>
<p>One way of thinking of the infrastructure of communication is in terms of ecology, and in the ecology of speech — as in the environment — an adverse impact on any component threatens the well-being of all. The idea of cyberspace as a commons is a much cherished myth and in the early days of the internet we were perhaps given a glimpse into its utopian possibility. But we would be deluding ourselves if we believed that the problems that plague free speech in the offline world (including ownership of the avenues of speech) are absent in cyberspace. Recall in recent times that one of the most effective ways in which various governments retaliated to the leaking of official secrets on WikiLeaks was by freezing Julian Assange's PayPal account.</p>
<h3>Direct & Indirect Controls</h3>
<p>It may be useful to distinguish between direct controls on free speech and indirect or structural controls on free speech. India has had a long history of battling direct and indirect controls on free speech and with a few exceptions the interests of the press have often coincided with the interests of a robust public sphere of debate and criticism.</p>
<p>In the late 1950s and early 1960s, a number of large media houses battled restrictions imposed on the press by way of control of the number of pages of a newspaper, regulation of the size of advertisements and the price of imported newsprint. On the face of it, some of these restrictions may have seemed like commercial disputes but the Supreme Court rightly recognised that indirect controls could adversely impact the individual's right to express himself or herself as well as to receive information freely.</p>
<p>In the online context, there has also been a similar recognition of the role of intermediaries in providing platforms of speech and it is with this view in mind that a number of countries have incorporated safe harbour provisions in their information technology laws.</p>
<p>Section 79 of the Information Technology Act is one such safe harbour provision in India which provides that intermediaries shall not be liable for any third party action if they are able to prove that the offence or contravention was committed without their knowledge or that they had exercised due diligence to prevent the commission of such offence or contravention. But this safe harbour has effectively been undone with the passing of the Information Technology (Intermediaries guidelines) Rules, 2011.</p>
<p>The rules clarify what standard of due diligence has to be met by intermediaries and Sec. 3(2) of the rules obliges intermediaries to have rules and conditions of usage which ensure that users do not host, display, upload, modify, publish, transmit, update or share any information that is in contravention of the Section. This includes the all too familiar ones (defamatory, obscene, pornographic content) but also a whole host of new categories which could be invoked to restrict speech (“grossly harmful,” “blasphemous,” “harassing,” “hateful”).</p>
<p>As is well known, any restriction on speech in India has to comply with both the test of reasonableness under Article 19(2) of the Constitution, as well as ensuring that the grounds of censorship are located within 19(2). Even though there are laws regulating hate speech in India, blasphemy is not a category under Art. 19(2) and has hitherto not been a part of Indian law. Some of the other categories such as “grossly harmful” suggest the people who drafted the rules seem to have taken a constitutional nap at the drafting board.</p>
<p>Sec. 3(4) of the rules provides that any intermediary who receives a notice by an aggrieved person about any violation of sub rule (2) will have to act within 36 hours and where applicable will ensure that the information is disabled. In the event that it fails to act or to respond, the intermediary cannot claim exemption for liability under Sec. 70 of the IT Act. It is worth noting that most intermediaries receive from hundreds to thousands of requests from individuals on a daily basis asking for the removal of objectionable material. The Centre for Internet and Society conducted a “sting operation” to determine whether the criteria, procedure and safeguards for administration of the takedowns as prescribed by the Rules lead to a chilling effect on free expression.</p>
<p>In the course of the study, frivolous takedown notices were sent to seven intermediaries and their response to the notices was documented. Different policy factors were permuted in the takedown notices in order to understand at what points in the process of takedown, free expression is being chilled. The takedown notices which were sent by the researcher were intentionally defective as they did not establish how they were interested parties, did not specifically identify and discuss any individual URL on the websites, or present any cause of action, or suggest any legal injury. Of the seven intermediaries to which takedown notices were sent, six over-complied with the notices, despite the apparent flaws in them.</p>
<h3>Caution</h3>
<p>Even in cases where the intermediaries challenged the validity of the takedowns, they erred on the side of caution and took down the material. While a number of intermediaries would see themselves as allies in the fight against censorship, more often than not intermediaries are also large commercial organisations whose primary concern is the protection of their business interests. In the face of any potential legal threat, especially from the government, they prefer to err on the side of caution. The people whose content was removed were not told, nor was the general public informed that the content was removed.</p>
<p>The procedural flaws (subjective determination, absence of the right to be heard, the short response time) coupled with the vague grounds on which such takedowns can be claimed, clearly point to a highly flawed situation in which we will see many more trigger happy demands for offending materials to be taken down.</p>
<p>We have already slipped into a state of being a republic of over sensitivity where any politician, religious group or individual can claim their sentiments have been hurt or they have been portrayed disparagingly, as evidenced by the recent attack and subsequent arrest of Professor Ambikesh Mahapatra of Jadavpur University for posting cartoons lampooning Mamata Banerjee.</p>
<h3>Nervous State</h3>
<p>In the era of global outsourcing it was inevitable that the state censorship machinery would also learn a lesson or two from the global trends and what better way of ensuring censorship than outsourcing it to individuals and to corporations. The renowned anthropologist, Michael Taussig, once compared the state to a nervous system and it seems that the Intermediary rules live up to the expectations of a nervous state ever ready to respond to criticism and disparaging cartoons.</p>
<p>What if the real danger is not even that we lose our freedom of speech and expression but we lose our sense of humour as a nation?</p>
<p>The evident flaws of the rules have been acknowledged even by lawmakers, with P. Rajeeve, the CPI(M) M.P., introducing a motion for the annulment of the rules. The annulment motion is going to be debated in the coming weeks and one hopes that the parliamentarians will seriously reconsider the rules in their current form.</p>
<p>When faced with conundrums of the present it is always useful to turn to history and there is reason to believe that while censorship has a very respectable genealogy in Indian thought, it has also been accompanied in equal measure by a tradition of the right to offend.</p>
<p>In his delightful reading of the <em>Arthashastra</em>, Sibaji Bandyopadhay alerts us to the myriad restrictions that existed to control Kusilavas (the term for entertainers which included actors, dancers, singers, storytellers, minstrels and clowns). These regulations ranged from the regulation of their movement during monsoon to prohibitions placed on them, ensuring that they shall not “praise anyone excessively nor receive excessive presents”. While some of the regulations appear harsh and unwarranted, Bandyopadhay says that in contrast to Plato's <em>Republic</em>, which banished poets altogether from the ideal republic, the <em>Arthashastra</em> goes so far as to grant to Kusilavas what we could now call the right to offend. Verse 4.1.61 of the <em>Arthashastra</em> says, “In their performances, [the entertainers] may, if they so wish, make fun of the customs of regions, castes or families and the practices or love affairs (of individuals)”. One hopes that our lawmakers, even if they are averse to reading the Indian Constitution, will be slightly more open to the poetic licence granted by Kautilya.</p>
<p><a class="external-link" href="http://www.thehindu.com/opinion/lead/article3367917.ece?homepage=true">Click</a> for the original published in the Hindu on April 30, 2012. Lawrence Liang is a lawyer and researcher based at Alternative Law Forum, Bangalore. He can be contacted at <a class="external-link" href="mailto:lawrence@altlawforum.org">lawrence@altlawforum.org</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/chilling-effects-frozen-words'>http://editors.cis-india.org/internet-governance/chilling-effects-frozen-words</a>
</p>
No publisherLawrence LiangFreedom of Speech and ExpressionPublic AccountabilityInternet GovernanceIntermediary LiabilityCensorship2012-04-30T07:32:17ZBlog EntryCentre for Internet and Society joins the Dynamic Coalition for Platform Responsibility
http://editors.cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility
<b>The Centre for Internet and Society (CIS) has joined the multistakeholder cooperative engagement amidst stakeholders towards creating Due Diligence Recommendations for online platforms and Model Contractual Provisions to be enshrined in ToS. This blog provides a brief background of the role of dynamic coalitions within the IGF structure, establishes the need for the coalition and provides an update on the action plan and next steps for interested stakeholders.</b>
<p class="callout" style="text-align: justify; ">"Identify emerging issues, bring them to the attention of the relevant bodies and the general public, and, where appropriate, make recommendations."<br />Tunis Agenda (Para 72.g)</p>
<p style="text-align: justify; ">The first United Nations Internet Governance Forum (IGF), in 2006 saw the emergence of the concept of Dynamic Coalition and a number of coalitions have been established over the years. The IGF is structured to bring together multistakeholder groups to,</p>
<p class="callout" style="text-align: justify; ">"Discuss public policy issues related to key elements of Internet governance in order to foster the sustainability, robustness, security, stability and development of the Internet."<br />Tunis Agenda (Para 72.a)</p>
<p style="text-align: justify; ">While IGF workshops allow various stakeholders to jointly analyse "hot topics" or to examine progress that such issues have undertaken since the previous IGF, dynamic coalitions are informal, issue-specific groups comprising members of various stakeholder groups. With no strictures upon the objects, structure or processes of dynamic coalitions claiming association with the IGF, and no formal institutional affiliation, nor any access to the resources of the IGF Secretariat, IGF Dynamic Coalitions allow collaboration of anyone interested in contributing to their discussions. Currently, there are eleven active dynamic coalitions at the IGF and can be divided into three distinct types—networks, working groups and Birds of Feather (BOFs).</p>
<p style="text-align: justify; ">Workshops at the IGF are content specific events that, though valuable in informing participants, are limited in their impact by being confined to the launch of a report or by the issues raised within the conference room. The coalitions on the other hand are expected to have a broader function, acting as a coalescing point for interested stakeholders to gather and analyse progress around identified issues and plan next steps. The coalitions can also make recommendations around issues, however, no mechanism has been developed so far, by which the recommendations can be considered by the plenary body. The long-term nature of coalition is perhaps, most suited to engage stakeholders in heterogeneous groups, towards understanding and cooperating around emerging issues and to make recommendations to inform policy making.</p>
<h3 style="text-align: justify; ">Platform Responsibility</h3>
<p style="text-align: justify; ">Social networks and other interactive online services, give rise to 'cyber-spaces' where individuals gather, express their personalities and exchange information and ideas. The transnational and private nature of such platforms means that they are regulated through contractual provisions enshrined in the platforms' Terms of Service (ToS). The provisions delineated in the ToS not only extend to users in spite of their geographical location, the private decisions undertaken by platform providers in implementing the ToS are not subject to constitutional guarantees framed under national jurisdictions.</p>
<p style="text-align: justify; ">While ToS serve as binding agreement online, an absence of binding international rules in this area despite the universal nature of human rights represented is a real challenge, and makes it necessary to engage in a multistakeholder effort to produce model contractual provisions that can be incorporated in ToS. The concept of 'platform responsibility' aims to stimulate behaviour in platform providers to provide intelligible and solid mechanisms, in line with the principles laid out by the UN Guiding Principles on Business and Human Rights and equip platform users with common and easy-to-grasp tools to guarantee the full enjoyment of their human rights online. The utilisation of model contractual provisions in ToS may prove instrumental in fostering trust in online services for content production, use and dissemination, increasing demand of services and ultimately consumer demand may drive the market towards human rights compliant solutions.</p>
<h3 style="text-align: justify; ">The Dynamic Coalition on Platform Responsibility</h3>
<p style="text-align: justify; ">To nurture a multi-stakeholder endeavour aimed at the elaboration of model contractual-provisions, Mr. Luca Belli, Council of Europe / Université Paris II, Ms Primavera De Filippi, CNRS / Berkman Center for Internet and Society and Mr Nicolo Zingales, Tilburg University / Center for Technology and Society Rio, initiated and facilitated the creation of the Dynamic Coalition on Platform Responsibility (DCPR). DCPR has over fifty individual and organisational members from civil society organisations, academia, private sector organisations and intergovernmental organisations and held its first meeting at the IGF in Istanbul. The meeting began with an overview of the concept of platform responsibility, highlighting relevant initiatives from Council of Europe, Global Network Initiative, Ranking Digital Rights and the Center for Democracy and Technology have undertaken in this regard. Existing issues such as difficulty in comprehension and lack of standardization of redress across rights were raised along with the fundamental lack of due process in terms of transparency across existing mechanisms.</p>
<p style="text-align: justify; ">Online platforms compliance to human rights is often framed around the duty of States to protect human rights and often, Internet companies do not sufficient consideration of the effects of their business practices on users fundamental rights undermining trust.</p>
<p style="text-align: justify; ">The meeting focused it efforts with a call to identify issues of process and substance and specific rights and challenges to be addressed by the DCPR. The procedural issues raised concerned 'responsibility' in decision-making e.g., giving users the right to be heard and an effective remedy before an impartial decision-making body, and obtaining their consent for changes in the contractual terms. The concerns raised around substantive rights such as privacy and freedom of expression eg., disclosure of personal information and content removal and need to promote 'responsibility' through establishing concrete mechanisms to deal with such issues.</p>
<p style="text-align: justify; ">It was suggested that concept of responsibility including in case of conflict between different rights could be grounded in Human Rights case law eg., from European Court of Human Rights jurisprudence. It was also established that any framework that would evolve from this coalition would consider the distinction between users (eg., adults, children, and people with or without continuous access to the Internet) and platforms (eg., in terms of size and functionality).</p>
<h3 style="text-align: justify; ">Action Plan</h3>
<p style="text-align: justify; ">The participants at the DCPR meeting agreed to establish a multistakeholder cooperative engagement amidst stakeholders that will go beyond dialogue and produce concrete proposals. Particularly, participants suggested developing:</p>
<ol>
<li style="text-align: justify; ">Due Diligence Recommendations: Recommendations to online platforms with regard to processes of compliance with internationally agreed human rights standards.</li>
<li style="text-align: justify; ">Model Contractual Provisions: Elaboration of a set of principles and provisions protecting platform users’ rights and guaranteeing transparent mechanisms to seek redress in case of violations.</li>
</ol>
<p style="text-align: justify; ">DCPR will ground the development of these frameworks in the preliminary step of compilation of existing projects and initiatives dealing with the analysis of ToS compatibility with human rights standards. Members, participants and interested stakeholders are invited to highlight and share relevant initiatives by 10th October regarding:</p>
<ol>
<li>Processes of due diligence for human rights compliance;</li>
<li>The evaluation of ToS cocompliance with human rights standards;</li>
</ol>
<p style="text-align: justify; ">Further to this compilation, a first recommendation draft regarding online platforms' due diligence will be circulated on the mailing list by 30th October 2014. CIS will be contributing to the drafting which will be led and elaborated by the DCPR coordinators. This draft will be open for comments via the DCPR mailing list until 30th November 2014 and we encourage you to sign up to the mailing list (<a class="external-link" href="http://lists.platformresponsibility.info/listinfo/dcpr">http://lists.platformresponsibility.info/listinfo/dcpr</a>).<br /><br />A second draft will be developed compiling the comments expressed via the mailing-list and shared for comments by 10 December 2014. The final version of the recommendation will be drafted by 30 December. Subsequently, the first set of model contractual provisions will be elaborated building upon such recommendation. A call for inputs will be issued in order to gather suggestions on the content of these provisions.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility'>http://editors.cis-india.org/internet-governance/blog/cis-joins-dynamic-coalition-for-platform-responsibility</a>
</p>
No publisherjyotiHuman RightsPrivacyInternet Governance ForumData ProtectionTerms of ServiceInternet GovernancePlatform ResponsibilityIntermediary Liability2014-10-07T10:54:03ZBlog Entry