The Centre for Internet and Society
http://editors.cis-india.org
These are the search results for the query, showing results 11 to 25.
Content takedown and users' rights
http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1
<b>After Shreya Singhal v Union of India, commentators have continued to question the constitutionality of the content takedown regime under Section 69A of the IT Act (and the Blocking Rules issued under it). There has also been considerable debate around how the judgement has changed this regime: specifically about (i) whether originators of content are entitled to a hearing, (ii) whether Rule 16 of the Blocking Rules, which mandates confidentiality of content takedown requests received by intermediaries from the Government, continues to be operative, and (iii) the effect of Rule 16 on the rights of the originator and the public to challenge executive action. In this opinion piece, we attempt to answer some of these questions.</b>
<p style="text-align: justify;" class="normal"> </p>
<p style="text-align: justify;" class="normal">This article was first <a class="external-link" href="http://https://theleaflet.in/content-takedown-and-users-rights/">published</a> at the Leaflet. It has subsequently been republished by <a class="external-link" href="https://scroll.in/article/953146/how-india-is-using-its-information-technology-act-to-arbitrarily-take-down-online-content">Scroll.in</a>, <a class="external-link" href="https://kashmirobserver.net/2020/02/15/content-takedown-and-users-rights/">Kashmir Observer</a> and the <a class="external-link" href="https://cyberbrics.info/content-takedown-and-users-rights/">CyberBRICS blog</a>. </p>
<p style="text-align: justify;" class="normal"><strong><br /></strong></p>
<p style="text-align: justify;" class="normal"><strong>Introduction</strong></p>
<p style="text-align: justify;" class="normal">Last year, several Jio users from different states <a href="https://www.medianama.com/2019/03/223-indiankanoon-jio-block/">reported</a> that sites like Indian Kanoon, Reddit and Telegram were inaccessible through their connections. While attempting to access the website, the users were presented with a notice that the websites were blocked on orders from the Department of Telecommunications (DoT). When contacted by the founder of Indian Kanoon, Reliance Jio <a href="https://in.reuters.com/article/us-india-internet-idINKCN1RF14D">stated</a> that the website had been blocked on orders of the government, and that the order had been rescinded the same evening. However, in response to a Right to Information (RTI) request, the DoT <a href="https://twitter.com/indiankanoon/status/1218193372210323456">said</a> they had no information about orders relating to the blocking of Indian Kanoon.</p>
<p style="text-align: justify;" class="normal">Alternatively, consider that the Committee to Protect Journalists (CPJ) <a href="https://cpj.org/blog/2019/10/india-opaque-legal-process-suppress-kashmir-twitter.php">expressed concern</a> last year that the Indian government was forcing Twitter to suspend accounts or remove content relating to Kashmir. They reported that over the last two years, the Indian government suppressed a substantial amount of information coming from the area, and prevented Indians from accessing more than five thousand tweets.</p>
<p style="text-align: justify;" class="normal">These instances are <a href="https://www.hindustantimes.com/analysis/to-preserve-freedoms-online-amend-the-it-act/story-aC0jXUId4gpydJyuoBcJdI.html">symptomatic</a> of a larger problem of opaque and arbitrary content takedown in India, enabled by the legal framework under the Information Technology (IT) Act. The Government derives its powers to order intermediaries (entities storing or transmitting information on behalf of others, a definition which includes internet service providers and social media platforms alike) to block online resources through <a href="https://indiankanoon.org/doc/10190353/">section 69A</a> of the IT Act and the <a href="https://meity.gov.in/writereaddata/files/Information%20Technology%20%28%20Procedure%20and%20safeguards%20for%20blocking%20for%20access%20of%20information%20by%20public%29%20Rules%2C%202009.pdf">rules</a> [“the blocking rules”] notified thereunder. Apart from this, <a href="https://indiankanoon.org/doc/844026/">section 79</a> of the IT Act and its allied rules also prescribe a procedure for content removal. <a href="https://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">Conversations</a> with one popular intermediary revealed that the government usually prefers to use its powers under section 69A, possibly because of the opaque nature of the procedure that we highlight below.</p>
<p style="text-align: justify;" class="normal">Under section 69A, a content removal request can be sent by authorised personnel in the Central Government not below the rank of a Joint Secretary. The grounds for issuance of blocking orders under section 69A are: “<em>the interest of the sovereignty and integrity of India, defence of India, the security of the state, friendly relations with foreign states or public order or for preventing incitement to the commission of any cognisable offence relating to the above.</em>” Specifically, the blocking rules envisage the process of blocking to be largely executive-driven, and require strict confidentiality to be maintained around the issuance of blocking orders. This shrouds content takedown orders in a cloak of secrecy, and makes it impossible for users and content creators to ascertain the legitimacy or legality of the government action in any instance of blocking.</p>
<p style="text-align: justify;" class="normal"><strong>Issues</strong></p>
<p style="text-align: justify;" class="normal">The Supreme Court had been called to determine the constitutional validity of section 69A and the allied rules in <a href="https://indiankanoon.org/doc/110813550/"><em>Shreya Singhal v Union of India</em></a>. The petitioners had contended that as per the procedure laid down by these rules, there was no guarantee of pre-decisional hearing afforded to the originator of the information. Additionally, the petitioners pointed out that the safeguards built into section 95 and 96 of the Code of Criminal Procedure (CrPC), which allow state governments to ban publications and persons to initiate legal challenges to those actions respectively, were absent from the blocking procedures. Lastly, the petitioners assailed rule 16 of the blocking rules, which mandated confidentiality of blocking procedures, on the grounds that it was affecting their fundamental rights.</p>
<p style="text-align: justify;" class="normal">The Court, however, found little merit in these arguments. Specifically, the Court found that section 69A was narrowly drawn and had sufficient procedural safeguards, which included the grounds of issuance of a blocking order being specifically drawn, and mandating that the reasons of the website blocking be in writing, thus making it amenable to judicial review. Further, the Court also found that the provision of setting up of a review committee saved the law from being constitutional infirmity. In the Court’s opinion, the mere absence of additional safeguards, as the ones built into the CrPC, did not mean that the law was unconstitutional.</p>
<p style="text-align: justify;" class="normal">But do the ground realities align with the Court’s envisaged implementation of these principles? Apar Gupta, a counsel for the petitioners, <a href="https://indianexpress.com/article/opinion/columns/but-what-about-section-69a/">pointed</a> out that there was no recorded instance of pre-decisional hearing being granted to show that this safeguard contained in the rules was actually being implemented. However, Gautam Bhatia <a href="https://indconlawphil.wordpress.com/2015/03/25/the-supreme-courts-it-act-judgment-and-secret-blocking/">read</a> <em>Shreya Singhal </em>to make an important advance: that the right of hearing be mandatorily extended to the ‘originator’, i.e. the content creator.</p>
<p style="text-align: justify;" class="normal">Additionally, Bhatia also noted that the Court, while upholding the constitutionality of the procedure under section 69A, held that the “<em>reasons have to be recorded in writing in such blocking order so that they may be assailed in a writ petition under Article 226 of the Constitution.</em>”</p>
<p style="text-align: justify;" class="normal">There are two important takeaways from this. <em>Firstly</em>, he argued that the broad contours of the judgment invoke an established constitutional doctrine — that the fundamental right under Article 19(1)(a) does not merely include the right of expression, but also the <em>right of access to information. </em>Accordingly, the right of challenging a blocking order was not only vested in the originator or the concerned intermediary, but may rest with the general public as well. And <em>secondly</em>, by the doctrine of necessary implication, it followed that for the general public to challenge any blocking order under Article 226, the blocking orders must be made public. While Bhatia concedes that public availability of blocking orders may be an over-optimistic reading of the judgment, recent events suggest that even the commonly-expected result, i.e. that the content creators having the right to a hearing, has not been implemented by the Government.</p>
<p style="text-align: justify;" class="normal">Consider the <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">blocking</a> of the satirical website DowryCalculator.com in September 2019 on orders from the government. The website displayed a calculator that suggests a ‘dowry’ depending on the salary and education of a prospective groom: even if someone misses the satire, the contents of the website are not immediately relatable to any grounds of removal listed under section 69A of the IT Act.</p>
<p style="text-align: justify;" class="normal"> Tanul Thakur, the creator of the website, was not granted a hearing despite the fact that he had publicly claimed the ownership of the website at various times and that the website had been covered widely by the press. The information associated with the domain name also publicly lists Thakur’s name and contact information. Clearly, the government made no effort to contact Thakur when passing the order. Perhaps even more worryingly, when he <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">tried</a> to access a copy of the blocking order by filing a RTI, the MeitY cited the confidentiality rule to deny him the information.</p>
<p style="text-align: justify;" class="normal">This incident documents a fundamental problem plaguing the rules: the confidentiality clause is still being used to deny disclosure of key information on content takedown orders. The government has also used the provision to deny citizens a list of blocked websites , as responses to RTI requests have proven <a href="https://cis-india.org/internet-governance/blog/rti-application-to-bsnl-for-the-list-of-websites-blocked-in-india">time</a> and <a href="https://sflc.in/deity-provides-list-sites-blocked-2013-withholds-orders">again</a>.</p>
<p style="text-align: justify;" class="normal">Clearly, the Supreme Court’s rationale in considering Section 69A and the blocking rules as constitutional is not one that is implemented in reality. The confidentiality clause is preventing legal challenges to content blocking in totality: content creators are unable access the orders, and hence are unable to understand the executive’s reasoning in ordering their content to be blocked from public access.</p>
<p style="text-align: justify;" class="normal">As we noted earlier, the grounds of issuing a blocking order under section 69A pertain to certain reasonable restrictions on expression permitted by Article 19(2), which are couched in broad terms. The government’s implementation of section 69A and the rules make it impossible for any judicial review or accountability on the conformity of blocking orders with the mentioned grounds under the rules, or any reasonable restriction at all.</p>
<p style="text-align: justify;" class="normal"><strong>The Way Forward</strong></p>
<p style="text-align: justify;" class="normal">From the opacity of proceedings under the law, to the lack of information regarding the same on public domain, the Indian content takedown regime leaves a lot to be desired from both the government and intermediaries at play. </p>
<p style="text-align: justify;" class="normal">First, we believe the Supreme Court’s decision in <em>Shreya Singhal v. Union of India</em> casts an obligation on the government to attempt to contact the content creator if they are passing a content takedown order to an intermediary. <em>Second</em>, even if the content creator is unavailable for a hearing at that instance, the confidentiality clause should not be used to prevent future disclosure of information to the content creator, so that affected citizens can access and challenge these orders.</p>
<p style="text-align: justify;" class="normal">While we wait for legal reform, intermediaries can also step up to ensure the rights of users online are upheld. On receiving formal orders, intermediaries should <a href="https://cis-india.org/internet-governance/blog/torsha-sarkar-suhan-s-and-gurshabad-grover-october-30-2019-through-the-looking-glass">assess</a> the legality of the received request. This should involve ensuring that only authorised agencies and personnel have sent the content removal orders, that the order specifically mentions what provision the government is exercising the power under, and that the content removal requests relate to the grounds of removal that are permissible under section 69A. For instance, intermediaries should refuse to entertain content removal requests under section 69A of the IT Act if they relate to obscenity, a ground not covered by the provision.</p>
<p style="text-align: justify;" class="normal">The representatives of the intermediary should also push for the committee to grant a hearing to the content creator. Here, the intermediary can act as a liaison between the uploader and the governmental authorities.</p>
<p style="text-align: justify;" class="normal">The Supreme Court’s recent decision in <a href="https://indiankanoon.org/doc/82461587/"><em>Anuradha Bhasin v. Union of India</em></a><em> </em>offers a glimmer of hope for user rights online<em>. </em>While the case primarily challenged the orders imposing section 144 of the CrPC and a communication blockade in Jammu and Kashmir, the final decision does affirm the fundamental principle that government-imposed restrictions on the freedom of expression and assembly must be made available to the public and affected parties to enable challenges in a court of law.</p>
<p style="text-align: justify;" class="normal"> The judiciary has yet another opportunity to consider the provision and the rules: late last year, Tanul Thakur <a href="https://internetfreedom.in/delhi-hc-issues-notice-to-the-government-for-blocking-satirical-dowry-calculator-website/">approached</a> the Delhi High Court to challenge the orders passed by the government to ISPs to block his website. One hopes that the future holds robust reforms to the content takedown regime.</p>
<p style="text-align: justify;" class="normal"> We live in an era where the ebb and flow of societal discourse is increasingly channeled through intermediaries on the internet. In the absence of a mature, balanced and robust framework that enshrines the rule of law, we risk arbitrary modulation of the marketplace of ideas by the executive.</p>
<p style="text-align: justify;" class="normal"><em> </em></p>
<p style="text-align: justify;" class="normal"><em>Torsha Sakar and Gurshabad Grover are researchers at the Centre for Internet and Society.</em></p>
<p style="text-align: justify;" class="normal"><em>Disclosure: The Centre for Internet and Society is a recipient of research grants from Facebook and Google.</em></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1'>http://editors.cis-india.org/internet-governance/blog/content-takedown-and-users-rights-1</a>
</p>
No publisherTorsha Sarkar, Gurshabad GroverInternet FreedomInternet GovernanceIntermediary LiabilityCensorship2020-02-17T05:18:25ZBlog EntryA Deep Dive into Content Takedown Timeframes
http://editors.cis-india.org/internet-governance/blog/torsha-sarkar-november-30-2019-a-deep-dive-into-content-takedown-timeframes
<b>Since the 1990s, internet usage has seen a massive growth, facilitated in part, by growing importance of intermediaries, that act as gateways to the internet. Intermediaries such as Internet Service Providers (ISPs), web-hosting providers, social-media platforms and search engines provide key services which propel social, economic and political development. However, these developments are also offset by instances of users engaging with the platforms in an unlawful manner. The scale and openness of the internet makes regulating such behaviour challenging, and in turn pose several interrelated policy questions.</b>
<p style="text-align: justify;">In this report, we will consider one such question by examining the appropriate time frame for an intermediary to respond to a government content removal request. The way legislations around the world choose to frame this answer has wider ramifications on issues of free speech and ease of carrying out operations for intermediaries. Through the course of our research, we found, for instance:</p>
<ol>
<li style="text-align: justify;">An one-size-fits-all model for illegal content may not be productive. The issue of regulating liability online contain several nuances, which must be considered for more holistic law-making. If regulation is made with only the tech incumbents in mind, then the ramifications of the same would become incredibly burdensome for the smaller companies in the market. </li>
<li style="text-align: justify;">Determining an appropriate turnaround time for an intermediary must also consider the nature and impact of the content in question. For instance, the Impact Assessment on the Proposal for a Regulation of the European Parliament and of the Council on preventing the dissemination of terrorist content online cites research that shows that one-third of all links to Daesh propaganda were disseminated within the first one-hour of its appearance, and three-fourths of these links were shared within four hours of their release. This was the basic rationale for the subsequent enactment of the EU Terrorism Regulation, which proposed an one-hour time-frame for intermediaries to remove terrorist content.</li>
<li style="text-align: justify;">Understanding the impact of specific turnaround times on intermediaries requires the law to introduce in-built transparency reporting mechanisms. Such an exercise, performed periodically, generates useful feedback, which can be, in turn used to improve the system.</li></ol>
<div style="text-align: justify;"> </div>
<div style="text-align: justify;"><strong>Corrigendum: </strong>Please note that in the section concerning 'Regulation on Preventing the Dissemination of Terrorist Content Online', the report mentions that the Regulation has been 'passed in 2019'. At the time of writing the report, the Regulation had only been passed in the European Parliament, and as of May 2020, is currently in the process of a trilogue. </div>
<div style="text-align: justify;"> </div>
<div style="text-align: justify;"><strong>Disclosure</strong>: CIS is a recipient of research grants from Facebook India. </div>
<div style="text-align: justify;"> </div>
<hr />
<p style="text-align: justify;"><a class="external-link" href="http://cis-india.org/internet-governance/files/a-deep-dive-into-content-takedown-frames">Click to download the research paper</a> by Torsha Sarkar (with research assistance from Keying Geng and Merrin Muhammed Ashraf; edited by Elonnai Hickok, Akriti Bopanna, and Gurshabad Grover; inputs from Tanaya Rajwade)</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/torsha-sarkar-november-30-2019-a-deep-dive-into-content-takedown-timeframes'>http://editors.cis-india.org/internet-governance/blog/torsha-sarkar-november-30-2019-a-deep-dive-into-content-takedown-timeframes</a>
</p>
No publishertorshaFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2020-06-26T11:59:06ZBlog EntryRoundtable Discussion on Intermediary Liability
http://editors.cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability
<b>Tanaya Rajwade participated in a roundtable discussion on intermediary liability organised by SFLC and the Dialogue in New Delhi on October 17, 2019.</b>
<p>Click to view the <a class="external-link" href="http://cis-india.org/internet-governance/files/internet-liability">agenda</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability'>http://editors.cis-india.org/internet-governance/news/roundtable-discussion-on-intermediary-liability</a>
</p>
No publisherAdminFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-10-20T07:08:11ZNews ItemRethinking the intermediary liability regime in India
http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india
<b>The article consolidates some of our broad thematic concerns with the draft amendments to the intermediary liability rules, published by MeitY last December.
</b>
<p>The blog post by Torsha Sarkar was <a class="external-link" href="https://cyberbrics.info/rethinking-the-intermediary-liability-regime-in-india/">published by CyberBRICS</a> on August 12, 2019.</p>
<hr />
<h3 style="text-align: justify; ">Introduction</h3>
<p style="text-align: justify; ">In December 2018, the Ministry of Electronics and Information Technology (“MeitY”) released the Intermediary Liability Guidelines (Amendment) Rules (“the Guidelines”), which would be significantly altering the intermediary liability regime in the country. While the Guidelines has drawn a considerable amount of attention and criticism, from the perspective of the government, the change has been overdue.</p>
<p style="text-align: justify; ">The Indian government has been determined to overhaul the pre-existing safe harbour regime since last year. The draft<a href="https://www.medianama.com/wp-content/uploads/Draft-National-E-commerce-Policy.pdf">version</a> of the e-commerce policy, which were leaked last year, also hinted at similar plans. As effects of mass dissemination of disinformation, propaganda and hate speech around the world spill over to offline harms, governments have been increasingly looking to enact interventionist laws that leverage more responsibility on the intermediaries. India has not been an exception.</p>
<p style="text-align: justify; ">A major source of these harmful and illegal content in India come through the popular communications app WhatsApp, despite the company’s enactment of several anti-spam measures over the past few years. Last year, rumours circulated on WhatsApp prompted a series of lynchings. In May, Reuters <a href="https://in.reuters.com/article/india-election-socialmedia-whatsapp/in-india-election-a-14-software-tool-helps-overcome-whatsapp-controls-idINKCN1SL0PZ" rel="noreferrer noopener" target="_blank">reported</a> that clones and software tools were available at minimal cost in the market, for politicians and other interested parties to bypass these measures, and continue the trend of bulk messaging.</p>
<p style="text-align: justify; ">These series of incidents have made it clear that disinformation is a very real problem, and the current regulatory framework is not enough to address it. The government’s response to this has been accordingly, to introduce the Guidelines. This rationale also finds a place in its preliminary<a href="https://www.meity.gov.in/comments-invited-draft-intermediary-rules" rel="noreferrer noopener" target="_blank">statement of reasons</a>.</p>
<p style="text-align: justify; ">While enactment of such interventionist laws has triggered fresh rounds of debate on free speech and censorship, it would be wrong to say that such laws were completely one-sided, or uncalled for.</p>
<p style="text-align: justify; ">On one hand, automated amplification and online mass circulation of purposeful disinformation, propaganda, of terrorist attack videos, or of plain graphic content, are all problems that the government would concern itself with. On the other hand, several online companies (including <a href="https://www.blog.google/outreach-initiatives/public-policy/oversight-frameworks-content-sharing-platforms/" rel="noreferrer noopener" target="_blank">Google</a>) also seem to be in an uneasy agreement that simple self-regulation of content would not cut it. For better oversight, more engagement with both government and civil society members is needed.</p>
<p style="text-align: justify; ">In March this year, Mark Zuckerberg wrote an<a href="https://www.washingtonpost.com/opinions/mark-zuckerberg-the-internet-needs-new-rules-lets-start-in-these-four-areas/2019/03/29/9e6f0504-521a-11e9-a3f7-78b7525a8d5f_story.html?utm_term=.4d177c66782f" rel="noreferrer noopener" target="_blank">op-ed</a> for the Washington Post, calling for more government involvement in the process of content regulation on its platform. While it would be interesting to consider how Zuckerberg’s view aligns with those similarly placed, it would nevertheless be correct to say that online intermediaries are under more pressure than ever to keep their platforms clean of content that is ‘illegal, harmful, obscene’. And this list only grows.</p>
<p style="text-align: justify; ">That being said, the criticism from several stakeholders is sharp and clear in instances of such law being enacted – be it the ambitious <a href="https://www.ivir.nl/publicaties/download/NetzDG_Tworek_Leerssen_April_2019.pdf" rel="noreferrer noopener" target="_blank">NetzDG</a> aimed at combating Nazi propaganda, hate speech and fake news, or the controversial new European Copyright Directive which has been welcomed by journalists but has been severely critiqued by online content creators and platforms as detrimental against user-generated content.</p>
<p style="text-align: justify; ">In the backdrop of such conflicting interests on online content moderation, it would be useful to examine the Guidelines released by MeitY. In the first portion we would be looking at certain specific concerns existing within the rules, while in the second portion, we would be pushing the narrative further to see what an alternative regulatory framework may look like.</p>
<p style="text-align: justify; ">Before we jump to the crux of this discussion, one important disclosure must be made about the underlying ideology of this piece. It would be unrealistic to claim that the internet should be absolutely free from regulation. Swathes of content on child sexual abuse, or terrorist propaganda, or even the hordes of death and rape threats faced by women online are and should be concerns of a civil society. While that is certainly a strong driving force for regulation, this concern should not override the basic considerations for human rights (including freedom of expression). These ideas would be expanded a bit more in the upcoming sections.</p>
<h3 style="text-align: justify; ">Broad, thematic concerns with the Rules</h3>
<h3 style="text-align: justify; ">A uniform mechanism of compliance</h3>
<h3 style="text-align: justify; ">Timelines</h3>
<p style="text-align: justify; ">Rule 3(8) of the Guidelines mandates intermediaries, prompted by <em>a</em> <em>court order or a government notification</em>, to take down content relating to unlawful acts within 24 hours of such notification. In case they fail to do so, the safe harbour applicable to them under section 79 of the Information Technology Act (“the Act”) would cease to apply, and they would be liable. Prior to the amendment, this timeframe was 36 hours.</p>
<p style="text-align: justify; ">There is a visible lack of research which could rationalize that a 24-hour timeline for compliance is the optimal framework, for <em>all</em> intermediaries, irrespective of the kind of services they provide, or the sizes or resources available to them. As Mozilla Foundation has <a href="https://blog.mozilla.org/netpolicy/2018/07/11/sustainable-policy-solutions-for-illegal-content/" rel="noreferrer noopener" target="_blank">commented</a>, regulation of illegal content online simply cannot be done in an one-size-fits-all approach, nor can <a href="https://blog.mozilla.org/netpolicy/2019/04/10/uk_online-harms/" rel="noreferrer noopener" target="_blank">regulation be made</a> with only the tech incumbents in mind. While platforms like YouTube can comfortably <a href="https://www.bmjv.de/SharedDocs/Pressemitteilungen/DE/2017/03142017_Monitoring_SozialeNetzwerke.html" rel="noreferrer noopener" target="_blank">remove</a> criminal prohibited content within a span of 24 hours, this still can place a large burden on smaller companies, who may not have the necessary resources to comply within this timeframe. There are a few unintended consequences that would arise out of this situation.</p>
<p style="text-align: justify; ">One, sanctions under the Act, which would include both organisational ramifications like website blocking (under section 69A of the Act) as well as individual liability, would affect the smaller intermediaries more than it would affect the bigger ones. A bigger intermediary like Facebook may be able to withstand a large fine in lieu of its failure to control, say, hate speech on its platform. That may not be true for a smaller online marketplace, or even a smaller online social media site, targeted towards a very specific community. This compliance mechanism, accordingly, may just go on to strengthen the larger companies, and eliminating the competition from the smaller companies.</p>
<p style="text-align: justify; ">Two, intermediaries, in fear of heavy criminal sanctions would err on the side of law. This would mean that the decisions involved in determining whether a piece of content is illegal or not would be shorter, less nuanced. This would also mean that legitimate speech would also be under risk from censorship, and intermediaries would pay <a href="https://cis-india.org/internet-governance/intermediary-liability-in-india.pdf" rel="noreferrer noopener" target="_blank">less heed</a> to the technical requirements or the correct legal procedures required for content takedown.</p>
<h3 style="text-align: justify; ">Utilization of ‘automated technology’</h3>
<p style="text-align: justify; ">Another place where the Guidelines assume that all intermediaries operating in India are on the same footing is Rule 3(9). This mandates these entities to proactively monitor for ‘unlawful content’ on their platforms. Aside the unconstitutionality of this provision, this also assumes that all intermediaries would have the requisite resource to actually set up this tool and operate it successfully. YouTube’s ContentID, which began in 2007, has already seen a whopping <a href="https://www.blog.google/outreach-initiatives/public-policy/protecting-what-we-love-about-internet-our-efforts-stop-online-piracy/" rel="noreferrer noopener" target="_blank">100 million dollars investment by 2018</a>.</p>
<p style="text-align: justify; ">Funnily enough, ContentID is a tool exclusively dedicated to finding copyright violation of rights-holder, and even then, it has been proven to be not <a href="https://www.plagiarismtoday.com/2019/01/10/youtubes-copyright-insanity/" rel="noreferrer noopener" target="_blank">infallible</a>. The Guidelines’ sweeping net of ‘unlawful’ content include far many more categories than mere violations of IP rights, and the framework assumes that intermediaries would be able to set up and run an automated tool that would filter through <em>all</em> these categories of ‘unlawful content’ at one go.</p>
<h3 style="text-align: justify; ">The problems of AI</h3>
<p style="text-align: justify; ">Aside the implementation-related concerns, there are also technical challenges related with Rule 3(9). Supervised learning systems (like the one envisaged under the Guidelines) use training data sets for pro-active filtering. This means if the system is taught that for ten instances of A being the input, the output would be B, then for the eleventh time, it sees A, it would give the output B. In the lingo of content filtering, the system would be taught, for example, that nudity is bad. The next time the system encounters nudity in a picture, it would automatically flag it as ‘bad’ and violating the community standards.</p>
<p style="text-align: justify; "><a href="https://www.theguardian.com/technology/2016/sep/08/facebook-mark-zuckerberg-napalm-girl-photo-vietnam-war" rel="noreferrer noopener" target="_blank">Except, that is not how it should work</a>. For every post that is under the scrutiny of the platform operators, numerous nuances and contextual cues act as mitigating factors, none of which, at this point, would be<a href="https://scholarship.law.nd.edu/cgi/viewcontent.cgi?referer=https://www.google.co.in/&httpsredir=1&article=1704&context=ndlr" rel="noreferrer noopener" target="_blank">understandable</a> by a machine.</p>
<p style="text-align: justify; ">Additionally, the training data used to feed the system <a href="https://www.cmu.edu/dietrich/philosophy/docs/london/IJCAI17-AlgorithmicBias-Distrib.pdf" rel="noreferrer noopener" target="_blank">can be biased</a>. A self-driving car who is fed training data from only one region of the country would learn the customs and driving norms of that particular region, and not the patterns that apply across the intended purpose of driving throughout the country.</p>
<p style="text-align: justify; ">Lastly, it is not disputed that bias would be completely eliminated in case the content moderation was undertaken by a human. However, the difference between a human moderator and an automated one, would be that there would be a measure of accountability in the first one. The decision of the human moderator can be disputed, and the moderator would have a chance to explain his reasons for the removal. Artificial intelligence (“AI”) is identified by the algorithmic ‘<a href="http://raley.english.ucsb.edu/wp-content/Engl800/Pasquale-blackbox.pdf" rel="noreferrer noopener" target="_blank">black box</a>’ that processes inputs, and generates usable outputs. Implementing workable accountability standards for this system, including figuring out appeal and grievance redressal mechanisms in cases of dispute, are all problems that the regulator must concern itself with.</p>
<p style="text-align: justify; ">In the absence of any clarity or revision, it seems unlikely that the provision would actually ever see full implementation. Neither would the intermediaries know what kind of ‘automated technology’ they are supposed to use for filtering ‘unlawful content’, nor would there be any incentives for them to actually deploy this system effectively for their platforms.</p>
<h3 style="text-align: justify; ">What can be done?</h3>
<p style="text-align: justify; ">First, more research is needed to understand the effect of compliance timeframes on the accuracy of content takedown. Several jurisdictions are operating now on different timeframes of compliance, and it would be a far more holistic regulation should the government consider the dialogue around each of them and see what it means for India.</p>
<p style="text-align: justify; ">Second, it might be useful to consider the concept of an independent regulator as an alternative and as a compromise between pure governmental regulation (which is more or less what the system is) or self-regulation (which the Guidelines, albeit problematically, also espouse through Rule 3(9)).</p>
<p style="text-align: justify; ">The <a href="https://www.gov.uk/government/consultations/online-harms-white-paper" rel="noreferrer noopener" target="_blank">UK White Paper on Harms</a>, a piece of important document in the system of liability overhaul, proposes an arms-length regulator who would be responsible for drafting codes of conduct for online companies and responsible for their enforcement. While the exact merits of the system is still up for debate, the concept of having a separate body to oversee, formulate and also possibly<a href="https://medium.com/adventures-in-consumer-technology/regulating-social-media-a-policy-proposal-a2a25627c210" rel="noreferrer noopener" target="_blank">arbitrate</a> disputes regarding content removal, is finding traction in several parallel developments.</p>
<p style="text-align: justify; ">One of the Transatlantic Working Group Sessions seem to discuss this idea in terms of having an ‘<a href="https://medium.com/whither-news/proposals-for-reasonable-technology-regulation-and-an-internet-court-58ac99bec420" rel="noreferrer noopener" target="_blank">internet court</a>’ for illegal content regulation. This would have the noted advantage of a) formulating norms of online content in a transparent, public fashion, something previously done behind closed doors of either the government or the tech incumbents and b) having specially trained professionals who would be able to dispose of matters in an expeditious manner.</p>
<p style="text-align: justify; ">India is not unfamiliar to the idea of specialized tribunals, or quasi-judicial bodies for dealing with specific challenges. In 2015, for example, the Government of India passed the Commercial Courts Act, by which specific courts were tasked to deal with matters of very large value. This is neither an isolated instance of the government choosing to create new bodies for dealing with a specific problem, nor would it be inimitable in the future.</p>
<p style="text-align: justify; ">There is no<a href="https://www.thehindubusinessline.com/opinion/resurrecting-the-marketplace-of-ideas/article26313605.ece" rel="noreferrer noopener" target="_blank"> silver bullet</a> when it comes to moderation of content on the web. However, in light of these parallel convergence of ideas, the appeal of an independent regulatory system as a sane compromise between complete government control and <em>laissez-faire</em>autonomy, is worth considering.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india'>http://editors.cis-india.org/internet-governance/blog/cyber-brics-august-12-2019-torsha-sarkar-rethinking-the-intermediary-liability-regime-in-india</a>
</p>
No publishertorshaInternet GovernanceIntermediary LiabilityArtificial Intelligence2019-08-16T01:49:47ZBlog EntryWebinar on counter-comments to the draft Intermediary Guidelines
http://editors.cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines
<b>CCAOI and the ISOC Delhi Chapter organised a webinar on February 11 to discuss the comments submitted to the Information Technology [Intermediary Guidelines (Amendment) Rules] 2018, and counter-comments that were due by February 14. </b>
<p>The agenda of the discussion was:</p>
<ul>
<li>A brief introduction to the counter comment process [Shashank Mishra]</li>
<li>Invited stakeholders comment on key issues and perspectives on the submissions and the points to be countered.</li>
</ul>
<p>The following people participated:</p>
<ul>
<li>Amba Kak, Mozilla</li>
<li>Rajesh Chharia, ISPAI</li>
<li>Gurshabad Grover, CIS</li>
<li>Priyanka Chaudhari, SFLC</li>
<li>Divij Joshi, Vidhi Centre for Legal Policy</li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines'>http://editors.cis-india.org/internet-governance/news/webinar-on-counter-comments-to-the-draft-intermediary-guidelines</a>
</p>
No publisherAdminInternet GovernanceIntermediary LiabilityInformation Technology2019-02-22T01:51:19ZNews Item2019 International Asia Conference
http://editors.cis-india.org/internet-governance/news/2019-international-asia-conference
<b>ITECHLAW organized the 2019 edition of International Asia Conference at JW Marriott hotel in Bangalore on January 31, 2019 and February 1, 2019. Sunil Abraham was a panelist in the session "Policy Making for the Emerging Tech in India".</b>
<p style="text-align: justify; ">The rush of emerging technologies of Machine Learning, Internet of Things (IoT) and Virtual Reality (VR) is revolutionising the landscape in which humans exist. Innovators of the generation are ambitious, and their contributions have significantly impacted on various fields like healthcare, media and entertainment, agriculture, and other service models. As these technology advancements are driving new business and service models, there is a need for stakeholders and governments to ensure security and stability of the market without stifling innovations, stigmatising incentives or creating obstacles. Rapid spreading technology applications are resulting in drastic changes in today’s regulatory model, posing the difficult challenges for regulators. In India, the expeditiously developing start-up ecosystem and online consumer base, has stirred the regulators.</p>
<p style="text-align: justify; ">Intermediary liability, surveillance, data and privacy, digital taxation, data governance and sovereignty are the dominating debatable topics in India. The debates are not only between regulators and stakeholders, but consumers also joining in it. As the competition between Indian and Foreign Technology intensifies in the turf, the debate on tech-policy is considerably being mentioned in run-up of political parties to the general elections as well. Over the past one year, the country has witnessed some landmark judgments and contentious government proposals related to data and privacy, implications of which have affected over-the-top (“OTT”) services, online media, social media, e-commerce platforms, IoT services etc. The Indian regulatory framework on tech-policy is becoming stricter due to a very disruptive phase last year. The tech-giants like Facebook, Google, Twitter, and Amazon are themselves realising their enormous market influence. After the episodes of lynching, hate speeches etc., they are participating in policy-making efforts related to fake news and digital malfeasance. In this process legal industry is making considerable lobbying efforts for corporations to work with government to curb the menace of digital malpractice and make the internet safer.</p>
<p style="text-align: justify; ">As the legal industry is participating in the process of creating an innovators-friendly regulatory regime, they are also striving to understand the disruptive technologies and adopt them for their own convenience. However, legal firms must understand that the technology cannot do their job for clients but can only upgrade the business model for them. The traditional law firm business model is not in sync with legal buyers. Effective deployment of technology will ameliorate the factor of its approachability to its clients.</p>
<p style="text-align: justify; ">With the growing technology-based start-ups in India, it is going to be a hub for investments by big corporations. In order to keep attracting the investors there is a need for government to remove the potential hindrances that may make investors double-think. The government should prepare a level-playing field in the market by making citizens aware of the standard tech-policies and fostering the innovators-friendly regulatory regime.</p>
<hr />
<p style="text-align: justify; ">For more info <a class="external-link" href="https://www.itechlaw.org/Bangalore2019">see the website</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/2019-international-asia-conference'>http://editors.cis-india.org/internet-governance/news/2019-international-asia-conference</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2019-02-19T00:23:43ZNews ItemIntermediary liability law needs updating
http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating
<b>The time has come for India to exert its foreign policy muscle. There is a less charitable name for intermediary liability regimes like Sec 79 of the IT Act — private censorship regimes. </b>
<p style="text-align: justify; ">The article was published in <a class="external-link" href="https://www.business-standard.com/article/opinion/intermediary-liability-law-needs-updating-119020900705_1.html">Business Standard</a> on February 9, 2019.</p>
<hr />
<p style="text-align: justify; ">Intermediaries get immunity from liability emerging from user-generated and third-party content because they have no “actual knowledge” until it is brought to their notice using “take down” requests or orders.</p>
<p style="text-align: justify; ">Since some of the harm caused is immediate, irreparable and irreversible, it is the preferred alternative to approaching courts for each case. When intermediary liability regimes were first enacted, most intermediaries were acting as common carriers — ie they did not curate the experience of users in a substantial fashion. While some intermediaries like Wikipedia continue this common carrier tradition, others driven by advertising revenue no longer treat all parties and all pieces of content neutrally. Facebook, Google and Twitter do everything they can to raise advertising revenues. They make you depressed. And if they like you, they get you to go out and vote. There is an urgent need to update intermediary liability law.</p>
<p style="text-align: justify; ">In response to being summoned by multiple governments, Facebook has announced the establishment of an independent oversight board. A global free speech court for the world’s biggest online country. The time has come for India to exert its foreign policy muscle. The amendments to our intermediary liability regime can have global repercussions, and shape the structure and functioning of this and other global courts.</p>
<p style="text-align: justify; ">While with one hand Facebook dealt the oversight board, with the other hand it took down APIs that would enable press and civil society to monitor political advertising in real time. How could they do that with no legal consequences? The answer is simple — those APIs were provided on a voluntary basis. There was no law requiring them to do so.</p>
<p style="text-align: justify; ">There are two approaches that could be followed. One, as scholar of regulatory theory Amba Kak puts it, is to “disincentivise the black box”. Most transparency reports produced by intermediaries today are on a voluntary basis; there is no requirement for this under law. Our new law could require a extensive transparency with appropriate privacy safeguards for the government, affected parties and the general public in terms of revenues, content production and consumption, policy development, contracts, service-level agreements, enforcement, adjudication and appeal. User empowerment measures in the user interface and algorithm explainability could be required. The key word in this approach is transparency.</p>
<p style="text-align: justify; ">The alternative is to incentivise the black box. Here faith is placed in technological solutions like artificial intelligence. To be fair, technological solutions may be desirable for battling child pornography, where pre-censorship (or deletion before content is published) is required. Fingerprinting technology is used to determine if the content exists in a global database maintained by organisations like the Internet Watch Foundation. A similar technology called Content ID is used pre-censor copyright infringement. Unfortunately, this is done by ignoring the flexibilities that exist in Indian copyright law to promote education, protect access knowledge by the disabled, etc. Even within such narrow application of technologies, there have been false positives. Recently, a video of a blogger testing his microphone was identified as a pre-existing copyrighted work.</p>
<p style="text-align: justify; ">The goal of a policy-maker working on this amendment should be to prevent repeats of the Shreya Singhal judgment where sections of the IT Act were read down or struck down. To avoid similar constitution challenges in the future, the rules should not specify any new categories of illegal content, because that would be outside the scope of the parent clause. The fifth ground in the list is sufficient — “violates any law for the time being in force”. Additional grounds, such as “harms minors in anyway”, is vague and cannot apply to all categories of intermediaries — for example, a dating site for sexual minorities. The rights of children need to be protected. But that is best done within the ongoing amendment to the POCSO Act.</p>
<p style="text-align: justify; ">As an engineer, I vote to eliminate redundancy. If there are specific offences that cannot fit in other parts of the law, those offences can be added as separate sections in the IT Act. For example, even though voyeurism is criminalised in the IT Act, the non-consensual distribution of intimate content could be criminalised, as it has been done in the Philippines.</p>
<p style="text-align: justify; ">Provisions that have to do with data retention and government access to that data for the purposes of national security, law enforcement and also anonymised datasets for the public interest should be in the upcoming Data Protection law. The rules for intermediary liability is not the correct place to deal with it, because data retention may also be required of those intermediaries that don’t handle any third-party information or user generated content. Finally, there have to be clear procedures in place for reinstatement of content that has been taken down.</p>
<hr />
<p style="text-align: justify; "><i>Disclosure: The Centre for Internet and Society receives grants from Facebook, Google and Wikimedia Foundation</i></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating'>http://editors.cis-india.org/internet-governance/blog/business-standard-february-9-2019-sunil-abraham-intermediary-liability-law-needs-updating</a>
</p>
No publishersunilInternet GovernanceIntermediary Liability2019-02-13T00:05:30ZBlog EntryResponse to the Draft of The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018
http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018
<b>In this response, we aim to examine whether the draft rules meet tests of constitutionality and whether they are consistent with the parent Act. We also examine potential harms that may arise from the Rules as they are currently framed and make recommendations to the draft rules that we hope will help the Government meet its objectives while remaining situated within the constitutional ambit.</b>
<p><br style="text-align: start;" /><span style="text-align: start; float: none;">This document presents the Centre for Internet & Society (CIS) response</span><span style="text-align: start; float: none;"> to the Ministry of Electronics and Information Technology’s invitation</span><span style="text-align: start; float: none;"> to comment and suggest changes to the draft of The Information</span><span style="text-align: start; float: none;"> Technology [Intermediary Guidelines (Amendment) Rules] 2018 (hereinafter</span><span style="text-align: start; float: none;"> referred to as the “draft rules”) published on December 24, 2018. CIS is</span><span style="text-align: start; float: none;"> grateful for the opportunity to put forth its views and comments. This response was sent on the January 31, 2019.</span><br style="text-align: start;" /><br style="text-align: start;" /><span style="text-align: start; float: none;">In this response, we aim to examine whether the draft rules meet tests</span><span style="text-align: start; float: none;"> of constitutionality and whether they are consistent with the parent</span><span style="text-align: start; float: none;"> Act. We also examine potential harms that may arise from the Rules as</span><span style="text-align: start; float: none;"> they are currently framed and make recommendations to the draft rules</span><span style="text-align: start; float: none;"> that we hope will help the Government meet its objectives while</span><span style="text-align: start; float: none;"> remaining situated within the constitutional ambit.</span></p>
<p><span style="text-align: start; float: none;">The response can be accessed <a href="https://cis-india.org/internet-governance/resources/Intermediary%20Liability%20Rules%202018.pdf">here</a>.<br /></span></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018'>http://editors.cis-india.org/internet-governance/blog/response-to-the-draft-of-the-information-technology-intermediary-guidelines-amendment-rules-2018</a>
</p>
No publisherGurshabad Grover, Elonnai Hickok, Arindrajit Basu, AkritiFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-02-07T08:06:41ZBlog EntryMediaNama roundtables on intermediary liability rules
http://editors.cis-india.org/a2k/news/medianama-roundtables-on-intermediary-liability-rules
<b>MediaNama hosted one policy round-table on Intermediary Liability protections in Bangalore and another round-table in New Delhi, to discuss inputs sought by MEITY on the amendments to Safe Harbor for platforms (payments services, content services, ISPs, etc.) in India. Centre for Internet & Society is a community partner for the event.</b>
<p style="text-align: justify; ">One round-table was held at St. Mark's Hotel in Bangalore on January 25, 2019 and the next one will be held at India Habitat Centre in New Delhi on February 7, 2019. Gurshabad Grover participated in the meeting held on January 25, 2019. Participants discussed the draft amendments to the intermediary liability rules (under Section 79 of the IT Act) and recommendations stakeholders could respond with. For more info <a class="external-link" href="https://www.medianama.com/2019/01/223-announcing-nama-event-on-the-future-of-online-safe-harbor-bangalore-delhi-ad/">click here</a>.</p>
<hr />
<p>MediaNama has posted some pieces after the discussion that may be of interest:</p>
<ul>
<li><a class="external-link" href="https://www.medianama.com/2019/02/223-namapolicy-no-clarity-on-what-constitutes-offenses-for-intermediaries-alok-prasanna-kumar/">No clarity on what constitutes offenses for intermediaries</a> (by Alok Prasanna Kumar)</li>
<li><a class="external-link" href="https://www.medianama.com/2019/02/223-regulation-of-intermediaries-nama/">Should different sizes or categories of intermediaries be regulated differently?</a> (by Nikhil Pahwa)</li>
<li><a class="external-link" href="https://www.medianama.com/2019/02/223-safe-harbor-intermediary-liability-traceability/">The Intent of Traceability is behavioral change</a> (by Nikhil Pahwa)</li>
</ul>
<p>
For more details visit <a href='http://editors.cis-india.org/a2k/news/medianama-roundtables-on-intermediary-liability-rules'>http://editors.cis-india.org/a2k/news/medianama-roundtables-on-intermediary-liability-rules</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2019-02-17T15:59:33ZNews ItemWebinar on the draft Intermediary Guidelines Amendment Rules
http://editors.cis-india.org/internet-governance/news/webinar-on-the-draft-intermediary-guidelines-amendment-rules
<b>CCAOI and the ISOC Delhi Chapter organised a webinar on January 10 to discuss the draft "The Information Technology [Intermediary Guidelines (Amendment) Rules] 2018". Gurshabad Grover was a discussant in the panel.</b>
<p>The agenda of the discussion was:</p>
<ul>
<li>A brief introduction to the draft highlighting the key issues[Shashank Mishra]</li>
<li>Invited experts sharing their view on the paper and questions asked [Nehaa Chaudhari, Paul Brooks, Arjun Sinha, Gurshabad Grover]</li>
<li>Open Discussion Q&A</li>
<li>Summarizing the session</li>
</ul>
<div>A recording of the session can be <a class="external-link" href="https://livestream.com/internetsociety/intermediaryrules">accessed here</a></div>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/webinar-on-the-draft-intermediary-guidelines-amendment-rules'>http://editors.cis-india.org/internet-governance/news/webinar-on-the-draft-intermediary-guidelines-amendment-rules</a>
</p>
No publisherAdminFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2019-01-18T02:13:23ZNews ItemRoundtable on Intermediary Liability and Gender Based Violence at the Digital Citizen Summit, 2018
http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018
<b>Akriti Bopanna and Ambika Tandon conducted a panel on 'Gender and Intermediary Liability' at the Digital Citizen Summit, hosted by the Digital Empowerment Foundation, on November 1, 2018 at India International Centre, New Delhi.</b>
<p class="moz-quote-pre">Ambika was the moderator for the panel, with Apar Gupta, Jyoti Pandey, Amrita Vasudevan, Anja Kovacs, and Japleen Pasricha as speakers. Click to read the <a class="external-link" href="http://cis-india.org/internet-governance/files/concept-note-digital-citizen-summit">concept note</a> and the <a class="external-link" href="http://cis-india.org/internet-governance/files/dcs-2018-agenda">agenda</a>.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018'>http://editors.cis-india.org/internet-governance/news/roundtable-on-intermediary-liability-and-gender-based-violence-at-the-digital-citizen-summit-2018</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2018-11-07T02:55:40ZNews ItemA trust deficit between advertisers and publishers is leading to fake news
http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news
<b>Transparency regulations is need of the hour. And urgently for election and political advertising. What do the ads look like? Who paid for them? Who was the target? How many people saw these advertisements? How many times? Transparency around viral content is also required.</b>
<p style="text-align: justify; ">The article was published in <a class="external-link" href="https://www.hindustantimes.com/analysis/a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news/story-SVNH9ot3KD50XRltbwOyEO.html">Hindustan Times</a> on September 24, 2018.</p>
<hr />
<p style="text-align: justify; ">Traditionally, we have depended on the private censorship that intermediaries conduct on their platforms. They enforce, with some degree of success, their own community guidelines and terms of services (TOS). Traditionally, these guidelines and TOS have been drafted keeping in mind US laws since historically most intermediaries, including non-profits like Wikimedia Foundation were founded in the US.</p>
<p style="text-align: justify; ">Across the world, this private censorship regime was accepted by governments when they enacted intermediary liability laws (in India we have Section 79A of the IT Act). These laws gave intermediaries immunity from liability emerging from third party content about which they have no “actual knowledge” unless they were informed using takedown notices. Intermediaries set up offices in countries like India, complied with some lawful interception requests, and also conducted geo-blocking to comply with local speech regulation.</p>
<p style="text-align: justify; ">For years, the Indian government has been frustrated since policy reforms that it has pursued with the US have yielded little fruit. American policy makers keep citing shortcomings in the Indian justice systems to avoid expediting the MLAT (Mutual Legal Assistance Treaties) process and the signing of an executive agreement under the US Clout Act. This agreement would compel intermediaries to comply with lawful interception and data requests from Indian law enforcement agencies no matter where the data was located.</p>
<p style="text-align: justify; ">The data localisation requirement in the draft national data protection law is a result of that frustration. As with the US, a quickly enacted data localisation policy is absolutely non-negotiable when it comes to Indian military, intelligence, law enforcement and e-governance data. For India, it also makes sense in the cases of health and financial data with exceptions under certain circumstances. However, it does not make sense for social media platforms since they, by definition, host international networks of people. Recently an inter ministerial committee recommended that “criminal proceedings against Indian heads of social media giants” also be considered. However, raiding Google’s local servers when a lawful interception request is turned down or arresting Facebook executives will result in retaliatory trade actions from the US.</p>
<p style="text-align: justify; ">While the consequences of online recruitment, disinformation in elections and fake news to undermine public order are indeed serious, are there alternatives to such extreme measures for Indian policy makers? Updating intermediary liability law is one place to begin. These social media companies increasingly exercise editorial control, albeit indirectly, via algorithms to claim that they have no “actual knowledge”.</p>
<p style="text-align: justify; ">But they are no longer mere conduits or dumb pipes as they are now publishers who collect payments to promote content. Germany passed a law called NetzDG in 2017 which requires expedited compliance with government takedown orders. Unfortunately, this law does not have sufficient safeguards to prevent overzealous private censorship. India should not repeat this mistake, especially given what the Supreme Court said in the Shreya Singhal judgment.</p>
<p style="text-align: justify; ">Transparency regulations are imperative. And they are needed urgently for election and political advertising. What do the ads look like? Who paid for them? Who was the target? How many people saw these advertisements? How many times? Transparency around viral content is also required. Anyone should be able to see all public content that has been shared with more than a certain percentage of the population over a historical timeline for any geographic area. This will prevent algorithmic filter bubbles and echo chambers, and also help public and civil society monitor unconstitutional and hate speech that violates terms of service of these platforms. So far the intermediaries have benefitted from surveillance — watching from above. It is time to subject them to sousveillance — watched by the citizens from below.</p>
<p style="text-align: justify; ">Data portability mandates and interoperability mandates will allow competition to enter these monopoly markets. Artificial intelligence regulations for algorithms that significantly impact the global networked public sphere could require – one, a right to an explanation and two, a right to influence automated decision making that influences the consumers experience on the platform.</p>
<p style="text-align: justify; ">The real solution lies elsewhere. Google and Facebook are primarily advertising networks. They have successfully managed to destroy the business model for real news and replace it with a business model for fake news by taking away most of the advertising revenues from traditional and new news media companies. They were able to do this because there was a trust deficit between advertisers and publishers. Perhaps this trust deficit could be solved by a commons-based solutions based on free software, open standards and collective action by all Indian new media companies.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news'>http://editors.cis-india.org/internet-governance/blog/hindustan-times-sunil-abraham-september-24-2018-a-trust-deficit-between-advertisers-and-publishers-is-leading-to-fake-news</a>
</p>
No publishersunilInternet GovernanceIntermediary LiabilityCensorship2018-10-02T06:44:55ZBlog EntryInter Movements Open Forum: Trafficking Bill
http://editors.cis-india.org/internet-governance/news/inter-movements-open-forum-trafficking-bill
<b>On 18 May 2018 Gurshabad Grover on behalf of CIS presented comments on the Trafficking (Prevention, Protection and Rehabilitation) Bill 2018 at a meeting of the Inter Movements Open Forum jointly organised by Sangram, Naz Foundation, NNSW, Tarshi and VAMP. The meeting was held at India International Centre in New Delhi.</b>
<p style="text-align: justify;">Gurshabad's presentation was based on Swaraj's <a href="https://cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill">blogpost</a> and subsequent research by Kumarjeet that highlights certain problematic sections (36, 39, 41, 59) in the Bill which may have an adverse impact on freedom of expression, and may additionally change the landscape of intermediary liability rules in India.</p>
<p style="text-align: justify;">Read the <a class="external-link" href="http://cis-india.org/internet-governance/files/the-trafficking-bill">agenda here</a></p>
<p style="text-align: justify;">Clarification (18th August, 2018): A letter sent to the Ministry of Women and Child Development mentioned the Centre for Internet & Society as instituionally endorsing a critique of the The Trafficking of Persons (Prevention, Protection and Rehabilitation) Bill, 2018. We seek to clarify that the Centre for Internet & Society did not endorse the letter to the Ministry.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/news/inter-movements-open-forum-trafficking-bill'>http://editors.cis-india.org/internet-governance/news/inter-movements-open-forum-trafficking-bill</a>
</p>
No publisherAdminInternet GovernanceIntermediary Liability2018-08-18T09:21:02ZNews ItemIndian Intermediary Liability Regime: Compliance with the Manila Principles on Intermediary Liability
http://editors.cis-india.org/internet-governance/blog/indian-intermediary-liability-regime
<b>This report assesses the compliance of the Indian intermediary liability framework with the Manila Principles on Intermediary Liability, and recommends substantive legislative changes to bring the legal framework in line with the Manila Principles. </b>
<p><span style="text-align: justify; ">The report was edited by Elonnai Hickok and Swaraj Barooah</span></p>
<hr />
<p style="text-align: justify; ">The report is an examination of Indian laws based upon the background paper to the Manila Principles as the explanatory text on which these recommendations have been based, and not an assessment of the principles themselves. To do this, the report considers the Indian regime in the context of each of the principles defined in the Manila Principles. As such, the explanatory text to the Manila Principles recognizes that diverse national and political scenario may require different intermediary liability legal regimes, however, this paper relies only on the best practices prescribed under the Manila Principles.</p>
<p style="text-align: justify; ">The report is divided into the following sections</p>
<ul>
<li>Principle I: Intermediaries should be shielded by law from liability for third-party content</li>
<li>Principle II: Content must not be required to be restricted without an order by a judicial authority</li>
<li>Principle III: Requests for restrictions of content must be clear, be unambiguous, and follow due process</li>
<li>Principle IV: Laws and content restriction orders and practices must comply with the tests of necessity and proportionality</li>
<li>
<div id="_mcePaste">Principle V: Laws and content restriction policies and practices must respect due process</div>
</li>
<li>
<div id="_mcePaste">Principle VI: Transparency and accountability must be built into laws and content restriction policies and practices</div>
</li>
<li>
<div id="_mcePaste">Conclusion</div>
</li>
</ul>
<p style="text-align: justify; "><a class="external-link" href="http://cis-india.org/internet-governance/files/indian-intermediary-liability-regime">Download the Full report here</a></p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/indian-intermediary-liability-regime'>http://editors.cis-india.org/internet-governance/blog/indian-intermediary-liability-regime</a>
</p>
No publisherdivijInternet GovernanceIntermediary LiabilityPrivacy2018-05-20T15:14:21ZBlog EntryA look at two problematic provisions of the draft Anti-trafficking bill
http://editors.cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill
<b>This post examines two badly drafted provisions of the new Anti-Trafficking bill that have the potential to severely impinge upon the Freedom of Expression, including through a misunderstanding of intermediary liability. </b>
<p style="text-align: justify;" class="normal">On 28 Feb 2018, the Union Cabinet approved ‘The Trafficking of Persons (Prevention, Protection and Rehabilitation) Bill, 2018’ (‘the bill’) for introduction to the Parliament. This comes after a series of consultations on an earlier 2016 draft bill, that had faced its fair share of <a href="https://scroll.in/article/813268/six-counts-on-which-the-draft-anti-trafficking-bill-fails-short" target="_blank">criticism</a>. As per the Press Information Bureau <a href="http://pib.nic.in/newsite/PrintRelease.aspx?relid=176878" target="_blank">announcement</a>, the Ministry of Women and Child Development met with various stakeholders including 60 NGOs and have incorporated many of the suggestions put forth. They’ve also stated that ‘the new law will make India a leader among South Asian countries to combat trafficking.’</p>
<p style="text-align: justify;" class="normal">However, at first glance, there appear to be several issues with overbroad or vague language used in the drafting of the bill, that stretch it into potentially problematic areas. This current post will focus on two such provisions that could lead to a deleterious effect on the Freedom of Expression. As the bill is currently not publicly available, a stakeholder’s copy of the draft is being used to source these provisions. The relevant sections have been reproduced below for convenience. (Emphasis in bold is as provided by the author).</p>
<p style="text-align: justify;" class="normal"><em>Section 39: Buying or Selling of any person</em></p>
<p style="text-align: justify;" class="normal"><em>39. (l) Whoever buys or sells any person for a consideration, shall be punished with rigorous imprisonment for a term which shall not be less than seven years but may extend to ten years, and shall also be liable to fine which shall not be less than one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal"><em>(2) Whoever solicits or publicises electronically, taking or distributing obscene photographs or videos or providing materials or soliciting or guiding tourists or using agents or any other form <strong>which may lead to the trafficking of a person shall be punished</strong> with rigorous imprisonment for a term which shall not be less than five years but may extend to ten years, and shall also be liable to fine which shall not be less than fifty thousand rupees but which may extend to one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal">The grammatical acrobatics of section 39(2) aside, this anti-solicitation provision is severely problematic in that it mandates punishment even for a vaguely defined action or actions that may not actually be connected to the trafficking of a person. In other words, the provision doesn’t require any of the actions to be connected to trafficking in their intent or even outcome, but only in <em>potential</em> <em>connection</em> to the outcome. At the same time, it says these ‘shall’ be punished!</p>
<p style="text-align: justify;" class="normal">This vagary that ignores actual or even probabilistic causation flies in the face of standard criminal law which requires <em>mens rea</em> along with <em>actus rea</em>. The excessively wide scope of this badly drafted provision leaves it prone to abuse. For example, currently the provision allows the following interpretation to be included: ‘Whoever publicizes electronically, by providing materials in any form, which may lead to trafficking of a person shall be punished…’. Even the electronic publicizing of an academic study on trafficking could fall under the provision as it currently reads, if it is argued that publishing studies that show the prevalence of trafficking ‘may lead to the trafficking of a person’! It is not hard to imagine that an academic study that shows trafficking numbers at embarrassingly high rates could be threatened with this provision. Similarly, any of our vast number of self-appointed moral guardians could also pull within this provision any artistic work that they may personally find offensive or ‘obscene’. Simply put, without any burden of showing a causal connect, it could be argued that <em>anything</em> ‘may lead’ to the trafficking of a person. Needless to say, this paves the way for a severe chilling effect on free speech, especially on critical speech around trafficking issues.</p>
<p style="text-align: justify;" class="normal"><em>Section 41: Offences related to media</em></p>
<p style="text-align: justify;" class="normal"><em>41. (l) Whoever commits trafficking of a person with the aid of media, including, but not limited to print, internet, digital or electronic media, shall be punished with rigorous imprisonment for a term which shall not be less than seven years but may extend to ten years and shall also be liable to fine which shall not be less than one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal"><em>(2) Whoever <strong>distributes, or sells or stores</strong>, in any form in any electronic or printed form showing incidence of sexual exploitation, sexual assault, or rape for the purpose of exploitation or for coercion of the victim or his family members, or for unlawful gain <strong>shall be punished</strong> with rigorous imprisonment for a term which shall not be less than three years but may extend to seven years and shall also be liable to fine which shall not be less than one lakh rupees.</em></p>
<p style="text-align: justify;" class="normal">The drafters of this bill have perhaps overlooked the fact that unlike the physical world, the infrastructure of the electronic / digital world requires 3rd party intermediaries to handle information during most forms of electronic activities, whether it is transmission, storage or display. As it is not feasible, desirable or even practically possible for intermediaries to verify the legality of every bit of data that gets transferred or stored by the intermediary, ‘safe harbours’ are provided in law for intermediaries, protecting them from liability of the information being transmitted through them. These ensure that entities that act as architectural requirements and intermediary platforms are able to operate smoothly and without fear. If intermediaries are not granted this protection, it puts them in the unenviable position of having to monitor un-monitorable amounts of data, and face legal action for the slip-ups that are bound to happen regularly. Furthermore, there are several levels of free speech and privacy issues associated with having multiple gatekeepers on the expression of speech online. A charitable reading of the intent of a provision which does not recognise safe harbours for 3rd party intermediaries, would be that the drafters of the bill have simply not realised that users who upload and initiate transfer of information online, are not the same parties who do the actual transmission of the information.</p>
<p style="text-align: justify;" class="normal">Distribution, selling or storing of information online would require the transmission of information over intermediaries, as well as the temporary storage of such information on intermediary platforms. In India, intermediaries engaging with transmission or temporary storage of information are provided safe harbour<a href="imap://prasad@mail.cis-india.org:143/fetch%3EUID%3E/INBOX%3E176833#_ftn1">[1]</a> by Section 79 of the Information Technology Act, 2000 (‘IT Act’), so long as they:</p>
<p style="text-align: justify;" class="normal">(i) act as a mere ‘conduit’ and do not initiate the transmission, select the receiver of the transmission, or select or modify the information contained in the transmission.</p>
<p style="text-align: justify;" class="normal">(ii) exercise due diligence while discharging duties under this Act, and observes other guidelines that the Central Government may prescribe.</p>
<p style="text-align: justify;" class="normal">The Information Technology (Intermediary Guidelines) Rules, 2011, list out the nature of the due diligence to be followed by intermediaries to claim exemption under Section 79 of the IT Act.</p>
<p style="text-align: justify;" class="normal">Intermediaries will not be granted safe harbour if they have conspired, abetted, aided or induced commission of the unlawful act, or if they do not remove or disable access to information upon receiving actual knowledge, or notice from the Government, of the information that is transmitted or stored by the intermediary being used for unlawful purposes.</p>
<p style="text-align: justify;" class="normal">Thus it can be seen that the IT Act already provides an in-depth regime for intermediary liability, and given its <em>non-obstante </em>clause which states that Section 79 of the IT Act would apply “Notwithstanding anything contained in any law for the time being in force” , as well as the reiteration of the IT Act’s overriding effect via Section 81, which states that the provisions of the Act ‘shall have effect notwithstanding anything inconsistent therewith contained in any other law for the time being in force’ (barring the exercise of copyright or patent rights), it is generally considered the appropriate legal framework for this issue. However, it appears that the drafters of the 2018 Anti-trafficking bill have not considered this aspect at all, since they have not referenced the IT Act in this context in the bill, and have additionally added their own <em>non-obstante </em>clause in Section 59 of the bill:</p>
<p style="text-align: justify;" class="normal">59.<em> The provisions of this Act, shall be in addition to and not in derogation of the provisions of any other law for the time being in force and, in case of any inconsistency, the provisions of this Act shall have overriding effect on the provisions of any such law to the extent of the inconsistency.</em></p>
<p style="text-align: justify;" class="normal">So the regime as prescribed by the IT Act allows for safe harbours, whereas the regime as prescribed by the Anti-Trafficking bill does not allow for safe harbours, and both say that they would an overriding effect for any conflicting law. This legislative bumble could potentially be solved by using the settled principle that a special Act prevails over a general legislation. This is still a little tricky as they are technically both special Acts. It could be argued that given the context of the Anti-trafficking bill as focusing on trafficking, and the context of the IT Act focusing on the interface of law and technology, that for the purposes of Section 41(2) of the Anti-trafficking bill, the IT Act is the special legislation. And thus Section 79 of the IT Act should make redundant the relevant portion of Section 41(2) of the Anti-trafficking bill. This reading would require the bill to be modified so as to remove the redundancy and the conflicting portion of Section 41(2).</p>
<hr />
<p style="text-align: justify;">[1] In 2016, a division bench of the Delhi High Court held in the case of Myspace Inc vs Super Cassettes Industries Ltd that a safe harbour immunity for intermediaries was necessary as it was not technically feasible to pre-screen content from third parties, and that tasking intermediaries with this responsibility could have a chilling effect on free speech, It held that their responsibility was limited to the extent of acting upon receiving ‘actual knowledge’. Earlier, in determining what ‘actual knowledge’ refers to, in 2015 the Supreme Court of India in the landmark case of Shreya Singhal vs Union of India, required this to be in the form of a notice via a court or government order. Thus under our current law, intermediaries are granted a safe harbour from liability so long as they act upon court or government orders which notify them of content that is required to be taken down.</p>
<p style="text-align: justify;"> </p>
<p style="text-align: justify;">Clarification (18th August, 2018): A letter sent to the Ministry of Women and Child Development mentioned the Centre for Internet & Society as instituionally endorsing a critique of the The Trafficking of Persons (Prevention, Protection and Rehabilitation) Bill, 2018. We seek to clarify that the Centre for Internet & Society did not endorse the letter to the Ministry.</p>
<p>
For more details visit <a href='http://editors.cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill'>http://editors.cis-india.org/internet-governance/blog/a-look-at-two-problematic-provisions-of-the-draft-anti-trafficking-bill</a>
</p>
No publisherswarajFreedom of Speech and ExpressionInternet GovernanceIntermediary Liability2018-08-18T09:21:55ZBlog Entry