Intermediary liability and Safe Harbour: On due diligence and automated filtering
This blogpost was authored by Gurshabad Grover and Anna Liz Thomas. It was first published at Law and Other Things.
Introduction
India’s intermediary liability regime flows from section 79 of the Information Technology Act, 2000 (the “Act”), a provision that exempts intermediaries from liability for third party content on their service, as long as certain conditions are fulfilled. Under Section 79(2)(c) of the Act, one of the conditions for an intermediary to claim safe harbour (immunity from liability for third party content) is that it:
“observes due diligence while discharging his duties under this Act and also observes such other guidelines as the Central Government may prescribe in this behalf.” (emphasis is authors’)
This post discusses this ‘due diligence’ obligation with a focus on its scope and its relationship with the intermediary guidelines issued under the Act. We primarily analyse the arguments made by T. Prashant Reddy in Back to the Drawing Board: What should be the new direction of the intermediary liability law?, (“the paper”) which was published last year in the NLUD Journal of Legal Studies.
While the paper aims to broadly engage with the question of how India’s intermediary liability regime should be reformed, this post only focuses on two of the arguments that form the basis of the paper. First, the paper suggests that ‘due diligence’ should be interpreted as a separate requirement from the intermediary guidelines (“the 2011 rules”) issued under the law. The second argument builds on this and argues that this due diligence requirement could be understood to mean that intermediaries should engage in proactive identification and filtering of unlawful content.
We explore the two questions in the same order, and then finally explore alternative interpretations of the due diligence requirement. We argue that (1) there are multiple ways to interpret the provision, but there may be merit in considering the ‘due diligence’ requirement as distinct from the guidelines; and that (2) even if it is a separate requirement, proactive filtering of content by intermediaries is unconstitutional, and thus cannot be the sort of ‘due diligence’ the law expects from intermediaries.
Is ‘due diligence’ a separate requirement?
Section 79 of the IT Act has long been criticised for its vague and poor drafting, including on whether the entire clause requiring ‘due diligence’ was mandatory at all. The paper only suggests that ‘due diligence’ is a separate requirement from the guidelines, with the interpretation being supported by two facts.
First, the paper points to the ‘and’ in Section 79(2)(c) that separates the obligation to conduct due diligence, and the obligation to observe the guidelines prescribed by the Central Government. This would indicate that the two obligations are to be separately fulfilled. We should point out that reading the statute in such a way does mean that the two obligations are distinct, but it could also imply that both ‘due diligence’ and ‘other guidelines’ can be notified by the Government. In fact, we think that evidence of the claim that ‘due diligence’ is a separate self-contained obligation is actually found in the word ‘also’ that succeeds ‘and’. If we interpret the provision in a way that the due diligence is only what is notified in the rules, the term ‘also’ ends up having no real significance. The rule of surplusage in interpretation states that “every word and every provision is to be given effect”, and that “none should be ignored.” Thus, the term ‘also’ can be understood as intentionally demarcating the ‘due diligence’ obligation and the one that obligates intermediaries to comply with the rules notified under the provision.
The paper further argues that the second fact supporting this interpretation is in the legislative history of section 79 of the Act. Section 79, as it presently exists, was the result of the amendments to the Act passed in 2008. The phrase ‘due diligence’ was retained in the text of the provision on the insistence of the Standing Committee which submitted a report on the Bill. The Committee had contextualized the due diligence requirement in relation to the need for an explicit provision requiring the blocking and elimination of objectionable content through technical mechanisms.
However, the paper does not consider the fact that the Committee had also specified that the reason it wanted ‘due diligence’ in the provision was because in their opinion, “removing an enabling provision which already exists in the principal Act and leaving it to be taken care of by the possible guidelines makes no sense”. From the perspective of the Standing Committee, the due diligence provision was an enabling one, i.e., primarily meant to allow the government to make guidelines in that regard. In an enabling provision like this one, retaining the term ‘due diligence’ and adding that intermediaries have an obligation to observe ‘such other’ guidelines curbed the possibility of excessive delegation, by ensuring that any guidelines prescribed specifically concern due diligence obligations.
Note that the judgement of the Andhra Pradesh High Court in Google India Private Limited vs M/S Visaka Industries Limited in November 2016 may support the paper’s argument in that the ‘due diligence’ obligation is distinct from the guidelines. In the absence of any explicit definition of ‘due diligence’ in the IT Act, the Court cited precedent that relied on dictionary meanings of due diligence and concluded that that in order to meet the requirement, an intermediary would have to prove that it “had acted as an ordinary reasonable prudent man”, which would be a “question of fact.” Perhaps the Delhi High Court was clearest in the matter in Christian Louboutin v Nakul Bajaj when it stated that “the ‘due diligence’ provided in the Act, has to be construed as being broad and not restricted merely to the guidelines themselves.”
On the other hand, like the paper notes, there are judgments like MySpace Inc. vs Super Cassettes Industries Ltd. by the Delhi High Court, which have not considered the specific question, but concluded Rule 3 of the 2011 rules to completely encapsulate ‘due diligence’. This is, of course, because of the language in the rules. While Section 79(2)(c) of the IT Act might suffer from some vagueness, Rule 3 of the 2011 rules is unequivocal in that it seeks to define the “due diligence to be observed by the intermediary.” As Chinmayi Arun notes, the notification of the rules is seen as serving to clarify the meaning of the requirement. It is no surprise that Rule 3 has become the traditionally-understood standard for fulfilling the ‘due diligence’ requirement under the law.
Overall, despite the lack of a crystal-clear answer, we agree with the paper that there is enough merit in seriously considering the ‘due diligence’ as distinct from the guidelines. The paper has rightly brought up an interpretation that needs more attention in literature and cases on intermediary liability in India.
Interpreting ‘Due diligence’
It thus becomes important to question what this ‘due diligence’ will entail for intermediaries if (and/or when) it is entirely distinct from the rules. The paper points to how the Committee had contextualized the due diligence requirement as a need for certain intermediaries to block and eliminate objectionable content through technical mechanisms. Using this frame of reference, Reddy suggests that this ‘due diligence’ requirement may mean that intermediaries are obligated to proactively filter objectionable content.
However, it is pertinent to note that the Standing Committee had originally intended that the ‘due diligence’ requirement be reinstated as a prerequisite for giving immunity to a specific kind of intermediary: online marketplaces and online auction sites. Their suggestions for automated tools for filtering content should also be understood then as targeted at these specific intermediaries. Therefore, there is nothing in the legislative history of Section 79(2)(c) that suggests that measures such as automated content filtration were even considered as obligations for all categories of intermediaries.
More importantly, as many have pointed out in the context of the proposed amendments to the intermediary guidelines, proactive filtering of content would be unreasonable and its application definitely an unconstitutional restriction on speech.
First, such a requirement would suffer from vagueness and overbreadth. There are lots of “automated tools” that can be used to filter content (keyword detection, hash-based content detection, machine learning, etc.), each with their merits and demerits. Even if delegated legislation were to provide clarity to the term, such a broad interpretation of ‘due diligence’ would not be consistent with the ‘case-by-case’ evaluation that is the usual understanding of the term. Apart from the fact that all forms of automated filtering have their inherent limitations, it would be impossible for certain kinds of intermediaries, like those that deal with end-to-end encrypted communications to implement such a requirement.
The determination of whether certain acts are illegal is a public function, left to the government and the courts. A broad proactive filtering obligation on intermediaries is state censorship by proxy, and worse yet, a form of privatized law enforcement. As a matter of principle, what the state cannot do directly, it cannot do indirectly. For such forms of censorship, Prof. Seth Kreimer has elucidated in detail the great dangers of “collateral damage” that come from placing restrictions on intermediaries (if not the speaker). On its face, it appears less egregious than a “frontal attack” on expression by the state, but it can have the same effects.
To understand the impact of such obligations in the context of intermediary liability, consider the even lower bar of requiring intermediaries to entertain third-party takedown notices. There is evidence from multiple jurisdictions to suggest that even third party notice-and-takedown systems make intermediaries over-censor in order to avoid liability. When such a system existed in India before the Supreme Court’s judgment in Shreya Singhal v. Union of India, a study by Rishabh Dara found that a majority of intermediaries (that they sent notices to) were over-censoring by complying with clearly frivolous takedown notices. The requirement of proactive filtering will undoubtedly cause a much amplified, unjust and disproportionate harm to the exercise of the right to freedom of expression. Furthermore, it has been confirmed by Shreya Singhal that the ‘knowledge;’ of content to be taken down, must only be construed as being brought to the intermediary through the medium of a court order. It, therefore, becomes difficult to reconcile Shreya Singhal with automatic filtration being mandated by law, since this would suggest that such ‘knowledge’ may be brought to the intermediary by way of an algorithm (with or without conjunction with human inspection), rather than a court order.
Rather than meeting T. Prashant Reddy’s aim, such a reading would also concentrate more powers in the hands of private companies like Facebook and Google that already exert an undue influence in the moderation of the online public sphere.
Instead of a draconian form of ‘due diligence’, it is important to consider what the range of possibilities that could inform the obligation. For instance, the UN Guiding Principles on Business and Human Rights require business enterprises to carry out a human rights due diligence on a regular basis, to identify, prevent, mitigate and account for how they address their impacts on human rights. Businesses, under these principles have differentiated responsibilities based on the size of the business, risk of severe human rights impacts and the nature and context of its operations. Once again, in this case, each intermediary’s performance of its due diligence obligation would be made on a case-to-case basis. Another interpretation can be the incorporation of safeguards in takedown process, as Article 19 has suggested. This could be to ensure that the companies are transparent in their decision-making, and users are able to challenge takedown decisions made by companies.
Conclusion
For the long-term reform of governance of online platforms, it is important to keep in mind that this is one of the many problems in section 79 of the IT Act. As the paper points out, the provision has been long criticised for having a “one-size-fits-all” approach to regulation, where internet service providers and social media companies are treated similarly when it comes to their conditions for exemption from liability. The conditions for exemption from liability in the provision contribute to confusion around their application to good faith content moderation and curation of newsfeeds.
There is also little in the law that advocates for transparency and fairness in the moderation of online content, which is the area where large and closed intermediaries act most as ‘gatekeepers’ and influence the public sphere. Unfortunately, while the paper recognises these issues, it goes on to advocate for proactive and automated content filtering, which is likely to concentrate even more power in the hands of big tech companies.
There are a host of problems that contribute to the misgovernance of online platforms, including an ineffective competition law framework, the lack of consumer protection standards applicable to most ‘free’ online services, and the opacity with which community standards are applied. A step towards addressing these issues would be a clearer and comprehensive intermediary liability legislation that recognises the role of intermediaries in facilitating the right to freedom of expression, holds them accountable to users, and dismantles unfair concentration of power in commercial interests.
The authors would like to thank Torsha Sarkar and the Editorial Board at Law and Other Things for their comments and suggestions.
Disclosure: CIS has been a recipient of research grants from Facebook and Google.