You are here: Home / Internet Governance / Blog / Right to be Forgotten: A Tale of Two Judgements

Right to be Forgotten: A Tale of Two Judgements

Posted by Amber Sinha at Apr 07, 2017 02:27 AM |
In the last few months, there have been contrasting judgments from two Indian high courts, Karnataka and Gujarat, on matters relating to the right to be forgotten. The two high courts heard pleas on issues to do the right of individuals to have either personal information redacted from the text of judgments available online or removal of such judgment from publically available sources.

While one High Court (Karnataka) ordered the removal of personal details from the judgment,[1] the other (Gujarat) dismissed the plea[2]. In this post, we try to understand the global jurisprudence on the right to be forgotten, and how the contrasting judgments in India may be located within it.

Background

The ‘right to be forgotten’ has gained prominence since a matter was referred to the Court of Justice of European Union (CJEU) in 2014 by a Spanish court.[3] In this case, Mario Costeja González had disputed the Google search of his name continuing to show results leading to an auction notice of his reposed home. The fact that Google continued to make available in its search results, an event in his past, which had long been resolved, was claimed by González as a breach of his privacy. He filed a complaint with the Spanish Data Protection Agency (AEPD in its Spanish acronym), to have the online newspaper reports about him as well as related search results appearing on Google deleted or altered. While AEPD did not agree to his demand to have newspaper reports altered, it ordered Google Spain and Google, Inc. to remove the links in question from their search results. The case was brought in appeal before the Spanish High Court, which referred the matter to CJEU. In a judgement having far reaching implications, CJEU held that where the information is ‘inaccurate, inadequate, irrelevant or excessive,’ individuals have the right to ask search engines to remove links with personal information about them. The court also ruled that even if the physical servers of the search engine provider are located outside the jurisdiction of the relevant Member State of EU, these rules would apply if they have branch office or subsidiary in the Member State.

The ‘right to be forgotten’ is a misnomer, and essentially when we speak of it in the context of the proposed laws in EU, we refer to the rights of individuals to seek erasure of certain data that concerns them. The basis of what has now evolved into this right is contained in the 1995 EU Data Protection Directive, with Article 12 of the Directive allowing a person to seek deletion of personal data once it is no longer required.

Critical to our understanding of the rationale for how the ‘right to be forgotten’ is being framed in the EU, is an appreciation of how European laws perceive privacy of individuals. Unlike the United States (US), where privacy may be seen as a corollary of personal liberty protecting against unreasonable state intrusions, European laws view privacy as an aspect of personal dignity, and are more concerned with protection from third parties, particularly the media. The most important way in which this manifests itself is in where the burden to protect privacy rights lie. In Europe, privacy policy often dictates intervention from the state, whereas in the US, in many cases it is up to the individuals to protect their privacy.[4]

Since the advent of the Internet, both the nature and quantity of information existing about individuals has changed dramatically. This personal information is no longer limited to newspaper reports and official or government records either. Our use of social media, micro-discussions on Twitter, photographs and videos uploaded by us or others tagging us, every page or event we like, favourite or share—all contribute to our digital footprint. Add to this the information created not by us but about us by both public and private bodies storing data about individuals in databases, our digital shadows begin to far exceed the data we create ourselves. It is abundantly clear that we exist in a world of Big Data, which relies on algorithms tracking repeated behaviour by our digital selves. It is in this context that a mechanism which enables the purging of some of this digital shadow makes sense.

Further, it is not only the nature and quantity of information that has changed, but also the means through which this information can be accessed. In the pre-internet era, access to records was often made difficult by procedural hurdles. Permissions or valid justifications were required to access certain kinds of data. Even for the information available in the public domain, often the process of gaining access were far too cumbersome. Now digital information not only continues to exist indefinitely, but can also be easily accessed readily through search engines. It is in this context that in a 2007 paper, Viktor Mayer-Schöenberger pioneered the idea of memory and forgetting for the digital age.[5] He proposed that all forms of personal data should have an additional meta data of expiration date to switch the default from information existing endlessly to having a temporal limit after which it is deleted. While this may be a radical suggestion, we have since seen proposals to allow individuals some control over information about them.

In 2016, the EU released the final version of the General Data Protection Regulation. The regulation provides for a right to erasure under Article 17, which would enable a data-subject to seek deletion of data.[6] Notably, except in the heading of the provision, Article 17 makes no reference to the word ‘forgetting.’ Rather the right made available in this regulation is in the form of making possible ‘erasure’ and ‘abstention from further dissemination.’ This is significant because what the proposed regulations provide for is not an overarching framework to enable or allow ‘forgetting’ but a limited right which may be used to delete certain data or search results. Providing a true right to be forgotten would pose issues of interpretation as to what ‘forgetting’ might mean in different contexts and the extent of measures that data controllers would have to employ to ensure it. The proposed regulation attempts to provide a specific remedy which can be exercised in the defined circumstances without having to engage with the question of ‘forgetting’.

The primary arguments made against the ‘right to be forgotten’ have come from its conflict with the right to freedom of speech. Jonathan Zittrain has argued against the rationale that the right to be forgotten merely alters results on search engines without deleting the actual source, thus, not curtailing the freedom of expression.[7] He has compared this altering of search results to letting a book remain in the library but making the catalogue unavailable. According to Zittrain, a better approach would be to allow data subjects to provide their side of the story and more context to the information about them, rather than allowing any kind of erasure. Unlike in the US, the European approach is to balance free speech against other concerns. So while one of the exceptions in sub-clause (3) of Article 17 provides that information may not be deleted where it is necessary to exercise the right to free speech, free speech does not completely trump privacy as the value that must be protected. On the other hand, US constitutional law would tend to give more credence to the First Amendment rights and allow them to be compromised in very limited circumstances. As per the position of the US Supreme Court in Florida Star v. B.J.F., lawfully obtained information may be restricted from publication only in cases involving a ‘state interest of the highest order’. This position would allow any potential right to be forgotten to be exercised in the most limited of circumstances and privacy and reputational harm would not satisfy the standard. For these reasons the rights to be forgotten as it exists in Article 17 may be unworkable in the US.

Issues in application

Significant technical challenges remain in the effective and consistent application of Article 17 of the EU Directive. One key issue is concerned with how ‘personal data’ is defined and understood, and how its interpretation will impact this right in different contexts. According to Article 17 of the EU directive, the term ‘personal data’ includes any information relating to an individual. Some ambiguity remains about whether information which may not uniquely identify a person, but as a part of small group, could be considered within the scope of personal data. This becomes relevant, for instance, where one seeks the erasure of information which, without referring to an individual, points fingers towards a family. At the same time, often the piece of information sought to be erased by a person may contain personal information about more than one individual. There is no clarity over whether a consensus of all the individuals concerned should be required, and if not, on what parameters should the wishes of one individual prevail over the others. Another important question, which is as yet unanswered, is whether the same standards for removal of content should apply to most individuals and those in public life.

The issue of what is personal data and can therefore be erased gets further complicated in cases of derived data about individuals used in statistics and other forms of aggregated content. While, it would be difficult to argue that the right to be forgotten needs to be extended to such forms of information, not erasing such derived content poses the risk of the primary information being inferred from it. In addition, Article 17(1)(a) provides for deletion in cases where the data is no longer necessary for the purposes for which they were collected or used. The standards for circumstances which satisfy this criteria are, as yet, unclear and may only be fully understood through a consistent application of this law.

Finally, once there are reasonable grounds to seek erasure of information, it is not clear how this erasure will be enforced practically. It may not be prudent to require that all copies of the impugned data are deleted such that they may not be recovered, to the extent technologically possible. A more reasonable solution might be to permit the data to continue to remain available in encrypted forms, much like certain records are sealed and subject to the strictest confidentiality obligations. In most cases, it may be sufficient to ensure that the records of the impugned data is removed from search results and database reports without actually tampering with information as it may exist. These are some of the challenges which the practical application of this right will face, and it is necessary to take them into account in enforcing the proposed regulations.

The two Indian judgments

In the first case, (before the Gujarat High Court), the petitioner entered a plea for “permanent restraint [on] free public exhibition of the judgment and order.” The judgment in question concerned proceeding against the petitioner for a number of offences, including culpable homicide amounting to murder. The petitioner was acquitted, both by the Sessions court and the High Court before which he was pleading. The petitioner’s primary contention was that despite the judgment being classified as ‘unreportable’, it was published by an online repository of judgments and was also indexed by Google search. The decision of the High Court to dismiss the petition, rest of the following factors: a) failure on the part of the petitioner to show any provisions in law which are attracted, or threat to the constitutional right to life and liberty, b) publication on a website does not amount to ‘reporting’, as reporting only refers to that by law reports.

While the second point of reasoning made by the courts is problematic in terms of the function of precedent served by the reported judgments, and the basis for reducing the scope of ‘reporting’ to only law reports, the first point is of direct relevance to our current discussion. The lack of available legal provisions points to the absence of data protection legislation in India. Had there been a privacy legislation which addressed the issues of how personal information may be dealt with, it is possible that it may have had instructive provisions to address situation like these. In the absence of such law, the only recourse that an individual has is to seek constitutional protection under one of the fundamental rights, most notably Article 21, which over the years, has emerged as the infinite repository of unenumerated rights. However, typically rights under Article 21 are of a vertical nature, i.e., available only against the state. Their application in cases where a private party is involved remains questionable, at best.

In contrast, in the second case, the Karnataka High Court ruled in favor of the petitioner. In this case, the petitioner’s daughter instituted both criminal and civil proceedings against a person. However, later they arrived at a compromise and one of the conditions was quashing all the proceedings which had been initiated. The petitioner had raised concerns about the appearance of his daughter’s name in the cause title and was easily searchable. The court, while making vague references to “trend in the Western countries where they follow this as a matter of rule “Right to be forgotten” in sensitive cases involving women in general and highly sensitive cases involving rape or affecting the modesty and reputation of the person concerned, held in the petitioner’s favor, and order that the name be redacted from the cause title and the body of the order before releasing to any service provider.  The second judgment is all the more problematic for while it makes a reference to jurisprudence in other countries, yet it does not base it on the fundamental right to privacy, but to the idea of modesty and reputation of women, which has no clear legal basis on either Indian or comparative jurisprudence.

Conclusion

The above two cases demonstrate the problem of lack of a clear legal basis being employed by the judiciary in interpreting the right to be forgotten. Not only were no clear legal provisions in Indian law were taken refuge of while ruling on the existence of this right, the court also do not engage in any analysis of comparative jurisprudence such as the GDPR or the Costeja judgment. Such ad-hoc jurisprudence underlines the need for a data protection legislation, as in its absence, it is likely that divergent views are taken upon this issue, without a clear legal direction. It is likely that most matters concerning the right to erasure concern private parties as data controllers. In such cases, the existing jurisprudence on the right to privacy as interpreted under Article 21 may also be of limited value. Further, as has been pointed out above, the right to be forgotten needs to be a right qualified by conditions very clearly, and its conflict with the right to freedom of expression under Article 19. Therefore, it is imperative that a comprehensive data protection law addresses these issues.


[1] Sri Vasunathan vs The Registrar, available at http://www.iltb.net/2017/02/karnataka-hc-on-the-right-to-be-forgotten/

[2] Dharmraj Bhanushankar Dave v. State of Gujarat, available at https://drive.google.com/file/d/0BzXilfcxe7yueXFJWG5mZ1pKaTQ/view.

[3] Google Spain et al v. Mario Costeja González, available at http://curia.europa.eu/juris/document/document_print.jsf?doclang=EN&docid=152065.

[4] http://www.europarl.europa.eu/RegData/etudes/STUD/2015/536459/IPOL_STU(2015)536459_EN.pdf

[5] Mayer-Schoenberger, Viktor, Useful Void: The Art of Forgetting in the Age of Ubiquitous Computing (April 2007). KSG Working Paper No. RWP07-022. Available at SSRN: https://ssrn.com/abstract=976541 or http://dx.doi.org/10.2139/ssrn.976541.

[6] Article 17 (1) states: The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies:

(a) the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;

(b) the data subject withdraws consent on which the processing is based according to point (a) of Article 6(1), or point (a) of Article 9(2), and where there is no other legal ground for the processing;

(c) the data subject objects to the processing pursuant to Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to Article 21(2);

(d) the personal data have been unlawfully processed;

(e) the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;

(f) the personal data have been collected in relation to the offer of information society services referred to in Article 8(1).

[7] Zittrain, Jonathan, “Don’t Force Google to ‘Forget’”, The New York Times, May 14, 2014. Available at https://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html.

Document Actions