Earlier this fall, the Court of Justice of the European Union (CJEU) ruled on two cases on the so-called right to be forgotten. This right was established by the same court in 2014 as a way to protect users’ rights to privacy and data protection. Its interpretation and implementation have however created a worrisome tension with the right to freedom of expression and information.
Access Now’s mission is to defend and extend the digital rights of users around the world. This includes protecting the rights to privacy, data protection, as well as freedom of expression and freedom of information. While we acknowledge the benefit that a well-designed and implemented right to be forgotten could bring to users in the EU based on the region’s robust jurisprudence on data protection, freedom of expression, and freedom of information, we oppose the development of this right outside the EU. Since 2014, we have witnessed how governments around the world have — inadvertently or otherwise — misinterpreted or implemented a “right to be forgotten” in a way that significantly harms human rights and leads to censorship.
Despite widespread reporting that these rulings are a victory for Google (a party in the cases) and freedom of expression, we found a lot of grey areas in the CJEU’s decisions that could impact human rights. We further argue that in both decisions, the CJEU grants too much power to search engines like Google to decide how content is discovered online.
In this piece, we dive into the two recent rulings on the application of this right in the EU and analyse what these decisions mean for data protection and freedom of access to information and of expression in the EU and beyond. We finalise our analysis by providing recommendations that aim to strengthen safeguards in the implementation of the right to be forgotten in the European Union.
What is the right to be forgotten?
The right to be forgotten is a data protection right which the EU Court of Justice developed in the 2014 “Google Spain” case. In this ruling, the Court established that users can ask search engines to hide certain URLs from search results when a search is conducted using their name and the content on the page the URL points to includes information that is “inadequate, irrelevant or no longer relevant, or excessive.” The right to be forgotten gives individuals the ability to exercise control over their personal data by deciding what information about them should be accessible to the public through search engines. It does not, however, give users the power to demand that the personal data be deleted from a site.
Even though the exercise of the right to be forgotten does not — and should not — lead to the deletion of content published online, it directly impedes the freedom of expression and freedom of information because it makes it harder to find information online. To mitigate these risks, the Court established in 2014 that the right to be forgotten should not apply to information that is relevant for the public interest and in addition, that the use of this right by public figures, such as celebrities or politicians, would be limited.
Unfortunately, the Court also left it up to search engines — that is to say private companies — to apply this right and to conduct the very delicate exercise of balancing the right to data protection with freedom of expression.
While the Court and EU data protection authorities provided for some criteria to help with the application of the right to be forgotten, many questions were left unanswered. This situation paved the way for the two cases heard by the Court that we analyse in this post.
In short, the Court was asked to answer fundamental questions about how and when the right to be forgotten should be applied:
-
Should search engines apply this right globally, on all of its domain names, or only in the country and domain name where a user is making a right to be forgotten request? (For example, if an Italian citizen makes a request in Italy, should Google honor that request on google.it only, or on all Google domains, including google.com?)
-
How is “public interest” defined?
-
When is information no longer considered relevant for the public?
-
Should a search engine always accept a right to be forgotten request related to sensitive information about a user?
Let’s have a look at the Court’s answers to these questions and what they mean for users’ rights.
1. GC et al v. CNIL: Google forgets you not
The case of GC et al v. CNIL concerns four individuals who requested that Google stop showing in its search results links to websites containing articles or content that third parties had published about them. Specifically, the search results in question led users to a satirical photo-montage of a local politician; to an article that described one of the individuals as a Church of Scientology public relations officer; to a judicial investigation of businessmen and political personalities; and finally, to an article about a criminal conviction for the sexual assault of minors. Google refused to comply with their requests, arguing that the personal data of the four individuals, although sensitive, were important to the public interest and should therefore remain available to online users. After the French data protection authority (CNIL) upheld Google’s decision, the applicants brought the case to the French council of state (Conseil d’Etat) which in turn referred a list of concrete questions to the CJEU.
The right to be forgotten enables individuals to request information to be removed from the indexes of major search engines. In practice, the information is not being removed from the open web but its visibility is being significantly limited. The requests Google receives sometimes come from people who wish to remove links to their criminal past from today’s search results or from politicians and public figures who don’t want damaging information or criticism about them to reach a wide public. This case tackles both past criminal convictions and individuals active in public life.
The CJEU was asked whether the prohibition of processing special categories of personal data, such as political opinions or religious beliefs published by individual media outlets, also applies to search engines. Consequently, it confirmed that because Google references pages and, in particular, displays links to web pages that contain the personal data in the list of search results, it can be held responsible for such processing as any other data controller would be. The CJEU emphasised that Google is required by law to obtain consent from users before processing their sensitive data. However, according to the Court, it would be impossible in practice for the search engine to obtain consent for everything its search results presents to users, and from all the individuals whose personal data is presented. Therefore, based on the CJEU reasoning, users must first inform Google about the personal data they do not consent to being listed in Google’s search results, and only then is the search engine required to take action. According to the Court, however, unless users request that Google remove a hyperlink containing sensitive data, the company does not have to justify its processing.
The CJEU also addressed a question related to the scope of application of the right to be forgotten over time. Should data about past criminal convictions, which no longer describe a present situation, be removed from a specific search result? In such circumstances, the Court is of the opinion that a search engine still has to balance the data subject’s fundamental rights and the public’s right to freedom of information, taking into consideration complex criteria such as the nature and seriousness of the offence in question, the progress and the outcome of the proceedings, the data subject’s role in public life, and the level of public interest in the information at the time of the removal request.
Importantly, the CJEU looked closely at balancing the right to access information and freedom of expression, and the rights to privacy and data protection. It recalled its landmark Google Spain judgement, which established that while individual rights may in general override the freedom of information of online users, the balance between these fundamental rights must be assessed on a case-by-case basis. When attempting to strike a balance, the following two factors need to be taken into consideration: 1) the nature of information in question and its sensitivity for the data subject’s private life, and 2) the public’s interest in accessing the information, which may vary depending on the data subject’s role in public life.
Google becomes a private adjudicator
Balancing the individual’s right to data protection in seeking to remove hyperlinks from search results and the public’s right to know is at the heart of this judgement. Such an exercise is often context-dependent and highly nuanced and therefore challenging even for experienced national and international judges. One of the most concerning elements in the Court’s reasoning is the assumption that it should be Google, a privately owned company with multiple online services and a vested interest in having as much information available through its search engine as possible, that ultimately decides what information falls under the scope of “public interest” — or does not. In other words, a private company will be making the final call on what information is relevant for the public to know and what can be hidden from users.
This approach is not new in the freedom of expression and information debate in Europe. In recent years, we have witnessed efforts by governments in Europe to impose increasing responsibility onto private companies, forcing them to police user-generated content. Just a few weeks after GC et al v. CNIL, the CJEU issued a long-anticipated judgement, Glawischnig-Piesczek v. Facebook. In this case, the CJEU concluded that in order to remove identically worded content previously determined as defamatory, the national courts may order Facebook to monitor every single post shared by each user. Although this decision does not address the scope or interpretation of the right to be forgotten, it illustrates the ongoing pressure on companies to make a judgement about information or content that may shape public knowledge and discourse. Moreover, the CJEU approach directly challenges the essence of the right to freedom of information. The European Court of Human Rights has recognised the internet as one of the principal means by which individuals exercise this fundamental freedom. It is an important tool for participation in activities and discussions concerning political issues and issues of public interest. A decision made by a private company about what information is of public interest is not transparent and does not comply with rule-of-law requirements. The recent CJEU ruling on the right to be forgotten follows the dangerous trend of having Google and other tech giants take a more active role in restricting access to different types of online content and information.
It is highly problematic that the CJEU has placed the burden and responsibility of balancing of users’ fundamental rights on private actors, forcing them to take up the role of private adjudicators in online space. While we know that this situation will not be resolved easily, we call on the courts, data protection authorities, and free expression experts to provide further detailed guidance to search engine on how to implement the right to be forgotten in a rights-respectful manner.
2. CNIL v. Google: the right to be forgotten is limited to the EU — at least for now
In the second case, brought by Google against the French data protection authority (CNIL), the EU Court of Justice was asked to rule on the geographical scope of application of the right to be forgotten.
In 2015, the French regulator fined Google 100,000€ for refusing to apply the right to be forgotten worldwide. In this decision, the CNIL further ordered the company to apply the right to be forgotten to all domain names, including google.com. In response, Google held the position that CNIL had the power to order its application only on google.fr domain. In its ruling, the Court established that the right to be forgotten should be applied to all domain names from the EU — that is, to google.fr, but also to google.it, google.de, google.nl and so on. The Court justified this decision by pointing to the fact that the adoption of the General Data Protection Regulation requires a consistent and harmonised level of protection for users across the EU. The Court further added that each EU state is, however, empowered to limit this scope in order to protect freedom of information, paving the way for a patchwork of approaches across the EU which could further complicate the situation.
The Court extends a worrying invitation to European lawmakers
In its ruling resolving the matter between CNIL and Google on the scope of application of the right to be forgotten, the Court said much more. While it indicated that there are currently no obligations under EU law to apply the right to be forgotten outside the EU, the Court deemed it important to note that the EU could modify its laws to create such an obligation. This (not so) subtle nudge to European lawmakers was however followed by a warning that many countries around the world either do not recognise the right to be forgotten or “have a different approach”. Indeed, numerous countries, as well as many free expression advocates, understand that the right to be forgotten can be used as a censorship tool.
Given the grave risks linked to the misapplication and misunderstanding of the right to be forgotten, Access Now recommends that EU lawmakers do not extend the scope of application of this right outside the EU. The EU recognises both data protection and freedom of information as fundamental rights under the EU Charter. These rights are further baked into national level constitutions and specific laws with independent bodies, such as regulators and courts tasked to enforce and balance these rights. Furthermore, as acknowledged by the Court in the two rulings analysed here, the GDPR requires the balancing of the right to be forgotten with the application of the rights to freedom of expression and information. Thanks to this and the jurisprudence of the EU Court of Justice and the European Court of Human Rights, the European Union has — in practice — necessary safeguards in place that prevent abuse of the right to be forgotten as a censorship tool.
Privacy please
In another very worrying move, to ensure the application of the right to be forgotten across the EU, the Court made some problematic recommendations to search engines. Google should not only hide the contested search results in all EU domain names, but also “prevent or, at the very least seriously discourage” internet users in the EU from accessing these results through other means. The Court refers to “effective measures” that search engines should take to prevent such access without clearly stating which measures it has in mind. However, in a footnote, the Court links these “measures” to previous rulings in the field of copyright referring to techniques to require internet users to “reveal their identity” online. Ensuring the application of the right to be forgotten must not come at the expense of protecting anonymity online which is necessary for many journalists and activists in Europe and around the world. To protect users’ rights online, we encourage data protection authorities to clarify this new obligation established by the Court. DPAs should in particular reaffirm that such measures shall never interfere with users’ right to anonymity when using tools like Virtual Private Networks (VPNs).
Where do we go from here?
Much has been said and written about the right to be forgotten since it was established in 2014. Courts and legislators around the world have demonstrated significant interest in this debate, and rightly so, as its impact on human rights can be significant.
With that in mind, we do not encourage lawmakers outside the EU to develop such a right; instead, we encourage them to focus on developing or reforming comprehensive data protection laws that do not need to include such a right. In any case, if a similar right is to be developed, its sole purpose must be to enhance users’ control over their personal information and come with appropriate safeguards, including remedy rights. Under no circumstances must the right to be forgotten be misinterpreted or misapplied to enable the removal of online content.
In Europe, the Court has provided some much needed guidance on the application of the right to be forgotten, yet many questions remain and new ones have emerged.
First, Google and other search engines continue to be tasked with the responsibility of conducting a delicate exercise of balancing rights, when they have no democratic mandate to do so. To protect the rule of law, it must always be up to the courts and independent public regulators to interpret and to evaluate the application of the right to be forgotten. Private actors should not be put in a situation where they have a de facto judicial role over content and are required to weigh data protection against freedom of information. Until a legal and political solution to this problem going beyond the right to be forgotten is found, courts, DPAs, and free expression experts should work together to provide further guidance to search engines on how to implement the right to be forgotten in a rights-respectful manner.
Second, now that the Court has clarified that the scope of application of the right to be forgotten is limited to all EU countries, we recommend that EU lawmakers do not extend the scope of this right beyond the EU.
Finally, we call on DPAs to clarify what measures search engines can take to ensure the application of this right across the EU in a way that fully guarantees the rights to privacy and anonymity.
For more guidance on the application of the right to be forgotten in the EU and around the world, read our paper here.