If you choose to fly into any country in the European Union, you will be subjected to screening and profiling to see whether you are — or could be — a dangerous criminal or terrorist. This system, required under the EU Passenger Name Record (PNR) Directive, entails the collection and retention of a wide range of personal information you provide to airlines and travel agencies. This digital frisking happens regardless of who you are, where you’re from, or whether you are under suspicion for any crime.
Right now, the Court of Justice of the European Union (CJEU) is deliberating on a case challenging the legality of the PNR Directive, and we expect a ruling in the next several months. If the Court upholds its legality — or at least part of it — the ruling risks entrenching the use of automated systems that routinely invade our privacy and expose us to unacceptable risks, all for the illusion of enhanced security.
What has happened so far?
On January 27, the Advocate General (AG) of the CJEU published an opinion that suggests the Court should affirm the PNR Directive’s legality — a sharp U-turn from the CJEU’s jurisprudence upholding the rights to privacy and data protection.
The good news is that not all hope is lost. While the Court typically does follow the AG’s guidance, it does not do so in every case. Furthermore, even the AG recognises flaws in the way the PNR Directive is written.
Below, we analyse the AG’s opinion to show why:
- It disregards previous CJEU rulings that safeguard privacy and data protection;
- It overlooks the lack of evidence to show the Directive is fit for purpose; and
- It ignores the inherent risks of automated profiling systems.
If the Court follows the opinion, it would exacerbate existing discrimination against targeted groups, and make people already vulnerable to rights violations even more vulnerable.
Sacrificing privacy on the altar of national security
The main question before the Court: Is the PNR Directive compatible with Articles 7 and 8 of the Charter on Fundamental Rights of the EU? These Articles protect the right to privacy (Art. 7) and the right to data protection (Art. 8). These provisions guarantee everyone’s right to a private life, which includes your travelling habits, and the right to have your personal information protected and used fairly.
Given the vast quantity of PNR data collected, and the way these data are processed and stored, the question is whether the Directive’s approach is proportionate to its objective. Is it okay to gather, analyse, and store the data of every single air traveller to combat terrorism and serious crime?
To answer this question, the AG references the obligations stemming from Article 52 of the Charter, which details the conditions under which fundamental rights, like privacy or data protection, can be limited under EU law. The limitations must follow three criteria: they must be necessary, follow an objective of general interest, and be prescribed by law.
Here, the AG indicates that the PNR Directive has a general security objective, meaning that even if the law does create limitations to the rights to privacy and data protection, it could be justified, if the measures are necessary and clearly defined by law.
However, the AG takes issue with the way the PNR Directive is written, in particular with point 12 of Annex I, pointing out that it is too vague regarding the type of data that can be collected and should therefore be removed from the law.
As the AG points out, the scope of personal data that falls under this point of the PNR Directive is not sufficiently delimited, as it would allow authorities to collect an amount of information that is not proportionate to its aim, and that could lead to officials over-collecting data and making discriminatory assumptions based upon the information.
This is an important clarification from the AG. However, it is the only section of the PNR Directive that he found incompatible with EU law. That is surprising, and not in tune with the Court’s previous decisions.
The Court has previously determined that similar data retention and access measures established under the PNR Directive are not compatible with EU law. The AG boldly suggests that CJEU jurisprudence should not apply in this particular context, as the previous case was in regard to communications data, which in his opinion are more sensitive than passenger data. The AG nevertheless acknowledges that accessing PNR data can be very intrusive, and that it can be used to map a picture of someone’s life. Yet he seems to discard his own assessment of the severity of that infringement.
Notably the CJEU previously ruled against a specific PNR agreement that had data retention schemes similar to the PNR Directive at issue, and found it incompatible with EU law. Once more the AG suggests putting aside the Court’s previous finding, asserting that it does not fully apply in this instance. Going in an entirely different direction, he concludes that most measures under the Directive are in line with EU law. And just like that, the AG reconsiders more than 15 years of case law in the name of security.
The PNR Directive seems immune to the requirement for evidence
The declared objective of the PNR Directive is to prevent, detect, investigate, and prosecute terrorist offences and serious crimes. The AG’s opinion defines PNR data as an “essential tool in the EU common plan to combat terrorism”. Yet no concrete evidence is being presented as to whether or how collecting and analysing PNR data actually meets the objective of combatting terrorism.
No one has shown that mass collection and retention of PNR data is an effective response to terrorism. That was true during the negotiations for the law, and it’s still true more than five years after its adoption. To date, the PNR Directive appears to be supported by anecdotal data alone. Judge Thomas von Danwitz, who will deliver the final ruling in this case, expressed concern about the lack of evidence during the CJEU oral hearing in the summer of 2021 as he interrogated the European Commission’s representatives on the proportionality of the Directive. He asked if and why airports are the only location suitable for crimes, or if there are other venues where people should be routinely subjected to mass surveillance to find criminals. “Why not rock concerts? Why not museum visits?”, he asked.
The Commission Evaluation Report itself does not corroborate the effectiveness of PNR data with relevant statistical data. It provides details on some cases in which a crime was resolved with the support of PNR data, but fails to demonstrate that the case could not have been prosecuted using other existing tools and methods.
Statistically meaningful evidence is essential to invoke the principle of proportionality, which would then justify the limitation to the rights to life and privacy. Overlooking evidence poses a great risk for a democracy, as it would encourage law-makers to base their policies on the political barometer, rather than facts or necessity.
Technical “de-biasing” won’t solve the PNR Directive’s problem with automated profiling
One of the most troubling aspects of the PNR Directive is its introduction of automated profiling systems in the EU’s border control regime, which paved the way for a new form of border surveillance.
Officials use PNR data not only to identify known criminals and criminal suspects, but also to identify whether a passenger represents a threat. Your data are checked against a set of undefined “databases”, and if you are deemed a threat, you can be stopped and interrogated.
Such automated-profiling systems are inherently problematic. Civil society organisations and human rights experts, such as the UN Special Rapporteur on contemporary forms of racism, racial discrimination, xenophobia and related intolerance, have raised two main concerns: the base-rate fallacy and the intrinsic and ineradicable bias of these systems.
The base-rate fallacy refers to the high likelihood of false-positives when you are looking for specific information (e.g. terrorist) in a large dataset (e.g. millions of air passengers). For instance, the afore-mentioned Commission Evaluation Report on PNR does not provide a detailed number of all passengers whose PNR data was processed. As an indication, in 2020, 277 million people travelled by plane to, from, or within the EU. This number represents a 73% decrease compared to the year before due to COVID-19 restrictions. This means that every year millions of PNR data get analysed in the EU. According to the European Commission, out of all PNR data collected, 0.59% are flagged by automated systems. However, after examination, only 0,11% were referred to competent authorities. Lacking a better explanation from the Commission, this would suggest that 99,89%of the passengers were wrongly or inaccurately flagged. If you extrapolate from these numbers and consider the reality of PNR screening, this would mean that millions would be subjected to unfair and discriminatory profiling.
As for the intrinsic bias of automated-profiling systems, research has repeatedly shown that such systems are often contaminated with problematic biases and can reinforce existing forms of oppression, racism, and exclusion. As the EU’s Fundamental Rights Agency points out, “discrimination can occur during the design and implementation of algorithms, through biases that are incorporated – consciously or not – in the algorithm, as well as when decisions are made on the basis of the information obtained”.
Even if the automated analysis is conducted on non-sensitive data only, sensitive data could be inferred by proxy, and result in indirect discrimination. What’s more, there are serious doubts that such systems can be effectively “de-biased” through technical means, or that even procedural safeguards can mitigate the risk they pose to fundamental rights. In such cases, where the negative impact of a system cannot be mitigated, there is growing consensus that they should be prohibited. Accordingly, the UN High Commissioner for Human Rights notes in her 2021 annual report on The Right to Privacy in the Digital Age, that “uses of AI that inherently conflict with the prohibition of discrimination should not be allowed”.
So how do these issues come into play in the AG’s opinion? It argues that the text of the Directive provides sufficient safeguards against the risk of discrimination. The law does indicate that predetermined criteria, which should be “objective, proportionate, and specific”, must be used for the analysis of PNR data. However, as the CJEU itself stated in the Digital Rights Ireland case, it is not sufficient to write in a law that a system is compliant or protective. You must put in place the safeguards necessary to make it a reality.
The theoretical safeguards the Directive foresees are not sufficient to mitigate the impact of a system which is mathematically flawed and intrinsically biased. Overlooking this flaw would undermine the principles of proportionality and necessity, both of which are essential for any limitation on the right to privacy.
Next step: the Court ruling
The debate around the PNR Directive is much more than a legal question. It shows the negative consequences for human rights when EU policy-makers base their policies on the idea that we can solve any social issue using technology.
As we await the ruling, we ask: Will Judge von Danwitz, the Judge Rapporteur in this case, deviate from the AG’s interpretation and follow the established jurisprudence of the Court? After all, Judge von Danwitz was central to the establishment of that jurisprudence. He may once again confirm that indiscriminate mass surveillance measures have no place in the EU, including in the context of PNR.