Privacy vs. Security: Snowden’s Legacy

encritpionPrivacy is a delicate issue, and the debate on data protection is complicated even though seemingly everyone has the same desire: as much privacy as possible, combined with as much security as possible. Finding an equilibrium is an incredibly difficult challenge for governments, intergovernmental organizations and non-state actors around the world.

In April, a Regulation[1] and a Directive[2] on data protection were passed through the European parliament. These new rules replace the data protection Directive of 1995 and the Framework decision of 2008 (specifically applicable to the police and judicial sector), and aim to restore citizens control over their own personal data. This new legal groundwork is of course an attempt to adapt to the new technological advancements and an answer to the public outcry that followed the revelations of Edward Snowden, the former CIA employee who leaked classified documents that uncovered widespread privacy intrusions by the U.S. government.

This sparked a fierce debate about privacy, data protection, anonymity and encryption.[3] Especially the latter is a rather hot topic, popularized by the legal tussle between Apple and the FBI. In order to fully understand the ongoing debate, we have to take a closer look at the evolution of encryption.

Encryption is the process of converting information or data into a code readable only by the intended recipient. It is by no means a new phenomenon, since there are even indications that the ancient Egyptians used it. Snowden, however, accentuated the desirability of digital encryption in the information age. And this has caused an important landslide: companies became aware of the added value of privacy enhancing technologies, and effectively commercialized encryption on a large scale. Like bio-product labels steer the consumer in the supermarket, promises of protection of personal data have become an increasingly important asset when it comes to the sale of electronics.

The fact that privacy enhancing technologies have become so interesting on a commercial level, is a first crack in the often heard story that Apple’s refusal to break their own code on the FBI’s request is merely established on moral grounds. But there is a second advantage for the multinational tech company: if they cannot access the data themselves, they will no longer be flooded by law enforcements requests to disclose this information. These realizations cast a new light on the ethical stance of telecom corporations that advocate privacy through encryption: they have actual economic interest in not being able to retain this private data. Still, whatever may be, as long as the effect is an increase in privacy, this ought to be good. Right?

But encryption cuts both ways: it offers protection to the law abiding citizen in a world where gathering personal information, constructing profiles and identifying individuals is ever so easy, but it also offers criminals a way to hide their activities. It becomes increasingly difficult for law enforcement to investigate matters of drug trade, human trafficking and terrorism when they occur on the so-called “dark web”. These two sides of the same coin force us to make a difficult assessment between security and effective law enforcement on the one hand, and the right to privacy and freedom of expression on the other hand.


Article 17 of the International Covenant on Civil and Political Rights protects the right to privacy, but also implicitly states that this right is not absolute:

“No one shall be subjected to arbitrary or unlawful interference with his privacy, family, home or correspondence, nor to unlawful attacks on his honour and reputation. […]”

As long as a limitation to the right is lawful and not arbitrary, said limitation is allowed. More specifically, this means that restrictions on encryption should be provided for by a precise, public and transparent law that protects specified interests and are considered necessary[4] (and not merely desirable or useful) to protect these interests.[5] Nonetheless, examples of disproportional limitation of the right to use encryption are paramount. Clear cases can be found in Cuba and China, but there are also examples in more democratic countries, such as India. In April, when WhatsApp activated encryption within its application, questions were raised over the legality of the 256-bit encryption key, since India has a law that only allows the usage of up to 40-bit encryption.[6] Rules like these are arguably in violation of art. 17 of the ICCPR since they indiscriminately prohibit the use of encryption for individuals, and as such can hardly be considered as necessary measures.

Another example of disproportional restriction is the so-called back-door access in certain products, like cell phones. Many politicians across the world plea for these weaknesses in the system, pre-installed by developers, allowing authorities to process encrypted information. However, these “glitches” are a threat to all users when unauthorized entities gain access to said back-doors. Furthermore, there is no proof that encryption is a problem that cannot be surpassed by authorities. Aside from the fact that law enforcement has many other options to gather information, a U.S. study in 2014 found that only 0.1% of criminal wiretaps were encrypted.[7] In other words, jeopardizing the safety and privacy of all users by creating a back-door to gain access to these few encrypted wiretaps fails to meet the criterion of necessity.

Individuals should be able to use encryption technology, and should have the right to remain anonymous to protect their fundamental right to privacy and freedom of expression. This responsibility lies mainly with the government, but corporate actors play a similar role in developing stronger encryption and ensuring anonymity. As the conclusion of the report of Special Rapporteur David Kaye states, restrictions on these rights should be adopted only on a case-specific basis.[8] Even with international terrorism and (digitally) organized crime on the rise, international human rights law must prevail.

 Arthur Goemans

[1] Regulation (EU) 2016/679

[2] Directive (EU) 2016/680

[3] Although neither the Regulation nor the Directive are applicable on intelligence services targeted by the Snowden revelations.

[4] Human Rights Committee, general comment No. 34 (2011)

[5] The Sunday Times v. United Kingdom, ECHR, 1979

[6] This a legally grey area, because the rules generally apply to ISPs, which WhatsApp is not. The applicability is uncertain. See this article. See also this link.

[7] Wiretap Report 2014, United States Courts.

[8] Report of the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, David Kaye