privacy

The right to privacy is a crucial element of our personal security, for free speech and for democratic participation. It is a fundamental right in the primary law of the European Union and is recognised in numerous international legal instruments. Digital technologies have generated a new environment of potential benefits and threats to this fundamental right. As a result, defending our right to privacy is at the centre of EDRi’s priorities.

06 Nov 2019

Danish data retention: Back to normal after major crisis

By IT-Pol

The Danish police and the Ministry of Justice consider access to electronic communications data to be a crucial tool for investigation and prosecution of criminal offences. Legal requirements for blanket data retention, which originally transposed the EU Data Retention Directive, are still in place in Denmark, despite the judgments from the Court of Justice of the European Union (CJEU) in 2014 and 2016 that declared general and indiscriminate data retention illegal under EU law.

In March 2017, in the aftermath of the Tele2 judgment, the Danish Minister of Justice informed the Parliament that it was necessary to amend the Danish data retention law. However, when it comes to illegal data retention, the political willingness to uphold the rule of law seems to be low – every year the revision is postponed by the Danish government with consent from Parliament, citing various formal excuses. Currently, the Danish government is officially hoping that the CJEU will revise the jurisprudence of the Tele2 judgment in the new data retention cases from Belgium, France and the United Kingdom which are expected to be decided in May 2020. This latest postponement, announced on 1 October 2019, barely caught any media attention.

However, data retention has been almost constantly in the news for other reasons since 17 June 2019 when it was revealed to the public that flawed electronic communications data had been used as evidence in up to 10000 police investigations and criminal trials since 2012. Quickly dubbed the “telecommunications data scandal” by the media, the ramifications of the case have revealed severely inadequate data management practices by the Danish police for almost ten years. This is obviously very concerning for the functioning of the criminal justice system and the right to a fair trial, but also rather surprising in light of the consistent official position of the Danish police that access to telecommunications data is a crucial tool for investigation of criminal offences. The mismatch between the public claims of access to telecommunications data being crucial, and the attention devoted to proper data management, could hardly be any bigger.

According to the initial reports in June 2019, the flawed data was caused by an IT system used by the Danish police to convert telecommunications data from different mobile service providers to a common format. Apparently, the IT system sometimes discarded parts of the data received from mobile service providers. During the Summer of 2019, a new source of error was identified. In some cases, the data conversion system had modified the geolocation position of mobile towers by up to 200 meters.

Based on the new information of involuntary evidence tampering, the Director of Public Prosecutions decided on 18 August 2019 to impose a temporary two-month ban on the use of telecommunications data as evidence in criminal trials and pre-trial detention cases. Somewhat inconsequential, the police could still use the potentially flawed data for investigative purposes. Since telecommunications data are frequently used in criminal trials in Denmark, for example as evidence that the indicted person was in the vicinity of the crime scene, the two-month moratorium caused a number of criminal trials to be postponed. Furthermore, about 30 persons were released from pre-trial detention, something that generated media attention even outside Denmark.

In late August 2019, the Danish National Police commissioned the consultancy firm Deloitte to conduct an external investigation of its handling of telecommunications data and to provide recommendations for improving the data management practices. The report from Deloitte was published on 3 October 2019, together with statements from the Danish National Police, the Director of Public Prosecutions, and the Ministry of Justice.

The first part of the report identifies the main technical and organisational causes for the flawed data. The IT system used for converting telecommunications data to a common format contained a timer which sometimes submitted the converted data to the police investigator before the conversion job was completed. This explains, at least at technical level, why parts of the data received from mobile service providers were sometimes discarded. The timer error mainly affected large data sets, such as mobile tower dumps (information about all mobile devices in a certain geographical area and time period) and access to historical location data for individual subscribers.

The flaws in the geolocation information for mobile towers that triggered the August moratorium were traced to errors in the conversion of geographical coordinates. Mobile service providers in Denmark use two different systems for geographical coordinates, and the police uses a third system internally. During a short period in 2016, the conversion algorithm was applied twice to some mobile tower data, which moved the geolocation positions by a couple of hundred meters.

On the face of it, these errors in the IT system should be relatively straightforward to correct, but the Deloitte report also identifies more fundamental deficiencies in the police practices of handling telecommunications data. In short, the report describes the IT systems and the associated IT infrastructure as complex, outdated, and difficult to maintain. The IT system used for converting telecommunications data was developed internally by the police and maintained by a single employee. Before December 2018, there were no administrative practices for quality control of the data conversion system, not even simple checks to ensure that the entire data set received from mobile service providers had been properly converted.

The only viable solution for the Danish police, according to the assessment in the report, is to develop an entirely new infrastructure for handling telecommunications data. Deloitte recommends that the new infrastructure should be based on standard software elements which are accepted globally, rather than internally developed systems which cannot be verified. Concretely, the reports suggests using POL-INTEL, a big data policing system supplied by Palantir Technologies, for the new IT infrastructure. In the short term, some investment in the existing infrastructure will be necessary in order to improve the stability of the legacy IT systems and reduce the risk of creating new data flaws. Finally, the report recommends systematic independent quality control and data validation by an external vendor. The Danish National Police has accepted all recommendations in the report.

Deloitte also delivered a short briefing note about the use of telecommunications data in criminal cases. The briefing note, intended for police investigators, prosecutors, defence lawyers and judges, explains the basic use cases of telecommunications data in police investigations, as well as information about how the data is generated in mobile networks. The possible uncertainties and limitations of telecommunications data are also mentioned. For example, it is pointed out that mobile devices do not necessarily connect to the nearest mobile tower, so it cannot simply be assumed that the user of the device is close to the mobile tower with almost “GPS level” accuracy. This addresses a frequent critique against the police and prosecutors for overstating the accuracy of mobile location data – an issue that was covered in depth by the newspaper Information in a series of articles in 2015. Quite interestingly, the briefing note also mentions the possibility of spoofing telephone numbers, so that the incoming telephone call or text message may originate from a different source than the telephone number registered by the mobile service provider under its data retention obligation.

On 16 October 2019, the Director of Public Prosecutions decided not to extend the moratorium on the use of telecommunications data. Along with this decision, the Director issued new and more specific instructions for prosecutors regarding the use of telecommunications data. The Deloitte briefing note should be part of the criminal case (and distributed to the defence lawyer), and police investigators are required to present a quality control report to prosecutors with an assessment of possible sources of error and uncertainty in the interpretation of the telecommunications data used in the case. Documentation of telecommunications data evidence should, to the extent possible, be based on the raw data received from mobile service providers and not the converted data.

For law enforcement, the October 16 decision marks the end of the data retention crisis which erupted in public four months earlier. However, only the most imminent problems at the technical level have really been addressed, and several of the underlying causes of the crisis are still looming under the surface, for example the severely inadequate IT infrastructure used by the Danish police for handling telecommunications data. The Minister of Justice has announced further initiatives, including investment in new IT systems, organisational changes to improve the focus on data management, improved training for police investigators in the proper use and interpretation of telecommunications data, and the creation of a new independent supervisory authority for technical investigation methods used by the police.

Denmark: Our data retention law is illegal, but we keep it for now (08.03.2017)
https://edri.org/denmark-our-data-retention-law-is-illegal-but-we-keep-it-for-now/

Denmark frees 32 inmates over flaws in phone geolocation evidence, The Guardian (12.09.2019)
https://www.theguardian.com/world/2019/sep/12/denmark-frees-32-inmates-over-flawed-geolocation-revelations

Response from the Minister of Justice to the reports on telecommunications data (in Danish only, 03.10.2019)
http://www.justitsministeriet.dk/nyt-og-presse/pressemeddelelser/2019/justitsministerens-reaktion-paa-teledata-redegoerelser

Can cell tower data be trusted as evidence? Blog post by the journalist covering telecommunications data for the newspaper Information (26.09.2015)
https://andreas-rasmussen.dk/2015/09/26/can-cell-tower-data-be-trusted-as-evidence/

(Contribution by Jesper Lund, EDRi member IT-pol, Denmark)

close
25 Sep 2019

PNR complaint advances to the Austrian Federal Administrative Court

By Epicenter.works

On 19 August 2019, Austrian EDRi member epicenter.works lodged a complaint with the Austrian data protection authority (DPA) against the Passenger Name Records (PNR) Directive. After only three weeks, on 6 September, they received the response from the DPA: The complaint was rejected. That sounds negative at first, but is actually good news. The complaint can and must now be lodged with the Federal Administrative Court.

Why was the complaint rejected?

The DPA has no authority to decide whether or not laws are constitutional. Moreover, it cannot refer the matter to the Court of Justice of the European Union (CJEU), which in this case is necessary, because the complaint concerns an EU Directive. It was to be expected that the DPA would decide in this way, but the speed of the decision was somewhat surprising – in a positive way. It was clear from the outset that the data protection authority would reject the complaint, but it was a necessary step that could not be skipped, as there is no other legal route to the Federal Administrative Court than via the DPA. All seven proceedings of the complainants lodged with the aid of epicenter.works were merged, and the organisation was given the power of representation. This means that epicenter.works is allowed to represent the complainants.

What are the next steps?

Meanwhile, epicenter.works is still waiting for a freedom of information (FOI) request they have sent to the Passenger Information Unit (PIU) that processes the PNR data in Austria. While an answer to one request was received within a few days, another one has been overdue since 23 August. The unanswered request concerns data protection framework conditions for the PNR implementation.

epicenter.works will file the complaint with the Federal Administrative Court within four weeks. It is to be expected that the court will submit legal questions to the Court of Justice of the European Union (CJEU).

Epicenter.works
https://en.epicenter.works/

Passenger Name Records
https://en.epicenter.works/thema/pnr-0

Passenger surveillance brought before courts in Germany and Austria (22.05.2019)
https://edri.org/passenger-surveillance-brought-before-courts-in-germany-and-austria/

PNR: EU Court rules that draft EU/Canada air passenger data deal is unacceptable (26.07.2017)
https://edri.org/pnr-eu-court-rules-draft-eu-canada-air-passenger-data-deal-is-unacceptable/

(Contribution by Iwona Laub, EDRi member Epicenter.works, Austria)

close
25 Sep 2019

Portugal: Data retention complaint reaches the Constitutional Court

By Guest author

September 2019 brought us long-awaited developments regarding the situation of data retention in Portugal. The Justice Ombudsman decided to send the Portuguese data retention law to the Constitutional Court, following the Court of Justice of the European Union’s (CJEU’s) case law on blanket retention of data that lead to invalidation of Directive 2006/24/EC. This decision comes after a complaint presented by EDRi observer Associação D3 – Defesa dos Direitos Digitais, in December 2017.

The Ombudsman had first decided to issue an official recommendation to the government, urging it to propose a legislative solution for the problematic law that originated from the now invalidated Data Retention Directive. Faced with a refusal from the Minister of Justice to find a solution through legislative means, the Ombudsman has now decided to concede to D3’s original request, and has sent the matter for the appreciation of the Constitutional Court, which will have to provide a ruling on the constitutionality of the Portuguese data retention scheme.

A few days later, the same Constitutional Court partially stroke down, for the second time, a law that granted the intelligence services’ access to retained data. In 2015, the Constitutional Court had already declared the unconstitutionality of a similar law, after the president had requested a preventive ruling by the Court before signing it into law. However, in 2017, a new law that addressed some of the problems raised by the Constitutional Court was approved in the Parliament. As the new president opted not to request a preventive decision, the law came into force. 35 Members of the Parliament (MP) from three parties then requested a Constitutional Court ruling on the law, which was now issued.

The fundamental reasoning of this decision is that the Portuguese Constitution forbids public authorities from accessing citizen’s correspondence and telecommunications, except in the context of a criminal procedure. Given that the intelligence services have no criminal procedure competences, they cannot access such data within the existent Constitutional framework. However, the Court did allow access to user location and identification data (in the context of the fight against terrorism and highly organised crime), as such data was not considered to be covered by the secrecy of communications.

This case has also lead to the resignation of the original judge rapporteur due to disagreements related to the reasoning reflected in the final version of the text of the decision.

Associação D3 – Defesa dos Direitos Digitais
https://www.direitosdigitais.pt/

Portugal: Data retention sent to the Constitutional Court (07.03.2018)
https://edri.org/portugal-data-retention-constitutional-court/

European Court overturns EU mass surveillance law (08.04.2014)
https://edri.org/european-court-overturns-eu-mass-surveillance-law/

(Contribution by Eduardo Santos, Associação D3 – Defesa dos Direitos Digitais, Portugal)


close
23 Sep 2019

Your mail, their ads. Your rights?

By Andreea Belu
  • In the digital space, “postal services” often snoop into your online conversations in order to market services or products according to what they find out from your chats.
  • A law meant to limit this exploitative practice is stalled by the Council of European Union

We all expect our mail to be safe in the hands of a mailman. We have confidence that both the post office and the mailmen working there will not take a sneak-a-peak into our written correspondence. Neither we expect mailmen to act like door-to-door salespersons.

When we say “postal services” snoop, it is important to understand that this refers to both traditional mail services such as Yahoo, but also instant messaging apps like WhatsApp. While targeted ads are no longer popular among mail providers, the practice is gaining momentum in the instant messaging zone after Facebook’s CEO announced plans to introduce ads on WhatsApp’s Status feature.

Not just shoes ads

You might think: ”Well, what’s the harm in having shoes advertised after they’ve read the shopping chats between my friend and me?”. Short answer: it’s not just shoes.

Often targeted ads are the result of you being profiled according to your age, location, gender, sexual orientation, political views or ethnicity. You will receive jobs ads based on your gender, or housing ads based on your ethnicity. Sometimes, you may be targeted because you feel anxious or worthless. Are you sure all of these will benefit you? More, your online mailman might be required to read all of your mail, just in case you get in trouble with the law in the future. We call this mass data retention.

Click to watch the animation

The need for encrypted mail in storage *and* in transit

The WhatsApp case is a good example. Currently, WhatsApp seals the message right after you press “send”. The message goes to WhatsApp’s servers, is stored encrypted, and then sent to its recipient, also encrypted. This means that, technically, the mail is encrypted both in storage and in transit and nobody can reads its content. However, as Forbes points out, future ads plans might modify WhatsApp’s encryption so that they “first identify key words in sentences, like “fishing” or “birthday,” and send them to Facebook’s servers to be processed for advertising, while separately sending the encrypted message.

There’s a law for it, but it’s stalled by the EU Council

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of our electronic communications, by complementing and particularising the rules introduced by the General Data Protection Regulation (GDPR). The EU Parliament adopted a good stand for ePrivacy that would ensure your online messages are protected both in storage and in transit (Art.5), that would consider “consent” as the only legal basis for processing data (Art 6), that would make privacy–by–design and privacy–by–default core principles in software design (Art. 10), and that would protect encryption from measures aimed at undermining it (Art. 17). However, the Council of the European Union is yielding under big tech lobby pressure and drafted an opinion that threatens our rights and freedoms. More, the text adopted by the EU Parliament in October 2017 has been stuck in the EU Council, behind closed-door negotiations for soon two years. We have sent several letters (here, here and here) calling for the safeguarding our communications and for the adoption of this much needed ePrivacy Regulation.

Will our voices be heard? If you are worried about being targeted based on your private conversations, join our efforts and stay tuned for more updates coming soon.


Read more:

Your family is none of their business (23.07.2019)
https://edri.org/your-family-is-none-of-their-business/

Real-time bidding: The auction for your attention (4.07.2019)
https://edri.org/real-time-bidding-the-auction-for-your-attention/

e-Privacy Directive: Frequently Asked Questions
https://edri.org/epd-faq/

e-Privacy: What happened and what happens next (29.11.2017)
https://edri.org/e-privacy-what-happened-and-what-happens-next/

e-Privacy Mythbusting (25.10.2017)
edri.org/files/eprivacy/ePrivacy_mythbusting.pdf

close
23 Jul 2019

Civil society calls for a proper assessment of data retention

By Diego Naranjo

In preparation of a possible proposal for new legislation, the European Commission is conducting informal dialogues with different stakeholders to research about the possibilities of data retention legislation that complies with the rulings of the Court of Justice of the European Union (CJEU) and the European Court of Human Rights (ECtHR). As part of these dialogues, EDRi has previously met with the Commission Directorate-General for Migration and Home Affairs (DG HOME) on 6 June 2019.

On 22 July 2019, 30 civil society organisations sent an open letter to the European Commission President-elect Ursula von der Leyen and Commissioners Avramopoulos, Jourová and King, urging the commissions of the EU Commission to conduct an independent assessment on the necessity and proportionality of existing and potential legislative measures around data retention. Furthermore, signatories asked to ensure that the debate around data retention does not prevent the ePrivacy Regulation from being adopted swiftly.

You can read the letter here, and below:

22 July 2019

By email:
President-elect von der Leyen
First Vice-President Timmermans

CC:
Commissioner Avramopoulos
Commissioner Jourová
Commissioner King

Dear First Vice-President Timmermans,
Dear President-elect von der Leyen,

The undersigned organisations represent non-governmental organisations working to protect and promote human rights in digital and connected spaces. We are writing to put forward suggestions to ensure compliance with the EU Charter of Fundamental Rights and the CJEU case law on data retention.

EU Member States (and EEA countries) have had different degrees of implementation of the CJEU ruling on 8 April 2014 invalidating the Data Retention Directive. EDRi’s 2015 study reported that six Member States1 have kept data retention laws which contained features that are similar or identical to those that were ruled to be contrary to the EU Charter. Other evidence pointed in the same direction.2 While personal data of millions of Europeans were being stored illegally, the European Commission had not launched any infringement procedures. On 21 December 2016, the CJEU delivered its judgment in the Tele2/Watson case regarding data retention in Member States’ national law. In the aftermath of this judgment, the Council Legal Service unambiguously concluded that “a general and indiscriminate retention obligation for crime prevention and other security reasons would no more be possible at national level than it is at EU level, since it would violate just as much the fundamental requirements as demonstrated by the Court’s insistence in two judgments delivered in Grand Chamber.”3

On 6 June 2019 the Council adopted “conclusions on the way forward with regard to the retention of electronic communication data for the purpose of fighting crime” which claim that “data retention is an essential tool for investigating serious crime efficiently”. The Council tasked the Commission to “gather further information and organise targeted consultations as part of a comprehensive study on possible solutions for retaining data, including the consideration of a future legislative initiative.”

While the concept of blanket data retention appeals to law enforcement agencies, it has never been shown that the indiscriminate retention of traffic and location data of over 500 million Europeans was necessary, proportionate or even effective.

Blanket data retention is an invasive surveillance measure of the entire population. This can entail the collection of sensitive information about social contacts (including business contacts), movements and private lives (e.g. contacts with physicians, lawyers, workers councils, psychologists, helplines, etc.) of hundreds of millions of Europeans, in the absence of any suspicion. Telecommunications data retention undermines professional confidentiality and deters citizens from making confidential communications via electronic communication networks. The retained data is also of high interest for criminal organisations and unauthorised state actors from all over the world. Several successful data breaches have been documented.4 Blanket data retention also undermines the protection of journalistic sources and thus compromises the freedom of the press. Overall, it damages preconditions of open and democratic societies.

The undersigned organisations have therefore been in constructive dialogue with the European Commission services to ensure that the way forward includes the following suggestions:

  • The European Commission commissions an independent, scientific study on the necessity and proportionality of existing and potential legislative measures around data retention, including a human rights impact assessment and a comparison of crime clearance rates;
  • The European Commission and the Council ensure that the debate around data retention does not prevent the ePrivacy Regulation from being adopted swiftly;
  • The European Commission tasks the EU Fundamental Rights Agency (FRA) to prepare a comprehensive study on all existing data retention legislation and their compliance with the Charter and the CJEU/European Court of Human Rights case law on this matter;
  • The European Commission considers launching infringement procedures against Member States that enforce illegal data retention laws.

We look forward to your response and remain at your disposal to support the necessary initiatives to uphold EU law in this policy area.

Signatories:

European Digital Rights (EDRi)
Access Now
Chaos Computer Club (CCC)
Bits of Freedom
Asociatia pentru Tehnologie si Internet (ApTI)
Epicenter.works
Electronic Frontier Norway (EFN)
Dataskydd.net
Digital Rights Ireland
Digitalcourage
Privacy International
Vrijschrift
FITUG e.V.
Hermes Center for Transparency and Digital Human Rights
Access Info
Aktion Freiheit statt Angst
Homo Digitalis
Electronic Privacy Information Center (EPIC)
Iuridicum Remedium (IuRe)
La Quadrature du Net
Associação D3 – Defesa dos Direitos Digitais
IT-Political Association of Denmark (IT-Pol)
Panoptykon Foundation
Open Rights Group (ORG)
Electronic Frontier Finland (Effi ry)
Državljan D
Deutsche Vereinigung für Datenschutz e. V. (DVD)
//datenschutzraum
Föreningen för Digitala Fri- och Rättigheter (:DFRI)
AK Vorrat


1) https://edri.org/edri-asks-european-commission-investigate-illegal-data-retention-laws/
2) See, for example. Privacy International, 2017, National Data Retention Laws since Tele-2/Watson Judgment: https://www.privacyinternational.org/sites/default/files/2017-12/Data%20Retention_2017.pdf
3) Council document 5884/17, paragraph 13
4) A recent example can be found here: https://techcrunch.com/2019/06/24/hackers-cell-networks-call-records-theft/

close
07 Jun 2019

Data Retention: EU Commission inconclusive about potential new legislation

By Diego Naranjo

On 6 June 2019, representatives from eight civil society organisations (including EDRi members) met with officials from the European Commission (EC) Directorate General of Home Affairs (DG HOME) to discuss data retention. This meeting, according to the EC officials, was just another one in a series of meetings that DG HOME is holding with different stakeholders to discuss potential data retention initiatives that could be put forward (or not) by the next Commission. The meeting is not connected to the publication of the conclusions by the Council on data retention published also on 6 June which coincidentally tasks the Commission with doing a study “on possible solutions for retaining data, including the consideration of a future legislative initiative”.

Ahead of the meeting, civil society was sent a set of questions about the impact of existing and potentially new data retention legislation on individuals, how a “legal” targeted data retention could be designed, and what are the specific issues (data retention periods, geographical restrictions, and so on) that could be included in case new data retention legislation were to be proposed.

According to the Commission, there are no clear “next stages” in the process, apart from the aforementioned study that will have to be prepared after the Council conclusions on data retention published on 6 June. The Commission will, in addition to this study, continue dialogues with civil society, data protection authorities, EU Fundamental Rights Agency and Member States that will inform a potential future action (or inaction) from the EC on data retention.

Four years ago EDRi met with DG HOME and presented them a study of a set of data retention laws which were likely to be considered illegal in light of the Digital Rights Ireland case. The EC then replied to our meeting and study saying that they would “monitor” existing data retention laws and their compliance with EU law. Four years after that, no infringing proceedings have been launched against any Member State and their (quite probably) illegal data retention laws.

Read more:

EU Member States willing to retain illegal data retention (16.09.2019)
https://edri.org/eu-member-states-willing-to-retain-illegal-data-retention/

Data retention – Conclusions on retention of data for the purpose of fighting crime (27.05.2019)
http://data.consilium.europa.eu/doc/document/ST-9663-2019-INIT/en/pdf

EU Member States plan to ignore EU Court data retention rulings (29.11.2017)
https://edri.org/eu-member-states-plan-to-ignore-eu-court-data-retention-rulings/

(Contribution by Diego Naranjo, EDRi)

close
05 Jun 2019

Czech Constitutional Court rejects complaint on data retention

By Iuridicum Remedium

Czech EDRi member Iuridicum Remedium (IuRe) has fought for 14 years against Czech implementation of the controversial EU data retention Directive which was declared invalid by the Court of Justice of the European Union (CJEU). After years of campaigning and many hard legislative battles, the fight has finally come to an end: on 22 May 2019, the Czech Constitutional Court rejected IuRe’s proposal to declare the Czech data retention law unconstitutional. The court ended up rejecting the claim, despite it being supported by 58 deputies of the parliament across the political spectrum.

In the Czech Republic, data retention legislation was first adopted in 2005. In March 2011, the Constitutional Court upheld first IuRe’s complaint on original data retention legislation and canceled it. In 2012, however, a new legal framework was adopted to implement the EU Data Retention Directive – that the CJEU found to contravene European law in Digital Rights Ireland case in 2014, and to comply with the Constitutional Court’s decision. This new legislation contained still problematic general and indiscriminate data retention and a number of sub-problems. Therefore, even in the light of CJEU’s decisions, IuRe decided to prepare a new constitutional complaint.

IuRe originally submitted a complaint to challenge the very principle of bulk data retention as massive collection and storage of data of people, without any link to the individual suspicion in criminal activities, extraordinary events, or terrorist threats. The CJEU already declared this general and indiscriminate data retention principle inadmissible in two of its decisions (Digital Rights Ireland and Tele2). Although the Czech Constitutional Court refers to both judgments several times, their conclusions – especially when it comes to analyse the foundations of why data retention is not in line with the Czech Constitution – does not deal with it properly.

The Constitutional Court’s main argument to declare data retention constitutional is that as communications increasingly occur in the digital domain, so does crime. Even though this could be true,it is regrettable that the Constitutional Court did not further develop this reasoning and argued why this is in itself a basis for bulk data retention. The Court also ignored that greater use of electronic communication also implies greater interference with privacy that is associated with general data retention.

The Court further argued that personal data, even without an obligation to retain it, are kept in any case for other purposes, such as invoicing for services, answering to claims and behavioral advertising. In the Court’s opinion, the fact that people give operators their “consent” to process their personal data reinforces the argument to claim that data retention is legal and acceptable. Unfortunately, the Constitutional Court does not take into consideration that the volume, retention period and sensitivity of personal data held by operators for other purposes is quite different from the obligatory data retention prescribed by the Czech data retention law. Furthermore, the fact that operators need to keep some data already (for billing purposes for example) shows that police would not be completely left in the dark without a legal obligation to store data.

In addition to the proportionality of data retention, which has not been clarified by the Court, another issue is how “effective” data retention is to reduce crime. Statistics from 2010 to 2014 show that there was no significant increase in crime or reduction of the crime detection in the Czech Republic after the Constitutional Court abolished the obligation to retain data in 2011. Police statistics presented to the Court that data retention is not helping to combat crime in general, nor facilitating investigation of serious crimes (such as murders) or other types of crimes (such as frauds or hacking). In arguments submitted by police representatives and by the Ministry of the Interior, some examples of individual cases where the stored data helped (or hampered an investigation when missing) were repeatedly mentioned. However, it has not been proven by any evidence shown to the Court that general and indiscriminate data retention would improve the ability of the police to investigate crimes.

The Court also did not annul the partially problematic parts of the legislation, such as the data retention period (six months), the volume of data to be retained, or too broad range of criminal cases where data may be required. Furthermore, the Court has not remedied the provisions of the Police Act that allow data to be requested without court authorisation in cases of search for wanted or missing persons or the fight against terrorism.

In its decision, the Constitutional Court acknowledges that stored data are very sensitive and that in some cases the sensitivity of so-called “metadata” may even be greater than the retention of the content of the communications. Thus, the retention of communications data represents a significant threat to individuals’ privacy. Despite all of this, the Court discarded IuRE’s claim to declare data retention law unconstitutional.

IuRe disagrees with the outcome of this procedure in which the Court has come to a conclusion on the constitutional conformity of the existing Czech data retention legislation. Considering the wide support for the complaint, IuRe will work on getting at least a part of existing arrangements changed by legislative amendments. In addition to this, we will consider the possibility for the EC to launch infringing proceedings or initiate other judicial cases, since we strongly believe that the existing bulk data retention of communications data in Czech law still contravenes the aforementioned CJEU decisions on mass data retention.

Czech constitutional decision (only in Czech)
https://www.usoud.cz/fileadmin/user_upload/Tiskova_mluvci/Publikovane_nalezy/2019/Pl._US_45_17_vcetne_disentu.pdf

Proposal to revoke data retention filed with the Czech Court (10.01.2018)
https://edri.org/proposal-to-revoke-data-retention-filed-with-the-czech-court/

(Contribution by Jan Vobořil, EDRi member Iuridicum Remedium, Czech Republic)

close
22 May 2019

ePrivacy: Private data retention through the back door

By Digitalcourage

Blanket data retention has been prohibited in several court decisions by the European Court of Justice (ECJ) and the German Federal Constitutional Court (BVerfG). In spite of this, some of the EU Member States want to reintroduce it for the use by law enforcement authorities – through a back door in the ePrivacy Regulation.

The ePrivacy Regulation

The ePrivacy Regulation, which is currently under negotiation, is aimed at ensuring privacy and confidentiality of communications in the electronic communications, by complementing and particularising the matters covered in the General Data Protection Regulation (GDPR). Confidentiality of communications is currently covered by the ePrivacy Directive dating back to 2002. A review of this piece of legislation is long overdue, but Member States keep delaying the process and therefore not updating necessary protections for online privacy in the EU.

Ever since 2017, the EU Ministers of Justice and Interior have been “deliberating” the Tele2 verdict by the European Court of Justice. The Court had declared the blanket retention of telecommunications metadata inadmissible. Yet the EU Member States are unwilling to accept this ruling. During an informal discussion in Valetta on 26 and 27 January 2017, the Justice and Interior Ministers expressed their wish for “a common reflection process at EU level on data retention in light of the recent judgments of the Court of Justice of the European Union” (Ref. EU Council 6713/17) to implement EU-wide data retention. This process was set in motion in March 2019 by the Presidency of the Council of the European Union. A sub-group of the Council’s Working Party on Information Exchange and Data Protection (DAPIX) was put in charge. From the very beginning, this reflection process has mainly served the purpose of finding opportunities to implement yet another instance of data retention on the EU level. This has been proven by documents published by EDRi member Statewatch.

Instead of complying with the clear rulings by the European Court of Justice (Tele 2 and Digital Rights Ireland), the responsible ministers are doing everything they can to “resurrect” data retention, potentially using ePrivacy as a basis for a new era of data retention. In a working document (WK 11127/17), the Presidency of the EU Council in 2017 concluded in addition to a specific data retention legislation it would be desirable to also collect citizens’ communications data (metadata) in ePrivacy to avoid so companies can use it for commercial purposes. The logic behind being, probably, to circumvent CJEU case law by not imposing an obligation on companies but having the data available when law enforcement needs it thanks to ePrivacy.

Private data retention

In plain words, this means: If the courts will not allow mass data retention, service providers will simply be given incentives to do so by their own choice. That is why the ePrivacy Regulation is being watered down by Member States in order to give the service providers manifold permissions to store data for a wide variety of reasons (see Article 6 of the draft ePrivacy Regulation). Those responsible are relying on the assumption that the providers’ appetite for data will be sufficient even without an explicit obligation to retain data.

The immediate problem with this type of private data retention is the fact that it weakens the protection of all users’ personal data against data hungry corporations whose main interest is making profit. What’s even worse is that, once again, a governmental function is being outsourced to private corporations. These corporations are not subject to democratic scrutiny, and they are given ever more power over the countries concerned.

In Germany, the hurdles for criminal investigators to get access to data are already very low. The e-mail provider Posteo, for example, had to pay a fine because they were unable to provide the criminal investigators the IP addresses from which a certain e-mail account had been accessed. Posteo simply hadn’t stored those data; they were erased as soon as they were received. The Court declared the fine to be justified. This decision could easily lead to a situation where private companies prefer to err on the side of caution and store even more data, just to avoid such fines.

The draft ePrivacy Regulation as proposed by the European Commission in 2017 placed relatively strict duties on service providers regarding data protection. For example, they were obliged to either erase or anonymise all data that was no longer needed. This is diametrically opposed to the goal of private data retention, and the DAPIX task force noticed it, too. As the Presidency of the EU Council statedservice providers will be given the freedom to use and store data in order to prevent “fraudulent use or abuse”. And these data could then be picked up by law enforcement doing criminal investigation.

No data retention through the back door!

EDRi member Digitalcourage wanted to know how the German government argued with respect to the data retention issue, and submitted a request for the disclosure of documents related to it. Unfortunately, the request was largely denied by the Council of the European Union, long after the legal deadline was missed. The secretariat declared that a disclosure would be a threat to public safety – the risk to the relationship of trust between the Member States and Eurojust, the EU agency dealing with judicial co-operation in criminal matters among agencies of the Member States, would be too severe. Furthermore, such a disclosure would threaten ongoing criminal investigation or judicial procedures. No further details were given. Digitalcourage lodged an appeal against this dismissal, but in addition to being asked for patience, they haven’t received an answer from the European Commission. Several requests pursuant to the Freedom of Information Act have also been submitted to German ministries.

It is unbelievable to imagine policy makers contemplating existing and potential new surveillance laws that would clearly be illegal. However, this is exacly what the DAPIX task force is doing, and they are doing it behind closed doors. The changes they propose can be found in the current draft ePrivacy Regulation. Digitalcourage will continue to request documents from the EU and the German government. As soon as the trilogue negotiations between EU Council, Commission and Parliament begin, the concerns will be voiced our concerns and a demand: No data retention through the back door!

This article was first published at https://digitalcourage.de/blog/2019/eprivacy-private-data-retention-through-the-back-door

Digitalcourage
https://digitalcourage.de/en

ePrivacy: Private data retention through the back door (in German, 18.04.2019)
https://digitalcourage.de/blog/2019/eprivacy-private-vorratsdatenspeicherung-durch-hintertuer

(Contribution by EDRi member Digitalcourage, Germany)

close
27 Feb 2019

New UK counter-terrorism law limits online freedoms

By Index on Censorship

The Counter-Terrorism and Border Security Act 2019 became law in the United Kingdom (UK) in February, after passing through UK parliament with less debate than many had hoped, while Brexit dominated the political agenda. The new law is problematic in many ways, including the way in which it limits freedom of expression and access to information online. It also creates extensive new border security powers, which include accessing information on electronic devices.

The draft law was widely criticised by civil society organisations, which led to some changes to the text. However, the changes were limited and did not do enough to safeguard freedom of expression and access to information.

edri.org/wp-content/uploads/2015/09/Supporters_banner.png” alt=”—————————————————————– Support our work – make a recurrent donation! edri.org/supporters/ —————————————————————–” class=”wp-image-8690″/>

The new law criminalises publication of pictures of clothes, symbols, or for example of a flag in a way that raises “reasonable suspicion” – an expression that leads into a low legal threshold – that the person publishing the picture is a member or supporter of a terrorist organisation. “Publication” includes posting on social media pictures or video that have been taken privately at home. This could be, for example, a selfie with a poster in the background that shows the symbol of a terrorist organisation.

As previously reported in the EDRi-gram, parliament’s Joint Committee on Human Rights found that this clause “risks a huge swathe of publications being caught, including historical images and journalistic articles”. United Nations rapporteur Fionnuala Ní Aoláin, in a submission that expressed serious concerns about the draft law, found that the clause risks criminalising “a broad range of legitimate behaviour, including reporting by journalists, civil society organizations or human rights activists as well as academic and other research activity.”

A related problem is that the UK authorities have admitted that at least 14 organisations that are currently listed as terrorist organisations do not meet the criteria for being on the list.

Another clause makes it a crime to watch or otherwise access information online that is likely to be useful to a person committing or preparing acts of terrorism. It also includes, for example, watching the content over the shoulder of another person who is sitting by a computer.

After debates in parliament, the government agreed to make a change, which states that working as a journalist or carrying out academic research is an acceptable excuse for accessing material online that could be useful for terrorism. This was a positive change, but not nearly sufficient, and the clause is still very problematic. No terrorist intent is required, and if someone for example watches a terrorist video online because she or he wants to understand why people might be drawn to terrorism, the person risks a long prison sentence.

The law also introduces wide new border security powers connected to a new and vaguely defined crime of “hostile activity”. Under the new powers, anyone can be stopped on the border, even if there are no suspicions that the person has been involved in hostile activity, and it’s a crime not to answer questions by the border officers or hand over to them requested information. A draft code of practice, which will guide how border officers use the powers, specifies that information “may include passwords to electronic devices”. During the first hour of questioning there is no right to a lawyer.

How this deeply concerning piece of legislation will work in practice remains to be seen. We fear that vague and overbroad provisions lead to arbitrariness and discrimination affecting human rights defenders, journalists, or ethnic minority groups on the grounds of mere suspicion.

Index on Censorship
https://www.indexoncensorship.org/

UK counter-terrorism law would restrict freedom of expression (26.09.2018)
https://edri.org/uk-counter-terrorism-law-would-restrict-freedom-of-expression/

(Contribution by Joy Hyvarinen, EDRi observer Index on Censorship, the United Kingdom)

EDRi-gram_subscribe_banner

Twitter_tweet_and_follow_banner

close
20 Feb 2019

FRA and EDPS: Terrorist Content Regulation requires improvement for fundamental rights

By EDRi

On 12 February 2019, the European Union Agency for Fundamental Rights (FRA) published an Opinion regarding the Regulation on preventing the dissemination of terrorist content online. In the same day, the European Data Protection Supervisor (EDPS) submitted its comments on the topic to the responsible committee in the European Parliament. These two texts complement EDRi’s analysis and the previous Report prepared by three UN Special Rapporteurs on the proposal.

FRA: Substantial threats for freedom of expression

In its Opinion, FRA structures its criticism around four main areas.

First, it calls to improve the definition of “terrorist content”. The Opinion highlights the need to add to this definition the concept of “incitement” or giving specific instructions to commit terrorist offences. The definition of such instructions should be aligned with the Terrorism Directive and specific actions such as “providing specific instructions on how to prepare explosives or firearms”. Further, the text calls to limit the proposal to content disseminated to the public and to exclude from the Regulation’s scope certain forms of expression, such as content that relates to educational, journalistic, artistic or research purposes.

Second, FRA calls to ensure that fundamental rights safeguards are in place through “effective judicial supervision”. Currently, there is no mention in the proposal of any “independent judicial authority in the adoption or prior to the execution of the removal order”. FRA also reminds of the need to avoid a disproportionate impact on the freedom to conduct a business when having to react to notices for removals of terrorist content in a very short time-frame (up to one hour in the original proposal). FRA suggests instead a reaction time of 24 hours from the receipt of the removal order. Regarding safeguards in cross-border removal orders, the Opinion calls to ensure that the authorities of the Member State where the content is hosted are “empowered to review the removal order in cases where there are reasonable grounds to believe that fundamental rights are impacted within its own jurisdiction.” FRA thus encourages the EU legislator to require a notification by the issuing Member State to the host Member State – in addition to the notification to the hosting service provider – when the removal order is issued.

Third, FRA states that the proposal “does not sufficiently justify the necessity of introducing the mechanism of referrals”, and suggests to distinguish between content needing a removal order and content requiring a referral.

Fourth, the Opinion states that the proposed proactive measures of the Regulation come very close to a general monitoring obligation. This is not only prohibited by Article 14 of the EU’s eCommerce Directive, but also generally incompatible with individuals’ right to freedom of expression under Article 11 of the Charter of Fundamental Rights in the European Union. Thus, FRA proposes to delete from the Regulation text the obligation for Hosting Service Providers’ (HSPs) to introduce proactive measures.

EDPS: Concerns for the Regulation’s data retention and GDPR compliance

While the EDPS issued similar concerns regarding the definition of terrorist content and the “one hour rule”, it also issued some targeted comments on the concerns surrounding potentially privacy intrusive elements of the Regulation proposal.

In the Regulation proposal, Hosting Service Providers’ have obligations to retain data of supposed terrorist content that they delete or disable access to on their platform. The EDPS presents substantive doubts whether such obligations would be compliant with case law of the Court of Justice of the European Union (CJEU). This opinion was based on the assessment that the proposed measures, in similarity to the Data Retention Directive that was struck down by the CJEU in 2014, do not lay down specific criteria regarding the time period and access and use limitations for the retained data. The EDPS is further not convinced of the overall usefulness of data retention measures in the Terrorist Content Regulation, given that the text obliges HSPs to promptly inform the competent law enforcement authorities of any evidence regarding terrorist offences.

On the proposal’s foreseen proactive measures, the EDPS stated that automated tools for recognising and removing content would likely fall under Article 22 of the General Data Protection Regulation (GDPR), which regulates citizens’ rights in automated decision making and profiling activities. This would, in turn, require more substantive safeguards than the ones provided in the Commission’s proposal, including case-specific information to the data subject, understandable information about how the decision was reached, and the right to obtain human intervention in any case.

The observations of the EU’s most important fundamental rights institutions feed into a steady stream of criticism of the proposal. These represent noteworthy positions for policy makers in the legislator institutions, particularly in the European Parliament’s LIBE, CULT and IMCO committees that are currently adopting their positions. It is now more evident than ever that the proposed Terrorist Content Regulation needs substantive reform to live up to the Union’s values, and to safeguard the fundamental rights and freedoms of its citizens.

Read more:

EDRi Recommendations for the European Parliament’s Draft Report on the Regulation on preventing the dissemination of terrorist content online (December 2018)
https://edri.org/files/counterterrorism/20190108_EDRipositionpaper_TERREG.pdf

All Cops Are Blind? Context in terrorist content online (13.02.2019)
https://edri.org/context-in-terrorist-content-online/

Terrorist Content: LIBE Rapporteur’s Draft Report lacks ambition (25.01.2019)
https://edri.org/terrorist-content-libe-rapporteurs-draft-report-lacks-ambition/

CULT: Fundamental rights missing in the Terrorist Content Regulation (21.01.2019)
https://edri.org/cult-fundamental-rights-missing-in-the-terrorist-content-regulation/

Terrorist Content: IMCO draft Opinion sets the stage right for EP (18.01.2019)
https://edri.org/terrorist-content-imco-draft-opinion-sets-the-stage-right-for-ep/

(Contribution by Diego Naranjo and Yannic Blaschke)

Twitter_tweet_and_follow_banner


close