12 Feb 2020

Cloud extraction: A deep dive on secret mass data collection tech

By Privacy International

Mobile phones remain the most frequently used and most important digital source for law enforcement investigations. Yet it is not just what is physically stored on the phone that law enforcement are after, but what can be accessed from it, primarily data stored in the “cloud”. This is why law enforcement is turning to “cloud extraction”: the forensic analysis of user data which is stored on third-party servers, typically used by device and application manufacturers to back up data. As we spend more time using social media and messaging apps, store files with the likes of Dropbox and Google Drive, as our phones become more secure, locked devices harder to crack, and file-based encryption becomes more widespread, cloud extraction is, as a prominent industry player says, “arguably the future of mobile forensics.”

The report “Cloud extraction technology: the secret tech that lets government agencies collect masses of data from your apps” brings together the results of Privacy International’s open source research, technical analyses and freedom of information requests to expose and address this emerging and urgent threat to people’s rights. 

Phone and cloud extraction go hand in hand

EDRi member Privacy International has repeatedly raised concerns over risks of mobile phone extraction from a forensics perspective and highlighted the absence of effective privacy and security safeguards. Cloud extraction goes a step further, promising access to not just what is contained within the phone, but also to what is accessible from it. Cloud extraction technologies are deployed with little transparency and in the context of very limited public understanding. The seeming “wild west” approach to highly sensitive data carries the risk of abuse, misuse and miscarriage of justice. It is a further disincentive to victims of serious offences to hand over their phones, particularly if we lack even basic information from law enforcement about what they are doing. 

The analysis of data extracted from mobile phones and other devices using cloud extraction technologies increasingly includes the use of facial recognition capabilities. If we consider the volume of personal data that can be obtained from cloud-based sources such as Instagram, Google photos, iCloud, which contain facial images, the ability to use facial recognition on masses of data is a big deal. Because of this, greater urgency is needed to address the risks that arise from such extraction, especially as we consider the addition of facial and emotion recognition to software which analyses the extracted data.  The fact that it is potentially being used on vast troves of cloud-stored data without any transparency and accountability is a serious concern.

What you can do

There is an absence of information regarding the use of cloud extraction technologies, making it unclear how this is lawful and equally how individuals are safeguarded from abuse and misuse of their data.  This is part of a dangerous trend by law enforcement agencies and we want to ensure globally the existence of transparency and accountability with respect to new forms of technology they use. 

If you live in the UK, you can submit a Freedom of Information Act Request to your local police to ask them about their use of cloud extraction techonoligies using this template: https://privacyinternational.org/action/3324/ask-your-local-uk-police-force-about-cloud-extraction. You can also use it to send a request if you are based in another country which has Freedom of Information legislation.

Privacy International
https://privacyinternational.org/

Cloud extraction technology: the secret tech that lets government agencies collect masses of data from your apps (07.01.2020)
https://privacyinternational.org/long-read/3300/cloud-extraction-technology-secret-tech-lets-government-agencies-collect-masses-data

Phone Data Extraction
https://privacyinternational.org/campaigns/phone-data-extraction

Push This Button For Evidence: Digital Forensics
https://privacyinternational.org/explainer/3022/push-button-evidence-digital-forensics

Can the police limit what they extract from your phone? (14.11.2019)
https://privacyinternational.org/news-analysis/3281/can-police-limit-what-they-extract-your-phone

Facial recognition and fundamental rights 101 (04.12.2019)
https://edri.org/facial-recognition-and-fundamental-rights-101/

Ask your local UK police force about cloud extraction
https://privacyinternational.org/action/3324/ask-your-local-uk-police-force-about-cloud-extraction

(Contribution by Antonella Napolitano, EDRi member Privacy International)

close
12 Feb 2020

Digitalcourage fights back against data retention in Germany

By Digitalcourage

On 10 February 2020, EDRi member Digitalcourage published the German government’s plea in the data retention case at the European Court of Justice (ECJ). Dated 9 September 2019, the document from the government explains the use of retained telecommunications data by secret services, the question whether the 2002 ePrivacy Directive might apply to various forms of data retention, which exceptions from human rights protections apply to secret service operations, and justifies its plans for the use of data retention to solve a broad range of crimes with the example of a case of the abduction of a Vietnamese man in Berlin by Vietnamese agents. However, this case is very specific and, even if then the retained data was “useful”, that is not a valid legal basis for mass data retention, and therefore can not justify drastic incisions into the basic rights of all individuals in Germany. Finally, the German government also argues that the scope and time period of the storage makes a difference regarding the compatibility of data retention laws with fundamental rights.

Digitalcourage calls for all existing illegal data retention laws to be declared invalid in the EU. There are no grounds for blanket and suspicion-less surveillance in a democracy and under the rule of law. Whether it is content data or metadata that is being stored, data retention (blanket and mass collection of telecommunications data) is inappropriate, unnecessary and ineffective, and therefore illegal. Where the German government argues that secret services need to use telecommunications data to protect state interests, Digitalcourage agrees with many human rights organisations that activities of secret services can be a direct threat to the core trust between the general public and the state. The ECJ has itself called for the storage to be reduced to the absolutely required minimum – and that, according to Digitalcourage, can be only be fulfilled if no data is stored without individual suspicion.

Digitalcourage
https://digitalcourage.de/

Press release: EU data retention: Digitalcourage publishes and criticises the position of the German government (only in German, 10.02.2020)
https://digitalcourage.de/pressemitteilungen/2020/bundesregierung-eugh-eu-weite-vorratsdatenspeicherung

(Contribution by Sebastian Lisken, EDRi member Digitalcourage, Germany)

close
12 Feb 2020

Double legality check in e-evidence: Bye bye “direct data requests”

By Chloé Berthélémy

After having tabled some 600 additional amendments, members of the European Parliament Committee on Civil Liberties (LIBE) are still discussing the conditions under which law enforcement authorities in the EU should access data for their criminal investigations in cross-border cases. One of the key areas of debate is the involvement of a second authority in the access process – usually the judicial authority in the State in which the online service provider is based (often called the “executing State”).

To prevent the misuse of this new cross-border data access instrument, LIBE Committee Rapporteur Birgit Sippel’s draft Report had angered the Commission by proposing that the executing State should receive, by default, the European Preservation or Production Order at the same time as the service provider. It should then have ten days to evaluate and possibly object to an Order by invoking one of the grounds for non-recognition or non-execution – including based on a breach of the EU Charter of Fundamental Rights.

What is more, the Sippel Report proposes that if it is clear from the early stages of the investigation that a suspected person does neither reside in the Member State that is seeking data access (the issuing State) nor in the executing State where the service provider is established, the judicial authorities of the State in which the person resides (the affected State) should also get the chance to intervene.

Notification as a fundamental element of EU judicial cooperation

The reasoning behind such a notification system is compelling: Entrusting one single authority to carry out the full legality and proportionality assessment for two or even three different jurisdictions (the issuing, the executing and the affected State) is careless at best. A national prosecutor or judge alone cannot possibly take into account all national security and defence interests, immunities and privileges and the legal framework of the other Member States, nor the special protections a suspected person may have in their capacity as a lawyer, doctor or journalist. This is especially relevant if the other Member States’ rules are different or even incompatible with the rules of the prosecutor’s own domestic investigation. The examination of a second judicial authority with a genuine possibility to review the Order is therefore of paramount importance to ensure its legality.

The LIBE Committee is currently discussing the details of this notification process. Some amendments that were tabled are unfortunately trying to undermine the protections that the notification requirement would bring. For example, some try to restrict the notification to Production Orders only (when data is transmitted directly), excluding all Preservation Orders (when the data is just frozen and needs to be acquired with a separate Order). Others try to limit notification to transactional data (aka metadata) or content data, alleging that subscriber data is somehow less sensitive and therefore needs less protection. Lastly, some propose that the notification does not have suspensive effects on the obligations of the service provider to respond to an order, meaning that if the notified State objects to an order and the service provider already gave out the data, it is too late.

The Parliament should uphold the basic principles of human rights law

If accepted, some of those amendments would bring the Parliament position dangerously close to the Council’s highly problematic weak notification model which does not provide any of the necessary safeguards it is supposed to have. To ensure the human rights compliance of the procedure, notifying the executing and the affected State should be mandatory for all types of data and Orders. Notifications should be simultaneously sent to the relevant judicial authority and the online service provider, and the latter should wait for a positive reaction from the former before executing the Order. The affected State should have the same grounds for refusal as the executing State, because it is best placed to protect its residents and their rights.

There seems to be a general consensus in the European Parliament about the involvement of a second judicial authority in the issuance of Orders. Meanwhile, the Commission grits its teeth and continues to pretend that mutual trust among EU Member States is all that is needed to protect people from law enforcement overreach. So far, the Commission seems to refuse to see the tremendous risks that its “e-evidence” proposal entails – especially in a context where some Member States are subjected to Article 7 proceedings which could lead to the suspension of some of their rights as Member States, because of endangered independence of their judicial systems and potential breaches of the rule of law. Mutual trust should not serve as an excuse to undermine individuals’ fundamental right to data protection and the basic principles of human rights law.

Cross-border access to data for law enforcement: Document pool
https://edri.org/cross-border-access-to-data-for-law-enforcement-document-pool/

“E-evidence”: Repairing the unrepairable (14.11.2019)
https://edri.org/e-evidence-repairing-the-unrepairable/

EU rushes into e-evidence negotiations without common position (19.06.2019)
https://edri.org/eu-rushes-into-e-evidence-negotiations-without-common-position/

Recommendations on cross-border access to data (25.04.2019)
https://edri.org/files/e-evidence/20190425-EDRi_PositionPaper_e-evidence_final.pdf

(Contribution by Chloé Berthélémy, EDRi)

close
12 Feb 2020

Data protection safeguards needed in EU-Vietnam trade agreements

By Vrijschrift

On 12 February 2020, the European Parliament gave consent for the ratification of the EU-Vietnam trade and investment agreements.

The trade agreement contains two cross-border data flow commitments. The related data protection safeguards in this agreement are similar to the ones in the EU-Japan agreement, which entered into force in February 2019. Civil society organisations and academics had pointed out flaws in these safeguards.

The EU-Vietnam investment agreement contains a variant of the controversial investor-to-state dispute settlement (ISDS) mechanism. In Opinion 1/17 (ISDS in EU-Canada CETA) the Court of Justice of the European Union found this mechanism compatible with the EU Treaties. ISDS does not interfere with the principle of autonomy of EU Law as the EU and its member states can refuse to pay ISDS damages awards, the Court suggests. Refusing to pay ISDS damages, however, comes with serious drawbacks.

The continued use of weak data protection safeguards is all the more disappointing as two years ago, in January 2018, the European Commission adopted a proposal for stronger safeguards to be used in trade agreements. Consumer and digital rights organisations supported these safeguards in principle. The Commission, however, never applied it. In order to properly protect the fundamental right to data protection in the context of trade agreements, the new von der Leyen Commission should adopt the proposed better safeguards and actually use them.

Vrijschrift
https://www.vrijschrift.org/

EU/Vietnam Free Trade Agreement 2018/0356(NLE)
https://oeil.secure.europarl.europa.eu/oeil/popups/ficheprocedure.do?reference=2018/0356(NLE)&l=en

Weak data protection in EU-Vietnam trade agreement (06.02.2020)
https://www.vrijschrift.org/serendipity/index.php?/archives/242-Weak-data-protection-in-EU-Vietnam-trade-agreement.html

EU-Japan trade agreement not compatible with EU data protection (10.01.2018)
https://edri.org/eu-japan-trade-agreement-eu-data-protection/

The European Commission rightly decides to defend citizens’ privacy in trade discussions (28.02.2018)
https://edri.org/the-european-commission-rightly-decides-to-defend-citizens-privacy-in-trade-discussions/

Study launch: The EU can achieve data protection-proof trade agreements (13.07.2016)
https://edri.org/study-launch-eu-can-achieve-data-protection-proof-trade-agreements/

EU Court CETA ruling shows failure of ISDS reform (06.05.2019)
https://www.vrijschrift.org/serendipity/index.php?/archives/237-EU-Court-CETA-ruling-shows-failure-of-ISDS-reform.html

(Contribution by Ante Wessels, EDRi member Vrijschrift, the Netherlands)

close
12 Feb 2020

PI and Liberty submit a new legal challenge against MI5

By Privacy International

On 1 February 2020, EDRi member Privacy International (PI) and civil rights group Liberty filed a complaint with the Investigatory Powers Tribunal, the judicial body that oversees the intelligence agencies in the United Kingdom, against the security service MI5 in relation to how they handle vast troves of personal data.

In mid-2019, MI5 admitted, during a case brought by Liberty, that personal data was being held in “ungoverned spaces”. Much about these ungoverned spaces, and how they would effectively be “governed” in the future, remained unclear. At the moment, they are understood to be a “technical environment” where personal data of an unknown number of individuals was being “handled”. The use of “technical environment” suggests something more than simply a compilation of a few datasets or databases.

The longstanding and serious failings of MI5 and other intelligence agencies, in relation to these “ungoverned spaces” first emerged in PI’s pre-existing case that started in November 2015. The case challenges the processing of bulk personal datasets and bulk communications data by the UK Security and Intelligence Agencies.

In the course of these proceedings, it was revealed that PI’s data were illegally held by MI5, among other intelligence and security agencies. MI5 deleted PI’s data while the investigation was ongoing. With the new complaint PI also requested the reopening of this case in relation to MI5’s actions.

In parallel proceedings brought by Liberty against the bulk surveillance powers contained in the Investigatory Powers Act 2016 (IPA), MI5 admitted that personal data was being held in “ungoverned spaces”, demonstrating a known and continued failure to comply with both statutory and non-statutory safeguards in relation to the handling of bulk data since at least 2014. Importantly, documents disclosed in that litigation and detailed in the new joint complaint showed that MI5 had sought and obtained bulk interception warrants on the basis of misleading statements made to the relevant authorities.

The documents reveal that MI5 not only broke the law, but for years misled the Investigatory Powers Commissioner’s Office (IPCO), the body responsible for overseeing UK surveillance practices.

In this new complaint, PI and Liberty argue that MI5’s data handling arrangements result in the systematic violation of the rights to privacy and freedom of expression (as protected under Articles 8 and 10 of the European Convention of Human Rights) and under EU law. Furthermore, they maintain that the decisions to issue warrants requested by MI5, in circumstances where the necessary safeguards were lacking, are unlawful and void.

Privacy International
https://privacyinternational.org/

MI5 ungoverned spaces challenge
https://privacyinternational.org/legal-action/mi5-ungoverned-spaces-challenge

Bulk Personal Datasets & Bulk Communications Data challenge
https://privacyinternational.org/legal-action/bulk-personal-datasets-bulk-communications-data-challenge

The Investigative Tribunal case no. IPT/15/110/CH
https://privacyinternational.org/sites/default/files/2019-08/IPT-Determination%20-%2026September2018.pdf

Reject Mass Surveillance
https://www.libertyhumanrights.org.uk/our-campaigns/reject-mass-surveillance

MI5 law breaking triggers Liberty and Privacy International legal action (03.02.2020)
https://www.libertyhumanrights.org.uk/news/press-releases-and-statements/mi5-law-breaking-triggers-liberty-and-privacy-international-legal

(Contribution by EDRi member Privacy International)

close
12 Feb 2020

Dangerous by design: A cautionary tale about facial recognition

By Ella Jakubowska

This series has explored facial recognition as a fundamental right; the EU’s response; evidence about the risks; and the threat of public and commercial data exploitation. In this fifth installment, we consider an experience of harm caused by fundamentally violatory biometric surveillance technology.

Leo Colombo Viña is the founder of a software development company and a professor of Computer Science. A self-professed tech lover, it was “ironic”, he says, that a case of mistaken identity with police facial recognition happened to him. What unfolded next, paints a powerful picture of the intrinsic risks of biometric surveillance. Whilst Leo’s experience occurred in Buenos Aires, Argentina, his story raises serious issues for the deployment of facial and biometric recognition in the EU, too.

“I’m not the guy they’re looking for”

One day in 2019, Leo was leaving the bank mid-afternoon to take the metro back to his office. While waiting for the train, he was approached by a police officer who had received an alert on his phone that Leo was wanted for armed robbery 17 years ago. The alert had been triggered by the metro station’s facial recognition surveillance system, which was recently the subject of a large media campaign.

His first assumption was “okay, there’s something up, I’m not the guy they’re looking for”. But once the police showed him the alert, it clearly showed his picture and personal details. “Okay,” he thought, “what the f***?” When they told him that the problem could not be resolved there and then, and he would have to accompany them to the police station, Leo’s initial surprise turned into concern.

Wrongful criminalisation

It turned out that whilst the picture and ID number in the alert matched Leo’s, bizarrely, the name and date of birth did not. Having never committed a crime, nor even been investigated, Leo still does not know how his face and ID number came to be wrongfully included in a criminal suspect database. Despite subsequent legal requests from across civil society, the government have not made information about the processing of, storage of or access to people’s data available. This is not a unique issue: across Europe, policing technology and processing of personal data is frighteningly opaque.

At the police station, Leo spent four hours in the bizarre position of having to “prove that I am who I am”. He says the police treated him kindly and respectfully – although he thinks that being a caucasian professional meant that they dismissed him as a threat. The evidence for this came later, when a similar false alert happened to another man who also did not have a criminal record, but who had darker skin than Leo and came from a typically poorer area. He was wrongfully jailed for six days because the system’s alert was used to justify imprisoning him – despite the fact that his name was not a match.

Undermining police authority

If the purpose of policing is to catch criminals and keep people safe, then Leo’s experience is a great example of why facial recognition does not work. Four officers spent a combined total of around 20 hours trying to resolve his issue (at the taxpayers’ expense, he points out). That doesn’t include the time spent afterwards by the public prosecutor to try and work out what went wrong. Leo recalls that the police were frustrated to be tied up with bureaucracy and attempts to understand the decision that the system had made, whilst their posts were left vacant and real criminals went free.

The police told Leo that the Commissioner receives a bonus tied to the use of the facial recognition system. They confided that it seemed to be a political move, not a policing or security improvement. Far from helping them solve violent crime – one of the reasons often given for allowing such intrusive systems – it mostly flagged non-violent issues such as witnesses who had not turned up for trials because they hadn’t received a summons, or parents who had overdue child support payments.

The implications on police autonomy are stark. Leo points out that despite swift confirmation that he was not the suspect, the police had neither the ability nor the authority to override the alert. They were held hostage to a system that they did not properly understand or control, but they were compelled to follow its instructions and decisions without knowing how or why it had made them.

Technology is a tool made by humans, not a source of objective truth or legal authority. In Leo’s case, the police assumed early on that the match was not legitimate because he did not fit their perception of a criminal. But for others also wrongfully identified, the assumption was that they did look like a criminal, so the system was assumed to be working correctly. Global anti-racism activists will be familiar with these damaging, prejudicial beliefs. Facial recognition does not solve human bias, but rather supports it by giving discriminatory human assumptions a false sense of “scientific” legitimacy.

Technology cannot fix a broken system

The issues faced by Leo, and the officers who had to resolve his situation, reflect deeper systemic problems which cannot be solved by technology. Biased or inefficient police processes, mistakes with data entry, and a lack of transparency do not disappear when you automate policing – they get worse.

Leo has had other experiences with the fallacies of biometric technology.
A few years ago, he and his colleagues experimented with developing fingerprinting software at the request of a client, but ultimately decided against it. “We realised that biometric systems are not good enough,” he says. “It feels good enough, it[‘s] good marketing, but it’s not safe.” He points to the fact that he was recently able to unlock his phone using a picture of himself. “See? You are not secure.”

Leo shared his story – which quickly went viral on Twitter – because he wanted to show that “there is no magic in technology.” As a software engineer, people see him like a “medieval wizard”. As he sees it, though, he is someone with the responsibility and ability to show people the truth behind government propaganda about facial recognition, starting with his own experience.

Aftermath

I asked Leo if the government considered the experiences of those who had been affected. He laughed sardonically. “No, no, absolutely not, no.” He continues that “I shouldn’t be in that database, because I didn’t commit any crime.” Yet it took the public prosecutor four months to confirm the removal of his data, and the metro facial recognition system is still in use today. Leo thinks it has been a successful marketing tool for a powerful city government wanting to assuage citizens’ safety concerns. He thinks that the people have been lied to, and that fundamentally unsafe technology cannot make the city safer.

A perfect storm of human errors, systemic policing issues and privacy violations led to Leo being included in the database, but this is by no means a uniquely Argentinian problem. The Netherlands, for example, have included millions of people in a criminal database despite them never being charged with a crime. Leo reflects that “the system is the whole thing, from the beginning to end, from the input to the output. The people working in technology just look at the algorithms, the data, the bits. They lose the big picture. That’s why I shared my story … Just because.” We hope the EU is taking notes.

As told to Ella Jakubowska by Leo Colombo

Dismantling AI Myths and Hype (04.12.2019)
https://daniel-leufer.com/2019/12/05/dismantling-ai-myths-and-hype/

Data-driven policing: The hardwiring of discriminatory policing practices across Europe (19.11.2019)
https://www.citizensforeurope.eu/learn/data-driven-policing-the-hardwiring-of-discriminatory-policing-practices-across-europe

Facial recognition and fundamental rights 101 (04.12.2019)
https://edri.org/facial-recognition-and-fundamental-rights-101/

The many faces of facial recognition in the EU (18.12.2019)
https://edri.org/the-many-faces-of-facial-recognition-in-the-eu/

Your face rings a bell: Three common uses of facial recognition (15.01.2020)
https://edri.org/your-face-rings-a-bell-three-common-uses-of-facial-recognition/

Stalked by your digital doppelganger? (29.01.2020)
https://edri.org/stalked-by-your-digital-doppelganger/

Facial recognition technology: fundamental rights considerations in the context of law enforcement (27.11.2019)
https://fra.europa.eu/sites/default/files/fra_uploads/fra-2019-facial-recognition-technology-focus-paper.pdf

(Contribution by Ella Jakubowska, EDRi intern)

close
03 Feb 2020

Support our work by investing in a piece of e-clothing!

By EDRi

Your privacy is increasingly under threat. European Digital Rights works hard to have you covered. But there’s only so much we can do.

Help us help you. Help us get you covered.

Click the image to watch the video!

Check out our 2020 collection!*

*The items listed below are e-clothes. That means they are electronic. Not tangible. But still very real – like many other things online.

Your winter stock(ings) – 5€
A pair of hot winter stockings can really help one get through cold and lonely winter days. Help us to fight for your digital rights by investing in a pair of these superb privacy–preserving fishnet stockings. This delight is also a lovely gift for someone special.


A hat you can leave on – 10€
Keep your head undercover with this marvelous piece of surveillance resistance. Adaptable to any temperature types and – for the record – to several CCTV models, the item really lives up to its value. This hat is an indispensable accessory when visiting your favourite public space packed with facial recognition technologies.


Winter/Summer Cape – 25€
Are you feeling heroic yet? Our flamboyant Winter/Summer cape is designed to keep you warm and cool. This stylish accessory takes the weight off your shoulders – purchase it and let us take care of fighting for your digital rights!


Just another White T-Shirt – 50€
A white t-shirt can do wonders when you’re trying to blend in with a white wall. This wildly unexciting but versatile classic is one of the uncontested fundamental pillars of your privacy enhancing e-wardrobe.


THE privacy pants ⭐️ – 100€
This ultimate piece of resistance is engineered to keep your bottom warm in the coldest winter, but also aired up during the hottest summer days. Its colour guarantees the ultimate tree (of knowledge) look. The item comes with a smart zipper.


Anti-tracksuit ⭐️ – 250€
Keep your digital life healthy with the anti-tracking tracksuit. The fabric is engineered to bounce out any attempt to get your privacy off track. Plus, you can impress your beloved babushka too.


Little black dress ⭐️ – 500€
Whether at a work cocktail party, a funeral, shopping spree or Christmas party – this dress will turn you into the center of attention, in a (strangely) privacy-respecting manner.


Sew your own ⭐️ – xxx€
Unsure of any of the items above? Let your inner tailor free, customise your very own unique, designer garment, and put a price tag of your choice on it.



⭐️ The items of value superior to 100€ are delivered with an (actual, analog, non-symbolic) EDRi iron-on privacy patch that you can attach on your existing (actual, analog, non-symbolic) piece of clothing or accessory. If you wish to receive this additional style and privacy enhancer, don’t forget to provide us with your postal address (either via the donation form, or in your bank transfer message)!


Question? Remark? Idea? Please contact us brussels [at] edri [dot] org !

close
03 Feb 2020

ECtHR: Obligation on companies to identify all phone users is legal

By Diego Naranjo

On 30 January 2020, the European Court of Human Rights (ECtHR) issued its judgment on the Breyer VS Germany case. The case was brought by Patrick Breyer (currently a Member of the European Parliament, MEP) and Jonas Breyer (herewith “the applicants”), who complained about the obligation introduced by the Telecommunications Act in Germany to register all customers of pre-paid SIM cards. Similar obligations have been imposed in Romania and elsewhere. In total, there are 15 Council of Europe (CoE) Member States requiring subscriber registration of pre-paid SIM customers, versus 32 that do not have such laws. The applicants claimed a violation of Articles 8 and 10 of the European Convention of Human Rights – right to privacy and freedom of expression, respectively.

Indiscriminate collection of personal data? This time it is ruled legal.

According to the Court, the scope of the applicants’ complaint was not sufficiently grounded regarding freedom of expression and, therefore, analysed the application solely on a potential violation of the right to private life. The Court, by six votes to one, declared that there was no violation of the right to private life. According to the majority of the Court, even though there was a clear interference with the right to private life, the interference was legitimate because of reasons of public safety and prevention of disorder or crime. It was also necessary in a democratic society because it “strongly simplifies and accelerates investigation by law-enforcement agencies” and it can “contribute” to more “effective law enforcement”. Furthermore, it added the data stored and the interference deriving from it was “while not trivial, of a rather limited nature”.

But is efficiency the right approach? In the recent Advocate General (AG) Opinion on four data retention cases before the Court of Justice of the European Union (CJEU), the AG points out that the argument of efficiency cannot lead to watering democracy and that the fight crime (or terrorism in that case) cannot be analysed just in terms of “efficiency”. Indeed, installing CCTV cameras in every room in every house in order to prevent violence against women may be very “efficient”, but efficiency cannot be the ultimate reason (or even a legal basis) to implement any measures we could imagine.

Dissenting Opinion: Sensitive data and lack of effective safeguards

Fortunately, not all judges agreed. The dissenting Opinion of judge Carlo Ranzoni raises relevant questions and arguments which could well lead to a referral of the case to the Grand Chamber. In his dissenting Opinion, Ranzoni argues that he found a violation of Article 8 for various reasons. First, in the case in question, the measures are not confined to the fight against terrorism or other serious crimes (and even when investigating serious crimes, not all measures are justified). Ranzoni also argues that, even though the information stored was not sensitive in itself, the majority of the Court overlooked the possibilities of the “identification of the parties to every telephone call or message exchange and the attribution of possibly sensitive information to an identifiable person”, which in his opinion makes it comparable to similar interferences in the right to private life, as it did in Benedik v. Slovenia, which were not described then as of a “rather limited nature” by the ECtHR (para. 5 of the dissenting Opinion).

Ranzoni further suggests that the law in question allows for the storage of (and access to) data of all SIM card subscribers, without a link to the investigation of any serious crime, for a long period of time. This is a serious interference, and not a light one. However, the “crux of the case” is, according to Ranzoni, the analysis of the quality of the safeguards and how effectively they can prevent abuses. According to him, supervising authorities do not have real capacity to investigate possible abuses, because as the Constitutional Court itself pointed out, “the retrieving authority does not have to give reasons for its request”, and therefore the Federal Network Agency (that is in charge of the retrieval of data of phone users from companies for requesting authorities) would not be able to analyse if the request is admissible (para. 22). Therefore, effective review and supervision of retrieval requests by a judicial or otherwise independent authority are nonexistent. Finally, according to Ranzoni, the vast majority of victims of the interference “are left without any possibility of review” since “it appears unrealistic [for Data Protection Authorities] to review some 35 million data sets consulted by a wide range of different authorities” (para. 25 of the dissenting Opinion).

What next?

The applicants can still apply for a referral of the case to the Grand Chamber that could still overturn this judgment. The dissenting opinion brings strong arguments justifying such a referral. In the meantime, the pending cases Privacy International, C-623/17 and Ordre des barreaux francophones et germanophone et al. C-520/18 are also awaiting a judgment. If the CJEU judgment follows the AG Opinion, the obligation on private companies to perform mass blanket data retention of communications data would be considered once again illegal. If that happens in the CJEU, some of the arguments put forward by the majority of judges in the present ECtHR Breyer case (such as the “efficiency” for law enforcement argument) may help the applicants to overturn the arguments of the majority in this case.

Judgment: Case Breyer v. Germany
http://hudoc.echr.coe.int/eng?i=001-200442

Data retention: “National security” is not a blank cheque (29.01.2020)
https://edri.org/data-retention-national-security-is-not-a-blank-cheque/

AG’s Opinion: Mass retention of data incompatible with EU law (29.01.2020)
https://edri.org/ag-opinion-mass-retention-of-data-incompatible-with-eu-law/

(Contribution by Diego Naranjo, EDRi)

close
29 Jan 2020

Stalked by your digital doppelganger?

By Ella Jakubowska

In this fourth installment of EDRi’s facial recognition and fundamental rights series, we explore what could happen if facial recognition collides with data-hungry business models and 24/7 surveillance.

In the business of violating privacy

Online platforms, advertisers and data brokers already rely on amassing vast amounts of intimate user data, which they use to sell highly-targeted adverts and to “personalise” services. This is not a by-product of what they do: data monetisation is the central purpose of the Facebooks and Googles of the world.

The Norwegian Consumer Council released a report on the extent of the damage caused by these exploitative and often unlawful ad-tech ecosystems. In fact, many rights violations are exacerbated by data-driven business models, which are systemically opaque, intrusive and designed to grab as much personal data as possible. We’ve all heard the saying: “if the product is free, then you’re the product.”

Already, your interactions build an eerie (and often inaccurate and biased) digital portrait every time you send an email, swipe right or browse the web. And now, through the dystopianly-named “BioMarketing”, advertisers can put cameras on billboards which instantly analyse your face as you walk by, in order to predict your age, gender – and even your ethnicity or mood. They use this data to “personalise” the adverts that you see. So should you be worried that this real-time analysis could be combined with your online interactions? Is your digital doppelganger capable of stepping off your screen and into the street?

When your digital doppelganger turns against you

Airbnb’s discrimination against sex workers is just one example of how companies already use data from other platforms to unjustly deny services to users who, in accessing the service, have breached neither laws nor terms of service. As thousands of seemingly innocuous bits of data about you from your whole digital footprint – and now even your doorbell – are combined, inferences and predictions about your beliefs, likes, habits or identity can be easily used against you.

The recent Clearview AI scandal has cast light on a shady data company, unlawfully scraping and analysing billions of facial images from social media and other platforms and then selling this to police departments. Clearview AI’s systems were sold on the basis of false claims of effectiveness, and deployed by law enforcement with flagrant disregard for data protection, security, safeguards, accuracy or privacy. The extent of online data-gathering and weaponisation may be even worse than we thought.

Surveillance tech companies are cashing in on this biometric recognition and data hype. Spain’s Herta Security and the Netherlands’ VisionLabs are just two of the many companies using the tired (and de-bunked) “security” excuse to justify scanning everyone in shops, stations or even just walking down the street. They sell real-time systems designed to exclude “bad” people by denying them access to spaces, and reward “good” people with better deals. Worryingly, this privatises decisions that have a serious impact on fundamental rights, and enables private companies to act as judge, jury and executioner of our public spaces.

It gets worse…

Surveillance tech companies are predictably evasive about where they get their data and how they train their systems. Although both Herta Security and VisionLabs advertise stringent data protection compliance, their claims are highly questionable: Herta Security, for instance, proudly offers to target adverts based on skin colour. VisionLabs, meanwhile, say that their systems can identify “returning visitors”. It’s hard to see how they could do this without holding on to personally-identifiable biometric data without people’s consent (which would, of course, be a serious breach of data protection law).

As if this wasn’t enough, VisionLabs also enthusiastically offer to analyse the emotions of shoppers. This so-called “affect recognition” is becoming increasingly common, and is based on incredibly dubious scientific and ethical foundations. But that hasn’t stopped it being used to assess everything from whether migrants are telling the truth in immigration interviews to someone’s suitability for a job.

Aren’t we being a bit paranoid?

In theory, a collision of biometric analysis with vast data sources and 24/7 surveillance is terrifying. But would anyone really exploit your online and biometric data like this?

Thanks to facial recognition, many online platforms already know exactly what you look like, and covertly assign highly-specific categories to your profile for advertising purposes. They know if you’re depressed or have a sexually-transmitted disease. They know when you had your last period. They know if you are susceptible to impulse buys. They infer if you are lonely, or have low self-esteem.

There is evidence, too, that facial recognition systems at international borders have now been combined with predictions scraped from covert data sources in order to label travellers as potential terrorists or undocumented migrants. Combine this with the fact that automated systems consistently assess black people as more criminal than white people, even if all other variables are controlled. Your digital doppelganger – inaccuracies, discriminatory judgements and all – becomes indelibly tied to your face and body. This will help law enforcement to identify, surveil, target and control even innocent people.

The violation of your fundamental rights

Given the huge impact that biometric identification systems have on our private lives, the question is not only how they work, but whether they should be allowed to. This data-driven perma-surveillance blurs the boundary between public and private control in dangerous ways, allowing public authorities to outsource responsibility to commercially-protected algorithms, and enabling private companies to commodify people and sell this back to law enforcement. This whole ecosystem fundamentally violates human dignity, which is essential to our ability to live in security and with respect for our private lives.

The ad-tech industry is a treasure trove for biometric surveillance tech companies, who can secretly purchase the knowledge, and therefore the power, to control your access to and interactions with streets, supermarkets and banks based on what your digital doppelganger says about you, whether true or not. You become a walking, tweeting advertising opportunity and a potential suspect in a criminal database. So the real question becomes: when will Europe put its foot down?

Facial recognition and fundamental rights 101 (04.12.2019)
https://edri.org/facial-recognition-and-fundamental-rights-101/

The many faces of facial recognition in the EU (18.12.2019)
https://edri.org/the-many-faces-of-facial-recognition-in-the-eu/

Your face rings a bell: Three common uses of facial recognition (15.01.2020)
https://edri.org/your-face-rings-a-bell-three-common-uses-of-facial-recognition/

10 reasons why online advertising is broken (08.01.2020)
https://medium.com/@ka.iwanska/10-reasons-why-online-advertising-is-broken-d152308f50ec

The EU is funding dystopian artificial intelligence projects (22.01.2020)
https://www.euractiv.com/section/digital/opinion/the-eu-is-funding-dystopian-artificial-intelligence-projects/

Facial Recognition Cameras Will Put Us All in an Identity parade (27.01.2020)
https://www.theguardian.com/commentisfree/2020/jan/27/facial-recognition-cameras-technology-police

Out of Control: How consumers are exploited by the online advertising industry (14.01.2020)
https://fil.forbrukerradet.no/wp-content/uploads/2020/01/2020-01-14-out-of-control-final-version.pdf

Amazon’s Rekognition Shows Its True Colours (15.01.2020)
https://edri.org/amazons-rekognition-shows-its-true-colors/

We’re Banning Facial Recognition. We’re Missing the Point (20.01.2020)
https://www.nytimes.com/2020/01/20/opinion/facial-recognition-ban-privacy.html

The Secretive Company That Might End Privacy as we Know It (18.01.2020) https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html

Adtech – the reform of real time bidding has started and will continue (17.01.2020)
https://ico.org.uk/about-the-ico/news-and-events/news-and-blogs/2020/01/blog-adtech-the-reform-of-real-time-bidding-has-started/

Privacy International study shows your mental health is for sale (03.09.2019)
https://privacyinternational.org/long-read/3194/privacy-international-study-shows-your-mental-health-sale

(Contribution by Ella Jakubowska, EDRi intern)

close
29 Jan 2020

CJEU to decide on processing of passenger data under PNR Directive

By Gesellschaft für Freiheitsrechte

On 20 January 2020, the District Court of Cologne, Germany, submitted to the Court of Justice of the European Union (CJEU) the question whether the European Passenger Name Record (PNR) Directive violates fundamental rights. EDRi member Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights) initiated the proceedings against the directive, which allows for authorities to analyse and store personal data of all people who take an international flight in Europe.

GFF considers the PNR Directive to violate the right to the protection of personal data and the right to respect for private and family life. The PNR Directive (Directive 2016/681) requires airlines to automatically transfer their passengers’ data records to state authorities. These data records contain a large amount of sensitive information, including the date of birth, the names of accompanying persons, the means of payment used to purchase the flight ticket and an unspecified text field which the airline fills in independently.

The data is usually stored with police authorities. In Germany, the Federal Criminal Police Office intends to automatically compare the data records with pre-determined “criteria” in the future, for example criteria that describe the flight behavior of known criminals. As a result, any person whose profile happens to appear suspicious will have to expect increased police checks or even arrests. This is because the error rates of the algorithms will be considerable.

Strategic litigation aimed at the highest European court

In 2019, GFF together with EDRi member epicenter.works took legal action against the PNR Directive before German and Austrian courts and authorities. Since it is not possible to appeal against the Directive directly before the CJEU, the lawsuits were chosen with a strategic view to having the complaint submitted to the highest European court.

In Germany, GFF supports several individuals filing complaints against the airline Deutsche Lufthansa AG transferring their data to the German Federal Criminal Police Office. The plaintiffs bringing charges before the Cologne District Court include Kathalijne Buitenweg, a member of the Dutch parliament, as well as the German net activist Kübra Gümüşay, and the lawyer Franziska Nedelmann.

As expected, the Cologne District Court has now referred the case to the CJEU due to its evident implications of EU law. In our view, the PNR Directive is incompatible with the European Charter of Fundamental Rights, and the CJEU has already stopped a similar PNR agreement between the EU and Canada with its Opinion 1/15 of 26 July 2017. With the matter referred to the CJEU, the mass processing of passenger data in the EU might come to an end.

The basic funding for the project is provided by the Digital Freedom Fund.

Gesellschaft für Freiheitsrechte (GFF, Society for Civil Rights)
https://freiheitsrechte.org/english/

PNR campaign site: No PNR
https://www.nopnr.eu

Passenger surveillance brought before courts in Germany and Austria (22.05.2019)
https://edri.org/passenger-surveillance-brought-before-courts-in-germany-and-austria/

EU Directive 2016/681 (PNR Directive)
https://eur-lex.europa.eu/legal-content/EN/TXT/HTML/?uri=CELEX:32016L0681&from=DE

(Contribution by EDRi member Gesellschaft für Freiheitsrechte, Germany)

close