EU privacy body urges anonymization of location data for COVID-19 tracking

The European Data Protection Board (EDPB) has published guidance for the use of location data and contacts tracing tools intended to mitigate the impact of the COVID-19 pandemic.

Europe’s data protection framework wraps around all such digital interventions, meaning there are legal requirements for EU countries and authorities developing tracing tools or soliciting data for a coronavirus related purpose.

“These guidelines clarify the conditions and principles for the proportionate use of location data and contact tracing tools, for two specific purposes: using location data to support the response to the pandemic by modelling the spread of the virus so as to assess the overall effectiveness of confinement measures; [and] contact tracing, which aims to notify individuals of the fact that they have been in close proximity of someone who is eventually confirmed to be a carrier of the virus, in order to break the contamination chains as early as possible,” the EDPB writes in the document.

The European Commission and the EU parliament have already weighed in with their own recommendations in this area, including a toolbox to help guide contacts tracing app developers. The Commission has also urged Member States to take a common approach to building such apps. And has been leaning on local telcos to provide ‘anonymized and aggregated’ metadata for modelling the spread of the virus across the EU.

The guideline document from the EDPB — a body made up of representatives from the EU’s national data protection agencies which helps coordinate the application of pan-EU data protection law — brings additional expert steerage for those developing digital interventions as part of a public health response to the coronavirus pandemic.

“The EDPB generally considers that data and technology used to help fight COVID-19 should be used to empower, rather than to control, stigmatise, or repress individuals,” it writes. “Furthermore, while data and technology can be important tools, they have intrinsic limitations and can merely leverage the effectiveness of other public health measures. The general principles of effectiveness, necessity, and proportionality must guide any measure adopted by Member States or EU institutions that involve processing of personal data to fight COVID-19.”

Among the body’s specific recommendations are that where location data is being considered for modelling the spread of the coronavirus or assessing the effectiveness of national lockdown measures then anonymizing the data is preferable — with the EDPB emphasizing that proper anonymization is not easy.

Given the inherent complexity it also recommends transparency around the anonymization methodology used. (tl;dr: there’s no security in obscurity, nor indeed accountability.)

“Many options for effective anonymisation exist, but with a caveat. Data cannot be anonymised on their own, meaning that only datasets as a whole may or may not be made anonymous,” it notes.

“A single data pattern tracing the location of an individual over a significant period of time cannot be fully anonymised. This assessment may still hold true if the precision of the recorded geographical coordinates is not sufficiently lowered, or if details of the track are removed and even if only the location of places where the data subject stays for substantial amounts of time are retained. This also holds for location data that is poorly aggregated.

“To achieve anonymisation, location data must be carefully processed in order to meet the reasonability test. In this sense, such a processing includes considering location datasets as a whole, as well as processing data from a reasonably large set of individuals using available robust anonymisation techniques, provided that they are adequately and effectively implemented.”

On contact tracing apps — aka digital tools that are designed to map proximity between individuals, as a proxy for infection risk — the EDPB urges that use of such apps be voluntary.

“The systematic and large scale monitoring of location and/or contacts between natural persons is a grave intrusion into their privacy,” it warns. “It can only be legitimised by relying on a voluntary adoption by the users for each of the respective purposes. This would imply, in particular, that individuals who decide not to or cannot use such applications should not suffer from any disadvantage at all.”

The importance of accountability is also front and center, with the EDPB saying the controller of such apps must be clearly defined.

“The EDPB considers that the national health authorities could be the controllers for such application; other controllers may also be envisaged. In any cases, if the deployment of contact tracing apps involves different actors their roles and responsibilities must be clearly established from the outset and be explained to the users.”

Purpose limitation is another highlighted component. Apps need to have purposes that are “specific enough to exclude further processing for purposes unrelated to the management of the COVID- 19 health crisis (e.g., commercial or law enforcement purposes)”, it says.

So, in other words, no function creep — and no EU citizen mass surveillance via a pandemic backdoor.

The EDPB also writes that “careful consideration should be given to the principle of data minimisation and data protection by design and by default” — noting specifically that contact tracing apps “do not require tracking the location of individual users”.

Instead “proximity data should be used” for the contacts tracing purpose.

“Contact tracing applications can function without direct identification of individuals,” it further emphasizes, adding that “appropriate measures should be put in place to prevent re-identification”.

The guidance aligns with the coronavirus contacts tracing model devised jointly by Apple and Google — which have said they will be offering a cross-platform API for COVID-19 contacts tracing based on ephemeral proximity IDs shared via Bluetooth.

At one point the EDPB guidance appears to be leaning towards favoring such decentralized approaches to contacts tracing apps, with the body writing that “the collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary”.

Although later on the in guidance it discussed centralized models that involve proximity data being uploaded to a server in the cloud, writing that: “Implementations for contact tracing can follow a centralized or a decentralized approach. Both should be considered viable options, provided that adequate security measures are in place, each being accompanied by a set of advantages and disadvantages.”

In Europe there is currently a big fight between different camps over whether contacts tracing apps should use a centralized or decentralized model for storing and processing proximity data — with a contacts tracing app standardization effort known as PEPP-PT that’s backed by Germany’s Fraunhofer Institute for Telecommunications and some EU governments wanting to support centralized protocols for COVID-19 contacts tracking, while a separate coalition of European academics wants only decentralized approaches on privacy grounds, and has developed a protocol called DP-3T.

“The current health crisis should not be used as an opportunity to establish disproportionate data retention mandates,” the EDPB warns. “Storage limitation should consider the true needs and the medical relevance (this may include epidemiology-motivated considerations like the incubation period, etc.) and personal data should be kept only for the duration of the COVID-19 crisis. Afterwards, as a general rule, all personal data should be erased or anonymised.”

The body also recommends algorithms used in contacts tracing apps be audited and regularly reviewed by outside experts.

Again, a key criticism of the PEPP-PT initiative has been around lack of transparency — including its failure to publish code for external review. (Though it has said it will be publishing code.)

“In order to ensure their fairness, accountability and, more broadly, their compliance with the law, algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available for the widest possible scrutiny,” the EDPB writes.

Another notable piece of the guidance is for a data protection impact assessment not only to be carried out but that it be published — which marks a further push for accountability via transparency in such an unprecedented moment.

“The EDPB considers that a data protection impact assessment (DPIA) must be carried out before implementing such tool as the processing is considered likely high risk (health data anticipated large-scale adoption, systematic monitoring, use of new technological solution). The EDPB strongly recommends the publication of DPIAs,” it writes.

Typically DPAs leave it up to data controllers to decide whether to publish a DPIA or not — in this case the strong push from the central authority is that these documents are made public where COVID-19 contacts tracing apps are concerned.

Having highlighted the pros and cons of centralized vs decentralized approaches to contacts tracing, the EDPB goes on to recommend that the conceptual phase of app development “should always include thorough consideration of both concepts carefully weighing up the respective effects on data protection/privacy and the possible impacts on individuals rights”.

“Any server involved in the contact tracing system must only collect the contact history or the pseudonymous identifiers of a user diagnosed as infected as the result of a proper assessment made by health authorities and of a voluntary action of the user. Alternately, the server must keep a list of pseudonymous identifiers of infected users or their contact history only for the time to inform potentially infected users of their exposure, and should not try to identify potentially infected users.”

“Putting in place a global contact tracing methodology including both applications and manual tracing may require additional information to be processed in some cases. In this context, this additional information should remain on the user terminal and only be processed when strictly necessary and with his prior and specific consent,” it adds.

You can read the full document here.

Google gobbling Fitbit is a major privacy risk, warns EU data protection advisor

The European Data Protection Board (EDPB) has intervened to raise concerns about Google’s plan to scoop up the health and activity data of millions of Fitbit users — at a time when the company is under intense scrutiny over how extensively it tracks people online and for antitrust concerns.

Google confirmed its plan to acquire Fitbit last November, saying it would pay $7.35 per share for the wearable maker in an all-cash deal that valued Fitbit, and therefore the activity, health, sleep and location data it can hold on its more than 28M active users, at ~$2.1 billion.

Regulators are in the process of considering whether to allow the tech giant to gobble up all this data.

Google, meanwhile, is in the process of dialling up its designs on the health space.

In a statement issued after a plenary meeting this week the body that advises the European Commission on the application of EU data protection law highlights the privacy implications of the planned merger, writing: “There are concerns that the possible further combination and accumulation of sensitive personal data regarding people in Europe by a major tech company could entail a high level of risk to the fundamental rights to privacy and to the protection of personal data.”

Just this month the Irish Data Protection Commission (DPC) opened a formal investigation into Google’s processing of people’s location data — finally acting on GDPR complaints filed by consumer rights groups as early as November 2018  which argue the tech giant uses deceptive tactics to manipulate users in order to keep tracking them for ad-targeting purposes.

We’ve reached out to the Irish DPC — which is the lead privacy regulator for Google in the EU — to ask if it shares the EDPB’s concerns.

The latter’s statement goes on to reiterate the importance for EU regulators to asses what it describes as the “longer-term implications for the protection of economic, data protection and consumer rights whenever a significant merger is proposed”.

It also says it intends to remain “vigilant in this and similar cases in the future”.

The EDPB includes a reminder that Google and Fitbit have obligations under Europe’s General Data Protection Regulation to conduct a “full assessment of the data protection requirements and privacy implications of the merger” — and do so in a transparent way, under the regulation’s principle of accountability.

“The EDPB urges the parties to mitigate the possible risks of the merger to the rights to privacy and data protection before notifying the merger to the European Commission,” it also writes.

We reached out to Google for comment but at the time of writing it had not provided a response nor responded to a question asking what commitments it will be making to Fitbit users regarding the privacy of their data.

Fitbit has previously claimed that users’ “health and wellness data will not be used for Google ads”.

However big tech has a history of subsequently steamrollering founder claims that ‘nothing will change’. (See, for e.g.: Facebook’s WhatsApp U-turn on data-linking.)

“The EDPB will consider the implications that this merger may have for the protection of personal data in the European Economic Area and stands ready to contribute its advice on the proposed merger to the Commission if so requested,” the advisory body adds.

We’ve also reached out to the European Commission’s competition unit for a response to the EDPB’s statement.