Meta dodged a €4BN privacy fine over unlawful ads, argues GDPR complainant

A €390M privacy fine for Meta announced earlier this month in the European Union — for running behavioral ads on Facebook and Instagram in the region without a valid legal basis — was several billion dollars smaller than it should have been, and orders of magnitude too tiny to be a deterrent for others going big on breaking the bloc’s privacy laws, according to the not-for-profit which filed the original complaint over Facebook’s ‘forced consent’ back in May 2018.

This week the privacy rights group, noyb, has written to the European Data Protection Board (EDPB) to raise fresh hell — arguing that the Irish regulator which issued the final decision on its complaint against Meta’s ads failed to follow the Board’s instructions to investigate the financial benefits it accrued off of the unlawful data processing.

It argues the Irish Data Protection Commission (DPC) has failed to implement the EDPB’s binding decision from December — which instructed the regulator to both find the legal basis Meta had claimed for running behavioral ads unlawful and significantly increase the size of the fine the DPC had proposed in its earlier draft decision.

In the final decision which the DPC issued earlier this month, the DPC declined to act on the Board’s direction to ascertain an estimate of the financial benefit Meta gained from targeting EU users with behavioral ads in breach of EU data protection law.

And while the Irish regulator did top-up the level of fine on Meta to €390M — vs the €28M to €36M it had originally proposed for transparency failures — the revised fine neither reflects the seriousness of the systematic breach of European users’ fundamental rights, per noyb — nor does it implement the Board’s requirement that the DPC determine the unlawful financial benefits accrued by Meta from running ads that break EU privacy law.

noyb notes that, per EDPB guidelines on calculation of fines (and the text of the final decision put out by the DPC incorporating the Board’s binding decisions), the Irish regulator needed to ensure any fines “counterbalanc[e] the gains from the infringement” and also “impose a fine that exceeds that [unlawfully obtained] amount”.

In the absence of directions, the [DPC] is unable to ascertain an estimation of the matters identified above. Accordingly, I am unable to take these matters into account for the purpose of this assessment,” is how the DPC’s Helen Dixon dryly dismissed the EDPB’s instruction — a few lines of text that essentially let Meta off the hook on what noyb calculates should have been a penalty set at the maximum possible under the EU’s General Data Protection Regulation (GDPR): 4% of annual revenue. (Or over €4BN in Meta’s case.)

noyb’s letter lays out how it has estimated the total revenue Meta generated, over the 4.5+ year infringement period, on users in the European Economic Area (EEA) — a figure it puts at circa €72.5BN. It says it’s arrived at this estimate by looking at the publicly listed company’s financial reports (and adjusting revenue figures to only reflect users in the EEA, not the European continent as a whole) — querying why the DPC’s far more numerous staff couldn’t have done the same.

“While ‘behavioural advertisement’ does not make up all the revenue of Meta’s overall advertising, it is clear that in any realistic scenario, the revenue from ‘behavioural advertisement’ in the EU overshot the maximum [possible, under GDPR] fine of €4.36BN,” noyb also argues.

In a statement, its honorary chairman, Max Schrems, adds: “By not even checking publicly available information, the DPC gifted €3.97BN to Meta.”

“It took us an hour and a spread sheet to make the calculation,” he went on. “I am sure the Irish taxpayers would not mind having that extra cash, if a DPC employee would have just opened a search engine and done some research.

noyb’s letter also questions why the DPC apparently failed to use its statutory powers under the regulation to ask the data controller for any information required for the performance of its tasks — which could have provided it with a precise route to estimate how wealthy Meta got by unlawfully processing Europeans’ data.

“Given that SAs [supervisory authorities] can only fine based on the revenue of the last year, and the Irish DPC has taken more than 4.5 years to issue a final decision, Meta has made substantial revenue from violating the law, even if the maximum fine of 4% of the annual turnover is applied,” noyb goes on. “The estimated revenue from advertisements in the EEA of €72,53BN, would only be reduced to €68,17BN if the full 4% would be applied. This clearly makes even a maximum fine of 4% not even remotely ‘effective, proportionate and dissuasive’ in comparison to the unlawful revenue made by Meta IE [Ireland].

“Nevertheless the EDPB and the DPC are bound by Articles 83(1), (2)(k) and (5) GDPR at the same time, meaning that the maximum fine of 4% may not be overstepped but must also be used fully to comply with the conflicting requirements of the GDPR.”

So — tl;dr — even the maximum possible financial penalty under GDPR would not have been remotely dissuasive to Meta in financial terms — given how much more money it was minting by trampling all over European users’ privacy. Yet, the kicker is, Meta didn’t even get fined that (inadequate) maximum amount! Lol! 

noyb’s letter presents a neatly calculated and — frankly — damning assessment of high profile enforcement flaws in the GDPR. Flaws that enable Big Tech to play the system by forum shopping for ‘friendly’ regulators who can find endless ways to chew the cud around complaints and spin claims of protocol and procedure into a full blown dance of dalliance and delay, and whose convenient decisions can, at the last, be relied upon to help minimize any damage — in a cynical mockery of due process that’s turned the EU’s flagship data protection framework into a paper tiger where Big Tech’s users’ rights are concerned.

noyb is calling on the EDPB to take “immediate action” against the DPC — to ensure its binding decision “is fully implemented in [or, well, by] Ireland”.

“Given the clear evidence that Meta IE [Ireland] has profited from the violation of Article 6(1) GDPR in vast excess of the maximum fine of 4% under Article 83(5) GDPR and the Irish DPC’s clear breach of the binding decision in this respect, we urge the EDPB and its members to take immediate action against the Irish DPC to ensure that the EDPB decision is fully implemented in Ireland,” it urges.

However this (meta – ha!) complaint by noyb — about the outcome of its 2018 complaint about Meta’s ads — most likely lands at the end of the road as far as regulators are concerned. Next stop: Class-action style litigation?

noyb’s call joins a pile of complaints (and legal actions) targeting the Irish regulator’s failure to rigorously enforce the GDPR against abusive Big Tech business models — including litigation over inaction (also vis-a-vis the behavioral ads industry) and an accusation of criminal corruption (also from noyb), to name two of the barrage of slings and arrows fired at the DPC since the GDPR came into application (on paper) and complainants started the clock on their interminable wait for enforcement. 

The DPC was contacted for comment on noyb’s complaint to the EDPB — but it declined to offer a response.

We also reached out to the EDPB. A spokeswoman for the Board told us it “takes note” of noyb’s letter — but declined further comment at this time.

It remains to be seen what action — if any — the steering body will take. Its powers are limited in this context since its competence to intervene in the GDPR enforcement process relates to any objections raised to a lead supervisor’s draft decision (as happened in the Meta ads case).

After a final decision is issued the Board does not carry out a full re-evaluation of a case. So the chance of it being able to do much more here looks slim.

EU law enshrines the independence of Member States’ data protection regulators so the Board essentially has to work with whatever it’s given in a draft decision (and/or any objections raised by other DPAs). Which is why the DPC also sees mileage in challenging the portion of the Board’s binding decision that instructed it to further investigate Meta’s data processing — as it argues that’s jurisdictional overreach.

This structure effectively means a lead DPA can do considerable work to shape GDPR outcomes that impact users all over the bloc — by, for starters, minimizing what they investigate and then, even if they do open a probe, by narrowly scoping these enquiries and limiting what they factor into their preliminary decisions.

In the case of Meta, the DPC did not provide any data on the estimated financial benefit it amassed from its unlawful behavioral ads. Which — once again — looks terribly convenient for the tech giant.

While there’s not much Internet users can do about such a gaping enforcement gap — aside from hoping litigation funders step in and spin up more class-action style lawsuits to sue for damages on these major breaches — EU lawmakers themselves should be very concerned.

Concerned that a flagship piece of the EU’s digital rulebook — one that’s now also a key component at the heart of an expanding tapestry of regulations the bloc has been building up in recent years around data governance, to try to foster trust and get more data flowing in the hopes of fuelling a revolution in homegrown AI innovation — is proving to be such a jelly in the face of systematic law breaking.

Rules that can’t protect or correct aren’t going to impress anyone over the long run. And that means the paper tiger may yet have some teeth: If the GDPR enforcement failures keep stacking up, the sour taste that leaves for EU citizens tired of watching their rights trampled might risk toppling people’s trust in the whole carefully constructed ‘European project’.

Meta dodged a €4BN privacy fine over unlawful ads, argues GDPR complainant by Natasha Lomas originally published on TechCrunch

WhatsApp challenge to decision that led to $267M GDPR fine tossed by EU court

More bad news for Meta in Europe: The tech giant has lost an attempt to annul a binding decision on its messaging app, WhatsApp, taken by the European Data Protection Board (EDPB) last summer — under the bloc’s General Data Protection Regulation (GDPR) — which factored into a final decision (and hefty fine) issued by WhatsApp’s lead EU data protection supervisor, Ireland’s Data Protection Commission (DPC), just over a year ago.

Meta still has a live appeal against the WhatsApp GDPR enforcement in Ireland, under Irish law — where the DPC issued its final decision on this enquiry in September last year — so the tech giant’s legal challenge still has road to run.

But in a judgement published today, the European Union’s General Court found WhatsApp Ireland’s action for annulment to be inadmissible.

The EDPB’s binding decision led to WhatsApp being issued with a €225 million fine for breaching GDPR transparency obligations — a financial sanction that was substantially larger than the €30M to €50M originally proposed by the DPC in its draft decision, underlining how significant the Board’s interventions can be. 

Under the GDPR’s one-stop-shop mechanism, regulatory enquiries into data processors that have users across multiple EU Member States are typically funnelled through a lead supervisor in the country of main establishment in the EU (in Meta’s case that’s Ireland). But any draft decision they produce must be submitted to other EU data protection authorities for review — and if objections are lodged a dispute resolution mechanism kicks in that can culminate in the EDPB taking a binding decision if no consensus is reached between the DPAs.

So the Board plays a crucial role in ensuring that enforcement of the bloc’s flagship data protection regulation does not stall in perpetual inter-regulatory bickering.

Just yesterday, for example, the EDPB confirmed it’s stepped in to take three more binding decisions on Meta-owned companies in relation to different GDPR complaints — against Facebook, Instagram and WhatsApp. Final decisions on those ‘legal basis’ cases are due from Ireland’s DPC early next year — and any Meta appeal against the Board’s role in those decisions looks less likely to succeed given today’s slapdown.

The EDPB also stepped in last summer to issue a binding decision on the aforementioned WhatsApp transparency enquiry after DPAs failed to agree on a number of issues. Its intervention resulted in extra woe for WhatsApp — after the Board found more violations than the DPC and identified problems with how the Irish regulator had calculated the level of the proposed fine, leading to it to require the DPC issue a larger financial sanction in its final decision.

The Board’s intervention also reduced the period of time WhatsApp was given to implement the corrective measures ordered under the enforcement — cutting it in half, down to three months from the six suggested by the DPC. So, again, its role can be significant in shaping final decisions, especially in more complex, contested GDPR cases.

But while the Board is critical to keeping GDPR enforcement flowing, it’s still the lead data protection authority that is, ultimately, responsible for taking the final decision on cases they lead — just with a stipulation that their final decision must incorporate an EDPB binding decision, if there is one.

And that nuance — over the difference between a partial vs final decision — looks to be part of the reason why Meta’s attempt to annul the Board’s binding decision foundered in the EU Court. As well as there not being a reason to admit the action under EU procedural law, in the Court’s view.

The Court also points out that allowing the action to be heard would create a situation of two judicial proceedings (“with significant overlap”) running in parallel — given Meta is appealing the WhatsApp enforcement in Ireland by challenging the DPC’s final decision — and further notes the ability of an Irish court to make a reference to the EU’s Court of Justice if it has doubts as to the validity of the EDPB’s decision. So there is still an avenue for this issue to return to an (higher) EU court down the line.

Responding to its EU legal action being tossed today, a WhatsApp spokesperson sent us this brief statement:

This case concerns a privacy policy from four years ago that has since been updated multiple times and clearly details the industry-leading privacy protections WhatsApp provides. We still strongly disagree with the EDPB decision and will consider all available options.

WhatsApp challenge to decision that led to $267M GDPR fine tossed by EU court by Natasha Lomas originally published on TechCrunch

WhatsApp challenge to decision that led to $267M GDPR fine tossed by EU court

More bad news for Meta in Europe: The tech giant has lost an attempt to annul a binding decision on its messaging app, WhatsApp, taken by the European Data Protection Board (EDPB) last summer — under the bloc’s General Data Protection Regulation (GDPR) — which factored into a final decision (and hefty fine) issued by WhatsApp’s lead EU data protection supervisor, Ireland’s Data Protection Commission (DPC), just over a year ago.

Meta still has a live appeal against the WhatsApp GDPR enforcement in Ireland, under Irish law — where the DPC issued its final decision on this enquiry in September last year — so the tech giant’s legal challenge still has road to run.

But in a judgement published today, the European Union’s General Court found WhatsApp Ireland’s action for annulment to be inadmissible.

The EDPB’s binding decision led to WhatsApp being issued with a €225 million fine for breaching GDPR transparency obligations — a financial sanction that was substantially larger than the €30M to €50M originally proposed by the DPC in its draft decision, underlining how significant the Board’s interventions can be. 

Under the GDPR’s one-stop-shop mechanism, regulatory enquiries into data processors that have users across multiple EU Member States are typically funnelled through a lead supervisor in the country of main establishment in the EU (in Meta’s case that’s Ireland). But any draft decision they produce must be submitted to other EU data protection authorities for review — and if objections are lodged a dispute resolution mechanism kicks in that can culminate in the EDPB taking a binding decision if no consensus is reached between the DPAs.

So the Board plays a crucial role in ensuring that enforcement of the bloc’s flagship data protection regulation does not stall in perpetual inter-regulatory bickering.

Just yesterday, for example, the EDPB confirmed it’s stepped in to take three more binding decisions on Meta-owned companies in relation to different GDPR complaints — against Facebook, Instagram and WhatsApp. Final decisions on those ‘legal basis’ cases are due from Ireland’s DPC early next year — and any Meta appeal against the Board’s role in those decisions looks less likely to succeed given today’s slapdown.

The EDPB also stepped in last summer to issue a binding decision on the aforementioned WhatsApp transparency enquiry after DPAs failed to agree on a number of issues. Its intervention resulted in extra woe for WhatsApp — after the Board found more violations than the DPC and identified problems with how the Irish regulator had calculated the level of the proposed fine, leading to it to require the DPC issue a larger financial sanction in its final decision.

The Board’s intervention also reduced the period of time WhatsApp was given to implement the corrective measures ordered under the enforcement — cutting it in half, down to three months from the six suggested by the DPC. So, again, its role can be significant in shaping final decisions, especially in more complex, contested GDPR cases.

But while the Board is critical to keeping GDPR enforcement flowing, it’s still the lead data protection authority that is, ultimately, responsible for taking the final decision on cases they lead — just with a stipulation that their final decision must incorporate an EDPB binding decision, if there is one.

And that nuance — over the difference between a partial vs final decision — looks to be part of the reason why Meta’s attempt to annul the Board’s binding decision foundered in the EU Court. As well as there not being a reason to admit the action under EU procedural law, in the Court’s view.

The Court also points out that allowing the action to be heard would create a situation of two judicial proceedings (“with significant overlap”) running in parallel — given Meta is appealing the WhatsApp enforcement in Ireland by challenging the DPC’s final decision — and further notes the ability of an Irish court to make a reference to the EU’s Court of Justice if it has doubts as to the validity of the EDPB’s decision. So there is still an avenue for this issue to return to an (higher) EU court down the line.

Responding to its EU legal action being tossed today, a WhatsApp spokesperson sent us this brief statement:

This case concerns a privacy policy from four years ago that has since been updated multiple times and clearly details the industry-leading privacy protections WhatsApp provides. We still strongly disagree with the EDPB decision and will consider all available options.

WhatsApp challenge to decision that led to $267M GDPR fine tossed by EU court by Natasha Lomas originally published on TechCrunch

Transatlantic data transfers ‘deal in principle’ faces tough legal review

The political agreement reached late last month between the European Union and the US administration on a new transatlantic data transfers pact which aims to end years of legal uncertainty for businesses exporting data from the bloc is not yet a done deal. The deal in principle faces scrutiny in the coming months once the full text is published — and will most likely face fresh (and fast) legal challenges if it does get adopted so everything hinges on the detail. 

Yesterday the European Data Protection Board (EDPB), which advises on compliance with EU data protection law, put out a statement signalling where it will be directing its attention when it reviews this detail — saying it will be paying “special attention to how this political agreement is translated into concrete legal proposals”.

“The EDPB looks forward to assessing carefully the improvements that the new framework may bring in light of EU law, CJEU case law and previous recommendations of the Board, once the EDPB receives all supporting documents from the European Commission,” the Board wrote.

“In particular, the EDPB will analyse whether the collection of personal data for national security purposes is limited to what is strictly necessary and proportionate. In addition, the EDPB will examine how the announced independent redress mechanism respects EEA individuals’ right to an effective remedy and to a fair trial. More specifically, the EDPB will look into whether any new authority part of this mechanism has access to relevant information, including personal data, when exercising its mission and whether it can adopt decisions binding on the intelligence services. The EDPB will also consider whether there is a judicial remedy against this authority’s decisions or inaction.”

The EDPB also warned that the political deal is not yet a legal agreement — emphasizing that data exporters must continue to comply with the case law of the bloc’s top court in the meanwhile; and especially with the July 2020 ruling by the CJEU, aka Schrems II, which struck down the last EU-US data transfers deal (aka, the EU-US Privacy Shield).

Talking up the political deal reached last month to replace the defunct Privacy Shield, the Biden administration said the US has committed to putting in place “new safeguards” which it said would ensure that state surveillance agencies’ data-gathering activities will be “necessary and proportionate” and linked to “defined national security objectives”.

The clash between the primacy of US surveillance laws and robust EU privacy rights remains the fundamental schism — so it’s difficult to see how any new deal will be able to stand against fresh legal challenges unless it commits to putting hard limits on US mass surveillance programs.

The replacement deal will also need to create a proper avenue for EU individuals to seek and obtain redress if they believe US intelligence agencies have unlawfully targeted them. And that also looks difficult.

Last month, ahead of the announcement of the political agreement, The Hill reported on a US Supreme Court ruling in a case related to FBI surveillance that it suggested made the chance of a deal harder — as the court reinforced state secrets privilege for spying cases by finding that Congress did not eliminate this privilege when it enacted surveillance reforms in the Foreign Intelligence Surveillance Act (FISA).

“Though the opinion left open the possibility that people such as the Fazaga plaintiffs nonetheless could pursue claims based on public information about the government’s surveillance, most people need sensitive information from the government to help prove that its surveillance was illegal. The decision could make it easier for the government to shield such information from judges, and therefore harder for most people challenging surveillance to prove their claims and obtain justice in court,” the publication reported.

The need for deeper reforms of FISA has been a key call from critics of previous EU-US data transfer deals (before Privacy Shield there was Safe Harbor — which was struck down by the CJEU in 2015).

Last month the White House said the deal agreed in principle would enable EU individuals to “seek redress from a new multi-layer redress mechanism that includes an independent Data Protection Review Court that would consist of individuals chosen from outside the U.S. Government who would have full authority to adjudicate claims and direct remedial measures as needed”.

However the legal status of this “Review Court” will be key — as the EDPB’s statement underlines.

Moreover, if the US Supreme Court takes a different view which essentially overrides any deal the Biden administration is promising by making it impossible for EU individuals to obtain the information they need to be able to bring a claim against the US government that would undermine the ability of EU people to actually obtain redress… And, well, the CJEU has made it crystal clear that EU individuals subject to illegal surveillance in a third country must have a genuine and meaningful way to pursue accountability.

The EDPB’s statement elucidates exactly these concerns — with the Board flagging that any “new authority” set up under a claim of delivering redress will need “access to relevant information, including personal data” in order to be able to live up to that mission; and will also need to be able to adopt decisions that are binding on the intelligence services.

It’s worth remembering that the Privacy Shield ‘ombudsperson’ regime which was tested in Privacy Shield didn’t pass muster with the CJEU — both on grounds of independence and because of the inability for the ombudsperson to adopt decisions that are binding on the intelligence services.

How different a “Data Protection Review Court” would be in those regards remains to be seen.

Max Schrems, the EU privacy campaigner who successfully brought down the last two EU-US data transfers deals, remains sceptical that the latest ‘fix’ offers anything substantially different — recently tweeting another eye-catching visual metaphor to illustrate his early assessment…

Failing genuine surveillance reform in the US it may well be that squaring the data transfers circle is as steep a challenge as it has proved the last two times around the block. But even if the political imperative inside the EU to do a deal overrides obvious legal gaps — as it did when the last Commission ignored concerns and adopted the Privacy Shield — that will just mean the two sides are buying time until the next CJEU strike down.

Likely not very much time either.

While Safe Harbor stood for 15 years, Privacy Shield only lasted four — and Schrems has suggested a fresh challenge to another flawed replacement would be fast tracked into the CJEU “within months” of a final decision to adopt it. So EU lawmakers have been warned.

EU warns adtech giants over ‘legal tricks’ as it moots changes to centralize privacy oversight

The European Commission has given its clearest signal yet that it’s prepared to intervene over weak enforcement of the EU’s data protection rules against big tech.

Today the bloc’s executive also had a warning for adtech giants Google and Facebook — accusing them of choosing “legal tricks” over true compliance with the EU’s standard of “privacy by design” — and emphasizing the imperative for them to take data protection “seriously”.

Speaking at a privacy conference this morning, Vera Jourová, the EU’s commissioner for values and transparency, said enforcement of the General Data Protection Regulation (GDPR) at a national level must buck up — and become “effective” — or else it “will have to change”, warning specifying that any “potential changes” will move toward centralized enforcement.

“When I was looking at existing enforcement decisions and pending cases, I also came to another conclusion,” she also said.  “So, we have penalties or decisions against Google, Facebook, WhatsApp.

“To me this means that clearly there is a problem with compliance culture among those companies that live off our personal data. Despite the fact that they have the best legal teams, presence in Brussels and spent countless hours discussing with us the GDPR. Sadly, I fear this is not privacy by design.

I think it is high time for those companies to take protection of personal data seriously. I want to see full compliance, not legal tricks. It’s time not to hide behind small print, but tackle the challenges head on.”

In parallel, an influential advisor to the bloc’s top court has today published an opinion which states that EU law does not preclude consumer protection agencies from bringing representative actions at a national level — following a referral by a German court in a case against Facebook Ireland — which, if the CJEU’s judges agree, could open up a fresh wave of challenges to tech giants’ misuse of people’s data without the need to funnel complaints through the single point of failure of gatekeeper regulators like Ireland’s Data Protection Commission (DPC).

Towards centralized privacy oversight?

On paper, EU law provides people in the region with a suite of rights and protections attached to their data. And while the regulation has attracted huge international attention, as other regions grapple with how to protect people in an age of data-mining giants, the problem for many GDPR critics, as it stands, is that the law decentralizes oversight of these rules and rights to a patchwork of supervisory agencies at the EU Member State level.

While this can work well for cases involving locally bounded services, major problems arise where complaints span borders within the EU — as is always the case with tech giants’ (global) services. This is because a one-stop-shop (OSS) mechanism kicks in, ostensibly to reduce the administrative burden for businesses.

But it also enables a huge get-out clause for tech giants, allowing them to forum shop for a ‘friendly’ regulator through their choice of where to locate their regional HQ. And working from a local EU base, corporate giants can use investment and job creation in that Member State as a lever to work against and erode national political will to press for vigorous oversight of their European business at the local authority level.

“In my view, it does take too long to address the key questions around processing of personal data for big tech,” said Jourová giving a keynote speech to the Forum Europe data protection & privacy conference. “Yes, I understand the lack of resources. I understand there is no pan-European procedural law to help the cross-border cases. I understand that the first cases need to be rock-solid because they will be challenged in court.

“But I want to be honest — we are in the crunch time now. Either we will all collectively show that GDPR enforcement is effective or it will have to change. And there is no way back to decentralised model that was there before the GDPR. Any potential changes will go towards more centralisation, bigger role of the EDPB [European Data Protection Board] or Commission.”

Jourová added that the “pressure” to make enforcement effective “is already here” — pointing to debate around incoming legislation that will update the EU’s rules around ecommerce, and emphasizing that, on the Digital Services Act, Member States have been advocating for enforcement change — and “want to see more central role of the European Commission”.

Point being that if there’s political will for structural changes to centralize EU enforcement among Member States, the Commission has the powers to propose the necessary amendments — and will hardly turn its nose up at being asked to take on more responsibility itself.

Jourová’s remarks are a notable step up on her approach to the thorny issue of GDPR enforcement back in summer 2020 — when, at the two year review mark of the regulation entering into application, she was still talking about the need to properly resource DPAs — in order that they could “step up their work” and deliver “vigorous but uniform enforcement”, as she put it then.

Now, in the dying days of 2021 — with a still massive backlog of decisions yet to be issued around cross-border cases, some of which are highly strategic, targeting adtech platforms’ core surveillance business model (Jourová’s speech, for example, noted that 809 procedures related to the OSS have been triggered but only 290 Final Decisions have been issued) — the Commission appears to be signalling that it’s finally running out of patience on enforcement.

And that it is already eyeing a Plan B to make the GDPR truly effective.

Criticism of weak enforcement against tech giants has been a rising chorus in Europe for years. Most recently frustration with regulatory inaction led privacy campaigner Max Schrems’ not-for-profit, noyb, to file a complaint of criminal corruption against the GDPR’s most infamous bottleneck: Ireland’s DPC, accusing the regulator of engaging in “procedural blackmail” which it suggested would help Facebook by keeping key developments out of the public eye, among other eye-raising charges.

The Irish regulator has faced the strongest criticism of all the EU DPAs over its role in hampering effective GDPR enforcement.

Although it’s not the only authority to be accused of creating a bottleneck by letting major complaints pile up on its desk and taking a painstaking ice-age to investigate complaints and issue decisions (assuming it opens an investigation at all).

The UK’s ICO — when the country was still in the EU — did nothing about complaints against real-time-bidding’s abuse of people’s data, for example, despite sounding a public warning over behavioral ads’ unlawfulness as early as 2019. While Belgium’s DPA has been taking a painstaking amount of time to issue a final decision on the IAB Europe’s TCF’s failure to comply with the GDPR. But Ireland’s central role in regulating most of big tech means it attracts the most flak. 

The sheer number of tech giants that have converged on Ireland — wooed by low corporate tax rates (likely with the added cherry of business-friendly data oversight) — gives it an outsized role in overseeing what’s done with European’s data.

Hence Ireland has open investigations into Apple, Google, Facebook and many others — yet has only issued two final decisions on cross-border cases so far (Twitter last year; and WhatsApp this year).

Both of those decisions went through a dispute mechanism that’s also baked into the GDPR — which kicks in when other EU DPAs don’t agree with a draft decision by the lead authority.

That mechanism further slowed down the DPC’s enforcement in those cases — but substantially cranked up the intervention the two companies ultimately faced. Ireland had wanted to be a lot more lenient vs the collective verdict once all of the bloc’s oversight bodies had had their say.

That too, critics say, demonstrates the DPC’s regulatory capture by platform power.

An opinion piece in yesterday’s Washington Post skewered the DPC as “the wrong privacy watchdog for Europe” — citing a study by the Irish Council for Civil Liberties that found it had only published decisions on about 2% of the 164 cross border cases it has taken on.

The number of complaints the DPC has chosen to entirely ignore — i.e. by not opening a formal investigation — or else to quietly shutter (“resolve”) without issuing a decision or taking any enforcement action is likely considerably higher. 

The agency is shielded by a very narrow application of Freedom of Information law, which applies only in relation to DPC records pertaining to the “general administration” of its office. So when TechCrunch asked the DPC, last December, how many times it had used GDPR powers such as the ability to order a ban on processing it declined to respond to our FOIs — arguing the information did not fall under Ireland’s implementation of the law.

Silence and stonewalling only go so far, though.

Calls for root and branch reform of the DPC specifically, and enforcement of the GDPR more generally, can now be heard from Ireland’s own parliament all the way up to the European Commission. And big tech’s game of tying EU regulators in knots looks as if it’s — gradually, gradually — getting toward the end of its rope.

What comes next is an interesting question. Last month the European Data Protection Superviso (EDPS) announced a conference on the future of “effective” digital enforcement — which will take place in June 2022 — and which he said would discuss best practice and also “explore alternative models of enforcement for the digital future”.

“We are ambitious,” said Wojciech Wiewiorowski as he announced the conference. “There is much scope for discussion and much potential improvement on the way current governance models are implemented in practice. We envisage a dialogue across different fields of regulation — from data protection to competition, digital markets and services, and artificial intelligence as well — both in the EU, and Europe as a continent, but also on the global level.”

Discussion of “different” and “alternative” models of enforcement will be a focus of the event, per Wiewiorowski — who further specified that this will include discussion of “a more centralized approach”. So the EDPS and the Commission appear to be singing a similar tune on reforming GDPR enforcement.

As well as the Commission itself (potentially) taking on an enforcement role in the future — perhaps specifically on major, cross border cases related to big tech, in order to beef up GDPR’s application against the most powerful offenders (as is already proposed in the case of the DSA and enforcing those rules against ‘very large online platforms’; aka vLOPs) — the GDPR steering and advisory body, the EDPB, also looks set to play an increasingly strategic and important role.

Indeed, it already has a ‘last resort’ decision making power to resolve disputes over cross border GDPR enforcement — and Ireland’s intransigence has led to it exercising this power for the first time.

In the future, the Board’s role could expand further if EU lawmakers decide that more centralization is the only way to deliver effective enforcement against tech giants that have become experts in exhausting regulators with bad faith arguments and whack-a-mole procedures, in order to delay, defer and deny compliance with European law.

The EDPB’s chair, Andrea Jelinek, was also speaking at the Forum Europe conference today. Asked for her thoughts on how GDPR enforcement could improve, including problematic elements like the OSS, she cautioned that change will be a “long term project”, while simultaneously agreeing there are notable “challenges” at the point where national oversight intersects with the needs of cross border enforcement.

“Enforcing at a national level and at the same time resolving cross border cases is time and resource intensive,” she said. “Supervisory authorities need to carry out investigations, observe procedural rules, coordinate and share information with other supervisory authorities. For the current system to work properly it is of vital important that supervisory authorities have enough resources and staff.

“The differences in national administrative procedures and the fact that in some Member States no deadlines are foreseen for handling a case also creates an obstacle to the efficient functioning of the OSS.”

Jelinek made a point of emphasizing that EDPB has been taking action to try to remedy some of issues identified — implementing what she described as “a series of practical solutions” to tackle problems around enforcement.

She said this has included developing (last year) a co-ordinated enforcement framework to facilitate joint actions (“in a flexible and coordinated manner”) — such as launching enforcement sweeps and joint investigations.

The EPBD is also establishing a pilot project to provide a pool of experts to support investigations and enforcement activities “of significant common interest”, she noted, predicting: “This will enhance the cooperation and solidarity between all the supervisory authorities by addressing their operational needs.”

“Finally we should not forget that the GDPR is a long term project and so is strengthening cooperation between supervisory authorities,” she added. “Any transformation of the GDPR will take years. I think the best solution is therefore to deploy the GDPR fully — it is likely that most of the issues identified by Member States and stakeholders will benefit from more experience in the application of the regulation in the coming years.”

However it is already well over three years since GDPR came into application. So many EU citizens may query the logic of waiting years more for regulators to figure out how to jointly work together to get the job of upholding people’s rights done. Not least because this enforcement impasse leaves data-mining tech giants free to direct their vast data-enabled wealth and engineering resource at developing new ‘innovations’ — to better evade legal restrictions on what they can do with people’s data.

One thing is clear: The next wave of big tech regulatory evasion will come dressed up in claims of privacy “innovation” from the get-go.

Indeed, that is already how adtech giants like Google are trying to re-channel regulators’ attention from enforcing against their core attention-manipulation, surveillance-based business model.

Google SVP Kent Walker also took to the (virtual) conference stage this morning for a keynote slot in which he argued that the novel ad targeting technologies Google is developing under its “Privacy Sandbox” badge (such as FloCs; aka federated learning of cohorts) will provide the answer to what big (ad)tech likes to claim is an inherent tension between European fundamental rights like privacy and economic growth.

The truth, as ever, is a lot more nuanced than that. For one thing, there are plenty of ways to target ads that don’t require processing people’s data. But as most of Europe’s regulators remain bogged down in a mire of corporate capture, under-resourcing, culture cowardice/risk aversion, internecine squabbles and, at times, a sheer lack of national political will to enforce the law against the world’s wealthiest companies, the adtech duopoly is sounding cockily confident that it will be allowed to carry on and reset the terms of the game in its own interests once again.

(The added irony here is that Google is currently working under the oversight of the UK’s Competition and Markets Authority and ICO on shaping behavioral remedies attached to its Sandbox proposals — and has said that these commitments will be applied globally if the UK is minded to accept them; which does risk tarnishing the GDPR’s geopolitical shine, given the UK is no longer a member of the EU… )

For EU citizens, it could well mean that — once again — it’s up to the CJEU to come to the rescue of their fundamental rights — assuming the court ends up concurring with advocate general Richard de la Tour’s opinion today that the GDPR:

” … does not preclude national legislation which allows consumer protection associations to bring legal proceedings against the person alleged to be responsible for an infringement of the protection of personal data, on the basis of the prohibition of unfair commercial practices, the infringement of a law relating to consumer protection or the prohibition of the use of invalid general terms and conditions, provided that the objective of the representative action in question is to ensure observance of the rights which the persons affected by the contested processing derive directly from that regulation.”

Consumer protection agencies being able to pursue representative legal actions to defend fundamental rights against tech giants’ self interest — at the Member State level, and therefore, all across the EU — could actually unblock GDPR enforcement via a genuinely decentralized wave of enforcement that’s able to route around the damage of captured gatekeepers and call out big adtech’s manipulative tricks in court.

Ireland must ‘swiftly’ investigate legality of Facebook-WhatsApp data-sharing, says EDPB

Facebook’s lead regulator in the European Union must “swiftly” investigate the legality of data-sharing related to a controversial WhatsApp policy update, following an order by the European Data Protection Board (EDPB).

We’ve reached out to the Irish Data Protection Commission (DPC) for a response.

Updated terms had been set to be imposed upon users of the Facebook-owned messaging app early this year — but in January Facebook delayed the WhatsApp terms update until May after a major privacy backlash and ongoing confusion over the details of its user data processing.

Despite WhatsApp going ahead with the policy update, the ToS has continued to face scrutiny from regulators and rights organizations around the world.

The Indian government, for example, has repeatedly ordered Facebook to withdraw the new terms. While, in Europe, privacy regulators and consumer protection organizations have raised objections about how opaque terms are being pushed on users — and in May a German data protection authority issued a temporary (national) blocking order.

Today’s development follows that and is significant as it’s the first urgent binding decision adopted by the EDPB under the bloc’s General Data Protection Regulation (GDPR).

Although the Board has not agreed to order the adoption of final measures against Facebook-WhatsApp as the requesting data supervisor, the Hamburg DPA, had asked — saying that “conditions to demonstrate the existence of an infringement and an urgency are not met”.

The Board’s intervention in the confusing mess around the WhatsApp policy update follows the use of GDPR Article 66 powers by Hamburg’s data protection authority.

In May the latter ordered Facebook not to apply the new terms to users in Germany — saying its analysis found the policy granted “far-reaching powers” to WhatsApp to share data with Facebook, without it being clear what legal basis the tech giant was relying upon to be able process users’ data.

Hamburg also accused the Irish DPC of failing to investigate the Facebook-WhatsApp data sharing when it raised concerns — hence seeking to take matters into its own hands by making an Article 66 intervention.

As part of the process it asked the EDPB to take a binding decision — asking it to take definitive steps to block data-sharing between WhatsApp and Facebook — in a bid to circumvent the Irish regulator’s glacial procedures by getting the Board to order enforcement measures that could be applied stat across the whole bloc.

However the Board’s assessment found that Hamburg had not met the bar for demonstrating the Irish DPC “failed to provide information in the context of a formal request for mutual assistance under Article 61 GDPR”, as it puts it.

It also decided that the adoption of updated terms by WhatsApp — which it nonetheless says “contain similar problematic elements as the previous version” — cannot “on its own” justify the urgency for the EDPB to order the lead supervisor to adopt final measures under Article 66(2) GDPR.

The upshot — as the Hamburg DPA puts it — is that data exchange between WhatsApp and Facebook remains “unregulated at the European level”.

Article 66 powers

The importance of Article 66 of the GDPR is that it allows EU data protection authorities to derogate from the regulation’s one-stop-shop mechanism — which otherwise funnels cross border complaints (such as those against Big Tech) via a lead data supervisor (oftentimes the Irish DPC), and is thus widely seen as a bottleneck to effective enforcement of data protection (especially against tech giants).

An Article 66 urgency proceeding allows any data supervisor across the EU to immediately adopt provisional measures — provided a situation meets the criteria for this kind of emergency intervention. Which is one way to get around a bottleneck, even if only for a time-limited period.

A number of EU data protection authorities have used (or threatened to use) Article 66 powers in recent years, since GDPR came into application in 2018, and the power is increasingly proving its worth in reconfiguring certain Big Tech practices — with, for example, Italy’s DPA using it recently to force TikTok to remove hundreds of thousands of suspected underage accounts.

Just the threat of Article 66’s use back in 2019 (also by Hamburg) was enough to encourage Google to suspend manual reviews of audio reviews of recordings captured by its voice AI, Google Assistant. (And later led to a number of major policy changes by several tech giants who had similarly been manually reviewing users’ interactions with their voice AIs.)

At the same time, Article 66 provisional measures can only last three months — and only apply nationally, not across the whole EU. So it’s a bounded power. (Perhaps especially in this WhatsApp-Facebook case, where the target is a ToS update, and Facebook could just wait out the three months and apply the policy anyway in Germany after the suspension order lapses.)

This is why Hamburg wanted the EDPB to make a binding decision. And it’s certainly a blow to privacy watchers eager for GDPR enforcement to fall on tech giants like Facebook that the Board has declined to do so in this case.

Unregulated data-sharing

Responding to the Board’s decision not to impose definitive measures to prevent data sharing between WhatsApp and Facebook, the Hamburg authority expressed disappointment — see below for its full statement — and also lamented that the EDPB has not set a deadline for the Irish DPC to conduct the investigation into the legal basis of the data-sharing.

Ireland’s data protection authority has only issued one final GDPR decision against a tech giant to date (Twitter) — so there is plenty of cause to be concerned that without a concrete deadline the ordered probe could be kicked down the road for years.

Nonetheless, the EDPB’s order to the Irish DPC to “swiftly” investigate the finer-grained detail of the Facebook-WhatsApp data-sharing does look like a significant intervention by a pan-EU body — as it very publicly pokes a regulator with a now infamous reputation for reluctance to actually do the job of rigorously investigating privacy concerns. 

Demonstrably it has failed to do so in this WhatsApp case. Despite major concerns being raised about the policy update — within Europe and globally — Facebook’s lead EU data supervisor did not open a formal investigation and has not raised any public objections to the update.

Back in January when we asked about concerns over the update, the DPC told TechCrunch it had obtained a ‘confirmation’ from Facebook-owned WhatsApp that there was no change to data-sharing practices that would affect EU users — reiterating Facebook’s line that the update didn’t change anything, ergo ‘nothing to see here’. 

“The updates made by WhatsApp last week are about providing clearer, more detailed information to users on how and why they use data. WhatsApp have confirmed to us that there is no change to data-sharing practices either in the European Region or the rest of the world arising from these updates,” the DPC told us then, although it also noted that it had received “numerous queries” from stakeholders who it described as “confused and concerned about these updates”, mirroring Facebook’s own characterization of complaints.

“We engaged with WhatsApp on the matter and they confirmed to us that they will delay the date by which people will be asked to review and accept the terms from February 8th to May 15th,” the DPC went on, referring to a pause in the ToS application deadline which Facebook enacted after a public backlash that saw scores of users signing up to alternative messaging apps, before adding: “In the meantime, WhatsApp will launch information campaigns to provide further clarity about how privacy and security works on the platform. We will continue to engage with WhatsApp on these updates.”

The EDPB’s assessment of the knotty WhatsApp-Facebook data-sharing terms looks rather different — with the Board calling out WhatsApp’s user communications as confusing and simultaneously raising concerns about the legal basis for the data exchange.

In a press release, the EDPB writes that there’s a “high likelihood of infringements” — highlighting purposes contained in the updated ToS in the areas of “safety, security and integrity of WhatsApp IE [Ireland] and the other Facebook Companies, as well as for the purpose of improvement of the products of the Facebook Companies” as being of particular concern.

From the Board’s PR [emphasis its]:

“Considering the high likelihood of infringements in particular for the purpose of safety, security and integrity of WhatsApp IE [Ireland] and the other Facebook Companies, as well as for the purpose of improvement of the products of the Facebook Companies, the EDPB considered that this matter requires swift further investigations. In particular to verify if, in practice, Facebook Companies are carrying out processing operations which imply the combination or comparison of WhatsApp IE’s [Ireland] user data with other data sets processed by other Facebook Companies in the context of other apps or services offered by the Facebook Companies, facilitated inter alia by the use of unique identifiers. For this reason, the EDPB requests the IE SA [Irish supervisory authority] to carry out, as a matter of priority, a statutory investigation to determine whether such processing activities are taking place or not, and if this is the case, whether they have a proper legal basis under Article 5(1)(a) and Article 6(1) GDPR.”

NB: It’s worth recalling that WhatsApp users were initially told they must accept the updated policy or else the app would stop working. (Although Facebook later changed its approach — after the public backlash.) While WhatsApp users who still haven’t accepted the terms continue to be nagged to do so via regular pop-ups, although the tech giant does not appear to be taking steps to degrade the user experience further as yet (i.e. beyond annoying, recurring pop-ups).

The EDPB’s concerns over the WhatsApp-Facebook data-sharing extend to what it says is “a lack of information around how data is processed for marketing purposes, cooperation with the other Facebook Companies and in relation to WhatsApp Business API” — hence its order to Ireland to fully investigate.

The Board also essentially confirms the view that WhatsApp users themselves have no hope of understanding what Facebook is doing with their data by reading the comms material it has provided them with — with the Board writing [emphasis ours]:

“Based on the evidence provided, the EDPB concluded that there is a high likelihood that Facebook IE [Ireland] already processes WhatsApp IE [Ireland] user data as a (joint) controller for the common purpose of safety, security and integrity of WhatsApp IE [Ireland] and the other Facebook Companies, and for the common purpose of improvement of the products of the Facebook Companies. However, in the face of the various contradictions, ambiguities and uncertainties noted in WhatsApp’s user-facing information, some written commitments adopted by Facebook IE [Ireland] and WhatsApp IE’s [Ireland] written submissions, the EDPB concluded that it is not in a position to determine with certainty which processing operations are actually being carried out and in which capacity.”

We contacted Facebook for a response to the EDPB’s order, and the company sent us this statement — attributed to a WhatsApp spokesperson:

“We welcome the EDPB’s decision not to extend the Hamburg DPA’s order, which was based on fundamental misunderstandings as to the purpose and effect of the update to our terms of service. We remain fully committed to delivering secure and private communications for everyone and will work with the Irish Data Protection Commission as our lead regulator in the region in order to fully address the questions raised by the EDPB.”

Facebook also claimed it has controls in place for ‘controller to processor data sharing’ (i.e. between WhatsApp and Facebook) — which it said prohibit it (Facebook) from using WhatsApp user data for its own purposes.

The tech giant went on to reiterate its line that the update does not expand WhatsApp’s ability to share data with Facebook.

GDPR enforcement stalemate

A further vital component to this saga is the fact the Irish DPC has, for years, been investigating long-standing complaints against WhatsApp’s compliance with GDPR’s transparency requirements — and still hasn’t issued a final decision.

So when the EDPB says it’s highly likely that some of the WhatsApp-Facebook data-processing being objected to is already going on it doesn’t mean Facebook gets a pass for that — because the DPC hasn’t issued a verdict on whether or not WhatsApp has been up front enough with users.

tl;dr: The regulatory oversight process is still ongoing.

The DPC provisionally concluded its WhatsApp transparency investigation last year — saying in January that it sent a draft decision to the other EU data protection authorities for review (and the chance to object) on December 24, 2020; a step that’s required under the GDPR’s co-decision-making process.

In January, when it said it was still waiting to receive comments on the draft decision, it also said: “When the process is completed and a final decision issues, it will make clear the standard of transparency to which WhatsApp is expected to adhere as articulated by EU Data Protection Authorities.”

Over a half a year later and WhatsApp users in the EU are still waiting to find out whether the company’s comms lives up to the required legal standard of transparency or not — with their data continuing to pass between Facebook and WhatsApp in the meanwhile.

The Irish DPC was contacted for comment on the EDPB’s order today and with questions on the current status of the WhatsApp transparency investigation.

It told us it would have a response later today — we’ll update this report when we get it.

Back in November the Irish Times reported that WhatsApp Ireland had set aside €77.5M for “possible administrative fines arising from regulatory compliance matters presently under investigation”. No fines against Facebook have yet been forthcoming, though.

Indeed, the DPC has yet to issue a single final GDPR decision against Facebook (or a Facebook-owned company) — despite more than three years having passed since the regulation started being applied.

Scores of GDPR complaints against the Facebook’s data-processing empire — such as this May 2018 complaint against Facebook, Instagram and WhatsApp’s use of so-called ‘forced consent’ — continue to languish without regulatory enforcement in the EU because there’s been no decisions from Ireland (and sometimes no investigations either).

The situation is a huge black mark against the EU’s flagship data protection regulation. So the Board’s failure to step in more firmly now — to course-correct — does look like a missed opportunity to tackle a problematic GDPR enforcement bottleneck.

That said, any failure to follow the procedural letter of the law could invite a legal challenge that unpicked any progress. So it’s hard to see any quick wins in the glacial game of GDPR enforcement.

In the meanwhile, the winners of the stalemate are of course the tech giants who get to continue processing people’s data how they choose, with plenty of time to work on reconfiguring their legal, business and system structures to route around any enforcement damage that does eventually come.

Hamburg’s deputy commissioner for data protection, Ulrich Kühn, essentially warns as much in a statement responding to the EDPB’s decision in a statement — in which he writes:

“The decision of the European Data Protection Board is disappointing. The body, which was created to ensure the uniform application of the GDPR throughout the European Union, is missing the opportunity to clearly stand up for the protection of the rights and freedoms of millions of data subjects in Europe. It continues to leave this solely to the Irish supervisory authority. Despite our repeated requests over more than two years to investigate and, if necessary, sanction the matter of data exchanges between WhatsApp and Facebook, the IDPC has not taken action in this regard. It is a success of our efforts over many years that IDPC is now being urged to conduct an investigation. Nonetheless, this non-binding measure does not do justice to the importance of the issue. It is hard to imagine a case in which, against the background of the risks for the rights and freedoms of a very large number of data subjects and their de facto powerlessness vis-à-vis monopoly-like providers, the urgent need for concrete action is more obvious. The EDPB is thus depriving itself of a crucial instrument for enforcing the GDPR throughout Europe. This is no good news for data subjects and data protection in Europe as a whole.“

In further remarks the Hamburg authority emphasizes that the Board noted “considerable inconsistencies between the information with which WhatsApp users are informed about the extensive use of their data by Facebook on the one hand, and on the other the commitments made by the company to data protection authorities not (yet) to do so”; and also that it “expressed considerable doubts about the legal basis on which Facebook intends to rely when using WhatsApp data for its own or joint processing” — arguing that the Board therefore agrees with the “essential parts” of its arguments against WhatsApp-Facebook data sharing.

Despite carrying that weight of argument, the call for action is once again back in Ireland’s court.

 

EU puts out final guidance on data transfers to third countries

The European Data Protection Board (EDPB) published its final recommendations yesterday setting on guidance for making transfers of personal data to third countries to comply with EU data protection rules in light of last summer’s landmark CJEU ruling (aka Schrems II).

The long and short of these recommendations — which are fairly long; running to 48 pages — is that some data transfers to third countries will simply not be possible to (legally) carry out. Despite the continued existence of legal mechanisms that can, in theory, be used to make such transfers (like Standard Contractual Clauses; a transfer tool that was recently updated by the Commission).

However it’s up to the data controller to assess the viability of each transfer, on a case by case basis, to determine whether data can legally flow in that particular case. (Which may mean, for example, a business making complex assessments about foreign government surveillance regimes and how they impinge upon its specific operations.)

Companies that routinely take EU users’ data outside the bloc for processing in third countries (like the US), which do not have data adequacy arrangements with the EU, face substantial cost and challenge in attaining compliance — in a best case scenario.

Those that can’t apply viable ‘special measures’ to ensure transferred data is safe are duty bound to suspend data flows — with the risk, should they fail to do that, of being ordered to by a data protection authority (which could also apply additional sanctions).

One alternative option could be for such a firm to store and process EU users’ data locally — within the EU. But clearly that won’t be viable for every company.

Law firms are likely to be very happy with this outcome since there will be increased demand for legal advice as companies grapple with how to structure their data flows and adapt to a post-Schrems II world.

In some EU jurisdictions (such as Germany) data protection agencies are now actively carrying out compliance checks — so orders to suspend transfers are bound to follow.

While the European Data Protection Supervisor is busy scrutinizing EU institutions’ own use of US cloud services giants to see whether high level arrangements with tech giants like AWS and Microsoft pass muster or not.

Last summer the CJEU struck down the EU-US Privacy Shield — only a few years after the flagship adequacy arrangement was inked. The same core legal issues did for its predecessor, ‘Safe Harbor‘, though that had stood for some fifteen years. And since the demise of Privacy Shield the Commission has repeatedly warned there will be no quick fix replacement this time; nothing short of major reform of US surveillance law is likely to be required.

US and EU lawmakers remain in negotiations over a replacement EU-US data flows deal but a viable outcome that can stand up to legal challenge as the prior two agreements could not, may well require years of work, not months.

And that means EU-US data flows are facing legal uncertainty for the foreseeable future.

The UK, meanwhile, has just squeezed a data adequacy agreement out of the Commission — despite some loudly enunciated post-Brexit plans for regulatory divergence in the area of data protection.

If the UK follows through in ripping up key tenets of its inherited EU legal framework there’s a high chance it will also lose adequacy status in the coming years — meaning it too could face crippling barriers to EU data flows. (But for now it seems to have dodged that bullet.)

Data flows to other third countries that also lack an EU adequacy agreement — such as China and India — face the same ongoing legal uncertainty.

The backstory to the EU international data flows issues originates with a complaint — in the wake of NSA whistleblower Edward Snowden’s revelations about government mass surveillance programs, so more than seven years ago — made by the eponymous Max Schrems over what he argued were unsafe EU-US data flows.

Although his complaint was specifically targeted at Facebook’s business and called on the Irish Data Protection Commission (DPC) to use its enforcement powers and suspend Facebook’s EU-US data flows.

A regulatory dance of indecision followed which finally saw legal questions referred to Europe’s top court and — ultimately — the demise of the EU-US Privacy Shield. The CJEU ruling also put it beyond legal doubt that Member States’ DPAs must step in and act when they suspect data is flowing to a location where the information is at risk.

Following the Schrems II ruling, the DPC (finally) sent Facebook a preliminary order to suspend its EU-US data flows last fall. Facebook immediately challenged the order in the Irish courts — seeking to block the move. But that challenge failed. And Facebook’s EU-US data flows are now very much operating on borrowed time.

As one of the platform’s subject to Section 702 of the US’ FISA law, its options for applying ‘special measures’ to supplement its EU data transfers look, well, limited to say the least.

It can’t — for example — encrypt the data in a way that ensures it has no access to it (zero access encryption) since that’s not how Facebook’s advertising empire functions. And Schrems has previously suggested Facebook will have to federate its service — and store EU users’ information inside the EU — to fix its data transfer problem.

Safe to say, the costs and complexity of compliance for certain businesses like Facebook look massive.

But there will be compliance costs and complexity for thousands of businesses in the wake of the CJEU ruling.

Commenting on the EDPB’s adoption of final recommendations, chair Andrea Jelinek said: “The impact of Schrems II cannot be underestimated: Already international data flows are subject to much closer scrutiny from the supervisory authorities who are conducting investigations at their respective levels. The goal of the EDPB Recommendations is to guide exporters in lawfully transferring personal data to third countries while guaranteeing that the data transferred is afforded a level of protection essentially equivalent to that guaranteed within the European Economic Area.

“By clarifying some doubts expressed by stakeholders, and in particular the importance of examining the practices of public authorities in third countries, we want to make it easier for data exporters to know how to assess their transfers to third countries and to identify and implement effective supplementary measures where they are needed. The EDPB will continue considering the effects of the Schrems II ruling and the comments received from stakeholders in its future guidance.”

The EDPB put out earlier guidance on Schrems II compliance last year.

It said the main modifications between that earlier advice and its final recommendations include: “The emphasis on the importance of examining the practices of third country public authorities in the exporters’ legal assessment to determine whether the legislation and/or practices of the third country impinge — in practice — on the effectiveness of the Art. 46 GDPR transfer tool; the possibility that the exporter considers in its assessment the practical experience of the importer, among other elements and with certain caveats; and the clarification that the legislation of the third country of destination allowing its authorities to access the data transferred, even without the importer’s intervention, may also impinge on the effectiveness of the transfer tool”.

Commenting on the EDPB’s recommendations in a statement, law firm Linklaters dubbed the guidance “strict” — warning over the looming impact on businesses.

“There is little evidence of a pragmatic approach to these transfers and the EDPB seems entirely content if the conclusion is that the data must remain in the EU,” said Peter Church, a Counsel at the global law firm. “For example, before transferring personal data to third country (without adequate data protection laws) businesses must consider not only its law but how its law enforcement and national security agencies operate in practice. Given these activities are typically secretive and opaque, this type of analysis is likely to cost tens of thousands of euros and take time. It appears this analysis is needed even for relatively innocuous transfers.”

“It is not clear how SMEs can be expected to comply with these requirements,” he added. “Given we now operate in a globalised society the EDPB, like King Canute, should consider the practical limitations on its power. The guidance will not turn back the tides of data washing back and forth across the world, but many businesses will really struggle to comply with these new requirements.”

 

Ban biometric surveillance in public to safeguard rights, urge EU bodies

There have been further calls from EU institutions to outlaw biometric surveillance in public.

In a joint opinion published today, the European Data Protection Board (EDPB) and the European Data Protection Supervisor (EDPS), Wojciech Wiewiórowski, have called for draft EU regulations on the use of artificial intelligence technologies to go further than the Commission’s proposal in April — urging that the planned legislation should be beefed up to include a “general ban on any use of AI for automated recognition of human features in publicly accessible spaces, such as recognition of faces, gait, fingerprints, DNA, voice, keystrokes and other biometric or behavioural signals, in any context”.

Such technologies are simply too harmful to EU citizens’ fundamental rights and freedoms — like privacy and equal treatment under the law — to permit their use, is the argument.

The EDPB is responsible for ensuring a harmonization application of the EU’s privacy rules, while the EDPS oversees EU institutions’ own compliance with data protection law and also provides legislative guidance to the Commission.

EU lawmakers’ draft proposal on regulating applications of AI contained restrictions on law enforcement’s use of biometric surveillance in public places — but with very wide-ranging exemptions which quickly attracted major criticism from digital rights and civil society groups, as well as a number of MEPs.

The EDPS himself also quickly urged a rethink. Now he’s gone further, with the EDPB joining in with the criticism.

The EDPB and the EDPS have jointly fleshed out a number of concerns with the EU’s AI proposal — while welcoming the overall “risk-based approach” taken by EU lawmakers — saying, for example, that legislators must be careful to ensure alignment with the bloc’s existing data protection framework to avoid rights risks.

“The EDPB and the EDPS strongly welcome the aim of addressing the use of AI systems within the European Union, including the use of AI systems by EU institutions, bodies or agencies. At the same time, the EDPB and EDPS are concerned by the exclusion of international law enforcement cooperation from the scope of the Proposal,” they write.

“The EDPB and EDPS also stress the need to explicitly clarify that existing EU data protection legislation (GDPR, the EUDPR and the LED) applies to any processing of personal data falling under the scope of the draft AI Regulation.”

As well as calling for the use of biometric surveillance to be banned in public, the pair have urged a total ban on AI systems using biometrics to categorize individuals into “clusters based on ethnicity, gender, political or sexual orientation, or other grounds on which discrimination is prohibited under Article 21 of the Charter of Fundamental Rights”.

That’s an interesting concern in light of Google’s push, in the adtech realm, to replace behavioral micromarketing of individuals with ads that address cohorts (or groups) of users, based on their interests — with such clusters of web users set to be defined by Google’s AI algorithms.

(It’s interesting to speculate, therefore, whether FLoCs risks creating a legal discrimination risk — based on how individual mobile users are grouped together for ad targeting purposes. Certainly, concerns have been raised over the potential for FLoCs to scale bias and predatory advertising. And it’s also interesting that Google avoided running early tests in Europe, likely owning to the EU’s data protection regime.)

In another recommendation today, the EDPB and the EDPS also express a view that the use of AI to infer emotions of a natural person is “highly undesirable and should be prohibited” —  except for what they describe as “very specified cases, such as some health purposes, where the patient emotion recognition is important”.

“The use of AI for any type of social scoring should be prohibited,” they go on — touching on one use-case that the Commission’s draft proposal does suggest should be entirely prohibited, with EU lawmakers evidently keen to avoid any China-style social credit system taking hold in the region.

However by failing to include a prohibition on biometric surveillance in public in the proposed regulation the Commission is arguably risking just such a system being developed on the sly — i.e. by not banning private actors from deploying technology that could be used to track and profile people’s behavior remotely and en masse.

Commenting in a statement, the EDPB’s chair Andrea Jelinek and the EDPS Wiewiórowski argue as much, writing [emphasis ours]:

“Deploying remote biometric identification in publicly accessible spaces means the end of anonymity in those places. Applications such as live facial recognition interfere with fundamental rights and freedoms to such an extent that they may call into question the essence of these rights and freedoms. This calls for an immediate application of the precautionary approach. A general ban on the use of facial recognition in publicly accessible areas is the necessary starting point if we want to preserve our freedoms and create a human-centric legal framework for AI. The proposed regulation should also prohibit any type of use of AI for social scoring, as it is against the EU fundamental values and can lead to discrimination.”

In their joint opinion they also express concerns about the Commission’s proposed enforcement structure for the AI regulation, arguing that data protection authorities (within Member States) should be designated as national supervisory authorities (“pursuant to Article 59 of the [AI] Proposal”) — pointing out the EU DPAs are already enforcing the GDPR (General Data Protection Regulation) and the LED (Law Enforcement Directive) on AI systems involving personal data; and arguing it would therefore be “a more harmonized regulatory approach, and contribute to the consistent interpretation of data processing provisions across the EU” if they were given competence for supervising the AI Regulation too.

They are also not happy with the Commission’s plan to give itself a predominant role in the planned European Artificial Intelligence Board (EAIB) — arguing that this “would conflict with the need for an AI European body independent from any political influence”. To ensure the Board’s independence the proposal should give it more autonomy and “ensure it can act on its own initiative”, they add.

The Commission has been contacted for comment.

The AI Regulation is one of a number of digital proposals unveiled by EU lawmakers in recent months. Negotiations between the different EU institutions — and lobbying from industry and civil society — continues as the bloc works toward adopting new digital rules.

In another recent and related development, the UK’s information commissioner warned last week over the threat posed by big data surveillance systems that are able to make use of technologies like live facial recognition — although she claimed it’s not her place to endorse or ban a technology.

But her opinion makes it clear that many applications of biometric surveillance may be incompatible with the UK’s privacy and data protection framework.

Europe puts out advice on fixing international data transfers that’s cold comfort for Facebook

Following the landmark CJEU ‘Schrems II’ ruling in July, which invalidated the four-year-old EU-US Privacy Shield, European data protection regulators have today published 38-pages of guidance for businesses stuck trying to navigate the uncertainty around how to (legally) transfer personal data out of the European Union.

The European Data Protection Board’s (EDPB) recommendations focus on measures data controllers might be able to put in place to supplement the use of another transfer mechanism: so-called Standard Contractual Clauses (SCCs) to ensure they are complying with the bloc’s General Data Protection Regulation (GDPR) .

Unlike Privacy Shield, SCCs were not struck down by the court but their use remains clouded with legal uncertainty. The court made it clear SCCs can only be relied upon for international transfers if the safety of EU citizens’ data can be guaranteed. It also said EU regulators have a duty to intervene when they suspect data is flowing to a location where it will not be safe — meaning options for data transfers out of the EU have both reduced in number and increased in complexity.

One company that’s said it’s waiting for the EDPB guidance is Facebook. It’s already faced a preliminary order to stop transferring EU users data to the US. It petitioned the Irish courts to obtain a stay as it seeks a judicial review of its data protection regulator’s process. It has also brought out its lobbying big guns — former UK deputy PM and ex-MEP Nick Clegg — to try to pressure EU lawmakers over the issue.

Most likely the tech giant is hoping for a ‘Privacy Shield 2.0‘ to be cobbled together and slapped into place to paper over the gap between EU fundamental rights and US surveillance law.

But the Commission has warned there won’t be a quick fix this time.

Changes to US surveillance law are slated as necessary — which means zero chance of anything happening before the Biden administration takes the reins next year. So the legal uncertainty around EU-US transfers is set to stretch well into next year at a minimum. (Politico suggests a new data deal isn’t likely in the first half of 2021.)

In the meanwhile, legal challenges to ongoing EU-US transfers are stacking up — at the same time as EU regulators know they have a legal duty to intervene when data is at risk.

“Standard contractual clauses and other transfer tools mentioned under Article 46 GDPR do not operate in a vacuum,” the EDPB warns in an executive summary. “The Court states that controllers or processors, acting as exporters, are responsible for verifying, on a case-by-case basis and, where appropriate, in collaboration with the importer in the third country, if the law or practice of the third country impinges on the effectiveness of the appropriate safeguards contained in the Article 46 GDPR transfer tools.

“In those cases, the Court still leaves open the possibility for exporters to implement supplementary measures that fill these gaps in the protection and bring it up to the level required by EU law. The Court does not specify which measures these could be. However, the Court underlines that exporters will need to identify them on a case-by-case basis. This is in line with the principle of accountability of Article 5.2 GDPR, which requires controllers to be responsible for, and be able to demonstrate compliance with the GDPR principles relating to processing of personal data.”

The EDPB’s recommendations set out a series of steps for data exporters to take as they go through the complex task of determining whether their particular transfer can play nice with EU data protection law.

Six steps but no one-size-fits-all fix

The basic overview of the process it’s advising is: Step 1) map all intended international transfers; step 2) verify the transfer tools you want to use; step 3) assess whether there’s anything in the law/practice of the destination third country which “may impinge on the effectiveness of the appropriate safeguards of the transfer tools you are relying on, in the context of your specific transfer”, as it puts it; step 4) identify and adopt supplementary measure/s to bring the level of protection up to ‘essential equivalent’ with EU law; step 5) take any formal procedural steps required to adopt the supplementary measure/s; step 6) periodically re-evaluate the level of data protection and monitor any relevant developments.

In short, this is going to involve both a lot of work — and ongoing work. tl;dr: Your duty to watch over the safety of European users’ data is never done.

Moreover, the EDPB makes it clear that there very well may not be any supplementary measures to cover a particular transfer in legal glory.

“You may ultimately find that no supplementary measure can ensure an essentially equivalent level of protection for your specific transfer,” it warns. “In those cases where no supplementary measure is suitable, you must avoid, suspend or terminate the transfer to avoid compromising the level of protection of the personal data. You should also conduct this assessment of supplementary measures with due diligence and document it.”

In instances where supplementary measures could suffice the EDPB says they may have “a contractual, technical or organisational nature” — or, indeed, a combination of some or all of those.

“Combining diverse measures in a way that they support and build on each other may enhance the level of protection and may therefore contribute to reaching EU standards,” it suggests.

However it also goes on to state fairly plainly that technical measures are likely to be the most robust tool against the threat posed by foreign surveillance. But that in turn means there are necessarily limits on the business models that can tap in — anyone wanting to decrypt and process data for themselves in the US, for instance, (hi Facebook!) isn’t going to find much comfort here.

The guidance goes on to include some sample scenarios where it suggests supplementary measures might suffice to render an international transfer legal.

Such as data storage in a third country where there’s no access to decrypted data at the destination and keys are held by the data exporter (or by a trusted entity in the EEA or in a third country that’s considered to have an adequate level of protection for data); or the transfer of pseudonymised data — so individuals can no longer be identified (which means ensuring data cannot be reidentified); or end-to-end encrypted data transiting third countries via encrypted transfer (again data must not be able to be decrypted in a jurisdiction that lacks adequate protection; the EDPB also specifies that the existence of any ‘backdoors’ in hardware or software must have been ruled out, although it’s not clear how that could be done).

Another section of the document discusses scenarios in which no effective supplementary measures could be found — such as transfers to cloud service providers (or similar) which require access to the data in the clear and where “the power granted to public authorities of the recipient country to access the transferred data goes beyond what is necessary and proportionate in a democratic society”.

Again, this is a bit of the document that looks very bad for Facebook.

“The EDPB is, considering the current state of the art, incapable of envisioning an effective technical measure to prevent that access from infringing on data subject rights,” it writes on that, adding that it “does not rule out that further technological development may offer measures that achieve the intended business purposes, without requiring access in the clear”.

“In the given scenarios, where unencrypted personal data is technically necessary for the provision of the service by the processor, transport encryption and data-at-rest encryption even taken together, do not constitute a supplementary measure that ensures an essentially equivalent level of protection if the data importer is in possession of the cryptographic keys,” the EDPB further notes.

It also makes it clear that supplementary contractual clauses aren’t any kind of get-out on this front — so, no, Facebook can’t stick a clause in its SCCs that defuses FISA 702 — with the EDPB writing: “Contractual measures will not be able to rule out the application of the legislation of a third country which does not meet the EDPB European Essential Guarantees standard in those cases in which the legislation obliges importers to comply with the orders to disclose data they receive from public authorities.”

The EDPB does discuss examples of potential clauses data exporters could use to supplement SCCs, depending on the specifics of their data flow situation — alongside specifying “conditions for effectiveness” (or ineffectiveness in many cases, really). And, again, there’s cold comfort here for those wanting to process personal data in the US (or another third country) while it remains at risk from state surveillance.

“The exporter could add annexes to the contract with information that the importer would provide, based on its best efforts, on the access to data by public authorities, including in the field of intelligence provided the legislation complies with the EDPB European Essential Guarantees, in the destination country. This might help the data exporter to meet its obligation to document its assessment of the level of protection in the third country,” the EDPB suggests in one example from a section of the guidance discussing transparency obligations.

However the point of such a clause would be for the data exporter to put up-front conditions on an importer to make it easier for them to avoid getting into a risky contract in the first place — or help them with suspending/terminating a contract if a risk is determined — rather than providing any kind of legal sticking plaster for mass surveillance. Aka: “This obligation can however neither justify the importer’s disclosure of personal data nor give rise to the expectation that there will be no further access requests,” as the EDPB warns.

Another example discussed in the document is the viability of adding clauses to try to get the importer to certify there’s no backdoors in their systems which could put the data at risk.

However the EDPB warns this may just be useless, writing: “The existence of legislation or government policies preventing importers from disclosing this information may render this clause ineffective.” So the example could just be being included to try to kneecap dodgy legal advice that suggests contract clauses are a panacea for US surveillance overreach.

The EDPB’s full guidance can be found here.

We’ve also reached out to Facebook to ask what next steps it’ll be taking over its EU-US data transfers in light of the EDPB guidance and will update this report with any response.

No quick fix for transatlantic data transfers, says EC

Europe’s justice commissioner has conceded there will be “no quick fix” for EU-US data transfers in the wake of the decision by the region’s top court in July that struck down a flagship data transfer agreement which was being used by thousands of businesses.

Despite the ‘Schrems II’ judgement being the second such CJEU strike in around five years, commissioners from the EU’s executive body and US counterparts in the U.S. Department of Commerce announced last month that they had begun discussions on a potential replacement for the now defunct EU-US Privacy Shield.

Justice commissioner, Didier Reynders, said today that talks on an ‘enhanced framework’ are continuing but he admitted there’s no fast track fix for the schism between Europeans’ fundamental rights and US surveillance law.

“There is a common willingness to fully comply with the judgement of the court — on both sides we want to find ways in which to address the issues raised by the court,” said Didier Reynders. “We will intensify our engagement with the US in the coming weeks but we also have to recognize that the judgement raises complex issues related to the sensitive area of national security. Therefore there will be no quick fix.”

He went on to suggest that changes to US law may be needed for any Privacy Shield 2 to be possible — giving the example of the lack of a redress mechanism for EU citizens as an area where legislation may be needed — before emphasizing that any such legislative change would clearly take time (he noted, for example, that the US election is looming — which bakes natural delay into any such timeline).

“We are working with the US counterparts to evaluate the possibility of a strengthened framework — and of course it’s possible to build on existing elements but of course it’s maybe also a necessity to have legislative changes,” he said. “That’s the real question that we have with the US authorities. And that will of course have an impact on the time needed to put in place a new framework.

“It’s a real political debate; it’s not just a technical issue. And if we look at the domestic developments and debates in the US around privacy at the state and federal level but also limitation for intelligence service program there are probably more common grounds to find viable solutions than when the Privacy Shield was negotiated. You have also seen that the reaction of US authorities were constructive; they want to explore where to address the issues raised by the judgement but again sometimes, on the base of actual elements, there is maybe some legislative changes [required].”

“What we need are sustainable solutions that deliver legal certainty in full compliance with the judgement of the court,” he added. “That is also the message I have clearly passed to my EU counterparts and on which I will keep insisting.”

Reynders was speaking to the EU Parliament’s civil liberties (LIBE) committee, which was holding a hearing into the implications of the Court of Justice of the EU (CJEU) invalidating the EU-US Privacy Shield — aka the Schrems II ruling.

The chair of the European Data Protection Board (EDPB), Andrea Jelinek, had also been invited to speak, alongside Max Schrems himself, the European privacy campaigner who now has two successful strikes against EU-US data transfer mechanisms — after the CJEU invalidated Safe Harbor in 2015 and the EU-US Privacy Shield this July following his complaints. 

The discussion delved into the implications of the CJEU ruling for an alternative data transfer mechanism called Standard Contractual Clauses (SCCs) which were not invalidated by the court, even as their use for US data transfers is now larded with legal risk as a result of US surveillance overreach.

Reynders told the committee the Commission is continuing its work on modernizing SCCs to bring them into line with the EU’s General Data Protection Regulation (GDPR) framework — saying it will produce a draft version this month and is aiming to complete the process before the end of the year.

“Now that the judgement has been assured we will of course preserve the elements of the existing SCCs that have led to the court to find them valid. At the same time we will try to reflect and operationalize in all texts the additional clarification provided by the court on the conditions under which SCCs can be used — taking also fully into account the guidance issued by the EDPB that it should help companies in their compliance effort,” he added. “But of course we need to see what kind of more longer term evolution in the US [law there might be].”

Reynders said the same the issues around data transfers will arise with the UK, post Brexit — as it seeks an adequacy agreement and the Commission will have to assess its domestic laws, including infamously draconian surveillance laws — and with other third countries like China where there’s no adequacy agreement in place (nor any prospect of a finding of privacy protections that are essentially equivalent to those in the EU).

“We want to stay open to those that apply the rules,” he added.

Jelinek said the EDPB has just set up a taskforce to work on around 100 strategic complaints filed last month by Schrems’ digital rights group, noyb, that target EU-based entities across the region which are using SCCs for data transfers for Google Analytics and/or Facebook Connect integrations.

noyb argues there’s no legal basis for those transfers and that DPAs should step in and suspend them.

“We are going to work not only close together but closer together than we’ve ever done [with EU data protection authorities] to solve this issue,” said Jelinek. “We will analyze the matter and ensure that we will go together in the same direction.”

Enforcement of EU data protection law is both a duty for supervisory authorities and “a matter of credibility”, she added. “You can be sure we are investigating all together within the taskforce but again I have to tell you that enforcement… is a matter of the national supervisory authorities. Each and every supervisory authority has to enforce in their own country those complaints which are ruled with them.”

The prospect of any enforcement of Schrems’ original SCC complaint to the Irish DPC — filed some seven years ago at this point — is still a distant one, according to what he told the committee.

“Enforcement is going to be a matter of credibility,” he said. “So far the understanding is that there will be no enforcement — or no serious enforcement — that’s also the reason we have filed a couple of complaints already to make sure that there’s some movement. And I think there needs to be some kind of highlight cases where the industry feels there’s a feeling where they actually have to comply with all of this.

“I also want to throw in real short that we got a letter this week that I cannot disclose yet from the Irish data protection regulator informing us that, defacto, they will probably not pursue this case that is ongoing for seven years for the next, I would assume one or two years… We’re very sorry to see that the regulator in Ireland, despite being under a court order that they have to enforce this judgement is apparently choosing not to do so.”

We reached out to the Irish DPC for a response to Schrems’ remarks and it told us he is “wrong” in that supposition but at the time of writing the regulator had not provided any further comment. We’ll update this report if we get more.

Schrems was withering is his view of the Irish DPC’s record, telling the committee that its handling of his complaint was not a pro-privacy case but a “pro-delay case”.

“We have already said at the beginning that this case could have been done by the DPC itself. And we now get back to exactly the problems we have outlined five years’ ago — that the DPC is now working on again.” he said.

“The bottom line is probably there’s not going to be a decision within the next two or three years — if they continue like that. Which means the original complaint I filed after Snowden will probably take up to ten years to get a first instance decision. Then we’ll have three layers of appeals in Ireland. So I’m probably going to be retired once this case is actually finally decided! I’m going to be grey and old and that’s not how fundamental rights in Europe should work — and I think we really have to work on that.”