Europe says yes to messaging interoperability as it agrees major new regime for big tech

Late Thursday the European Union secured agreement on the detail of a major competition reform that will see the most powerful, intermediating tech platforms subject to a set of up-front rules on how they can and cannot operate — with the threat of fines of up to 10% of global annual turnover should they breach requirements (or even 20% for repeat violations).

In three-way discussions between the European Council, parliament and Commission, which ran for around eight hours today, it was finally agreed that the Digital Markets Act (DMA) will apply to large companies providing “core platform services” — such as social networks or search engines — which have a market capitalisation of at least €75 billion or an annual turnover of €7.5 billion.

To be designated a so-called “gatekeepers”, and thus fall in scope of the DMA, companies must also have at least 45 million monthly end users in the EU and 10,000+ annual business users.

This puts US tech giants, including Apple, Google and Meta (Facebook), clearly in scope. While some less gigantic but still large homegrown European tech platforms — such as the music streaming platform Spotify — look set to avoid being subject to the regime as it stands. (Although other European platforms may already have — or gain — the scale to fall in scope.)

SMEs are generally excluded from being designated gatekeepers as the DMA is intended to take targeted aim at big tech.

The regulation has been years in the making — and is set to usher in a radically different ex ante regime for the most powerful tech platforms in contrast to the after-the-fact antitrust enforcement certain giants have largely been able to shrug off to date, with no discernible impact to marketshare.

Frustration with flagship EU competition investigations and enforcements against tech giants like Google — and widespread concern over the need to reboot tipped digital markets and restore the possibility of vibrant competition — have been core driving forces for the bloc’s lawmakers.

Commenting in a statement Andreas Schwab, the European Parliament’s Rapporteur for the file, said: “The agreement ushers in a new era of tech regulation worldwide. The Digital Markets Act puts an end to the ever-increasing dominance of Big Tech companies. From now on, they must show that they also allow for fair competition on the internet. The new rules will help enforce that basic principle. Europe is thus ensuring more competition, more innovation and more choice for users.”

In another supporting statement, Cédric O, French minister of state with responsibility for digital, added: “The European Union has had to impose record fines over the past 10 years for certain harmful business practices by very large digital players. The DMA will directly ban these practices and create a fairer and more competitive economic space for new players and European businesses. These rules are key to stimulating and unlocking digital markets, enhancing consumer choice, enabling better value sharing in the digital economy and boosting innovation. The European Union is the first to take such decisive action in this regard and I hope that others will join us soon.”

Key requirements agreed by the EU’s co-legislators include interoperability for messaging platforms, meaning smaller platforms will be able to request that dominant gatekeeper services open up on request and enable their users to be able to exchange messages, send files or make video calls across messaging apps, expanding choice and countering the typical social platform network effects that create innovation-chilling service lock in.

That could be hugely significant in empowering consumers who object to the policies of a giant like Meta, which owns Facebook Messenger and WhatsApp, but feel unable to switch to a rival since their social graph is held by the gatekeeper to actually leave without having to give up the ability to message their friends.

There had been some debate over whether messaging interoperability would survive the trilogues. It has — although group messaging interoperability is set to be phased in over a longer period than one-to-one messaging.

Speaking to TechCrunch ahead of today’s fourth and final trilogue, Schwab, emphasized the importance of messaging interoperability provisions.

“The Parliament has always been clear that interoperability for messaging has to come,” he told us. “It will come — at the same time, it also has to be secure. If the Telecoms Regulators say it is not possible to deliver end-to-end encrypted group chats within the next nine months, then it will come as soon as it is possible, there will be no doubt about that.”

Per Schwab, messenger services that are subject to the interoperability requirement will have to open up their APIs for competitors to provide interoperable messaging for basic features — with the requirement intentionally asymmetrical, meaning that smaller messaging services which are not in the scope of the DMA will not be required to open up to gatekeepers but can themselves connect into Big Tech.

“The first basic messaging features will be user-to-user messages, video and voice calls, as well as basic file transfer (photos, videos), and then over time, more features such as group chats will come,” noted Schwab, adding: “Everything must be end-to-end encrypted.”

Interoperability for social media services has been put on ice for now — with the EU co-legislators agreeing that such provisions will be assessed in the future.

In another important decision which could have major ramifications for dominant digital business models, the parliament managed to keep an amendment to an earlier version of the proposal — which means that explicit consent from users will be required for a gatekeeper to combine personal data for targeted advertising.

“Data combination and cross use will only be possible with explicit consent,” said Schwab. “This is especially true for the purpose of advertising and also applies to combination with third party data (e.g. Facebook with third parties). This means more control for users whether they want to be tracked across devices/services, even outside of the networks of Big Tech (hence the third party data), and whether they want to receive tracking ads.”

“Lastly, to avoid consent fatigue, Parliament will limit how many times Gatekeepers can ask again for consent if you refused it or withdrawn consent to these practices: Once per year. This has been very important to me — otherwise, consent would be meaningless if gatekeeper can simply spam users until they give in,” he added.

Another parliament-backed requirement which survived the trilogue negotiations is a stipulation that users should be able to freely choose their browser, virtual assistants or search engines when such a service is operated by a gatekeeper — meaning choice screens, not pre-selected defaults, will be the new norm in those areas for in scope platforms.

Although email — another often bundled choice which European end-to-end encrypted email service ProtonMail had been arguing should also get a choice screen — does not appear to have been included, with lawmakers narrowing this down to “the most important software”, as the Council put it.

Other obligations on gatekeepers in the agreed text include requirements to:

  • ensure that users have the right to unsubscribe from core platform services under similar conditions to subscription
  • allow app developers fair access to the supplementary functionalities of smartphones (e.g. NFC chip)
  • give sellers access to their marketing or advertising performance data on the platform
  • inform the European Commission of their acquisitions and mergers

And among the restrictions are stipulations that gatekeepers cannot:

  • rank their own products or services higher than those of others (aka a ban on self-preferencing)
  • reuse private data collected during a service for the purposes of another service
  • establish unfair conditions for business users
  • pre-install certain software applications
  • require app developers to use certain services (e.g. payment systems or identity providers) in order to be listed in app stores

The Commission will be solely responsible for enforcing the DMA — and it will have some leeway over whether to immediately  crack down on duty-breaching tech giants, with the text allowing the possibility of engaging in regulatory dialogue to ensure gatekeepers have a clear understanding of the rules (i.e. rather than reaching straight for a chunky penalty).

Today’s agreement on a provisional text of the DMA marks almost the last milestone on a multi-year journey towards the DMA proposal becoming law. But there are still a few hoops for European lawmakers to jump through.

It’s still pending approval of the finalized legal text by the Parliament and Council (but getting consensus agreement in the first place is typically the far harder ask). Then, after that final vote, the text will be published in the EU’s official journey and the regulation will come into force 20 days later — with six months allowed for Member States to implement it in national legislation.

EU commissioners will be holding a series of — doubtless very jubilant — briefings tomorrow to flesh out the finer detail of what’s been agreed so stay tuned for more analysis…


Grindr’s $7M GDPR fine is a stark warning to adtech not to track

Grindr, a hook-up app for gay, bi, trans and queer people, has been fined around $7.1 million (65M NOK) by Norway’s data protection authority for passing user data to advertisers without consent — including highly sensitive information related to users’ sexual orientation.

Specifically, the DPA found that Grindr breached Articles 6(1) and 9(1) of Europe’s General Data Protection Regulation (GDPR).

The complaint adds to the behavioral advertising industry’s legal woes — which continue to pile up in the region.

The final size of the penalty Grindr has been hit with is a little reduced vs the 100M NOK/$12.1M that the gay dating app was facing back in January — when the Datatilsynet issued a preliminary decision on the case.

The authority told TechCrunch the smaller sanction takes account of the company having lower turnover in reality than the “rough estimate” it had relied upon in January when issuing the preliminary fine,

It also said the reduction takes account of measures Grindr implemented since the complaint was filed with the aim of bringing its processing of personal data in line with GDPR’s requirements.

The DPA’s decision notes that the final fine is approximately 32% of the maximum amount possible. And since GDPR allows for fines of up to €20M or up to 4% of an entity’s total global turnover in the preceding year, whichever is higher, it suggests the US-based app’s annual revenue does not exceed €20M/$22.5M.

The DPA describes the size of the fine as “proportionate both to the severity of the infringement and to Grindr’s financial situation”, asserting that it “does not exceed what is necessary to achieve the objectives pursued by the GDPR in the present case”.

The complaint has taken almost a year to arrive at a final decision owing — at least in part — to Grindr requesting extensions to deadlines on a number of occasions.

It’s also worth noting that this investigation was limited to the process Grindr used to obtain consent at the time of the complaint — in 2019 and up to April 2020 (when it switched to a different method).

So the lawfulness of Grindr’s current method for obtaining consent has not been investigated.

While the decision does not include any requirements that Grindr (or its ad partners) delete unlawfully obtained user data the DPA told us that that could change in future.

It also confirmed that its investigation against Grindr’s ad partners (who it sent user data to) is ongoing.

Our decision does not include any erasure requirements at this time but we have also made it clear that further decisions may come at a later date if we deem it necessary,” said Tobias Judin, director for international issues at Datatilsynet. “In other words: We are not ruling out any possibilities for further enforcement at this stage.”

“Now that we have a final decision in the Grindr case, this decision will also inform those investigations,” he further confirmed of the ad partner probes. 

The penalty for Grindr tracking users without consent comes at a time when some EU lawmakers continue pressing for a ban on surveillance-based advertising — although a committee vote in the European Parliament this week did not back amending the Digital Services Act to include an outright ban on surveillance-based advertising, as some MEPs have been pressing for.

The committee did back a prohibition on dark patterns to manipulate consent, though. So legal requirements look set to continue to tighten around how adtech can operate in the EU — and reform of manipulative defaults is being enforced.

See also: The UK’s data watchdog’s recent warned to the industry that the end of tracking is nigh.

In a statement welcoming Norway’s GDPR slap-down of Grindr, the deputy DG of the European Consumer Organisation, BEUC, Ursula Pachl, said: “Grindr illegally exploited and shared its users’ information for targeted advertising, including sensitive information about their sexual orientation. It is high time the behavioural advertising industry stops tracking and profiling consumers 24/7. It is a business model which clearly breaches the EU’s data protection rules and harms consumers. Let’s now hope this is the first domino to fall and that authorities start imposing fines on other companies as the infringements identified in this decision are standard surveillance ad-tech industry practices.”

Consent breaches

Datatilsynet opened the investigation into Grindr after receiving complaints from Norway’s Consumer Council (NCC) and the European privacy campaign group, noyb, acting on behalf an individual complainant.

Last year the NCC published an analysis of data flows from a number of popular apps (including Grindr but also a number of others) showing how they share data with “unexpected third parties”, including entities in the behavioral ad industry to highlight the extent of adtech’s lawfulness problem.

In its response to the data protection watchdog’s investigation, Grindr had claimed it had users’ consent to share their data with its advertising partners — which included Twitter-owned MoPub, Xandr (previously AppNexus), OpenX, AdColony and Smaato.

However the app did not offer users a free choice over whether to agree to its terms or not. If a Grindr user declined to accept its privacy policy during onboarding they were unable to proceed to use the app.

And while Grindr went on to change how it gathers consent — implementing a consent management platform provided by the third party OneTrust in April 2020 — as noted above this complaint focuses on how the app was obtaining consent prior to that switch.

The GDPR states that for consent to be a valid legal basis to process personal data it must be informed, specific and freely given (emphasis ours). So the lack of a choice offered to users looks like a very flagrant breach of the rules.

In seeking to avoid a sanction, Grindr also sought to argue that it did not pass information on individual users’ sexuality to advertisers — claiming it only sent generic keywords (such as “gay”, “bi” and “bi-curious”).

This is important because GDPR has specific rules for so-called “special category data” — requiring an even higher bar of explicit consent from a user if that’s the legal basis you’re claiming for processing information such as a person’s sexual orientation.

In reaching its final decision on the complaint, the Datatilsynet concluded that protections contained in Article 9 of the GDPR (which concerns “special category data”) should not be so narrowly interpreted.

“Being a Grindr user strongly indicates, and appears in most cases to accurately reflect, that the data subject belongs to a sexual minority. Furthermore, the fact that a data subject belongs to a sexual minority may lead to prejudice and discrimination even without revealing their specific sexual orientation,” it writes, adding: “The wording of Article 9 does not require a revealing of a particular ‘sexual orientation’, and the purpose behind Article 9 discourages a narrow interpretation.

“For these reasons, we find that information that a data subject is a Grindr user is data ‘concerning’ the data subject’s ‘sexual orientation’.”

Grindr had also sought to suggest that advertisers were unlikely to use categories of special category data for profiling and ad targeting — telling the DPA it would be surprised if that were the case.

Which is — to put it mildly — a surprising argument to try to make, given ample evidence from other GDPR complaints of the highly invasive profiling being carried out by the behavioral ad industry.

Not to mention the fact that a flagship industry framework that’s widely used to claim consent to process people’s data for ad targeting is facing a GDPR breach finding itself. As is the online advertising body that controls it.

In any case, Datatilsynet rejected Grindr’s dodge — pointing out that it’s irrelevant how such sensitive data might be further processed, since — under GDPR — “the sharing of personal data concerning a natural person’s ‘sexual orientation’ to advertising partners is sufficient to trigger Article 9”. (Its decision also makes it explicit that it does “not agree with the claim that a data subject’s ‘sexual orientation’ is not a category of data that could potentially be used by advertisers to target ads”.)

In another attempt to wiggle out of a GDPR slap-down, Grindr had also sought to argue that even if its advertisers — theoretically — received any sensitive personal data they must “blind” themselves to, per commitments in its contracts with advertisers.

Moreover it claimed many adtech companies operating in the EU have spent the last decade or so devising so-called “blinding methods” which it said obfuscate which app an ad call is coming from.

“Grindr holds that participants in the ad tech ecosystem would likely only receive a ‘blinded’ app-ID and not the corresponding app name,” the DPA explains in the decision. “According to Grindr, it is a common practice in the EU for ad networks to nullify the app name and use a random App ID in the ad call so that downstream bidders are ‘blind’ to the actual name of the app where the ad is to be served.”

However, once again, the DPA points out this is irrelevant — given sensitive data being passed is enough to trigger Article 9 provisions.

The Datatilsynet’s decision also cites a technical report, by Mnemonic, which showed Grindr’s app name being shared with MoPub — “who further shared this within their mediation network”. And further notes that Mnemonic’s report also showed the app name was shared from Grindr to “multiple other advertising partners”.

As if that wasn’t enough, Datatilsynet further points out that Grindr’s own privacy policy “explicitly states that ‘[o]ur advertising partners are aware that such data is being transmitted from Grindr’.”

So, er,…. 🥴

(NB: In a further demolition of the self-serving notion of ‘blinded’ app-IDs, the DPA goes on to make the point that even if this were happening as claimed by the adtech industry it still wouldn’t comply with other requirements in the GDPR, noting: “Even if some advertising partners or other participants in the ad tech ecosystem would ‘blind’ themselves or only receive an obfuscated app ID, this is not line with the principle of accountability in Article 5(2) GDPR. Grindr would have to rely on the action of advertising partners or other participants in the ad tech ecosystem to halt its sharing of the data in question.”)

The DPA’s analysis goes further in unpicking adtech’s obfuscating claims vs what’s really being done with people’s data vs what EU law actually requires. (So it’s well worth reading in full if you’re interested in devilish detail.)

The long and short of it is that Datatilsynet found Grindr did process users’ sexual orientation data, as set out in Article 9(1) — by “sharing personal data on a specific user alongside app name or app ID to advertising partners”.

And while the GDPR can allow for consent-based processing of special category data a higher bar of “explicit” consent is required for that type of processing to be lawful, again, the DPA found that Grindr had not obtained the required legal standard of permission from users.

Its decision further concludes that Grindr users had not “manifestly made public” information about their sexual orientation simply by merit of using the app, as the app had sought to argue (noting, for example, that it allows for an anonymous approach, letting users select a nickname and choose whether or not to upload a selfie).

“At any rate, it goes beyond the reasonable expectations of the data subject that Grindr would disclose information concerning their sexual orientation to advertising partners. Though information about someone merely being a Grindr user must be considered a special category of personal data under Article 9(1), becoming a Grindr user is not an affirmative act by the data subject to make the information public,” Datatilsynet adds.

Grindr has been contacted for comment on the sanction.

It has three weeks to lodge an appeal against the decision — if it wishes to do so.

Datatilsynet’s order is careful to specify that there may be additional issues related to Grindr’s prior or current consent mechanism since this investigation was limited to the scope of the complaints which were focused on the lawfulness of its previous consent management platform in the app.

“The fact that potential issues have fallen outside the scope of our investigation does not preclude those issues from being investigated in the future,” its decision notes.

In a statement commenting on the decision, Ala Krinickytė, a data protection lawyer at noyb, described it as “astonishing that the DPA has to convince Grindr that its users are LGBT+ and that this fact is not a commodity to be bartered”.

Krinickytė further summarized the Datatilsynet order thusly: “You cannot share personal data with a potentially unlimited number of partners without being able to control what happens to that data.”

And that is really the crux of the problem for surveillance-based advertising which relies upon pervasive tracking of Internet users to individually target marketing.

Even setting aside the existential problem of a lack of consent for tracking, the adtech industry does not have processes in place to control what happens to data once it’s grabbed and ‘shared’ with scores of faceless adtech entities involved in the high velocity programmatic auction process known as real-time bidding (RTB).

GPDR complaints targeting RTB’s failure to adequately protect people’s data have been sitting, unenforced, on EU regulators’ desks for years — but there are signs that the enforcement blockage is starting to shift, not least as a result of smart, smaller-scale actions such as Norway going after Grindr.

The web of adtech data flows is such a tangled one that even a relative bit player can draw in and implicate scores of others.

The adtech industry’s workaround for people’s general distaste at being stalked and creeped on through their devices and digital activity, meanwhile, has been to not actually ask for permission to track and profile them in the first place.

But — in Europe at least — that mocking ‘consent’ pantomime is finally headed for its end-game.

Whether the alternative targeting processes the industry devises will be just as cynical, manipulative and exploitative as what they’ve been doing for the past decade+ will largely depend upon regulators and lawmakers driving proper oversight of a sector that’s been allowed to flourish in the dark, rife with dark patterns and defined by its dark arts.

One negative signal is how the IAB Europe continues to try to confuse the issue by conflating ad targeting with invasive tracking — in a bid to lobby MEPs not to outlaw surveillance-based adtech.

In reality, privacy-safe targeting alternatives already exist (such as contextual ads) and have been profitable for years for companies like DuckDuckGo.

The behavioral advertising industry’s lawfulness problem is in fact directly chainlinked to its mass surveillance of Internet users.

Commenting on the Datatilsynet’s decision against Grindr in a statement, Finn Myrstad, director of digital policy in the NCC, warned: “This sends a strong signal to all companies involved in commercial surveillance. There are serious repercussions to sharing personal data without a legal basis. We call for the digital advertising industry to make fundamental changes to respect consumers’ rights.”

NB: While Norway is not an EU Member it is part of the European Economic Area and it transposed the GDPR into national law in 2018. Additionally, Grindr being a US company without a defined legal entity in the EU opens its business to regulatory oversight by DPAs in any part of the bloc which have concerns (and where it offers a service), rather than oversight being funnelled via gatekeepers like Ireland’s Data Protection Commission, as has happened with complaints against Google’s adtech for example


Italy fines Apple and Google for ‘aggressive’ data practices

Apple and Google have been fined €10 million apiece by Italy’s competition and market authority (AGCM) which has found they did not provide their users with clear enough information on commercial uses of their data — in violation of the country’s consumer code.

The regulator also accuses the pair of deploying “aggressive” practices to push users to accept the commercial processing.

Apple and Google were both contacted for a response to the ACGM’s sanction. Both said they will appeal.

Google is accused of omitting relevant information at the account creation phase and as consumers are using its services — information the regulator says should be providing in order for people to decide whether or not to consent to its use of their data for commercial ends.

The AGCM has also accused Apple of failing to immediately provide users with clear information on how it uses their information commercially when they create an Apple ID or access its digital stores, such as the App Store.

It’s the rather more surprising sanction — given Apple’s carefully cultivated image as a champion of consumer privacy (not to mention the premium its devices and services tend to command vs cheaper, ad-supported alternatives, such as stuff made by Google).

The Italian regulator lumps both companies’ practices together in a press release announcing the sanctions — accusing each one of being especially aggressive in pushing self-serving commercial terms on their respective users, especially at the account creation phase.

For Google, the ACGM notes that it pre-sets user acceptance of commercial processing — and also notes that the adtech giant fails to provide a clear way for users to revoke consent for these data transfers later or otherwise change their choice after the account step has been completed.

It also takes the view that Apple’s approach denies users the ability to properly exercise choice over its commercial use of their data, with the regulator arguing the iPhone maker’s data acquisition practices and architecture essentially “condition” the consumer to accept its commercial terms.

It’s an awkward accusation for a company that splashes major marketing cash on suggesting its devices and software are superior to alternatives (such as tech made by Google) exactly because it claims to put user privacy at the core of what it does.

In a statement, Apple rejected the ACGM’s finding — writing:

“We believe the Authority’s view is wrong and will be appealing the decision. Apple has a long-standing commitment to the privacy of our users and we work incredibly hard to design products and features that protect customer data. We provide industry-leading transparency and control to all users so they can choose what information to share or not, and how it is used.”

A Google spokeswoman also disagreed with the findings, sending this statement:

“We have transparent and fair practices in order to provide our users with helpful tools and clear information about their usage. We give people simple controls to manage their information and limit the use of personal data, and we work hard to be fully compliant with the consumer protection rules. We disagree with the Authority’s decision and we will appeal.”

The full text of the ACGM’s decisions can be found here: For Apple and Google.

The Italian regulator has had a busy few days slapping big tech: Earlier this week it issued a $230M fine (total) for Apple and Amazon over alleged collusion around the sale of Apple kit on Amazon’s Italian marketplace.

It has also been stepping up investigations of tech giants over a period of years — earlier this year it fined Facebook over similar issues with its commercial use of people’s data, while this summer it hit Google with a $123M fine related to Android Auto. It also has an open probe into Google’s displaying advertising business.

Other fines from the ACGM in recent years include one for Apple related to misleading iPhone users about the device’s water resistance and another for Apple and Samsung for slowing devices.

Didomi raises $40 million to help you manage customer consent

French startup Didomi has raised a $40 million Series B funding round led by Elephant and Breega. The company manages consent flows for web publishers and app developers. Didomi is already doing well in Europe with billions of consent interactions per month — it plans to expand to the U.S. with today’s funding round.

“Jawad, Raphaël and I have co-founded Didomi to make privacy easier for everyone and an obvious choice for companies. This fundraising is a major milestone on our journey to deliver on this mission,” co-founder and CEO Romain Gauthier said in a statement.

“We look forward to helping brands and publishers make customer journeys more transparent and trustworthy through a delightful consent and preferences management experience,” he added.

In recent years, many regulators have implemented new privacy-focused frameworks. You might think about GDPR in Europe for instance.

And if you live in a country that is affected by those changes, you are now well aware that you’ll get a consent popup or banner whenever you visit a new website or open an app for the first time.

I wouldn’t say that these popups are “delightful” as the best consent popup is the one that doesn’t exist because the site you’re visiting doesn’t collect and share personal data. But that’s not always possible and there are different reasons why you may need to collect data — including on this current site techcrunch dot com.

In that case, a product like Didomi can be really helpful. Taking those consent flows seriously is extremely important as you don’t want to mess up the implementation and get fined. Didomi is a developer-focused consent platform that works across many different devices. You can configure your consent flow for a desktop website, a mobile website, a mobile app or a connected TV.

Having a unified solution also means that you don’t have to ask for permission over and over again. Didomi can store and synchronize preferences across devices. Everything is auditable in case regulators want to see how you’re collecting consent.

With today’s funding round, the company wants to make its product even more developer friendly with open APIs and open-source SDKs. It doesn’t mean that Didomi is for everyone as the company focuses on premium clients in particular. Clients include Rakuten, Orange, Giphy and Weight Watchers International.

The company will also hire more people with local marketing and sales teams for different markets. Didomi plans to open offices in Germany, Spain and the U.S.

At the same time, the landscape is quickly evolving. Web browsers are gradually blocking third-party trackers and Apple now even asks you if an app can track you at the operating system level. It’s going to be interesting to see how Didomi evolves with user expectations.

German federal court squashes consent opt-outs for non-functional cookies

Yet another stake through the dark-patterned heart of consentless online tracking. Following a key cookie consent ruling by Europe’s top court last year, Germany’s Federal Court (BGH) has today handed down its own ‘Planet49’ decision — overturning an earlier appeal ruling when judges in a district court had allowed a pre-checked box to stand for consent.

That clearly now won’t wash even in Germany, where there had been confusion over the interpretation of a local law which had suggested an opt-in for non-functional cookies might be legally valid in some scenarios. Instead, the federal court ruling aligns with last October’s CJEU decision (which we reported on in detail here).

The ‘Planet49’ legal challenge was originally lodged by vzbz, a German consumer rights organization, which had complained about a lottery website, Planet49, that — back in 2013 — had required users to consent to the storage of cookies in order to play a promotional game. (Whereas EU law generally requires consent to be freely given and purpose limited if it’s to be legally valid.)

In a statement today following the BGH’s decision, board member Klaus Müller said: “This is a good judgment for consumers and their privacy. Internet users are again given more decision-making authority and transparency. So far, it has been common practice in this country for website providers to track, analyze, and market the interests and behaviors of users until they actively contradict them. This is no longer possible. If a website operator wants to screen his users, he must at least ask for permission beforehand. This clarification was long overdue.”

There is one looming wrinkle, however, in the shape of Europe’s ePrivacy reform — a piece of legislation which deals with online tracking. In recent years, European institutions have failed to reach agreement on an update to this — with negotiations ongoing and lobbyists seeking ways to dilute Europe’s strict consent standard.

Should any future reform of ePrivacy weaken the rules on tracking consent that could undo hard won progress to secure European citizens’ rights, under the General Data Protection Regulation (GDPR), which deals with personal data more broadly.

vzbz’s statement warns about this possibility, with the consumer rights group urging the EU to “ensure that the currently negotiated European ePrivacy Regulation does not weaken these strict regulations”.

“We reject the Croatian Presidency’s proposal to allow user tracking in the future on the legal basis of a balance of interests,” added Müller. “The end devices of the consumers allow a deep insight into complex emotional, political and social aspects of a person. Protecting this privacy is a great asset. We therefore require tight and clear rules for user tracking for advertising purposes. This may only be permitted with consent or under strict conditions defined in the law.”

In the meanwhile, there will be legal pressure on data controllers in German to clean up any fuzzy cookie notices to ensure they are complying with consent requirements.

“As the implementation of these new requirements are easily visible (and technically identifiable) on the website, incompliance bears a high risk of cease-and-desist and supervisory procedures,” warns law firm TaylorWessing in a blog post commenting on the BGH decision.

Separately today, another long running legal challenge brought by vzbz against the social networking giant Facebook — for allegedly failing to gain proper consent to process user data related to games hosted on its app platform, back in 2012 — is set to get even longer after the BGH sought a referral on a legal question to Europe’s top court.

The German federal court is seeking clarification on whether consumer protection organizations can bring a lawsuit before the country’s civil courts seeking redress for data protection breaches. “This question is controversial in the case law of the instance courts and the legal literature,” the court notes in a press release.

We’ve reached out to Facebook for comment on the CJEU referral.

No cookie consent walls — and no, scrolling isn’t consent, says EU data protection body

You can’t make access to your website’s content dependant on a visitor agreeing that you can process their data — aka a ‘consent cookie wall’. Not if you need to be compliant with European data protection law.

That’s the unambiguous message from the European Data Protection Board (EDPB), which has published updated guidelines on the rules around online consent to process people’s data.

Under pan-EU law, consent is one of six lawful bases that data controllers can use when processing people’s personal data.

But in order for consent to be legally valid under Europe’s General Data Protection Regulation (GDPR) there are specific standards to meet: It must be clear and informed, specific and freely given.

Hence cookie walls that demand ‘consent’ as the price for getting inside the club are not only an oxymoron but run into a legal brick wall.

No consent behind a cookie wall

The regional cookie wall has been crumbling for some time, as we reported last year — when the Dutch DPA clarified its guidance to ban cookie walls.

The updated guidelines from the EDPB look intended to hammer the point home. The steering body’s role is to provide guidance to national data protection agencies to encourage a more consistent application of data protection rules.

The EDPB’s intervention should — should! — remove any inconsistencies of interpretation on the updated points by national agencies of the bloc’s 27 Member States. (Though compliance with EU data protection law tends to be a process; aka it’s a marathon not a sprint, though on the cookie wall issues the ‘runners’ have been going around the tracks for a considerable time now.)

As we noted in our report on the Dutch clarification last year, the Internet Advertising Bureau Europe was operating a full cookie wall — instructing visitors to ‘agree’ to its data processing terms if they wished to view the content.

The problem that we pointed out is that that wasn’t a free choice. Yet EU law requires a free choice for consent to be legally valid. So it’s interesting to note the IAB Europe has, at some point since, updated its cookie consent implementation — removing the cookie wall and offering a fairly clear (if nudged) choice to visitors to either accept or deny cookies for “aggregated statistics”…

As we said at the time the writing was on the wall for consent cookie walls.

The EDPB document includes the below example to illustrate the salient point that consent cookie walls do not “constitute valid consent, as the provision of the service relies on the data subject clicking the ‘Accept cookies’ button. It is not presented with a genuine choice.”

It’s hard to get clearer than that, really.

Scrolling never means ‘take my data’

A second area to get attention in the updated guidance, as a result of the EDPB deciding there was a need for additional clarification, is the issue of scrolling and consent.

Simply put: Scrolling on a website or digital service can not — in any way — be interpreted as consent.

Or, as the EDPB puts it, “actions such as scrolling or swiping through a webpage or similar user activity will not under any circumstances satisfy the requirement of a clear and affirmative action” [emphasis ours].

Logical reason being such signals are not unambiguous. (Additionally, the EDPB example raises the point of how would a user withdraw consent if such a signal were valid? By scrolling back up the same web page? Obviously that would be ridiculous and confusing.)

Here’s the relevant example from the document:

Again, harder to get clearer than that.

So any websites still trying to drop tracking cookies the moment a site visitor scrolls the page are risking regulatory enforcement. (Reminder: GDPR fines can scale as high as €20M or 4% of global annual turnover.)

Nonetheless, recent research suggests cookie consent theatre remains rife in the EU — albeit, not only limited to the ‘scroll and you’ve been tracked’ flavor of the practice.

Manipulative consent pop-ups and dark patterns also remain a major problem, with such tactics being actively deployed to undermine legal protections for EU citizens’ data.

Still, a lot of clarifying light has now been shone into this area by both regulators and courts, shrinking the operating space for bad faith actors.

A ruling by the European Court of Justice last year made it clear that active consent is required for tracking cookies, for example — also demolished ‘pre-checking’ as a valid way of gathering consent, among other stipulations.

Plus there’s increasing pressure on regulators to actually enforce the rules — with GDPR’s two year anniversary fast approaching.

So where consent is concerned, the rule of thumb, if you need one, is you can’t steal consent nor conceal consent. And if you wish to shortcut consent you can only do so if your shortcut is A) clearly and accurately signposted and B) you offer a similarly easy route to opt-out again. Simples.

Cookie consent tools are being used to undermine EU privacy rules, study suggests

Most cookie consent pop-ups served to Internet users in the European Union — ostensibly seeking permission to track people’s web activity — are likely to be flouting regional privacy laws, a new study by researchers at MIT, UCL and Aarhus University suggests.

“The results of our empirical survey of CMPs [consent management platforms] today illustrates the extent to which illegal practices prevail, with vendors of CMPs turning a blind eye to — or worse, incentivising — clearly illegal configurations of their systems,” the researchers argue, adding that: “Enforcement in this area is sorely lacking.”

Their findings, published in a paper entitled Dark Patterns after the GDPR: Scraping Consent Pop-ups and Demonstrating their Influence, chime with another piece of research we covered back in August — which also concluded a majority of the current implementations of cookie notices offer no meaningful choice to Europe’s Internet users — even though EU law requires one.

When consent is being relied upon as the legal basis for processing web users’ personal data, the bar for valid (i.e. legal) consent that’s set by the EU’s General Data Protection Regulation (GDPR) is clear: It must be informed, specific and freely given.

Recent jurisprudence by the Court of Justice of the European Union also further crystalized the law around cookies, making it clear that consent must be actively signalled — meaning a digital service cannot infer consent to tracking by indirect actions (such as the pop-up being closed by the user without a response or ignored in favor of interacting with the service).

Many websites use a so-called CMP to solicit consent to tracking cookies. But if it’s configured to contain pre-ticked boxes that opt users into sharing data by default — requiring an affirmative user action to opt out — any gathered ‘consent’ also isn’t legal.

Consent to tracking must also be obtained prior to a digital service dropping or accessing a cookie; Only service-essential cookies can be deployed without asking first.

All of which means — per EU law — it should be equally easy for website visitors to choose not to be tracked as to agree to their personal data being processed.

However the Dark Patterns after the GDPR study found that’s very far from the case right now.

“We found that dark patterns and implied consent are ubiquitous,” the researchers write in summary, saying that only slightly more than one in ten (11.8%) of the CMPs they looked at “meet the minimal requirements that we set based on European law” — which they define as being “if it has no optional boxes pre-ticked, if rejection is as easy as acceptance, and if consent is explicit”.

For the study, the researchers scraped the top 10,000 UK websites, as ranked by Alexa, to gather data on the most prevalent CMPs in the market — which are made by five companies: QuantCast, OneTrust, TrustArc, Cookiebot, and Crownpeak — and analyzed how the design and configurations of these tools affected Internet users’ choices. (They obtained a data set of 680 CMP instances via their method — a sample they calculate is representative of at least 57% of the total population of the top 10k sites that run a CMP, given prior research found only around a fifth do so.)

Implicit consent — aka (illegally) inferring consent via non-affirmative user actions (such as the user visiting or scrolling on the website or a failure to respond to a consent pop-up or closing it without a response) — was found to be common (32.5%) among the studied sites.

“Popular CMP implementation wizards still allow their clients to choose implied consent, even when they have already indicated the CMP should check whether the visitor’s IP is within the geographical scope of the EU, which should be mutually exclusive,” they note, arguing that: “This raises significant questions over adherence with the concept of data protection by design in the GDPR.”

They also found that the vast majority of CMPs make rejecting all tracking “substantially more difficult than accepting it” — with a majority (50.1%) of studied sites not having a ‘reject all’ button. While only a tiny minority (12.6%) of sites had a ‘reject all’ button accessible with the same or fewer number of clicks as an ‘accept all’ button.

Or, to put it another way, ‘Ohhai dark pattern design‘…

“An ‘accept all’ button was never buried in a second layer,” the researchers go on to point out, also finding that “74.3% of reject all buttons were one layer deep, requiring two clicks to press; 0.9% of them were two layers away, requiring at minimum three.”

Pre-ticked boxes were found to be widely deployed in the studied CMPs as well — despite such a setting not being legally valid. (On this they found: “56.2% of sites pre-ticked optional vendors or purposes/categories, with 54.1% of sites pre-ticking optional purposes, 32.3% pre-ticking optional categories, and 30.3% pre-ticking both”.)

They also point out that the high number of third-party trackers routinely being used by sites poses a major problem for the EU consent model — given it requires a “prohibitively long time” for users to become clearly informed enough to be able to legally consent.

The exact number of third party trackers they found being packed like sardines into CMPs varied — with between tens and several hundreds in play depending on the site.

Fifty-eight was the lowest number they encountered. While the highest instance was 542 vendors — on an implementation of QuantCast’s CMP. (And, well, just imagine the ‘friction’ involved in manually unticking all those, assuming that was one of the sites that also lacked a ‘reject all’ button… )

Sites relied on a large number of third party trackers, which would take a prohibitively long time for users to inform themselves about clearly. Out of the 85.4% of sites that did list vendors (e.g. third party trackers) within the CMP, there was a median number of 315 vendors (low. quartile 58, upp. quartile 542). Different CMP vendors have different average numbers of vendors, with the highest being QuantCast at 542… 75% of sites had over 58 vendors. 76.47% of sites provide some descriptions of their vendors. The mean total length of these descriptions per site is 7,985 words: roughly 31.9 minutes of reading for the average 250 words-per-minute reader, not counting interaction time to e.g. unfold collapsed boxes or navigating to and reading specific privacy policies of a vendor.

A second part of the research involved a field experiment involving 40 participants to investigate how the eight most common CMP designs affect Internet users’ consent choices.

“We found that notification style (banner or barrier) has no effect [on consent choice]; removing the opt-out button from the first page increases consent by 22–23 percentage points; and providing more granular controls on the first page decreases consent by 8–20 percentage points,” they write in summary on that.

They argue this portion of the study supports the notion that two of the most common consent interface designs – “not showing a ‘reject all’ button on the first page; and showing bulk options before showing granular control” – make it more likely for users to provide consent, thereby “violating the [GDPR] principle of “freely given””.

They also make reference to “qualitative reflections” of the participants in the paper — which were obtained via  survey after individuals’ consent choices had been registered during the field study — suggesting these responses “put into question the entire notice-and-consent model not because of specific design decisions but merely because an action is required before the user can accomplish their main task and because they appear too frequently if they are shown on a website-by-website basis”.

So, in other words, just the fact of interrupting a web user to ask them to make a choice may itself apply substantial enough pressure that it might render any resulting ‘consent’ invalid.

The study’s finding of the prevalence of manipulative designs and configurations intended to nudge or even force consent suggests Internet users in Europe are not actually benefiting from a legal framework that’s supposed to protection their digital data from unwanted exploitation — and are rather being subject to a lot of noisy, distracting and disingenuous ‘consent theatre’.

Cookie notices not only generate friction and frustration for the average Internet user, as they try to go about their daily business online, but the current situation is creating a faux veneer of compliance — atop what is actually a massive trampling of rights via what amounts to digital daylight robbery of people’s data at scale.

The problem here is that EU regulators have for years looked the other way where online tracking is concerned, failing entirely to enforce the on-paper standard.

Enforcement is indeed sorely lacking, as the researchers note. (Industry lobbying/political pressure, limited resources, risk aversion and regulatory capture, and a legacy of inaction around digital rights are all likely to blame.)

And while the GDPR only started being applied in May 2018, Europe has had regulations on data-gathering mechanisms like cookies for approaching two decades — with the paper pointing out that an amendment to the ePrivacy Directive all the way back in 2002 made it a requirement that “storing or accessing information on a user’s device not ‘strictly necessary’ for providing an explicitly requested service requires both clear and comprehensive information and opt-in consent”.

Asked about the research findings, lead author, Midas Nouwens, questioned why CMP vendors are selling so called ‘compliance’ tools that allow for non-compliant configurations in the first place.

“It’s sad, but I don’t think anyone is surprised anymore by how few pop-ups comply with the GDPR,” he told TechCrunch. “What is shocking is how non-compliant interface designs are allowed by the companies that provide consent pop-ups. Why do they let their clients count scrolling as consent or bury the decline button somewhere on the third page?”

“Enforcement is really the next big challenge if we don’t want the GDPR to go down the same path as the ePrivacy directive,” he added. “Since enforcement agencies have limited resources, focusing on the popular consent pop-up providers could be a much more effective strategy than targeting individual websites.

“Unfortunately, while we wait for enforcement, the dark patterns in these pop-ups are still manipulating people into being tracked.”

Another of the researchers behind the paper, Michael Veale, a lecturer in digital rights and regulation at UCL, also expressed shock that CMP vendors are allowing their tools to be configured in ways which are clearly intended to manipulate Internet users — thereby flouting the law.

In the paper the researchers urge regulators to take a smarter approach to tackling such widespread violation, such as by making use of automated tools “to expedite discovery and enforcement” of non-compliant cookie notices, and suggest they work “further upstream” — such as by placing requirements on the vendors of CMPs “to only allow compliant designs to be placed on the market”.

“It’s shocking to see how many of the large providers of consent pop-ups allow their systems to be misconfigured, such as through implicit consent, in ways that clearly infringe data protection law,” Veale told us, adding: “I suspect data protection authorities see this widespread illegality and are not sure exactly where to start. Yet if they do not start enforcing these guidelines, it’s unclear when this widespread illegality will start to stop.”

“This study even overestimates compliance, as we don’t focus on what actually happens to the tracking when you click on these buttons, which other recent studies have emphasised in many cases mislead individuals and do nothing at all,” he also pointed out.

We reached out to the UK’s data protection watchdog, the ICO, for a response to the research — and a spokeswoman pointed us to this cookie advice blog post it published last year, saying the advice it contains “still stands”.

In the blog Ali Shah, the ICO’s head of technology policy, suggests there could be some (albeit limited) action from the regulator this year to clean up cookie consent, with Shah writing that: “Cookie compliance will be an increasing regulatory priority for the ICO in the future. However, as is the case with all our powers, any future action would be proportionate and risk-based.”

While European citizens wait for data protection regulators to take meaningful action over systematic breaches of the GDPR — including those attached to consent-less tracking of web users — there is one step European web users can take to shrink the pain of cookie consent pop-ups: The researchers behind the study have built an open source browser extension that can automatically answer pop-ups based on user-customizable preferences.

It’s called Consent-o-Matic — and there are versions available for Firefox and Chrome.

At release the tool can automatically respond to cookie banners built by the five big CMP suppliers (QuantCast, OneTrust, TrustArc, Cookiebot, and Crownpeak).

But being as it’s open source, the hope is others will build on it to expand the types of pop-ups it’s able to auto-respond to. In the absence of a legally enforced ‘Do Not Track’ browser standard this is about as good as it gets for Internet users desperately seeking easier agency over the online tracking industry.

In a Twitter thread last month announcing the tool, Nouwens described the project as making use of “adversarial interoperability” as a pro-privacy tactic.

“Automating consent and privacy preferences is not new (DNT and P3P), but this project uses adversarial interoperability, rather than rely on industry self-regulation or buy-in from fundamentally opposed stakeholders (browsers, advertisers, publishers),” he observed.

However he added one caveat, reminding users to be on their guard for further non-compliance from the data suckers — pointing to the earlier research paper also flagged by Veale which found a small portion of sites (~7%) entirely ignore responses to cookie pop-ups and track users regardless of response.

So sometimes even a seamlessly automated ‘no’ to tracking might still sum to being tracked…

Microsoft’s lead EU data watchdog is looking into fresh Windows 10 privacy concerns

The Dutch data protection agency has asked Microsoft’s lead privacy regulator in Europe to investigate ongoing concerns it has attached to how Windows 10 gathers user data.

Back in 2017 the privacy watchdog found Microsoft’s platform to be in breach of local privacy laws on account of how it collects telemetry metadata.

After some back and forth with the regulator, Microsoft made changes to how the software operates in April last year — and it was in the course of testing those changes that the Dutch agency found fresh reasons for concern, discovering what it calls in a press release “new, potentially unlawful, instances of personal data processing”. 

Since the agency’s investigation of Windows 10 started a new privacy framework is being enforced in Europe — the General Data Protection Regulation (GDPR) — which means Microsoft’s lead EU privacy regulator is the Irish Data Protection Commission (DPC), where its regional HQ is based. This is why the Dutch agency has referred its latest concerns to Ireland.

It will now be up to the Irish DPC to investigate Windows 10, adding to its already hefty stack of open files on multiple tech giants’ cross-border data processing activities since the GDPR came into force last May.

The regulation steps up the penalties that can be imposed for violations (to up to 4% of a company’s annual global turnover).

A spokeswoman for the Irish DPC confirmed to TechCrunch that it received the Dutch agency’s concerns last month. “Since then the DPC has been liaising with the Dutch DPA to further this matter,” she added. “The DPC has had preliminary engagement with Microsoft and, with the assistance of the Dutch authority, we will shortly be engaging further with Microsoft to seek substantive responses on the concerns raised.”

A Microsoft spokesperson also told us:

The Dutch data protection authority has in the past brought data protection concerns to our attention, which related to the consumer versions of Windows 10, Windows 10 Home and Pro. We will work with the Irish Data Protection Commission to learn about any further questions or concerns it may have, and to address any further questions and concerns as quickly as possible.

Microsoft is committed to protecting our customers’ privacy and putting them in control of their information. Over recent years, in close coordination with the Dutch data protection authority, we have introduced a number of new privacy features to provide clear privacy choices and easy-to-use tools for our individual and small business users of Windows 10. We welcome the opportunity to improve even more the tools and choices we offer to these end users.

The Dutch DPA advises users of Windows 10 to pay close attention to privacy settings when installing and using the software.

“Microsoft is permitted to process personal data if consent has been given in the correct way,” it writes. “We’ve found that Microsoft collect diagnostic and non-diagnostic data. We’d like to know if it is necessary to collect the non-diagnostic data and if users are well informed about this.

“Does Microsoft collect more data than they need to (think about dataminimalization as a base principle of the GDPR). Those questions can only be answered after further examination.”

During the onboarding process for Windows 10, Microsoft makes multiple requests to process user data for various reasons, including ad purposes.

It also deploys the female voice of Cortana, its digital assistant technology, to provide a running commentary on settings screens — which can include some suggestive prompts to agree to its T&Cs. “If you don’t agree, y’know, no Windows!” the human-sounding robot says at one point. It’s not clear whether the Dutch agency’s concerns extend to Microsoft’s use of Cortana to nudge users during the Windows 10 consent flow.