Firefox extends its anti-tracking protection to Android

Firefox announced today that its Total Cookie Protection (TCP) feature that protects users from trackers is now available on Android. The feature, which will be turned on by default, will prevent cross-site tracking. This way, trackers won’t be able to gather data about your browsing behavior for targeted advertising.

TCP was first introduced in 2021, but it was limited to Firefox’s enhanced tracking protection (ETP) mode. So users had to manually select that security level to enable protection from cookie-based tracking. Last year, the company made TCP available and turned on by default in all modes on Firefox for Windows, Mac, and Linux. The browser is not getting rid of cookies entirely. Instead, it maintains a “separate cookie jar” for each site to maintain your data within that silo.

TCP on Android will start rolling out to users starting today and will be available to all users by next month.

In contrast to Firefox, Google has delayed its move from deprecating third-party cookies on Chrome — now set to roll out sometime in 2024.

Mozilla is also testing a new feature of generating a Firefox Relay — an email proxy service — on the fly during a site’s sign-up process. The company says that on some sites — Firefox didn’t specify which — the tool will prompt users to use one of the existing proxy email addresses or create a new one. Mozilla says that it plans to extend this feature to all users and more sites later this year.

Image Credits: Firefox

Firefox Relay’s masking service offers five email addresses for free. But with its premium service — which has plans starting at $0.99 per month — you can get unlimited email addresses.

Last month, Firefox for Android also got new extensions for removing tracking elements before sharing a URL and listing to articles.

Firefox extends its anti-tracking protection to Android by Ivan Mehta originally published on TechCrunch

UK wants to replace cookie pop-ups with browser-based opt-outs

The UK government has published its final response to a data ‘reform’ consultation it kicked off last year, laying out how it intends to diverge from EU-based data protection rules.

At first pass, it looks like it has stepped away from some of the more extreme ‘reforms’ it had been tossing around — such as removing the right for human review of automated/AI decisions; which the consultation admits was opposed by the “vast majority” of respondents (ergo, the government writes that it “recognises the importance of appropriate safeguards, and will not pursue this proposal”; although it says it’s still considering how to amend Article 22 of the UK GDPR — so watch that space).

That said, there are still a lot of potentially wide-ranging amendments being announced in this package — such as a switch to an opt-out model for most online tracking; which the government is spinning as an end to cookie consent pop-ups but which raises plenty of wider questions — and changes to the UK’s data protection regulator that could still sum to substantial differences for the rights of citizens, businesses and other types of data processors operating in the country.

There’s plenty more incoming from the UK government on the digital policy front too — such as the sprawling Online Safety Bill, which is currently making its way through parliament, and is set to dramatically ramp up compliance demands for all sorts of businesses. So it pays to keep the wider picture in mind as the government spins its pitch of post-Brexit, rebooted data laws that will give British business a “boost” by cutting EU ‘red tape’.

tl;dr plenty of uniquely British red tape is also incoming for your digital operations.

Data Reform Bill

Top line changes the government says it’s moving forward on via the forthcoming Data Reform Bill include amendments to rules around data use for scientific research — which it says will simplify the legal requirements for this sort of processing. (Although the example cited in the departmental press release refers to a machine learning medical research project that’s been undertaken under existing UK data rules so 🤷‍♀️.)

“The Data Reform Bill will more clearly define the scope of scientific research and give scientists clarity about when they can obtain user consent to collect or use data for broad research purposes,” writes the Department for Digital, Media, Culture and Sport (DCMS) in its press release. “This removes the need for them to have the ultimate purpose of their research project finalised before collecting data. For example, scientists will be able to rely on the consent a person has given for their data to be used for ‘cancer research’ as opposed to a particular cancer study.”

The government also says it will enact a number of changes around how businesses can use personal data — including removing the need for smaller entities to have a data protection officer (DPO), or to undertake data impact assessments to evaluate risks to potential uses for personal data.

DCMS is projecting savings of more than £1 billion for businesses over ten years as a result of these deregulatory changes to the UK’s data protection regime.

It argues the (current) EU framework the UK adopted in 2018, when it transposed the bloc’s General Data Protection Regulation (GDPR) into national law, is hamstrung by being “largely one-size-fits all” — which it suggests is especially disadvantageous for small businesses, including startups and scaleups. Hence the reform talks about moving to “an outcomes-based compliance regime for data rights”.

“The government’s new data protection rules will be focused on outcomes to reduce unnecessary burdens on businesses,” says DCMS, dialling up the spin.

“Organisations will still be required to have a privacy management programme to ensure they are accountable for how they process personal data. The same high data protection standards will remain but organisations will have more flexibility to determine how they meet these standards.”

Ofc the devil will be in the detail of what these reforms boil down in practice. And we don’t yet have full detail as that would require sight of the planned legislation which remains pending the government publishing a draft bill. (The government has previously confirmed it will introduce the data reform bill in the current parliamentary session.)

On the planned changes to consent for cookies — which target the universally hated cookie pop-up plague — the response to the consultation sets out a phased ‘reform’ plan; starting with legislating to remove the need for websites to display cookie banners to UK residents by permitting cookies (and similar tracking technologies) to be placed on a user’s device without explicit consent “for a small number of other non-intrusive purposes” — whatever that phasing means (an earlier reference in the document talks about scrapping consent to drop cookies for audience measurement purposes, or for detecting faults on an organisation’s website).

But there’s more: “In the future, the government intends to move to an opt-out model of consent for cookies placed by websites. In practice, this would mean cookies could be set without seeking consent but the website must give the web user clear information about how to opt out,” continues the consultation response. “This would allow the government to realise its ambition to improve the user experience and remove the need for unnecessary cookie consent banners. The opt-out model would not apply to websites likely to be accessed by children.”

The government’s — let’s politely call it — ambitious plan here is to convince web browsers and websites to finally make a browser-based ‘do-not-track’ signal fly.

“Responses to the government’s consultation highlighted that users value privacy and want control over how their personal data is used. To address this, the government will work with industry and the regulator on browser-based and similar solutions that will help people manage their cookie and opt-out preferences,” it writes. “The government will take forward proposals that require websites to respect automated signals emitted by these technologies — and will move to an opt-out model of consent for cookies only when the government assesses these solutions are widely available for use.”

DCMS suggests that legislative changes to fully switch the UK to an opt-out for online tracking won’t take place until the necessary browser-based “technology” is “effective and readily available so people can set their online cookie preferences to opt out via automated means”.

It doesn’t given any timeframe for when that might be possible but it pays to remember that the original ‘Do Not Track‘ proposal, to offer a convenient, browser-based opt-out from online tracking, dates back to 2009 yet still hasn’t delivered on the dream — thanks to, er, lack of industry support.

Still, the UK government is confident that British exceptionalism can finally make it happen for its small island Internet. So full marks for enthusiasm.

Another chunk of the reform is focused on changes to the Information Commission’s Office (ICO), aka the UK’s data protection regulator.

Here DCMS’ press release couches the plan as a “modernization” of the Office — such as by adding a chair, chief executive and a board to “make sure it remains an internationally renowned regulator”.

But the government is also planning to require that the Secretary of State approve ICO statutory codes and guidance before they are presented to parliament — and as UK policy watchers have quickly pointed out, having a political appointee shape ICO policy doesn’t sound very independent from government…

The first big test for the UK’s data reform package probably won’t be public opinion — given that data processing is an inherently wonky, complex and abstract topic, and the government is front-loading its PR around the bill with populist talk of killing cookie pop-ups and — in another announcement — cracking down on nuisance calls (by ramping up fines).

No, the big test will be what the EU will do in response. And whether or not what the government is spinning as “Brexit benefits” will result in the UK losing its precarious “adequacy” status — which is critical for scores of businesses as it allows the smooth in-flow of personal data from the bloc. And would cost UK businesses a lot more than £1BN if it were to be cast into the Brexit abyss.

Just the pure compliance costs of a loss of EU adequacy have been estimated at between £1BN and £1.6BN — so, best case scenario, that potential outcome would immediately gobble up all the ‘red tape’ savings DCMS has attached to the reforms.

The government may be — tacitly — anticipating such an outcome as the reform also talks about “empowering international trade” by striking new “data partnerships”, with priority countries listed as including the United States, Australia, the Republic of Korea and Singapore (only one of which, Korea, has an EU data adequacy agreement).

“The data reforms will support the UK government’s ambitions to strike new data partnerships with important economies and improve international data transfers which a number of technologies rely on, such as GPS navigation, smart home technology and content streaming services,” it writes. “The government’s International Data Transfer Expert Council, made up of global experts on data, will play a major role helping the UK unlock the benefits of free and secure cross-border data flows.

“The group, which combines world-leading academics, organisations such as the World Economic Forum and the Future of Privacy Forum alongside digital industry figures including Google, Mastercard and Microsoft, will be empowered to remove barriers to data flows and ensure services from smart devices to online banking can be provided more reliably, cheaply and securely.”

Another chunk of the reform is focused on public sector data use and sharing — with the government planning to clarify rules on police use of biometric data, saying it will work with policing authorities to “promote high standards and best practice in the responsible and effective use of new technologies, including supporting the development of policing-led guidance such as new codes of conduct”.

However it has decided not to move forward at this time on boosting algorithmic transparency across the public sector, generally — despite the majority of responses to the government consultation backing a proposal to introduce compulsory transparency reporting on the use of algorithms in decision-making for public sector bodies; and despite the government acknowledging that “increasing transparency of the use of algorithmic tools for decision-making in the public sector is critical for maintaining public trust”; and also despite a UN warning — back in 2018 — over the human rights risks of the Tories’ rush to implement an ‘algorithmic welfare state’.

Why is the government not doing anything on this critical issue? DCMS claims algorithmic transparency standard work is at too “early” a stage to legislate on transparency reporting. But it says it will continue pilots of the standard and gather feedback, claiming that it’s “strongly committed to algorithmic transparency” and will “explore policy enforcement options in the future”.

Commenting on the data reform package in a statement, digital secretary Nadine Dorries said:

“Today is an important step in cementing post-Brexit Britain’s position as a science and tech superpower. Our new Data Reform Bill will make it easier for businesses and researchers to unlock the power of data to grow the economy and improve society, but retains our global gold standard for data protection.

Outside of the EU we can ensure people can control their personal data, while preventing businesses, researchers and civil society from being held back by a lack of clarity and cumbersome EU legislation.”

In another supporting statement, John Edwards, the UK’s information commissioner, gave a rather more cautious assessment of what the reforms will mean in practice — writing:

“I share and support the ambition of these reforms. I am pleased to see the government has taken our concerns about independence on board. Data protection law needs to give people confidence to share their information to use the products and services that power our economy and society. The proposed changes will ensure my office can continue to operate as a trusted, fair and impartial regulator, and enable us to be more flexible and target our action in response to the greatest harms.

“We look forward to continuing to work constructively with the government as the proposals are progressed and will continue to monitor how these reforms are expressed in the Bill.”

UK wants to replace cookie pop-ups with browser-based opt-outs

The UK government has published its final response to a data ‘reform’ consultation it kicked off last year, laying out how it intends to diverge from EU-based data protection rules.

At first pass, it looks like it has stepped away from some of the more extreme ‘reforms’ it had been tossing around — such as removing the right for human review of automated/AI decisions; which the consultation admits was opposed by the “vast majority” of respondents (ergo, the government writes that it “recognises the importance of appropriate safeguards, and will not pursue this proposal”; although it says it’s still considering how to amend Article 22 of the UK GDPR — so watch that space).

That said, there are still a lot of potentially wide-ranging amendments being announced in this package — such as a switch to an opt-out model for most online tracking; which the government is spinning as an end to cookie consent pop-ups but which raises plenty of wider questions — and changes to the UK’s data protection regulator that could still sum to substantial differences for the rights of citizens, businesses and other types of data processors operating in the country.

There’s plenty more incoming from the UK government on the digital policy front too — such as the sprawling Online Safety Bill, which is currently making its way through parliament, and is set to dramatically ramp up compliance demands for all sorts of businesses. So it pays to keep the wider picture in mind as the government spins its pitch of post-Brexit, rebooted data laws that will give British business a “boost” by cutting EU ‘red tape’.

tl;dr plenty of uniquely British red tape is also incoming for your digital operations.

Data Reform Bill

Top line changes the government says it’s moving forward on via the forthcoming Data Reform Bill include amendments to rules around data use for scientific research — which it says will simplify the legal requirements for this sort of processing. (Although the example cited in the departmental press release refers to a machine learning medical research project that’s been undertaken under existing UK data rules so 🤷‍♀️.)

“The Data Reform Bill will more clearly define the scope of scientific research and give scientists clarity about when they can obtain user consent to collect or use data for broad research purposes,” writes the Department for Digital, Media, Culture and Sport (DCMS) in its press release. “This removes the need for them to have the ultimate purpose of their research project finalised before collecting data. For example, scientists will be able to rely on the consent a person has given for their data to be used for ‘cancer research’ as opposed to a particular cancer study.”

The government also says it will enact a number of changes around how businesses can use personal data — including removing the need for smaller entities to have a data protection officer (DPO), or to undertake data impact assessments to evaluate risks to potential uses for personal data.

DCMS is projecting savings of more than £1 billion for businesses over ten years as a result of these deregulatory changes to the UK’s data protection regime.

It argues the (current) EU framework the UK adopted in 2018, when it transposed the bloc’s General Data Protection Regulation (GDPR) into national law, is hamstrung by being “largely one-size-fits all” — which it suggests is especially disadvantageous for small businesses, including startups and scaleups. Hence the reform talks about moving to “an outcomes-based compliance regime for data rights”.

“The government’s new data protection rules will be focused on outcomes to reduce unnecessary burdens on businesses,” says DCMS, dialling up the spin.

“Organisations will still be required to have a privacy management programme to ensure they are accountable for how they process personal data. The same high data protection standards will remain but organisations will have more flexibility to determine how they meet these standards.”

Ofc the devil will be in the detail of what these reforms boil down in practice. And we don’t yet have full detail as that would require sight of the planned legislation which remains pending the government publishing a draft bill. (The government has previously confirmed it will introduce the data reform bill in the current parliamentary session.)

On the planned changes to consent for cookies — which target the universally hated cookie pop-up plague — the response to the consultation sets out a phased ‘reform’ plan; starting with legislating to remove the need for websites to display cookie banners to UK residents by permitting cookies (and similar tracking technologies) to be placed on a user’s device without explicit consent “for a small number of other non-intrusive purposes” — whatever that phasing means (an earlier reference in the document talks about scrapping consent to drop cookies for audience measurement purposes, or for detecting faults on an organisation’s website).

But there’s more: “In the future, the government intends to move to an opt-out model of consent for cookies placed by websites. In practice, this would mean cookies could be set without seeking consent but the website must give the web user clear information about how to opt out,” continues the consultation response. “This would allow the government to realise its ambition to improve the user experience and remove the need for unnecessary cookie consent banners. The opt-out model would not apply to websites likely to be accessed by children.”

The government’s — let’s politely call it — ambitious plan here is to convince web browsers and websites to finally make a browser-based ‘do-not-track’ signal fly.

“Responses to the government’s consultation highlighted that users value privacy and want control over how their personal data is used. To address this, the government will work with industry and the regulator on browser-based and similar solutions that will help people manage their cookie and opt-out preferences,” it writes. “The government will take forward proposals that require websites to respect automated signals emitted by these technologies — and will move to an opt-out model of consent for cookies only when the government assesses these solutions are widely available for use.”

DCMS suggests that legislative changes to fully switch the UK to an opt-out for online tracking won’t take place until the necessary browser-based “technology” is “effective and readily available so people can set their online cookie preferences to opt out via automated means”.

It doesn’t given any timeframe for when that might be possible but it pays to remember that the original ‘Do Not Track‘ proposal, to offer a convenient, browser-based opt-out from online tracking, dates back to 2009 yet still hasn’t delivered on the dream — thanks to, er, lack of industry support.

Still, the UK government is confident that British exceptionalism can finally make it happen for its small island Internet. So full marks for enthusiasm.

Another chunk of the reform is focused on changes to the Information Commission’s Office (ICO), aka the UK’s data protection regulator.

Here DCMS’ press release couches the plan as a “modernization” of the Office — such as by adding a chair, chief executive and a board to “make sure it remains an internationally renowned regulator”.

But the government is also planning to require that the Secretary of State approve ICO statutory codes and guidance before they are presented to parliament — and as UK policy watchers have quickly pointed out, having a political appointee shape ICO policy doesn’t sound very independent from government…

The first big test for the UK’s data reform package probably won’t be public opinion — given that data processing is an inherently wonky, complex and abstract topic, and the government is front-loading its PR around the bill with populist talk of killing cookie pop-ups and — in another announcement — cracking down on nuisance calls (by ramping up fines).

No, the big test will be what the EU will do in response. And whether or not what the government is spinning as “Brexit benefits” will result in the UK losing its precarious “adequacy” status — which is critical for scores of businesses as it allows the smooth in-flow of personal data from the bloc. And would cost UK businesses a lot more than £1BN if it were to be cast into the Brexit abyss.

Just the pure compliance costs of a loss of EU adequacy have been estimated at between £1BN and £1.6BN — so, best case scenario, that potential outcome would immediately gobble up all the ‘red tape’ savings DCMS has attached to the reforms.

The government may be — tacitly — anticipating such an outcome as the reform also talks about “empowering international trade” by striking new “data partnerships”, with priority countries listed as including the United States, Australia, the Republic of Korea and Singapore (only one of which, Korea, has an EU data adequacy agreement).

“The data reforms will support the UK government’s ambitions to strike new data partnerships with important economies and improve international data transfers which a number of technologies rely on, such as GPS navigation, smart home technology and content streaming services,” it writes. “The government’s International Data Transfer Expert Council, made up of global experts on data, will play a major role helping the UK unlock the benefits of free and secure cross-border data flows.

“The group, which combines world-leading academics, organisations such as the World Economic Forum and the Future of Privacy Forum alongside digital industry figures including Google, Mastercard and Microsoft, will be empowered to remove barriers to data flows and ensure services from smart devices to online banking can be provided more reliably, cheaply and securely.”

Another chunk of the reform is focused on public sector data use and sharing — with the government planning to clarify rules on police use of biometric data, saying it will work with policing authorities to “promote high standards and best practice in the responsible and effective use of new technologies, including supporting the development of policing-led guidance such as new codes of conduct”.

However it has decided not to move forward at this time on boosting algorithmic transparency across the public sector, generally — despite the majority of responses to the government consultation backing a proposal to introduce compulsory transparency reporting on the use of algorithms in decision-making for public sector bodies; and despite the government acknowledging that “increasing transparency of the use of algorithmic tools for decision-making in the public sector is critical for maintaining public trust”; and also despite a UN warning — back in 2018 — over the human rights risks of the Tories’ rush to implement an ‘algorithmic welfare state’.

Why is the government not doing anything on this critical issue? DCMS claims algorithmic transparency standard work is at too “early” a stage to legislate on transparency reporting. But it says it will continue pilots of the standard and gather feedback, claiming that it’s “strongly committed to algorithmic transparency” and will “explore policy enforcement options in the future”.

Commenting on the data reform package in a statement, digital secretary Nadine Dorries said:

“Today is an important step in cementing post-Brexit Britain’s position as a science and tech superpower. Our new Data Reform Bill will make it easier for businesses and researchers to unlock the power of data to grow the economy and improve society, but retains our global gold standard for data protection.

Outside of the EU we can ensure people can control their personal data, while preventing businesses, researchers and civil society from being held back by a lack of clarity and cumbersome EU legislation.”

In another supporting statement, John Edwards, the UK’s information commissioner, gave a rather more cautious assessment of what the reforms will mean in practice — writing:

“I share and support the ambition of these reforms. I am pleased to see the government has taken our concerns about independence on board. Data protection law needs to give people confidence to share their information to use the products and services that power our economy and society. The proposed changes will ensure my office can continue to operate as a trusted, fair and impartial regulator, and enable us to be more flexible and target our action in response to the greatest harms.

“We look forward to continuing to work constructively with the government as the proposals are progressed and will continue to monitor how these reforms are expressed in the Bill.”

After years of inaction against adtech, UK’s ICO calls for browser-level controls to fix ‘cookie fatigue’

In the latest quasi-throwback toward ‘do not track‘, the UK’s data protection chief has come out in favor of a browser- and/or device-level setting to allow Internet users to set “lasting” cookie preferences — suggesting this as a fix for the barrage of consent pop-ups that continues to infest websites in the region.

European web users digesting this development in an otherwise monotonously unchanging regulatory saga, should be forgiven — not only for any sense of déjà vu they may experience — but also for wondering if they haven’t been mocked/gaslit quite enough already where cookie consent is concerned.

Last month, UK digital minister Oliver Dowden took aim at what he dubbed an “endless” parade of cookie pop-ups — suggesting the government is eyeing watering down consent requirements around web tracking as ministers consider how to diverge from European Union data protection standards, post-Brexit. (He’s slated to present the full sweep of the government’s data ‘reform’ plans later this month so watch this space.)

Today the UK’s outgoing information commissioner, Elizabeth Denham, stepped into the fray to urge her counterparts in G7 countries to knock heads together and coalesce around the idea of letting web users express generic privacy preferences at the browser/app/device level, rather than having to do it through pop-ups every time they visit a website.

In a statement announcing “an idea” she will present this week during a virtual meeting of fellow G7 data protection and privacy authorities — less pithily described in the press release as being “on how to improve the current cookie consent mechanism, making web browsing smoother and more business friendly while better protecting personal data” — Denham said: “I often hear people say they are tired of having to engage with so many cookie pop-ups. That fatigue is leading to people giving more personal data than they would like.

“The cookie mechanism is also far from ideal for businesses and other organisations running websites, as it is costly and it can lead to poor user experience. While I expect businesses to comply with current laws, my office is encouraging international collaboration to bring practical solutions in this area.”

“There are nearly two billion websites out there taking account of the world’s privacy preferences. No single country can tackle this issue alone. That is why I am calling on my G7 colleagues to use our convening power. Together we can engage with technology firms and standards organisations to develop a coordinated approach to this challenge,” she added.

Contacted for more on this “idea”, an ICO spokeswoman reshuffled the words thusly: “Instead of trying to effect change through nearly 2 billion websites, the idea is that legislators and regulators could shift their attention to the browsers, applications and devices through which users access the web.

“In place of click-through consent at a website level, users could express lasting, generic privacy preferences through browsers, software applications and device settings – enabling them to set and update preferences at a frequency of their choosing rather than on each website they visit.”

Of course a browser-baked ‘Do not track’ (DNT) signal is not a new idea. It’s around a decade old at this point. Indeed, it could be called the idea that can’t die because it’s never truly lived — as earlier attempts at embedding user privacy preferences into browser settings were scuppered by lack of industry support.

However the approach Denham is advocating, vis-a-vis “lasting” preferences, may in fact be rather different to DNT — given her call for fellow regulators to engage with the tech industry, and its “standards organizations”, and come up with “practical” and “business friendly” solutions to the regional Internet’s cookie pop-up problem.

It’s not clear what consensus — practical or, er, simply pro-industry — might result from this call. If anything.

Indeed, today’s press release may be nothing more than Denham trying to raise her own profile since she’s on the cusp of stepping out of the information commissioner’s chair. (Never waste a good international networking opportunity and all that — her counterparts in the US, Canada, Japan, France, Germany and Italy are scheduled for a virtual natter today and tomorrow where she implies she’ll try to engage them with her big idea).

Her UK replacement, meanwhile, is already lined up. So anything Denham personally champions right now, at the end of her ICO chapter, may have a very brief shelf life — unless she’s set to parachute into a comparable role at another G7 caliber data protection authority.

Nor is Denham the first person to make a revived pitch for a rethink on cookie consent mechanisms — even in recent years.

Last October, for example, a US-centric tech-publisher coalition came out with what they called a Global Privacy Standard (GPC) — aiming to build momentum for a browser-level pro-privacy signal to stop the sale of personal data, geared toward California’s Consumer Privacy Act (CCPA), though pitched as something that could have wider utility for Internet users.

By January this year they announced 40M+ users were making use of a browser or extension that supports GPC — along with a clutch of big name publishers signed up to honor it. But it’s fair to say its global impact so far remains limited. 

More recently, European privacy group noyb published a technical proposal for a European-centric automated browser-level signal that would let regional users configure advanced consent choices — enabling the more granular controls it said would be needed to fully mesh with the EU’s more comprehensive (vs CCPA) legal framework around data protection.

The proposal, for which noyb worked with the Sustainable Computing Lab at the Vienna University of Economics and Business, is called Advanced Data Protection Control (ADPC). And noyb has called on the EU to legislate for such a mechanism — suggesting there’s a window of opportunity as lawmakers there are also keen to find ways to reduce cookie fatigue (a stated aim for the still-in-train reform of the ePrivacy rules, for example).

So there are some concrete examples of what practical, less fatiguing yet still pro-privacy consent mechanisms might look like to lend a little more color to Denham’s ‘idea’ — although her remarks today don’t reference any such existing mechanisms or proposals.

(When we asked the ICO for more details on what she’s advocating for, its spokeswoman didn’t cite any specific technical proposals or implementations, historical or contemporary, either, saying only: “By working together, the G7 data protection authorities could have an outsized impact in stimulating the development of technological solutions to the cookie consent problem.”)

So Denham’s call to the G7 does seem rather low on substance vs profile-raising noise.

In any case, the really big elephant in the room here is the lack of enforcement around cookie consent breaches — including by the ICO.

Add to that, there’s the now very pressing question of how exactly the UK will ‘reform’ domestic law in this area (post-Brexit) — which makes the timing of Denham’s call look, well, interestingly opportune. (And difficult to interpret as anything other than opportunistically opaque at this point.)

The adtech industry will of course be watching developments in the UK with interest — and would surely be cheering from the rooftops if domestic data protection ‘reform’ results in amendments to UK rules that allow the vast majority of websites to avoid having to ask Brits for permission to process their personal data, say by opting them into tracking by default (under the guise of ‘fixing’ cookie friction and cookie fatigue for them).

That would certainly be mission accomplished after all these years of cookie-fatigue-generating-cookie-consent-non-compliance by surveillance capitalism’s industrial data complex.

It’s not yet clear which way the UK government will jump — but eyebrows should raise to read the ICO writing today that it expects compliance with (current) UK law when it has so roundly failed to tackle the adtech industry’s role in cynically sicking up said cookie fatigue by failing to take any action against such systemic breaches.

The bald fact is that the ICO has — for years — avoided tackling adtech abuse of data protection, despite acknowledging publicly that the sector is wildly out of control.

Instead, it has opted for a cringing ‘process of engagement’ (read: appeasement) that has condemned UK Internet users to cookie pop-up hell.

This is why the regulator is being sued for inaction — after it closed a long-standing complaint against the security abuse of people’s data in real-time bidding ad auctions with nothing to show for it… So, yes, you can be forgiven for feeling gaslit by Denham’s call for action on cookie fatigue following the ICO’s repeat inaction on the causes of cookie fatigue…

Not that the ICO is alone on that front, however.

There has been a fairly widespread failure by EU regulators to tackle systematic abuse of the bloc’s data protection rules by the adtech sector — with a number of complaints (such as this one against the IAB Europe’s self-styled ‘transparency and consent framework’) still working, painstakingly, through the various labyrinthine regulatory processes.

France’s CNIL has probably been the most active in this area — last year slapping Amazon and Google with fines of $42M and $120M for dropping tracking cookies without consent, for example. (And before you accuse CNIL of being ‘anti-American’, it has also gone after domestic adtech.)

But elsewhere — notably Ireland, where many adtech giants are regionally headquartered — the lack of enforcement against the sector has allowed for cynical, manipulative and/or meaningless consent pop-ups to proliferate as the dysfunctional ‘norm’, while investigations have failed to progress and EU citizens have been forced to become accustomed, not to regulatory closure (or indeed rapture), but to an existentially endless consent experience that’s now being (re)branded as ‘cookie fatigue’.

Yes, even with the EU’s General Data Protection Regulation (GDPR) coming into application in 2018 and beefing up (in theory) consent standards.

This is why the privacy campaign group noyb is now lodging scores of complaints against cookie consent breaches — to try to force EU regulators to actually enforce the law in this area, even as it also finds time to put up a practical technical proposal that could help shrink cookie fatigue without undermining data protection standards. 

It’s a shining example of action that has yet to inspire the lion’s share of the EU’s actual regulators to act on cookies. The tl;dr is that EU citizens are still waiting for the cookie consent reckoning — even if there is now a bit of high level talk about the need for ‘something to be done’ about all these tedious pop-ups.

The problem is that while GDPR certainly cranked up the legal risk on paper, without proper enforcement it’s just a paper tiger. And the pushing around of lots of paper is very tedious, clearly. 

Most cookie pop-ups you’ll see in the EU are thus essentially privacy theatre; at the very least they’re unnecessarily irritating because they create ongoing friction for web users who must constantly respond to nags for their data (typically to repeatedly try to deny access if they can actually find a ‘reject all’ setting).

But — even worse — many of these pervasive pop-ups are actively undermining the law (as a number of studies have shown) because the vast majority do not meet the legal standard for consent.

So the cookie consent/fatigue narrative is actually a story of faux compliance enabled by an enforcement vacuum that’s now also encouraging the watering down of privacy standards as a result of such much unpunished flouting of the law.

There is a lesson here, surely.

‘Faux consent’ pop-ups that you can easily stumble across when surfing the ‘ad-supported’ Internet in Europe include those failing to provide users with clear information about how their data will be used; or not offering people a free choice to reject tracking without being penalized (such as with no/limited access to the content they’re trying to access), or at least giving the impression that accepting is a requirement to access said content (dark pattern!); and/or otherwise manipulating a person’s choice by making it super simple to accept tracking and far, far, far more tedious to deny.

You can also still sometimes find cookie notices that don’t offer users any choice at all — and just pop up to inform that ‘by continuing to browse you consent to your data being processed’ — which, unless the cookies in question are literally essential for provision of the webpage, is basically illegal. (Europe’s top court made it abundantly clear in 2019 that active consent is a requirement for non-essential cookies.)

Nonetheless, to the untrained eye — and sadly there are a lot of them where cookie consent notices are concerned — it can look like it’s Europe’s data protection law that’s the ass because it seemingly demands all these meaningless ‘consent’ pop-ups, which just gloss over an ongoing background data grab anyway.

The truth is regulators should have slapped down these manipulative dark patterns years ago.

The problem now is that regulatory failure is encouraging political posturing — and, in a twisting double-back throw by the ICO! — regulatory thrusting around the idea that some newfangled mechanism is what’s really needed to remove all this universally inconvenient ‘friction’.

An idea like noyb’s ADPC does indeed look very useful in ironing out the widespread operational wrinkles wrapping the EU’s cookie consent rules. But when it’s the ICO suggesting a quick fix after the regulatory authority has failed so spectacularly over the long duration of complaints around this issue you’ll have to forgive us for being sceptical.

In such a context the notion of ‘cookie fatigue’ looks like it’s being suspiciously trumped up; fixed on as a convenient scapegoat to rechannel consumer frustration with hated online tracking toward high privacy standards — and away from the commercial data-pipes that demand all these intrusive, tedious cookie pop-ups in the first place — whilst neatly aligning with the UK government’s post-Brexit political priorities on ‘data’.

Worse still: The whole farcical consent pantomime — which the adtech industry has aggressively engaged in to try to sustain a privacy-hostile business model in spite of beefed up European privacy laws — could be set to end in genuine tragedy for user rights if standards end up being slashed to appease the law mockers.

The target of regulatory ire and political anger should really be the systematic law-breaking that’s held back privacy-respecting innovation and non-tracking business models — by making it harder for businesses that don’t abuse people’s data to compete.

Governments and regulators should not be trying to dismantle the principle of consent itself. Yet — at least in the UK — that does now look horribly possible.

Laws like GDPR set high standards for consent which — if they were but robustly enforced — could lead to reform of highly problematic practices like behavorial advertising combined with the out-of-control scale of programmatic advertising.

Indeed, we should already be seeing privacy-respecting forms of advertising being the norm, not the alternative — free to scale.

Instead, thanks to widespread inaction against systematic adtech breaches, there has been little incentive for publishers to reform bad practices and end the irritating ‘consent charade’ — which keeps cookie pop-ups mushrooming forth, oftentimes with ridiculously lengthy lists of data-sharing ‘partners’ (i.e. if you do actually click through the dark patterns to try to understand what is this claimed ‘choice’ you’re being offered).

As well as being a criminal waste of web users’ time, we now have the prospect of attention-seeking, politically charged regulators deciding that all this ‘friction’ justifies giving data-mining giants carte blanche to torch user rights — if the intention is to fire up the G7 to send a collect invite to the tech industry to come up with “practical” alternatives to asking people for their consent to track them — and all because authorities like the ICO have been too risk averse to actually defend users’ rights in the first place.

Dowden’s remarks last month suggest the UK government may be preparing to use cookie consent fatigue as convenient cover for watering down domestic data protection standards — at least if it can get away with the switcheroo.

Nothing in the ICO’s statement today suggests it would stand in the way of such a move.

Now that the UK is outside the EU, the UK government has said it believes it has an opportunity to deregulate domestic data protection — although it may find there are legal consequences for domestic businesses if it diverges too far from EU standards.

Denham’s call to the G7 naturally includes a few EU countries (the biggest economies in the bloc) but by targeting this group she’s also seeking to engage regulators further afield — in jurisdictions that currently lack a comprehensive data protection framework. So if the UK moves, cloaked in rhetoric of ‘Global Britain’, to water down its (EU-based) high domestic data protection standards it will be placing downward pressure on international aspirations in this area — as a counterweight to the EU’s geopolitical ambitions to drive global standards up to its level.

The risk, then, is a race to the bottom on privacy standards among Western democracies — at a time when awareness about the importance of online privacy, data protection and information security has actually never been higher.

Furthermore, any UK move to weaken data protection also risks putting pressure on the EU’s own high standards in this area — as the regional trajectory would be down not up. And that could, ultimately, give succour to forces inside the EU that lobby against its commitment to a charter of fundamental rights — by arguing such standards undermine the global competitiveness of European businesses.

So while cookies themselves — or indeed ‘cookie fatigue’ — may seem an irritatingly small concern, the stakes attached to this tug of war around people’s rights over what can happen to their personal data are very high indeed.

Europe needs to back browser-level controls to fix cookie consent nightmares, says privacy group

European privacy group noyb, which recently kicked off a major campaign targeting rampant abuse of the region’s cookie consent rules, has followed up by publishing a technical proposal for an automated browser-level signal it believes could go even further to tackle the friction generated by endless ‘your data choices’ pop-ups.

Its proposal is for an automated signal layer that would enable users to configure advanced consent choices — such as only being asked to allow cookies if they frequently visit a website; or being able to whitelist lists of sites for consent (if, for example, they want to support quality journalism by allowing their data to be used for ads in those specific cases).

The approach would offer a route to circumvent the user experience nightmare flowing from all the dark pattern design that’s made cookie consent collection so cynical, confusing and tedious — by simply automating the yeses and noes, thereby keeping interruptions to a user-defined minimum.

In the European Union cookie consent banners mushroomed in the wake of a 2018 update to the bloc’s privacy rules (GDPR) — especially on websites that rely on targeted advertising to generate revenue. And in recent years it has not been unusual to find cookie pop-ups that contain a labyrinthine hell of opacity — culminating (if you don’t just click ‘agree’) — to vast menus of ‘trusted partners’ all after your data. Some of which are pre-set to share information and require the user to individually toggle each and every one off.

Such stuff is a mockery of compliance, rather than the truly simple choice envisage by the law. So noyb’s earlier campaign is focused on filing scores of complaints against sites it believes aren’t complying with requirements to provide users with a clear and free choice to say no to their data being used for ads (and it’s applying a little automation tech there too to help scale up the number of complaint it can file).

Its follow-up here — showing how an advanced control layer that signals user choices in the background could work — shares the same basic approach as the ‘Do Not Track’ proposals originally proposed for baking into web browsers all the way back in 2009 but which failed to get industry buy-in. There has also been a more recent US-based push to revive the idea of browser-level privacy control — buoyed by California’s California Consumer Privacy Act (CCPA), which took effect at the start of last year, and includes a requirement that businesses respect user opt-out preferences via a signal from their browser.

However noyb’s version of browser-level privacy control seeks to go further by enabling more granular controls — which it says it necessary to better mesh with the EU’s nuanced legal framework around data protection.

It points out that Article 21(5) of the GDPR already allows for automatic signals from the browser to inform websites in the background whether a user is consenting to data processing or not.

The ePrivacy Regulation proposal, a much delayed reform of the bloc’s rules around electronic privacy has also included such a provision.

However noyb says development to establish such a signal hasn’t happened yet — suggesting that cynically manipulative consent management platforms may well have been hampering privacy-focused innovation.

But it also sees a chance for the necessary momentum to build behind the idea.

For example, it points to how Apple has recently been dialling up the notification and control it offers users of its mobile platform, iOS, to allow people to both know which third party apps want to track them and allow or deny access to their data — including giving users a super simple ‘deny all third party tracking’ option backed into iOS’ settings.

So, well, why should Internet users who happen to be browsing on a desktop device not have a set of similarly advanced privacy controls too?

EU lawmakers are also still debating the ePrivacy Regulation reform — which deals centrally with cookies — so the campaign group wants to demonstrate how automated control tech could be a key piece of the answer to so-called ‘cookie consent fatigue’; by giving users a modern toolset to shrink consent friction without compromising their ability to control what happens with their data.

In order to work as intended automated signals would need to be legally binding (to prevent adtech companies just ignoring them) — and having a clear legal basis set out in the ePrivacy Regulation is one way that could happen within fairly short order.

The chance at least is there.

There have been concerns that the ePrivacy reform — which was stalled for years — could end up weakening the EU’s data protection framework in the face of massive adtech industry lobbying. And the negotiation process to reach a final text remains ongoing. So it’s still not clear where it’s going to end up.

But, earlier this year, the European Council agreed its negotiating mandate with the other EU institutions. And, on cookies, the Council said they want companies to find ways to reduce ‘cookie consent fatigue’ among users — such as by whitelisting types of cookies/providers in their browser settings. So there is at least a potential path to legislate for an effective browser-level control layer in Europe.

For now, noyb has published a prototype and a technology specification for what it’s calling the ADPC (aka Advanced Data Protection Control). The work on the framework has been carried out by noyb working with the Sustainable Computing Lab at the Vienna University of Economics and Business.

The proposal envisages web pages sending privacy requests in a machine-readable way and the ADPC allowing the response to be transmitted using header signals or via Java Script. noyb likens the intelligent management of queries and automatic responses such a system could support to an email spam filter.

Commenting in a statement, chairman Max Schrems said: “For Europe, we need more than just an ‘opt-out’ so that it fits into our legal framework. That’s why we call the prototype ‘Advanced’ Data Protection Control, because it’s much more flexible and specific than previous approaches.

“ADPC allows intelligent management of privacy requests. A user could say, for example, ‘please ask me only after I’ve been to the site several times’ or ‘ask me again after 3 months.’ It is also possible to answer similar requests centrally. ADPC thus allows the flood of data requests to be managed in a meaningful way.”

“With ADPC, we also want to show the European legislator that such a signal is feasible and brings advantages for all sides,” he added. “We hope that the negotiators of the member states and the European Parliament will ensure a solid legal basis here, which could be applicable law in a short time. What California has done already, the EU should be able to do as well.”

The Commission has been contacted for comment on noyb’s ADPC.

While there are wider industry shifts afoot to depreciate tracking cookies altogether — with Google proposing to replace current adtech infrastructure supported by Chrome with an alternative stack of (it claims) more privacy respecting alternatives (aka its Privacy Sandbox) — there’s still plenty of uncertainty over what will ultimately happen to third party cookies.

Google’s move to end support for tracking cookies is being closely scrutinized by regional antitrust regulators. And just last week the UK’s Competition and Markets Authority (CMA), which is investigating a number of complaints about the plan, said it’s minded to accept concessions from Google that would mean the regulator could order it not to switch off tracking cookies.

Moreover, even if tracking cookies do finally crumble there is still the question of what exactly they get replaced with — and how alternative adtech infrastructure could impact user privacy?

Google’s so-called ‘Privacy Sandbox’ proposal to target ads at cohorts of users (based on bucketed ‘interests’ its technology will assign them via on-device analysis of their browsing habits) has raised fresh concerns about the risks of exploitative and predatory advertising. So it may be no less important for users to have meaningful browser-level controls over their privacy choices in the future — even if the tracking cookie itself goes away.

A browser-level signal could offer a way for a web user to say ‘no’ to being stuck in an ‘interest bucket’ for ad targeting purposes, for example — signalling that they prefer to see only contextual ads instead, say.

tl;dr: The issue of consent does not only affect cookies — and it’s telling that Google has avoided running the first trials of its replacement tech for tracking cookies (FLoCs, or federated learning of cohorts) in Europe.

 

EU’s top privacy regulator urges ban on surveillance-based ad targeting

The European Union’s lead data protection supervisor has recommended that a ban on targeted advertising based on tracking Internet users’ digital activity be included in a major reform of digital services rules which aims to increase operators’ accountability, among other key goals.

The European Data Protection Supervisor (EDPS), Wojciech Wiewiorówski, made the call for a ban on surveillance-based targeted ads in reference to the Commission’s Digital Services Act (DSA) — following a request for consultation from EU lawmakers.

The DSA legislative proposal was introduced in December, alongside the Digital Markets Act (DMA) — kicking off the EU’s (often lengthy) co-legislative process which involves debate and negotiations in the European Parliament and Council on amendments before any final text can be agreed for approval. This means battle lines are being drawn to try to influence the final shape of the biggest overhaul to pan-EU digital rules for decades — with everything to play for.

The intervention by Europe’s lead data protection supervisor calling for a ban on targeted ads is a powerful pre-emptive push against attempts to water down legislative protections for consumer interests.

The Commission had not gone so far in its proposal — but big tech lobbyists are certainly pushing in the opposite direction so the EDPS taking a strong line here looks important.

In his opinion on the DSA the EDPS writes that “additional safeguards” are needed to supplement risk mitigation measures proposed by the Commission — arguing that “certain activities in the context of online platforms present increasing risks not only for the rights of individuals, but for society as a whole”.

Online advertising, recommender systems and content moderation are the areas the EDPS is particularly concerned about.

“Given the multitude of risks associated with online targeted advertising, the EDPS urges the co-legislators to consider additional rules going beyond transparency,” he goes on. “Such measures should include a phase-out leading to a prohibition of targeted advertising on the basis of pervasive tracking, as well as restrictions in relation to the categories of data that can be processed for targeting purposes and the categories of data that may be disclosed to advertisers or third parties to enable or facilitate targeted advertising.”

It’s the latest regional salvo aimed at mass-surveillance-based targeted ads after the European Parliament called for tighter rules back in October — when it suggested EU lawmakers should consider a phased in ban.

Again, though, the EDPS is going a bit further here in actually calling for one. (Facebook’s Nick Clegg will be clutching his pearls.)

More recently, the CEO of European publishing giant Axel Springer, a long time co-conspirator of adtech interests, went public with a (rather protectionist-flavored) rant about US-based data-mining tech platforms turning citizens into “the marionettes of capitalist monopolies” — calling for EU lawmakers to extend regional privacy rules by prohibiting platforms from storing personal data and using it for commercial gain at all.

Apple CEO, Tim Cook, also took to the virtual stage of a (usually) Brussels based conference last month to urge Europe to double down on enforcement of its flagship General Data Protection Regulation (GDPR).

In the speech Cook warned that the adtech ‘data complex’ is fuelling a social catastrophe by driving the spread of disinformation as it works to profit off of mass manipulation. He went on to urge lawmakers on both sides of the pond to “send a universal, humanistic response to those who claim a right to users’ private information about what should not and will not be tolerated”. So it’s not just European companies (and institutions) calling for pro-privacy reform of adtech.

The iPhone maker is preparing to introduce stricter limits on tracking on its smartphones by making apps ask users for permission to track, instead of just grabbing their data — a move that’s naturally raised the hackles of the adtech sector, which relies on mass surveillance to power ‘relevant’ ads.

Hence the adtech industry has resorted to crying ‘antitrust‘ as a tactic to push competition regulators to block platform-level moves against its consentless surveillance. And on that front it’s notable than the EDPS’ opinion on the DMA, which proposes extra rules for intermediating platforms with the most market power, reiterates the vital links between competition, consumer protection and data protection law — saying these three are “inextricably linked policy areas in the context of the online platform economy”; and that there “should be a relationship of complementarity, not a relationship where one area replaces or enters into friction with another”.

Wiewiorówski also takes aim at recommender systems in his DSA opinion — saying these should not be based on profiling by default to ensure compliance with regional data protection rules (where privacy by design and default is supposed to be the legal default).

Here too be calls for additional measures to beef up the Commission’s legislative proposal — with the aim of “further promot[ing] transparency and user control”.

This is necessary because such system have “significant impact”, the EDPS argues.

The role of content recommendation engines in driving Internet users towards hateful and extremist points of view has long been a subject of public scrutiny. Back in 2017, for example, UK parliamentarians grilled a number of tech companies on the topic — raising concerns that AI-driven tools, engineered to maximize platform profit by increasing user engagement, risked automating radicalization, causing damage not just to the individuals who become hooked on hateful views the algorithms feeds them but cascading knock-on harms for all of us as societal cohesion is eaten away in the name of keeping the eyeballs busy.

Yet years on little information is available on how such algorithmic recommender systems work because the private companies that operate and profit off these AIs shield the workings as proprietary business secrets.

The Commission’s DSA proposal takes aim at this sort of secrecy as a bar to accountability — with its push for transparency obligations. The proposed obligations (in the initial draft) include requirements for platforms to provide “meaningful” criteria used to target ads; and explain the “main parameters” of their recommender algorithms; as well as requirements to foreground user controls (including at least one “nonprofiling” option).

However the EDPS wants regional lawmakers to go further in the service of protecting individuals from exploitation (and society as a whole from the toxic byproducts that flow from an industry based on harvesting personal data to manipulate people).

On content moderation, Wiewiorówski’s opinion stresses that this should “take place in accordance with the rule of law”. Though the Commission draft has favored leaving it with platforms to interpret the law.

“Given the already endemic monitoring of individuals’ behaviour, particularly in the context of online platforms, the DSA should delineate when efforts to combat ‘illegal content’ legitimise the use of automated means to detect, identify and address illegal content,” he writes, in what looks like a tacit recognition of recent CJEU jurisprudence in this area.

“Profiling for purposes of content moderation should be prohibited unless the provider can demonstrate that such measures are strictly necessary to address the systemic risks explicitly identified by the DSA,” he adds.

The EDPS has also suggested minimum interoperability requirements for very large platforms, and for those designated as ‘gatekeepers’ (under the DMA), and urges lawmakers to work to promote the development of technical standards to help with this at the European level.

On the DMA, he also urges amendments to ensure the proposal “complements the GDPR effectively”, as he puts it, calling for “increasing protection for the fundamental rights and freedoms of the persons concerned, and avoiding frictions with current data protection rules”.

Among the EDPS’ specific recommendations are: That the DMA makes it clear that gatekeeper platforms must provide users with easier and more accessible consent management; clarification to the scope of data portability envisaged in the draft; and rewording of a provision that requires gatekeepers to provide other businesses with access to aggregated user data — again with an eye on ensuring “full consistency with the GDPR”.

The opinion also raises the issue of the need for “effective anonymisation” — with the EDPS calling for “re-identification tests when sharing query, click and view data in relation to free and paid search generated by end users on online search engines of the gatekeeper”.

ePrivacy reform emerges from stasis

Wiewiorówski’s contributions to shaping incoming platform regulations come on the same day that the European Council has finally reached agreement on its negotiating position for a long-delayed EU reform effort around existing ePrivacy rules.

In a press release announcing the development, the Commission writes that Member States agreed on a negotiating mandate for revised rules on the protection of privacy and confidentiality in the use of electronic communications services.

“These updated ‘ePrivacy’ rules will define cases in which service providers are allowed to process electronic communications data or have access to data stored on end-users’ devices,” it writes, adding: “Today’s agreement allows the Portuguese presidency to start talks with the European Parliament on the final text.”

Reform of the ePrivacy directive has been stalled for years as conflicting interests locked horns — putting paid to the (prior) Commission’s hopes that the whole effort could be done and dusted in 2018. (The original ePrivacy reform proposal came out in January 2017; four years later the Council has finally settled on its arguing mandate.)

The fact that the GDPR was passed first appears to have upped the stakes for data-hungry ePrivacy lobbyists — in both the adtech and telco space (the latter having a keen interest in removing existing regulatory barriers on comms data in order that it can exploit the vast troves of user data which Internet giants running rival messaging and VoIP services have long been able to).

There’s a concerted effort to try to use ePrivacy to undo consumer protections baked into GDPR — including attempts to water down protections provided for sensitive personal data. So the stage is set for an ugly rights battle as negotiations kick off with the European Parliament.

Metadata and cookie consent rules are also bound up with ePrivacy so there’s all sorts of messy and contested issues on the table here.

Digital rights advocacy group Access Now summed up the ePrivacy development by slamming the Council for “hugely” missing the mark.

“The reform is supposed to strengthen privacy rights in the EU [but] States poked so many holes into the proposal that it now looks like French Gruyère,” said Estelle Massé, senior policy analyst at Access Now, in a statement. “The text adopted today is below par when compared to the Parliament’s text and previous versions of government positions. We lost forward-looking provisions for the protection of privacy while several surveillance measures have been added.”

The group said it will be pushing to restore requirements for service providers to protect online users’ privacy by default and for the establishment of clear rules against online tracking beyond cookies, among other policy preferences.

The Council, meanwhile, appears to be advocating for a highly dilute (and so probably useless) flavor of ‘do not track’ — by suggesting users should be able to give consent to the use of “certain types of cookies by whitelisting one or several providers in their browser settings”, per the Commission.

“Software providers will be encouraged to make it easy for users to set up and amend whitelists on their browsers and withdraw consent at any moment,” it adds in its press release.

Clearly the devil will be in the detail of the Council’s position there. (The European Parliament has, by contrast, previously clearly endorsed a “legally binding and enforceable” Do Not Track mechanism for ePrivacy so, again, the stage is set for clashes.)

Encryption is another likely bone of ePrivacy contention.

As security and privacy researcher, Dr Lukasz Olejnik, noted back in mid 2017, the parliament strongly backed end-to-end encryption as a means of protecting the confidentiality of comms data — saying then that Member States should not impose any obligations on service providers to weaken strong encryption.

So it’s notable that the Council does not have much to say about e2e encryption — at least in the PR version of its public position. (A line in this that runs: “As a main rule, electronic communications data will be confidential. Any interference, including listening to, monitoring and processing of data by anyone other than the end-user will be prohibited, except when permitted by the ePrivacy regulation” is hardly reassuring, either.)

It certainly looks like a worrying omission given recent efforts at the Council level to advocate for ‘lawful’ access to encrypted data. Digital and humans rights groups will be buckling up for a fight.

Tech-publisher coalition backs new push for browser-level privacy controls

Remember ‘Do Not Track‘? The tracker-loving adtech industry hopes you don’t recall that decade+ doomed attempt to bake user-friendly privacy controls into browsers. But a coalition of privacy-forward tech companies, publishers and advocacy groups has taken the wraps off of a push to develop a new standard that gives Internet users a super simple way to put digital guardrails around their data.

The effort to bake in a new browser-level privacy signal to stop the sale of personal data — which has been christened: Global Privacy Standard (GPC) — is being led by the ex-CTO of the FTC, Ashkan Soltani, and privacy researcher Sebastian Zimmeck.

They’ve got early backing from The New York Times; The Washington Post; Financial Times; WordPress-owner Automattic; dev community Glitch; privacy search engine DuckDuckGo; anti-tracking browser Brave; Firefox maker Mozilla; tracker blocker Disconnect; privacy tool maker Abine; Digital Content Next; Consumer Reports; and digital rights group the Electronic Frontier Foundation.

 

“In the initial experimental phase, individuals can download browsers and extensions from AbineBraveDisconnectDuckDuckGo, and EFF in order to communicate their ‘do not sell or share’ preference to participating publishers,” they write in a press release unveiling the effort.

“Additionally, we are committed to developing GPC into an open standard that many other organizations will support and are in the process of identifying the best venue for this proposal,” they add.

This ‘DNT’-esque initiative is, at least initially, being tailored toward California’s Consumer Privacy Act (CCPA) — which gives Internet users in the state the right to opt out of having their data sold on (with the potential for further strengthening if a November ballot measure, called Prop24, gets passed).

The law also requires businesses to respect user opt-out preferences via a signal from their browser — reviving the potential for a low friction, browser-level control which was what supporters of DNT always hoped it would be.

The aim for the group steering GPC is to develop a standard for a browser-level opt-out for the sale of personal data that businesses subject to CCPA would be legally compelled to respond to — assuming they succeed in getting the standard accepted as legally binding under California’s law.

“We look forward to working with AG Becerra to make GPC legally binding under CCPA,” they write on that.

We’ve reached out to AG Becerra’s office for a response on the launch. He has also just tweeted approvingly — calling the proposal “a first step towards a meaningful global privacy control that will make it simple and easy for consumers to exercise their privacy rights online”.

“CA DOJ is encouraged to see the technology community developing a global privacy control in furtherance of the CCPA and consumer privacy rights,” he added in a follow on tweet.

At the same time — and as GPC’s name implies — the ambition is to develop a standard that’s able to flex to mesh with privacy regimes elsewhere, such as Europe’s GDPR framework (which provides citizens with a suite of protective and access rights around their data, though not a carbon-copy CCPA opt-out for the sale of data).

“While they don’t specifically call for a GPC, I think there’s a potential for EU DPAs [data protection agencies] to consider a mechanism like this as a valid way for consumers to invoke their rights under GDPR, including the objection to sale,” Soltani tells TechCrunch. “Also the spec was designed to be extensible in case the laws vary slightly from CCPA — permitting users to object to specific uses in GDPR — or even the new rights that will come about if CPRA (Prop24) passes next month.”

One big and obvious question looming over this effort is why not simply revive DNT as a vehicle for expressing the CCPA opt-out signal?

Much effort and resource has been expended over the years to try to make DNT fly. Not entirely without success, given it was able to gain widespread backing from browser makers — falling apart from lack of compliance on the other side of the coin given the lack of legal compulsion.

However now, with robust legal regimes in place protecting people’s digital data (at least in Europe and California), you could argue there’s an opportunity to revive DNT and make it stick this time. (And, indeed, some EU parliamentarians have, in recent years, suggested Do Not Track settings could be used to express consent to processing as part of a planned reform of EU ePrivacy rules — likely with an eye on tidying up the consent pop-up clutter that’s been supercharged by GDPR compliance efforts.)

However the answer to why GPC, rather than DNT 2.0, seems to be partly related to all the baggage accumulated around Do Not Track — whose pithy call to action can still send insta-shudders down adtech exec spines. (Whereas ‘Global Privacy Control’ is certainly boring-sounding enough that it could have been dreamt up by an adtech lobbyist and may, therefore, put fewer industry noses out of joint.)

More seriously, the potential for using DNT to express opt-out signals was discussed by California lawmakers when they were drawing up CCPA, and industry feedback taken in — and the message they got back was that most businesses were ignoring it, which in turn led to a feeling that a revived DNT would just continue to be ignored.

Hence the law may demand a more precision instrument to carry the torch for user privacy, is the thinking.

We also understand the GPC effort had intended and expected to be able to use DNT as the opt out mechanism. But in the end, given the concern around compliance, they decided a CCPA-specific mechanism was needed to circumvent this problem of businesses tuning out the broader DNT signal.

“Getting privacy online should be simple and accessible to everyone, period,” said Gabriel Weinberg, CEO & founder of DuckDuckGo in a supporting statement. “Global Privacy Control (GPC) takes us one step closer to making this vision a reality by creating a simple universal setting for users to express their preference for privacy. DuckDuckGo is proud to be a founding member of this effort and starting today, the GPC will be launching in our mobile browser and desktop browser extensions, making the setting available to over ten million consumers.”

“Mozilla is pleased to support the Global Privacy Control initiative. People’s data rights must be recognized and respected, and this is a step in the right direction. We look forward to working with the rest of the web standards community to bring these protections to everyone,” added Selena Deckelmann, VP of Firefox Desktop.

The full spec of the proposed GPC standard can be found here.

Google proposes new privacy and anti-fingerprinting controls for the web

Google today announced a new long-term initiative that, if fully realized, will make it harder for online marketers and advertisers to track you across the web. This new proposal follows the company’s plans to change how cookies in Chrome work and to make it easier for users to block tracking cookies.

Today’s proposal for a new open standard extends this by looking at how Chrome can close the loopholes that the digital advertising ecosystem can use to circumvent that. And soon, that may mean that your browser will feature new options that give you more control over how much you share without losing your anonymity.

Over the course of the last few months, Google started talking about a ‘Privacy Sandbox’ which would allow for a certain degree of personalization while still protecting a user’s privacy.

“We have a great reputation on security. […] I feel the way we earned that reputation was by really moving the web forward,” Justin Schuh, Google’s engineering director for Chrome security and privacy told me. “We provide a lot of benefits, worked on a lot of different fronts. What we’re trying to do today is basically do the same thing for privacy: have the same kind of big, bold vision for how we think privacy should work on the web, how we should make browsers and the web more private by default.”

Here is the technical side of what Google is proposing today: to prevent the kind of fingerprinting that makes your machine uniquely identifiable as yours, Google is proposing the idea of a privacy budget. With this, a browser could allow websites to make enough API calls to get enough information about you to group your into a larger cohort but not to the point where you give up your anonymity. Once a site has exhausted this budget, the browser stops responding to any further calls.

Some browsers also already implement a very restrictive form of cookie blocking. Google argues that this has unintended consequences and that there needs to be an agreed-upon set of standards. “The other browser vendors, for the most part, we think really are committed to an open web,” said Schuh, who also stressed that Google wants this to be an open standard and develop it in collaboration with other players in the web ecosystem.

“There’s definitely been a lot of not intentional misinformation but just incorrect data about how sites monetize and how publishers are actually funded,” Schuh stressed. Indeed, Google today notes that its research has shown that publishers lose an average of 52 percent of their advertising revenue when their readers block cookies. That number is even higher for news sites.

In addition, blocking all third-party cookies is not a viable solution according to Google because developers will find ways around this restriction by relying on fingerprinting a user’s machine instead. Yet while you can opt out of cookies and delete them from your browser, you can’t opt out of being fingerprinted since there’s no data stored on your machine (unless you regularly change the configuration of your laptop, the fonts you have installed and other identifiable traits that make your laptop uniquely yours).

What Google basically wants to do here is change the incentive structure for the advertising ecosystem. Instead of trying to circumvent a browser’s cookie and fingerprinting restrictions, the privacy budget, in combination with the industry’s work on federated learning and differential privacy, this is meant to give advertisers the tools they need without hurting publishers, while still respecting the users’ privacy. That’s not an easy switch and something that, as Google freely acknowledges, will take years.

“It’s going to be a multi-year journey,” said Schuh. “What I can say is that I have very high confidence that we will be able to change the incentive structures with this. So we are committed to taking very strong measures to preserve user privacy, we are committed to combating abuses of user privacy. […] But as we’re doing that, we have to move the platform forward and make the platform inherently provide much more robust privacy protections.”

Most of the big tech companies now understand that they have a responsibility to help their users retain their privacy online. Yet at the same time, personalized advertising relies on knowing as much as possible about a given user and Google itself makes the vast majority of its income from its various ad services. It sounds like this should create some tension inside the company. Schuh, however, argued that Google’s ad side and the Chrome team have their independence. “At the end of the day, we’re a web browser, we are concerned about our users base. We are going to make the decisions that are most in their interest so we have to weigh how all of this fits in,” said Schuh. He also noted that the ad side has a very strong commitment to user transparency and user control — and that if users don’t trust the ads ecosystem, that’s a problem, too.

For the time being, though, there’s nothing here for you to try out or any bits being shipped in the Chrome browser. For now, this is simply a proposal and an effort on the Chrome team’s part to start a conversation. We should expect the company to start experimenting with some of these ideas in the near future, though.

Just like with its proposed changes to how advertisers and sites use cookies, this is very much a long-term project for the company. Some users will argue that Google could take more drastic measures and simply use its tech prowess to stop the ad ecosystem from tracking you through cookies, fingerprinting and whatever else the adtech boffins will dream up next. If Google’s numbers are correct, though, that would definitely hurt publishers and few publications are in a position to handle a 50 percent drop in revenue. I can see why Google doesn’t want to do this alone, but it does have the market position to be more aggressive in pushing for these changes.

Apple, which doesn’t have any vested interest in the advertising business, has already made this more drastic move with the latest release of Safari. Its browser now blocks a number of tracking technologies, including fingerprinting, without making any concessions to advertisers. The results of this for publishers is in line with Google’s cookie study.

As far as the rest of Chrome’s competitors, Firefox has started to add anti-fingerprinting techniques as well. Upstart Brave, too, has added fingerprinting protection for all third-party content, while Microsoft’s new Edge currently focuses on cookies for tracking prevention.

By trying to find a middle path, Chrome runs the risk of falling behind as users look for browsers that protect their privacy today — especially now that there are compelling alternatives again.