New Google Play policies to cut down on ‘fleeceware,’ deepfakes, and unnecessary location tracking

Google is today announcing a series of policy changes aimed at eliminating untrustworthy apps from its Android app marketplace, the Google Play store. The changes are meant to give users more control over how their data is used, tighten subscription policies and help prevent deceptive apps and media — including those involving deepfakes — from becoming available on the Google Play Store.

Background Location

The first of these new policies is focused on the location tracking permissions requested by some apps.

Overuse of location tracking has been an area Google has struggled to rein in. In Android 10, users were able to restrict apps’ access to location while the app was in use, similar to what’s been available on iOS. With the debut of Android 11, Google decided to give users even more control with the new ability to grant a temporary “one-time” permission to sensitive data, like location.

In February, Google said it would also soon require developers to get user permission before accessing background location data, after noting that many apps were asking for unnecessary user data. The company found that a number of these apps would have been able to provide the same experience to users if they only accessed location while the app was in use — there was no advantage to running the app in the background.

Of course, there’s an advantage for developers who are collecting location data. This sort of data can be sold to third-party through trackers that supply advertisers with detailed information about the app’s users, earning the developer additional income.

The new change to Google Play policies now requires that developers get approval to access background location in their apps.

But Google is giving developers time to comply. It says no action will be taken for new apps until August 2020 or on existing apps until November 2020.

“Fleeceware”

A second policy is focused on subscription-based apps. Subscriptions have become a booming business industry-wide. They’re often a better way for apps to generate revenue as opposed to other monetization methods — like paid downloads, ads or in-app purchases.

However, many subscription apps are duping users into paying by not making it easy or obvious how to dismiss a subscription offer in order to use the free parts of an app, or not being clear about subscription terms or the length of free trials, among other things.

The new Google Play policy says developers will need to be explicit about their subscription terms, trials and offers, by telling users the following:

  • Whether a subscription is required to use all or parts of the app (and if not required, allow users to dismiss the offer easily).
  • The cost of the subscription.
  • The frequency of the billing cycle.
  • Duration of free trials and offers.
  • The pricing of introductory offers.
  • What is included with a free trial or introductory offer.
  • When a free trial converts to a paid subscription.
  • How users can cancel if they do not want to convert to a paid subscription

That means the “fine print” has to be included on the offer’s page, and developers shouldn’t use sneaky tricks like lighter font to hide the important bits, either.

For example:

This change aims to address the rampant problem with “fleeceware” across the Google Play store. Multiple studies have shown subscription apps have gotten out of control. In fact, one study from January stated that over 600 million Android users had installed “fleeceware” apps from the Play Store. To be fair, the problem is not limited to Android. The iOS App Store was recently found to have an issue, too, with more than 3.5 million users having installed “fleeceware.” 

Developers have until June 16, 2020 to come into compliance with this policy, Google says.

Deepfakes

The final update has to do with the Play Store’s “Deceptive Behavior” policy.

This wasn’t detailed in Google’s official announcements about the new policies, but Google tells us it’s also rolling out updated rules around deceptive content and apps.

Before, Google’s policy was used to restrict apps that tried to deceive users — like apps claiming a functionally impossible task, those that lied in their listing about their content or features or those that mimicked the Android OS, among others.

The updated policy is meant to better ensure all apps are clear about their behavior once they’re downloaded. In particular, it’s meant to prevent any manipulated content (aka “deepfakes”) from being available on the Play Store.

Google tells us this policy change won’t impact apps that allow users to make deepfakes that are “for fun” — like those that allow users to swap their face onto GIFs, for example. These will fall under an exception to the rule, which allows deepfakes that are “obvious satire or parody.”

However, it will take aim at apps that manipulate and alter media in a way that isn’t conventionally obvious or acceptable.

For example:

  • Apps adding a public figure to a demonstration during a politically sensitive event.
  • Apps using public figures or media from a sensitive event to advertise media altering capability within an app’s store listing.
  • Apps that alter media clips to mimic a news broadcast.

In particular, the policy will focus on apps that promote misleading imagery that could cause harm related to politics, social issues or sensitive events. The apps must also disclose or watermark the altered media if it isn’t clear the media has been altered.

Similar bans on manipulated media have been enacted across social media platforms, including Facebook, Twitter and WeChat. Apple’s App Store Developer Guidelines don’t specifically reference “deepfakes” by name, however, though it bans apps with false or defamatory information, outside of satire and humor.

Google says the apps currently available on Google Play have 30 days to comply with this change.

In Google’s announcement, the company said it understood these were difficult times for people, which is why it’s taken steps to minimize the short-term impact of these changes. In other words, it doesn’t sound like the policy changes will soon result in any mass banning or big Play Store clean-out — rather, they’re meant to set the stage for better policing of the store in the future.

India says video conference app Zoom is ‘not safe’

India said today Zoom is ‘not a safe platform’ as the video conference service surges in popularity in many nations including the world’s second largest smartphone market as billions of people remain stuck at home due to the coronavirus crisis.

Zoom is a not a safe platform,” the Cyber Coordination Centre (CyCord) of India’s ministry of home affairs said in a 16-page advisory. The advisory also includes guidelines for users who still wish to use Zoom for their private communications.

The move comes as several companies including Google, Apple, NASA, and Tesla have urged — or warned — their employees from using Zoom, which has amassed over 200 million users. German and Taiwan have also banned the use of Zoom in their nations. But the firm, with market cap of over $40 billion, has also come under scrutiny — and become subject of a lawsuit — after several of its security and privacy lapses emerged in recent weeks.

Zoom has been trending in India in recent weeks, too, in a surprise as enterprise services rarely get traction with consumers in the country. Several Indian ministers in India have also tweeted pictures that showed they were using Zoom in recent weeks.

The app is being downloaded more than 450,000 times a day for the last two weeks in India, research firm Apptopia told TechCrunch. This week, India also started a competition for startups to develop a secure conferencing app.

Zoom chief executive has Eric S. Yuan has apologized for the security lapses and pledged to prioritize focus on users’ privacy and security over development of new features. The firm recently also recently hired former Facebook security officer Alex Stamos as an advisor.

EU lawmakers set out guidance for coronavirus contacts tracing apps

The European Commission has published detailed guidance for Member States on developing coronavirus contacts tracing and warning apps.

The toolbox, which has been developed by the e-Health Network with the support of the Commission, is intended as a practical guide to implementing digital tools for tracking close contacts between device carriers as a proxy for infection risk that seeks to steer Member States in a common, privacy-sensitive direction as they configure their digital responses to the COVID-19 pandemic.

Commenting in a statement, Thierry Breton — the EU commissioner for Internal Market — said: Contact tracing apps to limit the spread of coronavirus can be useful, especially as part of Member States’ exit strategies. However, strong privacy safeguards are a pre-requisite for the uptake of these apps, and therefore their usefulness. While we should be innovative and make the best use of technology in fighting the pandemic, we will not compromise on our values and privacy requirements.”

“Digital tools will be crucial to protect our citizens as we gradually lift confinement measures,” added Stella Kyriakides, commissioner for health and food safety, in another supporting statement. “Mobile apps can warn us of infection risks and support health authorities with contact tracing, which is essential to break transmission chains. We need to be diligent, creative, and flexible in our approaches to opening up our societies again. We need to continue to flatten the curve – and keep it down. Without safe and compliant digital technologies, our approach will not be efficient.”

The Commission’s top-line “essential requirements” for national contacts tracing apps are that they’re:

  • voluntary;
  • approved by the national health authority;
  • privacy-preserving (“personal data is securely encrypted”); and
  • dismantled as soon as no longer needed

In the document the Commission writes that the requirements on how to record contacts and notify individuals are “anchored in accepted epidemiological guidance, and reflect best practice on cybersecurity, and accessibility”.

“They cover how to prevent the appearance of potentially harmful unapproved apps, success criteria and collectively monitoring the effectiveness of the apps, and the outline of a communications strategy to engage with stakeholders and the people affected by these initiatives,” it adds.

Yesterday, setting out a wider roadmap to encourage a co-ordinated lifting of the coronavirus lockdown, the Commission suggested digital tools for contacts tracing will play a key role in easing quarantine measures.

Although today’s toolbox clearly emphasizes the need to use manual contact tracing in parallel with digital contact tracing, with such apps and tools envisaged as a support for health authorities — if widely rolled out — by enabling limited resources to be more focused toward manual contacts tracing.

“Manual contact tracing will continue to play an important role, in particular for those, such as elderly or disabled persons, who could be more vulnerable to infection but less likely to have a mobile phone or have access to these applications,” the Commission writes. “Rolling-out mobile applications on a large-scale will significantly contribute to contact tracing efforts also allowing health authorities to carry manual tracing in a more focussed manner.”

“Mobile apps will not reach all citizens given that they rely on the possession and active use of a smart phone. Evidence from Singapore and a study by Oxford University indicate that 60-75% of a population need to have the app for it to be efficient,” it adds in a section on accessibility and inclusiveness. “However, non-users will benefit from any increased population disease control the widespread use of such an app may bring.”

The toolbox also reiterates a clear message from the Commission in recent days that “appropriate safeguards” must be embedded into digital contacts tracing systems. Though it’s less clear whether all Member States are listening to memos about respecting EU rights and freedoms, as they scrambled for tech and data to beat back COVID-19.

“This digital technology, if deployed correctly, could contribute substantively to containing and reversing its spread. Deployed without appropriate safeguards, however, it could have a significant negative effect on privacy and individual rights and freedoms,” the Commission writes, further warning that: “A fragmented and uncoordinated approach to contact tracing apps risks hampering the effectiveness of measures aimed at combating the COVID-19 crisis, whilst also causing adverse effects to the single market and to fundamental rights and freedoms.”

On safeguards the Commission has a clear warning for EU Member States, writing: “Any contact tracing and warning app officially recognised by Member States’ relevant authorities should present all guarantees for respect of fundamental rights, and in particular privacy and data protection, the prevention of surveillance and stigmatization.”

Its list of key safeguards notably includes avoiding the collection of any location data.

“Location data is not necessary nor recommended for the purpose of contact tracing apps, as their goal is not to follow the movements of individuals or to enforce prescriptions,” it says. “Collecting an individual’s movements in the context of contact tracing apps would violate the principle of data minimisation and would create major security and privacy issues.”

The toolbox also emphasizes that such contacts tracing/warning systems be temporary and voluntary in nature — with “automated/gentle self-dismantling, including deletion of all remaining personal data and proximity information, as soon as the crisis is over”.

“The apps’ installation should be consent-based, while providing users with complete and clear information on intended use and processing,” is another key recommendation. 

The toolbox leans towards suggesting a decentralized approach, in line with earlier Commission missives, with a push for: “Safeguards to ensure the storing of proximity data on the device and data encryption.”

Though the document also includes some discussion of alternative centralized models which involve uploading arbitrary identifiers to a backend server held by public health authorities. 

Users cannot be directly identified through these data. Only the arbitrary identifiers generated by the app are stored on the server. The advantage is that the data stored in the server can be anonymised by aggregation and further used by public authorities as a source of important aggregated information on the intensity of contacts in the population, on the effectiveness of the app in tracing and alerting contacts and on the aggregated number of people that could potentially develop symptoms,” it writes. 

“None of the two options [decentralized vs centralized] includes storing of unnecessary personal information,” it adds, leaving the door open to states that might want their public health authorities to be responsible for centralized data processing.

However the Commission draws a clear distinction between centralized approaches that use arbitrary identifiers and those that store directly-identifiable data on every user — with the latter definitely not recommended.

They would have “major disadvantage”, per the toolbox, because they “would not keep personal data processing to the absolute minimum, and so people may be less willing to install and use the app”.

“Centralised storage of mobile phone numbers could also create risks of data breaches and cyberattacks,” the Commission further warns.

Michael Veale, a backer of a decentralized protocol for COVID-19 contacts tracing that’s being developed by an EU coalition of privacy and security experts, told us: “It is good to see the document clearly lay out how you can achieve contact tracing in a decentralised, privacy-preserving way. However, some Member States might be confused, as they think that if they go for PEPP-PT [an initiative to standardize contacts tracing apps by distributing tools and processes, which is being developed by another EU coalition that previously told us it will support both centralized and decentralized approaches], they get privacy and decentralisation. In fact, PEPP-PT has removed mention of DP-3T from its website, but has not published any alternative white paper or code for scrutiny for its own system.”

We’ve reached out to PEPP-PT for comment.

Discussing cross-border interoperability requirements, the Commission’s toolbox highlights the necessity for a grab-bag of EU contacts tracing apps to be interoperable, in order to successfully break cross-border transmission chains, which requires national health authorities to be technically able to exchange available information about individuals infected with and/or exposed to COVID-19.

“Tracing and warning apps should therefore follow common EU interoperability protocols so that the previous functionalities can be performed, and particularly safeguarding rights to privacy and data protection, regardless of where a device is in the EU,” it suggests.

On preventing the spread of harmful or unlawful apps the document suggests Member States consider setting up a national system of evaluation/accreditation endorsement of national apps, perhaps based on a common set of criteria (that would need to be defined).

“A close cooperation between health and digital authorities should be sought whenever possible for the evaluation/endorsement of the apps,” it writes. 

The Commission also says “close cooperation with app stores will be needed to promote national apps and promote uptake while delisting harmful apps” — putting Apple and Google squarely in the frame.

Earlier this week the pair announced their own collaboration on coronavirus contracts tracing — announcing a plan to offer an API and later opt-in system-level contacts tracing, based on a decentralized tracking architecture with ephemeral IDs processed locally on devices, rather than being uploaded and held on a central server.

Given the dominance of the two tech giants their decision to collaborate on a decentralized system may effectively deprive national health authorities of the option to gain buy in for systems that would give those publicly funded bodies access to anonymized and aggregated data for coronavirus modelling and/or tracking purposes. Which should, in the middle of a pandemic, give more than a little pause for thought.

A note in the toolbox mentions Apple and Google — with the Commission writing that: “By the end of April 2020, Member States with the Commission will seek clarifications on the solution proposed by Google and Apple with regard to contact tracing functionality on Android and iOS in order to ensure that their initiative is compatible with the EU common approach.”

Digital mapping of coronavirus contacts will have key role in lifting Europe’s lockdown, says Commission

The European Commission has set out a plan for co-ordinating the lifting of regional coronavirus restrictions that includes a role for digital tools — in what the EU executive couches as “a robust system of reporting and contact tracing”. However it has reiterated that such tools must “fully respect data privacy”.

Last week the Commission made a similar call for a common approach to data and apps for fighting the coronavirus, emphasizing the need for technical measures to be taken to ensure that citizens’ rights and freedoms aren’t torched in the scramble for a tech fix.

Today’s toolbox of measures and principles is the next step in its push to coordinate a pan-EU response.

Responsible planning on the ground, wisely balancing the interests of protection of public health with those of the functioning of our societies, needs a solid foundation. That’s why the Commission has drawn up a catalogue of guidelines, criteria and measures that provide a basis for thoughtful action,” said EC president Ursula von der Leyen, commenting on the full roadmap in a statement.

“The strength of Europe lies in its social and economic balance. Together we learn from each other and help our European Union out of this crisis,” she added.

Harmonized data gathering and sharing by public health authorities — “on the spread of the virus, the characteristics of infected and recovered persons and their potential direct contacts” — is another key plank of the plan for lifting coronavirus restrictions on citizens within the 27 Member State bloc.

While ‘anonymized and aggregated’ data from commercial sources — such as telcos and social media platforms — is seen as a potential aid to pandemic modelling and forecasting efforts, per the plan.

“Social media and mobile network operators can offer a wealth of data on mobility, social interactions, as well as voluntary reports of mild disease cases (e.g. via participatory surveillance) and/or indirect early signals of disease spread (e.g. searches/posts on unusual symptoms),” it writes. “Such data, if pooled and used in anonymised, aggregated format in compliance with EU data protection and privacy rules, could contribute to improve the quality of modelling and forecasting for the pandemic at EU level.”

The Commission has been leaning on telcos to hand over fuzzy metadata for coronavirus modelling which it wants done by the EU’s Joint Research Centre. It wrote to 19 mobile operators last week to formalize its request, per Euractiv, which reported yesterday that its aim is to have the data exchange system operational ‘as soon as possible’ — with the hope being it will cover all the EU’s member states.

Other measures included in the wider roadmap are the need for states to expand their coronavirus testing capacity and harmonize tesing methodologies — with the Commission today issuing guidelines to support the development of “safe and reliable testing”.

Steps to support the reopening of internal and external EU borders is another area of focus, with the executive generally urging a gradual and phased lifting of coronavirus restrictions.

On contacts tracing apps specifically, the Commission writes:

“Mobile applications that warn citizens of an increased risk due to contact with a person tested positive for COVID-19 are particularly relevant in the phase of lifting containment measures, when the infection risk grows as more and more people get in contact with each other. As experienced by other countries dealing with the COVID-19 pandemic, these applications can help interrupt infection chains and reduce the risk of further virus transmission. They should thus be an important element in the strategies put in place by Member States, complementing other measures like increased testing capacities.

“The use of such mobile applications should be voluntary for individuals, based on users’ consent and fully respecting European privacy and personal data protection rules. When using tracing apps, users should remain in control of their data. National health authorities should be involved in the design of the system. Tracing close proximity between mobile devices should be allowed only on an anonymous and aggregated basis, without any tracking of citizens, and names of possibly infected persons should not be disclosed to other users. Mobile tracing and warning applications should be subject to demanding transparency requirements, be deactivated as soon as the COVID-19 crisis is over and any remaining data erased.”

“Confidence in these applications and their respect of privacy and data protection are paramount to their success and effectiveness,” it adds.

Earlier this week Apple and Google announced a collaboration around coronavirus contracts tracing — throwing their weight behind a privacy-sensitive decentralized approach to proximity tracking that would see ephemeral IDs processed locally on devices, rather than being continually uploaded and held on a central server.

A similar decentralized infrastructure for Bluetooth-based COVID-19 contacts tracing had already been suggested by a European coalition of privacy and security experts, as we reported last week.

While a separate coalition of European technologists and researchers has been pushing a standardization effort for COVID-19 contacts tracing that they’ve said will support either centralized or decentralized approaches — in the hopes of garnering the broadest possible international backing.

For its part the Commission has urged the use of technologies such as decentralization for COVID-19 contacts tracing to ensure tools align with core EU principles for handling personal data and safeguarding individual privacy, such as data minimization.

However governments in the region are working on a variety of apps and approaches for coronavirus contacts tracing that don’t all look as if they will check a ‘rights respecting’ box…

In a video address last week, Europe’s lead privacy regulator, the EDPS, intervened to call for a “panEuropean model ‘COVID-19 mobile application’, coordinated at EU level” — in light of varied tech efforts by Member States which involve the processing of personal data for a claimed public health purpose.

“The use of temporary broadcast identifiers and bluetooth technology for contact tracing seems to be a useful path to achieve privacy and personal data protection effectively,” said Wojciech Wiewiórowski on Monday week. “Given these divergences, the European Data Protection Supervisor calls for a panEuropean model “COVID-19 mobile application”, coordinated at EU level. Ideally, coordination with the World Health Organisation should also take place, to ensure data protection by design globally from the start.”

The Commission has not gone so far in today’s plan — calling instead for Member States to ensure their own efforts align with the EU’s existing data protection framework.

Though its roadmap is also heavy on talk of the need for “coordination between Member Statesto avoid negative effects” — dubbing it “a matter of common European interest”. But, for now, the Commission has issued a list of recommendations; it’s up to Member States to choose to fall in behind them or not.

With the caveat that EU regulators are watching very carefully how states’ handle citizens’ data.

“Legality, transparency and proportionality are essential for me,” warned Wiewiórowski, ending last week’s intervention on the EU digital response to the coronavirus with a call for “digital solidarity, which should make data working for all people in Europe and especially for the most vulnerable” — and a cry against “the now tarnished and discredited business models of constant surveillance and targeting that have so damaged trust in the digital society”.

Call for common EU approach to apps and data to fight COVID-19 and protect citizens’ rights

The European Commission has responded to the regional scramble for apps and data to help tackle the coronavirus crisis by calling for a common EU approach to boost the effectiveness of digital interventions and ensure key rights and freedoms are respected.

The European Union’s executive body wants to ensure Member States’ individual efforts to use data and tech tools to combat COVID-19 are aligned and can interoperate across borders — and therefore be more effective, given the virus does not respect national borders.

Current efforts by governments across the EU to combat the virus are being hampered by the fragmentation of approaches, it warns.

At the same time its recommendation puts a strong focus on the need to ensure that fundamental EU rights do not get overridden in the rush to mitigate the spread of the virus — with the Commission urging public health authorities and research institutions to observe a key EU legal principle of data minimization when processing personal data for a coronavirus purpose.

Specifically it writes that these bodies should apply what it calls “appropriate safeguards” — listing pseudonymization, aggregation, encryption and decentralization as examples of best practice. 

The Commission’s thinking is that getting EU citizens to trust digital efforts — such as the myriad of COVID-19 contacts tracing apps now in development — will be key to their success by helping to drive uptake and usage, which means core rights like privacy take on additional significance at a moment of public health crisis.

Commenting in a statement, commissioner for the EU’s internal market, Thierry Breton said: “Digital technologies, mobile applications and mobility data have enormous potential to help understand how the virus spreads and to respond effectively. With this Recommendation, we put in motion a European coordinated approach for the use of such apps and data, without compromising on our EU privacy and data protection rules, and avoiding the fragmentation of the internal market. Europe is stronger when it acts united.”

“Europe’s data protection rules are the strongest in the world and they are fit also for this crisis, providing for exceptions and flexibility. We work closely with data protection authorities and will come forward with guidance on the privacy implications soon,” added Didier Reynders, the commissioner for justice, in another supporting statement. “We all must work together now to get through this unprecedented crisis. The Commission is supporting the Member States in their efforts to fight the virus and we will continue to do so when it comes to an exit strategy and to recovery. In all this, we will continue to ensure full respect of Europeans’ fundamental rights.”

Since Europe has fast-followed China to become a secondary epicenter for the SARS-CoV-2 virus there has been a rush by governments, institutions and the private sector to grab data and technologies to try to map the spread of the virus and inform policy responses. The Commission itself has leant on telcos to provide anonymized and aggregated user location data for COVID-19 tracking purposes.

Some individual Member States have gone further — calling in tech companies to ask directly for resources and/or data, with little public clarity on what exactly is being provided. Some governments have even rushed out apps that apply individual-level location tracking to enforce quarantine measures.

Multiple EU countries also have contacts tracing apps in the works — taking inspiration from Singapore’s TraceTogether app which users Bluetooth proximity as a proxy for infection risk.

With so much digital activity going on — and huge economic and social pressure for a ‘coronavirus fix’ — there are clear risks to privacy and civil liberties. Governments, research institutions and the private sector are all mobilizing to capture health-related data and track people’s location like never before, all set against the pressing backdrop of a public health emergency.

The Commission warned today that some of the measures being taken by certain (unnamed) countries — such as location-tracking of individuals; the use of technology to rate an individual’s level of health risk; and the centralization of sensitive data — risk putting pressure on fundamental EU rights and freedoms.

Its recommendation emphasizes that any restrictions on rights must be justified, proportionate and temporary.

Any such restrictions should remain “strictly limited” to what is necessary to combat the crisis and should not continue to exist “without an adequate justification” after the COVID-19 emergency has passed, it adds.

It’s not alone in expressing such concerns.

In recent days bottom-up efforts have emerged out of EU research institutions with the aim of standardizing a ‘privacy-preserving’ approach to coronavirus contacts tracing.

One coalition of EU technologists and scientists led by institutions in Germany, Switzerland and France, is pushing a common approach that they’re hoping will get baked into such apps to limit risks. They’ve called the effort: PEPP-PT (Pan-European Privacy-Preserving Proximity Tracing).

However a different group of privacy experts is simultaneously pushing for a decentralized method for doing the same thing (DP-3T) — arguing it’s a better fit with the EU’s data protection model as it doesn’t require pseudonymized IDs to be centralized on a server. Instead storage of contacts and individual infection risk processing would be decentralized — performed locally, on the user’s device — thereby shrinking the risk of such a system being repurposed to carry out state-level surveillance of citizens.

Although the backers of this protocol accept it does not erase all risk; with the potential for tech savvy hackers to intercept the pseudonymized IDs of infected people at the point they’re being broadcast to devices for local processing, for instance. (While health authorities may be more accustomed to the concept of centralizing data to secure it, rather than radically distributing it.)

Earlier this week, one of the technologists involved in the PEPP-PT project told us it intends to support both approaches — centralized and decentralized — in order to try to maximize international uptake, allowing developers to make their own choice of preferred infrastructure.

Though questions remain over achieving interoperability between different models.

Per its recommendation, the Commission looks to be favoring a decentralized model — as the closest fit with the EU’s rights framework.

In a section of its recommendation paper on privacy and data protection for “COVID-19 mobile warning and prevention applications” it also states a preference for “safeguards ensuring respect for fundamental rights and prevention of stigmatization” — and for “the least intrusive yet effective measures”.

The Commission’s recommendation also stresses the importance of keeping the public informed.

“Transparency and clear and regular communication, and allowing for the input of persons and communities most affected, will be paramount to ensuring public trust when combating the COVID-19 crisis,” it warns. 

The Commission is proposing a joint toolbox to be developed with EU Member States to encourage a rights-respecting, coordinated and common approach to smartphone apps for tracing COVID-19 infections — which will consist of [emphasis its]:

  • specifications to ensure the effectiveness of mobile information, warning and tracing applications from a medical and technical point of view;
  • measures to avoid proliferation of incompatible applications, support requirements for interoperability and promotion of common solutions;
  • governance mechanisms to be applied by public health authorities and in cooperation with the European Centre for Disease Control;
  • the identification of good practices and mechanisms for exchange of information on the functioning of the applications; and
  • sharing data with relevant epidemiological public bodies, including aggregated data to ECDC.

It also says it will be providing guidance for Member States that will specifically cover off data protection and privacy implications — another clear signal of concerns.

“The Commission is in close contact with the European Data Protection Board [EDPB] for an overview of the processing of personal data at national level in the context of the coronavirus crisis,” it adds.

Yesterday, following a plenary meeting of the EU data watchdogs body, the EDPB announced that it’s assigned expert subgroups to work on developing guidance on key aspects of data processing in the fight against COVID-19 — including for geolocation and other tracing tools in the context of the COVID-19 outbreak, with its technology expert subgroup leading the work.

While a compliance, e-government and health expert subgroup is also now working on guidance for the processing of health data for research purposes in the coronavirus context.

These are the two areas the EDPB said it’s prioritizing at this time, putting planned guidance for teleworking tools and practices during the current crisis on ice for now.

“I strongly believe data protection and public health go hand in hand,” said EDPB chair, Andrea Jelinek, in a statement: “The EDPB will move swiftly to issue guidance on these topics within the shortest possible notice to help make sure that technology is used in a responsible way to support and hopefully win the battle against the corona pandemic.”

The Commission also wants a common approach for modelling and predicting the spread of COVID-19 too — and says the toolbox will focus on developing this via the use of “anonymous and aggregated mobile location data” (such as it has been asking EU operators to provide).

“The aim is to analyse mobility patterns including the impact of confinement measures on the intensity of contacts, and hence the risks of contamination,” it writes. “This will be an important and proportionate input for tools modelling the spread of the virus, and provide insights for the development of strategies for opening up societies again.”

“The Commission already started the discussion with mobile phone operators on 23 March 2020 with the aim to cover all Member States. The data will be fully anonymised and transmitted to the Joint Research Centre for processing and modelling. It will not be shared with third parties and only be stored as long as the crisis is ongoing,” it adds.

The Commission’s push to coordinate coronavirus tech efforts across the EU has been welcomed by privacy and security experts.

Michael Veale, a backer of the decentralized protocol for COVID-19 contacts tracing, told us: “It’s great to see the Commission recommend decentralisation as a core principle for information systems tackling COVID-19. As our DP-3T protocol shows, creating a centralised database is a wholly unnecessary and removable part of bluetooth contact tracing.”

“We hope to be able to place code online for scrutiny and feedback next week — fully open source, of course,” Veale added. “We have already had great public feedback on the protocol which we are revising in light of that to make it even more private and secure. Centralised systems being developed in Europe, such as in Germany, have not published their protocols, let along code — perhaps they are afraid of what people will find?”

While Lukasz Olejnik, an EU-based cybersecurity advisor and privacy researcher, also welcomed the Commission’s intervention, telling us: “A coordinated approach can certainly be easier to build trust. We should favor privacy-respecting approaches, and make it clear that we are in a crisis situation. Any such crisis system should be dismantled, and it looks like the recommendations recognize it. This is good.”

The Commission intends the toolbox for moving towards a pan-European approach for COVID-19 mobile applications to be developed by April 15.

It also wants Member States to report on the actions they have taken in this area by May 31 — making their measures accessible to other Member States and the Commission for peer review.

It adds that it will assess the progress made and publish periodic reports starting in June 2020 and throughout the crisis, recommending action and/or the phasing out of measures that are no longer necessary.

Cookie consent still a compliance trash-fire in latest watchdog peek

The latest confirmation of the online tracking industry’s continued flouting of EU privacy laws which — at least on paper — are supposed to protect citizens from consent-less digital surveillance comes by via Ireland’s Data Protection Commission (DPC).

The watchdog did a sweep survey of around 40 popular websites last year — covering sectors including media and publishing; retail; restaurants and food ordering services; insurance; sport and leisure; and the public sector — and in a new report, published yesterday, it found almost all failing on a number of cookie and tracking compliance issues, with breaches ranging from minor to serious.

Twenty were graded ‘amber’ by the regulator, which signals a good response and approach to compliance but with at least one serious concern identified; twelve were graded ‘red’, based on very poor quality responses and a plethora of bad practices around cookie banners, setting multiple cookies without consent, badly designed cookies policies or privacy policies, and a lack of clarity about whether they understood the purposes of the ePrivacy legislation; while a further three got a borderline ‘amber to red’ grade.

Just two of the 38 controllers got a ‘green’ rating (substantially compliance with any concerns straightforward and easily remedied); and one more got a borderline ‘green to amber’ grade.

EU law means that if a data controller is relying on consent as the legal basis for tracking a user the consent must be specific, informed and freely given. Additional court rulings last year have further finessed guidance around online tracking — clarifying pre-checked consent boxes aren’t valid, for example.

Yet the DPC still found examples of cookie banners that offer no actual choice at all. Such as those which serve a dummy banner with a cookie notice that users can only meaningless click ‘Got it!’. (‘Gotcha data’ more like.. )

In fact the watchdog writes that it found ‘implied’ consent being relied upon by around two-thirds of the controllers, based on the wording of their cookie banners (e.g. notices such as: “by continuing to browse this site you consent to the use of cookies”) — despite this no longer meeting the required legal standard.

“Some appeared to be drawing on older, but no longer extant, guidance published by the DPC that indicated consent could be obtained ‘by implication’, where such informational notices were put in place,” it writes, noting that current guidance on its website “does not make any reference to implied consent, but it also focuses more on user controls for cookies rather than on controller obligations”.

Another finding was that all but one website set cookies immediately on landing — with “many” of these found to have no legal justification for not asking first, as the DPC determined they fall outside available consent exemptions in the relevant regulations.

It also identified widespread abuse of the concept of ‘strictly necessary’ where the use of trackers are concerned. “Many controllers categorised the cookies deployed on their websites as having a ‘necessary’ or ‘strictly necessary’ function, where the stated function of the cookie appeared to meet neither of the two consent exemption criteria set down in the ePrivacy Regulations/ePrivacy Directive,” it writes in the report. “These included cookies used to establish chatbot sessions that were set prior to any request by the user to initiate a chatbot function. In some cases, it was noted that the chatbot function on the websites concerned did not work at all.

“It was clear that some controllers may either misunderstand the ‘strictly necessary’ criteria, or that their definitions of what is strictly necessary are rather more expansive than the definitions provided in Regulation 5(5),” it adds.

Another problem the report highlights is a lack of tools for users to vary or withdraw their consent choices, despite some of the reviewed sites using so called ‘consent management platforms’ (CMPs) sold by third-party vendors.

This chimes with a recent independent study of CPMs — which earlier this year found illegal practices to be widespread, with “dark patterns and implied consent… ubiquitous”, as the researchers put it.

“Badly designed — or potentially even deliberately deceptive — cookie banners and consent-management tools were also a feature on some sites,” the DPC writes in its report, detailing some examples of Quantcast’s CPM which had been implemented in such a way as to make the interface “confusing and potentially deceptive” (such as unlabelled toggles and a ‘reject all’ button that had no effect).

Pre-checked boxes/sliders were also found to be common, with the DPC finding ten of the 38 controllers used them — despite ‘consent’ collected like that not actually being valid consent.

“In the case of most of the controllers, consent was also ‘bundled’ — in other words, it was not possible for users to control consent to the different purposes for which cookies were being used,” the DPC also writes. “This is not permitted, as has been clarified in the Planet49 judgment. Consent does not need to be given for each cookie, but rather for each purpose. Where a cookie has more than one purpose requiring consent, it must be obtained for all of those purposes separately.”

In another finding, the regulator came across instances of websites that had embedded tracking technologies, such as Facebook pixels, yet their operators did not list these in responses to the survey, listing only http browser cookies instead. The DPC suggests this indicates some controllers aren’t even aware of trackers baked into their own sites.

“It was not clear, therefore, whether some controllers were aware of some of the tracking elements deployed on their websites — this was particularly the case where small controllers had outsourced their website management and development to a third-part,” it writes.

The worst sector of its targeted sweep — in terms of “poor practices and, in particular, poor understanding of the ePrivacy Regulations and their purpose” — was the restaurants and food-ordering sector, per the report. (Though the finding is clearly based on a small sampling across multiple sectors.)

Despite encountering near blanket failure to actually comply with the law, the DPC, which also happens to be the lead regulator for much of big tech in Europe, has responded by issuing, er, further guidance.

This includes specifics such as pre-checked consent boxes must be removed; cookie banners can’t be designed to ‘nudge’ users to accept and a reject option must have equal prominence; and no non-necessary cookies be set on landing. It also stipulates there must always be a way for users to withdraw consent — and doing so should be as easy as consenting.

All stuff that’s been clear and increasingly so at least since the GDPR came into application in May 2018. Nonetheless the regulator is giving the website operators in question a further six months’ grace to get their houses in order — after which it has raised the prospect of actually enforcing the EU’s ePrivacy Directive and the General Data Protection Regulation.

“Where controllers fail to voluntarily make changes to their user interfaces and/or their processing, the DPC has enforcement options available under both the ePrivacy Regulations and the GDPR and will, where necessary, examine the most appropriate enforcement options in order to bring controllers into compliance with the law,” it warns.

The report is just the latest shot across the bows of the online tracking industry in Europe.

The UK’s Information Commission’s Office (ICO) has been issuing sternly worded blog posts for months. Its own report last summer found illegal profiling of Internet users by the programmatic ad industry to be rampant — also giving the industry six months to reform.

However the ICO still hasn’t done anything about the adtech industry’s legal blackhole — leading to privacy experts to denouncing the lack of any “substantive action to end the largest data breach ever recorded in the UK”, as one put it at the start of this year.

Ireland’s DPC, meanwhile, has yet to put the decision trigger on multiple cross-border investigations into the data-mining business practices of tech giants including Facebook and Google, following scores of GDPR complaints — including several targeting their legal base to process people’s data.

A two-year review of the pan-EU regulation, set for May 2020, provides one hard deadline that might concentrate minds.

EU privacy experts push a decentralized approach to COVID-19 contacts tracing

A group of European privacy experts has proposed a decentralized system for Bluetooth-based COVID-19 contacts tracing which they argue offers greater protection against abuse and misuse of people’s data than apps which pull data into centralized pots.

The protocol — which they’re calling Decentralized Privacy-Preserving Proximity Tracing (DP-PPT) — has been designed by around 25 academics from at least seven research institutions across Europe, including the Swiss Federal Institute of Technology, ETH Zurich and KU Leuven in the Netherlands.

They’ve published a White Paper detailing their approach here.

The key element is that the design entails local processing of contacts tracing and risk on the user’s device, based on devices generating and sharing ephemeral Bluetooth identifiers (referred to as EphIDs in the paper).

A backend server is used to push data out to devices — i.e. when an infected person is diagnosed with COVID-19 a health authority would sanction the upload from the person’s device of a compact representation of EphIDs over the infectious period which would be sent to other devices so they could locally compute whether there is a risk and notify the user accordingly.

Under this design there’s no requirement for pseudonymized IDs to be centralized, where the pooled data would pose a privacy risk. Which in turn should make it easier to persuade EU citizens to trust the system — and voluntarily download contacts tracing app using this protocol — given it’s architected to resist being repurposed for individual-level state surveillance.

The group does discuss some other potential threats — such as posed by tech savvy users who could eavesdrop on data exchanged locally, and decompile/recompile the app to modify elements — but the overarching contention is such risks are small and more manageable vs creating centralized pots of data that risk paving the way for ‘surveillance creep’, i.e. if states use a public health crisis as an opportunity to establish and retain citizen-level tracking infrastructure.

The DP-PPT has been designed with its own purpose-limited dismantling in mind, once the public health crisis is over.

“Our protocol is demonstrative of the fact that privacy-preserving approaches to proximity tracing are possible, and that countries or organisations do not need to accept methods that support risk and misuse,” writes professor Carmela Troncoso, of EPFL. “Where the law requires strict necessity and proportionality, and societal support is behind proximity tracing, this decentralized design provides an abuse-resistant way to carry it out.”

In recent weeks governments all over Europe have been leaning on data controllers to hand over user data for a variety of coronavirus tracking purposes. Apps are also being scrambled to market by the private sector — including symptom reporting apps that claim to help researchers fight the disease. While tech giants spy PR opportunities to repackage persistent tracking of Internet users for a claimed public healthcare cause, however vague the actual utility.

The next big coronavirus tech push looks likely to be contacts-tracing apps: Aka apps that use proximity-tracking Bluetooth technology to map contacts between infected individuals and others.

This is because without some form of contacts tracing there’s a risk that hard-won gains to reduce the rate of infections by curtailing people’s movements will be reversed, i.e. once economic and social activity is opened up again. Although whether contacts tracing apps can be as effective at helping to contain COVID-19 as policymakers and technologists hope remains an open question.

What’s crystal clear right now, though, is that without a thoughtfully designed protocol that bakes in privacy by design contacts-tracing apps present a real risk to privacy — and, where they exist, to hard-won human rights. 

Torching rights in the name of combating COVID-19 is neither good nor necessary is the message from the group backing the DP-PPT protocol.

“One of the major concerns around centralisation is that the system can be expanded, that states can reconstruct a social graph of who-has-been-close-to-who, and may then expand profiling and other provisions on that basis. The data can be co-opted and used by law enforcement and intelligence for non-public health purposes,” explains University College London’s Dr Michael Veale, another backer of the decentralized design.

“While some countries may be able to put in place effective legal safeguards against this, by setting up a centralised protocol in Europe, neighbouring countries become forced to interoperate with it, and use centralised rather than decentralised systems too. The inverse is true: A decentralised system puts hard technical limits on surveillance abuses from COVID-19 bluetooth tracking across the world, by ensuring other countries use privacy-protective approaches.”

“It is also simply not necessary,” he adds of centralizing proximity data. “Data protection by design obliges the minimisation of data to that which is necessary for the purpose. Collecting and centralising data is simply not technically necessary for Bluetooth contact tracing.”

Last week we reported on another EU effort — by a different coalition of technologists and scientists, led by by Germany’s Fraunhofer Heinrich Hertz Institute for telecoms (HHI) — which has said it’s working on a “privacy preserving” standard for Covid-19 contacts tracing which they’ve dubbed: Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT).

At the time it wasn’t clear whether or not the approach was locked to a centralized model of handling the pseudoanonymized IDs. Speaking to TechCrunch today, Hans-Christian Boos, one of the PEPP-PT project’s co-initiators, confirmed the standardization effort will support both centralized and decentralized approaches to handling contacts tracing.

The effort had faced criticizm from some in the EU privacy community for appearing to favor a centralized rather than decentralized approach — thereby, its critics contend, undermining the core claim to preserve user privacy. But, per Boos, it will in fact support both approaches — in a bid to maximize uptake around the world.

He also said it will be interoperable regardless of whether data is centralized or decentralized. (In the centralized scenario, he said the hope is that the not-for-profit that’s being set up to oversee PEPP-PT will be able to manage the centralized servers itself, pending proper financing — a step intended to further shrink the risk of data centralization in regions that lacks a human rights frameworks, for example.)

“We will have both options — centralized and decentralized,” Boos told TechCrunch. “We will offer both solutions, depending on who wants to use what, and we’ll make them operable. But I’m telling you that both solutions have their merits. I know that in the crypto community there is a lot of people who want decentraliztion — and I can tell you that in the health community there’s a lot of people who hate decentralization because they’re afraid that too many people have information about infected people.”

“In a decentralized system you have the simple problem that you would broadcast the anonymous IDs of infected people to everybody — so some countries’ health legislation will absolutely forbid that. Even though you have a cryptographic method, you’re broadcasting the IDs to all over the place — that’s the only way your local phone can find out have I been in contact or no,” Boos went on.

“That’s the drawback of a decentralized solution. Other than that it’s a very good thing. On a centralized solution you have the drawback that there is a single operator, whom you can choose to trust or not to trust — has access to anonymized IDs, just the same as if they were broadcast. So the question is you can have one party with access to anonymized IDs or do you have everybody with access to anonymized IDs because in the end you’re broadcasting them over the network [because] it’s spoofable.”

“If your assumption is that someone could hack the centralized service… then you have to also assume that someone could hack a router, which stuff goes through,” he added. “Same problem.

“That’s why we offer both solutions. We’re not religious. Both solutions offer good privacy. Your question is who would you trust more and who would you un-trust more? Would you trust more a lot of users that you broadcast something to or would you trust more someone who operates a server? Or would you trust more that someone can hack a router or that someone can hack the server? Both is possible, right. Both of these options are totally valid options — and it’s a religious discussion between crypto people… but we have to balance it between what crypto wants and what healthcare wants. And because we can’t make that decision we will end up offering both solutions.

“I think there has to be choice because if we are trying to build an international standard we should try and not be part of a religious war.”

Boos also said the project aims to conduct research into the respective protocols (centralized vs decentralized) to compare and conduct risk assessments based on access to the respective data.

“From a data protection point of view that data is completely anonymized because there’s no attachment to location, there’s no attachment to time, there’s no attachment to phone number, MAC address, SIM number, any of those. The only thing you know there is a contact — a relevant contact between two anonymous IDs. That’s the only thing you have,” he said. “The question that we gave the computer scientists and the hackers is if we give you this list — or if we give you this graph, what could you derive from it? In the graph they are just numbers connected to each other, the question is how can you derive anything from it? They are trying — let’s see what’s coming out.”

“There are lots of people trying to be right about this discussion. It’s not about being right; it’s about doing the right thing — and we will supply, from the initiative, whatever good options there are. And if each of them have drawbacks we will make those drawbacks public and we will try to get as much confirmation and research in on these as we can. And we will put this out so people can make their choices which type of the system they want in their geography,” he added.

“If it turns out that one is doable and one is completely not doable then we will drop one — but so far both look doable, in terms of ‘privacy preserving’, so we will offer both. If one turns out to be not doable because it’s hackable or you could derive meta-information at an unacceptable risk then we would drop it completely and stop offering the option.”

On the interoperability point Boos described it as “a challenge” which he said boils down to how the systems calculate their respective IDs — but he emphasized it’s being worked on and is an essential piece.

“Without that the whole thing doesn’t make sense,” he told us. “It’s a challenge why the option isn’t out yet but we’re solving that challenge and it’ll definitely work… There’s multiple ideas how to make that work.”

“If every country does this by itself we won’t have open borders again,” he added. “And if in a country there’s multiple applications that don’t share data then we won’t have a large enough set of people participating who can actually make infection tracing possible — and if there’s not a single place where we can have discussions about what’s the right thing to do about privacy well then probably everybody will do something else and half of them will use phone numbers and location information.”

The PEPP-PT coalition has not yet published its protocol or any code. Which means external experts wanting to chip in with informed feedback on specific design choices related to the proposed standard haven’t been able to get their hands on the necessary data to carry out a review.

Boos said they intend to open source the code this week, under a Mozilla licence. He also said the project is willing to take on “any good suggestions” as contributions.

“Currently only beta members have access to it because those have committed to us that they will update to the newest version,” he said. “We want to make sure that when we publish the first release of code it should have gone through data privacy validation and security validation — so we are as sure as we can be that there’s no major change that someone on an open source system might skip.”

The lack of transparency around the protocol had caused concern among privacy experts — and led to calls for developers to withhold support pending more detail. And even to speculation that European governments may be intervening to push the effort towards a centralized model — and away from core EU principles of data protection by design and default.

As it stands, the EU’s long-standing data protection law bakes in principles such as data minimization. Transparency is another core requirement. And just last week the bloc’s lead privacy regulator, the EDPS, told us it’s monitoring developments around COVID-19 contacts tracing apps.

“The EDPS supports the development of technology and digital applications for the fight against the coronavirus pandemic and is monitoring these developments closely in cooperation with other national Data Protection Supervisory Authorities. It is firmly of the view that the GDPR is not an obstacle for the processing of personal data which is considered necessary by the Health Authorities to fight the pandemic,” a spokesman told us.

“All technology developers currently working on effective measures in the fight against the coronavirus pandemic should ensure data protection from the start, e.g. by applying apply data protection by design principles. The EDPS and the data protection community stand ready to assist technology developers in this collective endeavour. Guidance from data protection authorities is available here: EDPB Guidelines 4/2019 on Article 25 Data Protection by Design and by Default; and EDPS Preliminary Opinion on Privacy by Design.”

We also understand the European Commission is paying attention to the sudden crop of coronavirus apps and tools — with effectiveness and compliance with European data standards on its radar.

However, at the same time, the Commission has been pushing a big data agenda as part of a reboot of the bloc’s industrial strategy that puts digitization, data and AI at the core. And just today Euroactiv reported on leaked documents from the EU Council which say EU Member States and the Commission should “thoroughly analyse the experiences gained from the COVID-19 pandemic” in order to inform future policies across the entire spectrum of the digital domain.

So even in the EU there is a high level appetite for data that risks intersecting with the coronavirus crisis to drive developments in a direction that might undermine individual privacy rights. Hence the fierce push back from certain pro-privacy quarters for contacts tracing to be decentralized — to guard against any state data grabs.

For his part Boos argues that what counts as best practice ‘data minimization’ boils down to a point of view on who you trust more. “You could make an argument [for] both [deccentralized and centralized approaches] that they’re data minimizing — just because there’s data minimization at one point doesn’t mean you have data minimization overall in a decentralized system,” he suggests.

“It’s a question who do you trust? It’s who would you trust more — that’s the real question. I see the critical point of data as not the list of anonymized contacts — the critical data is the confirmed infected.

“A lot of this is an old, religious discussion between centralization and decentralization,” he added. “Generally IT oscillates between those tools; total distribution, total centralization… Because none of those is a perfect solution. But here in this case I think both offer valid security options, and then they have both different implications on what you’re willing to do or not willing to do with medical data. And then you’ve got to make a decision.

“What we have to do is we’ve got to make sure that the options are available. And we’ve got to make sure there’s sound research, not just conjecture, in heavyweight discussions: How does what work, how do they compare, and what are the risks?”

In terms of who’s involved in PEPP-PT discussions, beyond direct project participants, Boos said governments and health ministries are involved for the practical reason that they “have to include this in their health processes”. “A lot of countries now create their official tracing apps and of course those should be connected to the PEPP-PT,” he said.

“We also talk to the people in the health systems — whatever is the health system in the respective countries — because this needs to in the end interface with the health system, it needs to interface with testing… it should interface with infectious disease laws so people could get in touch with the local CDCs without revealing their privacy to us or their contact information to us, so that’s the conversation we’re also having.”

Developers with early (beta) access are kicking the tyres of the system already. Asked when the first apps making use of PEPP-PT technologies might be in general circulation Boos suggested it could be as soon as a couple of weeks.

“Most of them just have to put this into their tracing layer and we’ve already given them enough information so that they know how they can connect this to their health processes. I don’t think this will take long,” he said, noting the project is also providing a tracing reference app to help countries that haven’t got developer resource on tap.

“For user engagement you’ll have to do more than just tracing — you’ll have to include, for example, the information from the CDC… but we will offer the skeletal implementation of an app to make starting this as a project [easier],” he said.

“If all the people that have emailed us since last week put it in their apps [we’ll get widespread uptake],” Boos added. “Let’s say 50% do I think we get a very good start. I would say that the influx from countries and I would say companies especially who want their workforce back — there’s a high pressure especially to go on a system that allows international exchange and interoperability.”

On the wider point of whether contacts tracing apps is a useful tool to help control the spread of this novel coronavirus — which has shown itself to be highly infectious, more so than flu, for example — Boos said: “I don’t think there’s much argument that isolating infection is important, the problem with this disease is there’s zero symptoms while you’re already contagious. Which means that you can’t just go and measure the temperature of people and be fine. You actually need that look into the past. And I don’t think that can be done accurately without digital help.

“So if the theory that you need to isolate infection chains is true at all, which many diseases have shown that it is — but each disease is different, so there’s no 100% guarantee, but all the data speaks for it — then that is definitely something that we need to do… The argument [boils down to] if we have so many infected as we currently have, does this make sense — do we not end up very quickly, because the world is so interconnected, with the same type of lockdown mechanism?

“This is why it only makes sense to come out with an app like this when you have broken these R0 values [i.e how many other people one infected person can infect] — once you’ve got it under 1 and got the number of cases in your country down to a good level. And I think that in the language of an infectious disease person this means going back to the approach of containing the disease, rather than mitigating the disease — what we’re doing now.”

“The approach of contact chain evaluation allows you to put better priorities on testing — but currently people don’t have the real priority question, they have a resource question on testing,” he added. “Testing and tracing are independent of each other. You need both; because if you’re tracing contacts and you can’t get tested what’s that good for? So yes you definitely [also] need the testing infrastructure for sure.”

New York City bans Zoom in schools citing security concerns

As schools lie empty, students still have to learn. But officials in New York City say schools are not permitted to use Zoom for remote teaching, citing security concerns with the video conferencing service.

“Providing a safe and secure remote learning experience for our students is essential, and upon further review of security concerns, schools should move away from using Zoom as soon as possible,” said Danielle Filson, a spokesperson for the New York City Dept. of Education. “There are many new components to remote learning, and we are making real-time decisions in the best interest of our staff and students.”

Instead, the city’s Dept. of Education is transitioning schools to Microsoft Teams, which the spokesperson said has the “same capabilities with appropriate security measures in place.”

The ban will cover some 1.1 million students in more than 1,800 schools across the city’s five boroughs. The decision to ban Zoom from schools was made in part by New York City’s Cyber Command, which launched in 2018 to help keep the city’s residents safe.

Zoom did not immediately comment.

News of the ban comes after a barrage of criticism over the company’s security policies and privacy practices, as hundreds of millions of users forced to work during the pandemic from home turn to the video calling platform. On Friday, Zoom’s chief executive apologized for “mistakenly” routing some calls through China, after researchers said the setup would put ostensibly encrypted calls at risk of interception by Chinese authorities. Zoom also apologized for claiming its service was end-to-end encrypted when it was not.

Zoom also changed its default settings to enable passwords on video calls by default after a wave of “Zoombombing” attacks, which saw unprotected calls invaded by trolls and used to broadcast abusive content.

Not all schools are said to be finding the transition easy. As first reported by Chalkbeat, Zoom quickly became the popular video calling service of choice after city schools closed on March 16. But one school principal in Brooklyn warned the publication that the shift away from Zoom would make it harder to remotely teach their classes, citing a “clunkiness” of Microsoft’s service.

The city spokesperson said it had been training schools on Microsoft Teams for “several weeks.”

But the spokesperson did not rule out an eventual return to Zoom, saying that the department “continues to review and monitor developments with Zoom,” and will update schools with any changes.

Before suing NSO Group, Facebook allegedly sought their software to better spy on users

Facebook’s WhatsApp is in the midst of a lawsuit against Israeli mobile surveillance outfit NSO Group. But before complaining about the company’s methods, Facebook seems to have wanted to use them for its own purposes, according to testimony from NSO founder Shalev Hulio.

Last year brought news of an exploit that could be used to install one of NSO’s spyware packages, Pegasus, on devices using WhatsApp. The latter sued the former over it, saying that over a hundred human rights activists, journalists and others were targeted using the method.

Last year also saw Facebook finally shut down Onavo, the VPN app it purchased in 2013 and developed into a backdoor method of collecting all manner of data about its users — but not as much as they’d have liked, according to Hulio. In a document filed with the court yesterday he states that Facebook in 2017 asked NSO Group for help collecting data on iOS devices resistant to the usual tricks:

In October 2017, NSO was approached by two Facebook representatives who asked to purchase the right to use certain capabilities of Pegasus, the same NSO software discussed in Plaintiffs’ Complaint.

The Facebook representatives stated that Facebook was concerned that its method for gathering user data through Onavo Protect was less effective on Apple devices than on Android devices. The Facebook representatives also stated that Facebook wanted to use purported capabilities of Pegasus to monitor users on Apple devices and were willing to pay for the ability to monitor Onavo Protect users. Facebook proposed to pay NSO a monthly fee for each Onavo Protect user.

NSO declined, as it claims to only provide its software to governments for law enforcement purposes. But there is a certain irony to Facebook wanting to employ against its users the very software it would later decry being employed against its users. (WhatsApp maintains some independence from its parent company but these events come well after the purchase by and organizational integration into Facebook.)

A Facebook representative did not dispute that representatives from the company approached NSO Group at the time, but said the testimony was an attempt to “distract from the facts” and contained “inaccurate representations about both their spyware and a discussion with people who work at Facebook.” We can presumably expect a fuller rebuttal in the company’s own filings soon.

Facebook and WhatsApp are, quite correctly, concerned that effective, secret intrusion methods like those developed and sold by NSO Group are dangerous in the wrong hands — as demonstrated by the targeting of activists and journalists, and potentially even Jeff Bezos. But however reasonable Facebook’s concerns are, the company’s status as the world’s most notorious collector and peddler of private information makes its righteous stance hard to take seriously.

Google rolls back SameSite cookie changes to keep essential online services from breaking

Google today announced that it will temporarily roll back the changes it recently made to how its Chrome browser handles cookies in order to ensure that sites that perform essential services like banking, online grocery, government services and healthcare won’t become inaccessible to Chrome users during the current COVID-19 pandemic.

The new SameSite rules, which the company started rolling out to a growing number of Chrome users in recent months, are meant to make it harder for sites to access cookies from third-party sites and hence track a user’s online activity. These new rules are also meant to prevent cross-site request forgery attacks.

Under Google’s new guidance, developers must explicitly allow their cookies to be read by third-party sites, otherwise, the browser will prevent these third-party sites from accessing them.

Because this is a pretty major change, Google gave developers quite a bit of time to adapt their applications to it. Still, not every site is ready yet, so the Chrome team decided to halt the gradual rollout and stop enforcing these new rules for the time being.

“While most of the web ecosystem was prepared for this change, we want to ensure stability for websites providing essential services including banking, online groceries, government services and healthcare that facilitate our daily life during this time,” writes Google Chrome engineering director Justin Schuh. “As we roll back enforcement, organizations, users and sites should see no disruption.”

A Google spokesperson also told us that the team saw some breakage in sites “that would not normally be considered essential, but with COVID-19 having become more important, we made this decision in an effort to ensure stability during this time.”

The company says it plans to resume its SameSite enforcement over the summer, though the exact timing isn’t yet clear.