Hundreds of French academics sign letter asking for safeguards on contact-tracing

471 French cryptography and security researchers have signed a letter named Attention StopCovid to raise awareness about the potential risks of a contact-tracing app. A debate in the French parliament will take place tomorrow to talk about all things related to post-lockdown — including contact-tracing app StopCovid.

Among the group of researchers, 77 of them are affiliated with Inria, the French research institute that has been working on the contact-tracing protocol that will power the government-backed contact-tracing app, ROBERT. With this letter, it appears that Inria is conflicted about ROBERT.

“All those applications induce very important risks when it comes to protecting privacy and individual rights,” the letter says. “This mass surveillance could be done by collecting the interaction graph of individuals — the social graph. It could happen at the operating system level on the phones. Not only operating system makers could reconstruct the social graph, but the state could as well, more or less easily depending on the approaches.”

The letter also mentions a thorough analysis of centralized and decentralized implementations of contact-tracing protocols. It includes multiple attack scenarios and undermines both the DP-3T protocol as well as ROBERT.

Ahead of the debate in the French parliament tomorrow, researchers say that “it is essential to thoroughly analyze the health benefits of a digital solution with specialists — there should be important evidence in order to justify the risks incurred.”

Researchers also ask for more transparency at all levels — every technical choice should be documented and justified. Data collection should be minimized and people should understand the risks and remain free not to use the contact-tracing app.

Over the past few weeks, multiple groups of researchers in Europe have been working on different protocols. In particular, DP-3T has been working a decentralized protocol that leverages smartphones to compute social interactions. Ephemeral IDs are stored on your device and you can accept to share ephemeral IDs with a relay server to send them to the community of app users.

PEPP-PT has been backing a centralized protocol that uses pseudonymization to match contacts on a central server. A national authority manages the central server, which could lead to state surveillance if the protocol isn’t implemented properly. ROBERT is a variant of PEPP-PT designed by French and German researchers.

While the French government has always been cautious about the upsides of a contact-tracing app, there’s been little debate about the implementation. Inria, with official backing from the French government, and Fraunhofer released specifications for the ROBERT protocol last week.

Many (including me) have called out various design choices as you have to trust your government that they’re not doing anything nefarious without telling you — a centralized approach requires a lot of faith from the end users as the government holds a lot of data about your social interactions and your health. Sure, it’s pseudonymized, but it’s not anonymized despite what the ROBERT specification document says.

Moreover, ROBERT doesn’t leverage Apple and Google’s contact-tracing API that is in the works. France’s digital minister, Cédric O, has been trying to put some pressure on Apple over Bluetooth restrictions with a Bloomberg interview. Given that Apple and Google provide an API for decentralized implementations, they have little incentive to bow to French pressure.

On Sunday, Germany announced that it would abandon its original plans for a centralized architecture in favor of a decentralized approach, leaving France and the U.K. as the two remaining backers of a centralized approach.

France’s data protection watchdog CNIL released a cautious analysis of ROBERT, saying that the protocol could be compliant with GDPR. But it says it will need further details on the implementation of the protocol to give a definitive take on StopCovid.

The European Data Protection Supervisor (EDPS) also said on Twitter that the debate in front of the French parliament is particularly important. “Decisions will have an impact not only on the immediate future but as well on years to come,” they say.

Brave accuses European governments of GDPR resourcing failure

Brave, a maker of a pro-privacy browser, has lodged complaints with the European Commission against 27 EU Member States for under resourcing their national data protection watchdogs.

It’s asking the European Union’s executive body to launch an infringement procedure against Member State governments, and even refer them to the bloc’s top court, the European Court of Justice, if necessary.

“Article 52(4) of the GPDR [General Data Protection Regulation] requires that national governments give DPAs the human and financial resources necessary to perform their tasks,” it notes in a press release.

Brave has compiled a report to back up the complaints — in which it chronicles a drastic shortage of tech expertise and budget resource among Europe’s privacy agencies to enforce the region’s data protection framework.

Lack of proper resource to ensure the regulation’s teeth are able to clamp down on bad behavior — as the law drafters’ intended — has been a long standing concern.

In the Irish data watchdog’s annual report in February — aka the agency that regulates most of big tech in Europe — the lack of any decisions in major cross-border cases against a roll-call of tech giants loomed large, despite plenty of worthy filler, with reams of stats included to illustrate the massive case load of complaints the agency is now dealing with.

Ireland’s decelerating budget and headcount in the face of rising numbers of GDPR complaints is a key concern highlighted by Brave’s report.

Per the report, half of EU data protection agencies have what it dubs a small budget (sub €5M), while only five of Europe’s 28 national GDPR enforcers have more than 10 “tech specialists”, as it describes them.

“Almost a third of the EU’s tech specialists work for one of Germany’s Länder (regional) or federal DPAs,” it warns. “All other EU countries are far behind Germany.”

“Europe’s GDPR enforcers do not have the capacity to investigate Big Tech,” is its top-line conclusion.

“If the GDPR is at risk of failing, the fault lies with national governments, not with the data protection authorities,” said Dr Johnny Ryan, Brave’s chief policy & industry relations officer, in a statement. “Robust, adversarial enforcement is essential. GDPR enforcers must be able to properly investigate ‘big tech’, and act without fear of vexatious appeals. But the national governments of European countries have not given them the resources to do so. The European Commission must intervene.”

It’s worth noting that Brave is not without its own commercial interest here. It absolutely has skin in the game, as a provider of privacy-sensitive adtech.

Ryan has also been a key instigator of a number of strategic GDPR complaints — such as those filed against certain widespread adtech industry practices. Enforcement against programmatic advertisement’s use of real-time bidding would very likely be of commercial benefit to Brave, given its engineered to operate a different model.

But such commercial interest in robust and active GDPR enforcement doesn’t undermine Brave’s core beef — aka: that regulatory inaction is linked to DPA under-resourcing.

Indeed, the UK’s ICO has itself, er, blogged multiple times about the systemic problem of unlawful adtech — repeatedly calling for the industry to reform. But not actually doing anything when it doesn’t.

It’s just this sort of ‘soft soap’ from regulators — words, instead of firm GDPR enforcement — that’s in Brave’s sights. Nor is it alone in complaining about the lack of GDPR ‘bite’; independent privacy campaigns and researchers have dubbed ongoing regulatory inaction as a “disastrous” failure that’s undermining the rule of law.

We reached out to the Irish Data Protection Commission, the European Data Protection Board (EDPB), the European Data Protection Supervisor (EDPS) and the European Commission for comment on Brave’s report and to ask whether they believe GDPR is functioning as intended.

A major milestone is looming with the regulation’s two year birthday falling next month — which will be concentrating minds within EU institutions.

A spokesman for the EDPS pointed us to this joint document with the EDPB — which was adopted in mid February, ahead of this wider evaluation process for GDPR.

In a section of the document on enforcement the assessment finds “increased attention and effort toward enforcement of data protection laws by most SAs” [supervisory authorities], with the EDPB noting that: “The new enforcement tools provided by the GDPR and the SAs made use of a wide range of corrective measures, i.e. not only administrative fines but also warnings and reprimands”.

On fines specifically, the evaluation notes that between May 25, 2018 and November 30, 2019, a total of 22 EU/EEA data protection agencies made use of this corrective power — with 785 fines issued overall (although around 110 of which relate to infringements that predate GDPR coming into force).  

“Only 8 SAs have not imposed any administrative fine yet although most of them have ongoing proceedings that might lead to imposing an administrative fine in the near future,” they further note.

In terms of what fines have been issued for, the write that most related to principles relating to processing of personal data (Art. 5 GDPR); lawfulness of processing (Art. 6 GDPR); valid consent (Art. 7 GDPR); processing of special categories of personal data (Art. 9 GDPR); transparency and rights of the data subjects (Art. 12 to 22 GDPR); security of processing and data breaches (Art. 32 to 34 GDPR).

We’ll update this report with any other responses to Brave’s report. We’ve also asked the Commission if it will be instigating infringement proceedings against any Member States.

As noted above, the Commission will publish a review of GDPR next month, as the regulation reaches its second anniversary. And while plenty of compliance activity is undoubtedly taking place, away from flashy headlines — such as data impact assessments; and accelerated data breach notifications — which will be provide plenty of filler for the looming Commission report, the biggest ongoing criticism attached to GDPR is the lack of perceived action over major cross-border complaints. And, therefore, the lack of enforcement against major platforms and tech giants.

A $57M fine for Google by France’s CNIL back in January 2019 stands as something of a lone exception on the major-financial-penalties-for-tech-giants front.

However, fines seems a poor lever to spur reform of resource-rich tech giants. Just look at the $5BN fine Facebook negotiated with domestic regulators in the US — a tiny price-tag for its earlier flouting of US regulatory requirements. tl;dr fines — even record-breaking ones — are a line of business expense for platforms operating at this level.

So it’s worth noting some high profile interventions/warnings by EU DPAs — which did not involved any actual financial penalties — have netted some tangible changes to how voice assistant AI systems function.

Last summer, for example, it emerged that the Hamburg data protection authority, in German, had informed Google of its intention to use Article 66 powers of the GDPR to begin an “urgency procedure” — which allows a DPA to order data processing to stop if it believes there’s “an urgent need to act in order to protect the rights and freedoms of data subjects”.

Just the warning that it was about to unbox that power appeared to be enough to spark action from Google which suspended manual (human) audio reviews of Google Assistant across the whole of Europe.

There were similar process changes from Apple and Amazon — following regional press and regulatory attention. (Global changes, in the case of Apple.)

So the picture around GDPR enforcement is a little more nuanced than just ‘hey DPAs, show us the money’.

Nonetheless, Ireland remains an obvious ‘one-stop’ bottleneck for the functioning of regulation — making the agency an eye-catching pinata for those who like to claim GDPR isn’t working.

The DPC cannot remain in this critical limbo forever, of course, no matter how concerned it evidently is that its decisions stand up to tech giants’ lawyerly nitpickings and future judicial review.

Decisions in the more than 20 cross-border cases stuck on its desk — including complaints against Apple, Facebook, Google, LinkedIn, Twitter and TechCrunch’s own parent, Verizon Media, to name a few — must flow eventually. And, per earlier comments, pretty quickly now — given the first decisions were slated for ‘early’ this year. (Expect the coronavirus crisis to provide some cover for any further administrative delay.)

Whatever those crux decisions look like, critics will still be able to shoot back that they’ve come too late to be truly effective, though.

Germany ditches centralized approach to app for COVID-19 contacts tracing

Germany has U-turned on building a centralized COVID-19 contacts tracing app — and will instead adopt a decentralized architecture, Reuters reported Sunday, citing a joint statement by chancellery minister Helge Braun and health minister Jens Spahn.

In Europe in recent weeks, a battle has raged between different groups backing centralized vs decentralized infrastructure for apps being fast-tracked by governments which will use Bluetooth-based smartphone proximity as a proxy for infection risk — in the hopes of supporting the public health response to the coronavirus by automating some contacts tracing.

Centralized approaches that have been proposed in the region would see pseudonymized proximity data stored and processed on a server controlled by a national authority, such as a healthcare service. However concerns have been raised about allowing authorities to scoop up citizens’ social graph, with privacy experts warning of the risk of function creep and even state surveillance.

Decentralized contacts tracing infrastructure, by contrast, means ephemeral IDs are stored locally on device — and only uploaded with a user’s permission after a confirmed COVID-19 diagnosis. A relay server is used to broadcast infected IDs — enabling devices to locally compute if there’s a risk that requires notification. So social graph data is not centralized.

The change of tack by the German government marks a major blow to a homegrown standardization effort, called PEPP-PT, that had been aggressively backing centralization — while claiming to ‘preserve privacy’ on account of not tracking location data. It quickly scrambled to propose a centralized architecture for tracking coronavirus contacts, led by Germany’s Fraunhofer Institute, and claiming the German government as a major early backer, despite PEPP-PT later saying it would support decentralized protocols too.

As we reported earlier, the effort faced strident criticism from European privacy experts — including a group of academics developing a decentralized protocol called DP-3T — who argue p2p architecture is truly privacy preserving. Concerns were also raised about a lack of transparency around who is behind PEPP-PT and the protocols they claimed to support, with no code published for review.

The European Commission, meanwhile, has also recommended the use of decentralization technologies to help boost trust in such apps in order to encourage wider adoption.

EU parliamentarians have also warned regional governments against trying to centralize proximity data during the coronavirus crisis.

But it was Apple and Google jumping into the fray earlier this month by announcing joint support for decentralized contacts tracing that was the bigger blow — with no prospect of platform-level technical restrictions being lifted. iOS limits background access to Bluetooth for privacy and security reasons, so national apps that do not meet this decentralized standard won’t benefit from API support — and will likely be far less usable, draining battery and functioning only if actively running.

Nonetheless PEPP-PT told journalists just over a week ago that it was engaged in fruitful discussions with Apple and Google about making changes to their approach to accommodate centralized protocols.

Notably, the tech giants never confirmed that claim. They have only since doubled down on the principle of decentralization for the cross-platform API for public health apps — and system-wide contacts tracing which is due to launch next month.

At the time of writing PEPP-PT’s spokesman, Hans-Christian Boos, had not responded to a request for comment on the German government withdrawing support.

Boos previously claimed PEPP-PT had around 40 governments lining up to join the standard. However in recent days the momentum in Europe has been going in the other direction. A number of academic institutions that had initially backed PEPP-PT have also withdrawn support.

In a statement emailed to TechCrunch, the DP-3T project welcomed Germany’s U-turn. “DP-3T is very happy to see that Germany is adopting a decentralized approach to contact tracing and we look forward to its next steps implementing such a technique in a privacy preserving manner,” the group told us.

Berlin’s withdrawal leaves France and the UK the two main regional backers of centralized apps for coronavirus contacts tracing. And while the German U-turn is certainly a hammer blow for the centralized camp in Europe the French government appears solid in its support — at least for now.

France has been developing a centralized coronavirus contacts tracing protocol, called ROBERT, working with Germany’s Fraunhofer Institute and others.

In an opinion issued Sunday, France’s data protection watchdog, the CNIL, did not take active issue with centralizing pseudonymized proximity IDs — saying EU law does not in principle forbid such a system — although the watchdog emphasized the need to minimize the risk of individuals being re-identified.

It’s notable that France’s digital minister, Cédric O, has been applying high profile public pressure to Apple over Bluetooth restrictions — telling Bloomberg last week that Apple’s policy is a blocker to the virus tracker.

Yesterday O was also tweeting to defend the utility of the planned ‘Stop Covid’ app.

We reached out to France’s digital ministry for comment on Germany’s decision to switch to a decentralized approach but at the time of writing the department had not responded.

In a press release today the government highlights the CNIL view that its approach is compliant with data protection rules, and commits to publishing a data protection impact assessment ahead of launching the app.

If France presses ahead it’s not clear how the country will avoid its app being ignored or abandoned by smartphone users who find it irritating to use. (Although it’s worth noting that Google’s Android platform has a substantial marketshare in the market, with circa 80% vs 20% for iOS, per Kantar.)

A debate in the French parliament tomorrow is due to include discussion of contacts tracing apps.

We’ve also reached out to the UK’s NHSX — which has been developing a COVID-19 contacts tracing app for the UK market — and will update this report with any response.

In a blog post Friday the UK public healthcare unit’s digital transformation division said it’s “working with Apple and Google on their welcome support for tracing apps around the world”, a PR line that entirely sidesteps the controversy around centralized vs decentralized app infrastructures.

The UK has previously been reported to be planning to centralize proximity data — raising questions about the efficacy of its planned app too, given iOS restrictions on background access to Bluetooth.

“As part of our commitment to transparency, we will be publishing the key security and privacy designs alongside the source code so privacy experts can ‘look under the bonnet’ and help us ensure the security is absolutely world class,” the NHSX’s Matthew Gould and Dr Geraint Lewis added in the statement.

Apple and Google update joint coronavirus tracing tech to improve user privacy and developer flexibility

Apple and Google have provided a number of updates about the technical details of their joint contact tracing system, which they’re now exclusively referring to as an “exposure notification” technology, since the companies say this is a better way to describe what they’re offering. The system is just one part of a contact tracing system, they note, not the entire thing. Changes include modifications made to the API that the companies say provide stronger privacy protections for individual users, and changes to how the API works that they claim will enable health authorities building apps that make use of it to develop more effective software.

The additional measures being implemented to protect privacy include changing the cryptography mechanism for generating the keys used to trace potential contacts. They’re no longer specifically bound to a 24-hour period, and they’re now randomly generated instead of derived from a so-called ‘tracing key’ that was permanently attached to a device. In theory, with the old system, an advanced enough attack with direct access to the device could potentially be used to figure out how individual rotating keys were generated from the tracing key, though that would be very, very difficult. Apple and Google clarified that it was included for the sake of efficiency originally, but they later realized they didn’t actually need this to ensure the system worked as intended, so they eliminated it altogether.

The new method makes it even more difficult for a would-be bad actor to determine how the keys are derived, and then attempt to use that information to use them to track specific individuals. Apple and Google’s goal is to ensure this system does not link contact tracing information to any individual’s identity (except for the individual’s own use) and this should help further ensure that’s the case.

The companies will now also be encrypting any metadata associated with specific Bluetooth signals, including the strength of signal and other info. This metadata can theoretically be used in sophisticated reverse identification attempts, by comparing the metadata associated with a specific Bluetooth signal with known profiles of Bluetooth radio signal types as broken down by device and device generation. Taken alone, it’s not much of a risk in terms of exposure, but this additional step means it’s even harder to use that as one of a number of vectors for potential identification for malicious use.

It’s worth noting that Google and Apple say this is intended as a fixed length service, and so it has a built-in way to disable the feature at a time to be determined by regional authorities, on a case-by-case basis.

Finally on the privacy front, any apps built using the API will now be provided exposure time in five minute intervals, with a maximum total exposure time reported of 30 minutes. Rounding these to specific 5 minute duration blocks and capping the overall limit across the board helps ensure this info, too, is harder to link to any specific individual when paired with other metadata.

On the developer and health authority side, Apple and Google will now be providing signal strength information in the form of Bluetooth radio power output data, which will provide a more accurate measure of distance between two devices in the case of contact, particularly when used with existing received signal strength info from the corresponding device that the API already provides access to.

Individual developers can also set their own parameters in terms of how strong a signal and for what duration will trigger an exposure event. This is better for public health authorities because it allows them to be specific about what level of contact actually defines a potential contact, since it varies depending on geography in terms of the official guidance from health agencies. Similarly, developers can now determine how many days have passed since an individual contact event, which might alter their guidance to a user (ie., if it’s already been 14 days, measures would be very different from if it’s been two).

Apple and Google are also changing the encryption algorithm used to AES, from the HMAC system they were previously using. The reason for this switch is that the companies have found that using AES encryption, which can be accelerated locally using on-board hardware in many mobile devices, the API will be more energy efficiency and have less of a performance impact on smartphones.

As we reported Thursday, Apple and Google also confirmed that they’re aiming to distribute the beta seed version of the OS update that will support these devices next week. On Apple’s side, the update will support any iOS hardware released over the course of the past four years running iOS 13. On the Android side, it would cover around 2 billion devices globally, Android said.

Coronavirus tracing: Platforms vs governments

One key outstanding question is what will happen in the case of governments that choose to use centralized protocols for COVID-19 contacts tracing apps, with proximity data uploaded to a central server — rather than opting for a decentralized approach, which Apple and Google are supporting with an API.

In Europe, the two major EU economies, France and Germany, are both developing contacts tracing apps based on centralized protocols — the latter planning deep links to labs to support digital notification of COVID-19 test results. The UK is also building a tracing app that will reportedly centralize data with the local health authority.

This week Bloomberg reported that the French government is pressuring Apple to remove technical restrictions on Bluetooth access in iOS, with the digital minister, Cedric O, saying in an interview Monday: “We’re asking Apple to lift the technical hurdle to allow us to develop a sovereign European health solution that will be tied our health system.”

While a German-led standardization push around COVID-19 contacts tracing apps, called PEPP-PT — that’s so far only given public backing to a centralized protocol, despite claiming it will support both approaches — said last week that it wants to see changes to be made to the Google-Apple API to accommodate centralized protocols.

Asked about this issue an Apple spokesman told us it’s not commenting on the apps/plans of specific countries. But the spokesman pointed back to a position on Bluetooth it set out in an earlier statement with Google — in which the companies write that user privacy and security are “central” to their design.

Judging by the updates to Apple and Google’s technical specifications and API framework, as detailed above, the answer to whether the tech giants will bow to government pressure to support state centralization of proximity social graph data looks to be a strong no.

The latest tweaks look intended to reinforce individual privacy and further shrink the ability of outside entities to repurpose the system to track people and/or harvest a map of all their contacts.

The sharpening of the Apple and Google’s nomenclature is also interesting in this regard — with the pair now talking about “exposure notification” rather than “contact tracing”, as preferred terminology for the digital intervention. This shift of emphasis suggests they’re keen to avoid any risk of their role being (mis)interpreted as supporting broader state surveillance of citizens’ social graphs, under the guise of a coronavirus response.

Backers of decentralized protocols for COVID-19 contacts tracing — such as DP-3T, a key influence for the Apple-Google joint effort that’s being developed by a coalition of European academics — have warned consistently of the risk of surveillance creep if proximity data is pooled on a central server.

Apple and Google’s change of terminology doesn’t bode well for governments with ambitions to build what they’re counter branding as “sovereign” fixes — aka data grabs that do involve centralizing exposure data. Although whether this means we’re headed for a big stand off between certain governments and Apple over iOS security restrictions — a la Apple vs the FBI — remains to be seen.

Earlier today, Apple and Google’s EU privacy chiefs also took part in a panel discussion organized by a group of European parliamentarians which specifically considered the question of centralized vs decentralized models for contacts tracing.

Asked about supporting centralized models for contacts tracing the tech giants offered a dodge, rather than a clear ‘no’.

“Our goal is to really provide an API to accelerate applications. We’re not obliging anyone to use it as a solution. It’s a component to help make it easier to build applications,” said Google’s Dave Burke, VP of Android engineering.

“When we build something we have to pick an architecture that works,” he went on. “And it has to work globally, for all countries around the world. And when we did the analysis and looked at different approaches we were very heavily inspired by the DP-3T group and their approach — and that’s what we have adopted as a solution. We think that gives the best privacy preserving aspects of the contacts tracing service. We think it’s also quite rich in epidemiological data that we think can be derived from it. And we also think it’s very flexible in what it could do. [The choice of approach is] really up to every member state — that’s not the part that we’re doing. We’re just operating system providers and we’re trying to provide a thin layer of an API that we think can help accelerate these apps but keep the phone in a secure, private mode of operation.”

“That’s really important for the expectations of users,” Burke added. “They expect the devices to keep their data private and safe. And then they expect their devices to also work well.”

DP-3T’s Michael Veale was also on the panel — busting what he described as some of the “myths” about decentralized contacts tracing vs centralized approaches.

“The [decentralized] system is designed to provide data to epidemiologists to help them refine and improve the risk score — even daily,” he said. “This is totally possible. We can do this using advanced methods. People can even choose to provide additional data if they want to epidemiologists — which is not really required for improving the risk score but might help.”

“Some people think a decentralized model means you can’t have a health authority do that first call [to a person exposed to a risk of infection]. That’s not true. What we don’t do is we don’t tag phone numbers and identities like a centralized model can to the social network. Because that allows misuse,” he added. “All we allow is that at the end of the day the health authority receives a list separate from the network of whose phone number they can call.”

MEP, Sophie in ‘t Veld, who organzied the online event, noted at the top of the discussion they had also invited PEPP-PT to join the call but said no one from the coalition had been able to attend the video conference.

The changing face of employment law during a global pandemic

Prompted by Jeff Bezos’s plans to test all Amazon employees for the virus that causes COVID-19, we wondered whether employers can mandate employee testing, regardless of symptoms. The issue pits public safety against personal privacy, but limited testing availability has rendered the question somewhat moot.

But as the World Health Organization and U.S. Centers for Disease Control and Prevention have noted, asymptomatic COVID-19 carriers can spread the virus without realizing they’re infected. To learn more about workers’ rights in this arena, we spoke to Tricia Bozyk Sherno, counsel at Debevoise & Plimpton, who focuses on employment and general commercial litigation.

The answer, for now, is not entirely straightforward, though updates from the U.S. Equal Employment Opportunity Commission could make the situation clearer going forward as more tests are made available and state governments begin pushing to reopen businesses.

Sherno offered a fair amount of insight into the EEOC’s updated guidance and made some predictions about how things may look for both employers and workers going forward.

TechCrunch: Prior to the COVID-19 pandemic, what sorts of laws governed an employer’s ability to test employees for infectious diseases?

Tricia Bozyk Sherno: Covered employers (employers with 15 or more employees) must comply with the requirements of the Americans with Disabilities Act (ADA), which limits an employer’s ability to make disability-related inquiries or require medical examinations. (Note that certain states may also have similar statutes in place.) Generally, disability-related inquiries and medical examinations are prohibited by the ADA except in limited circumstances. A “medical examination” is a procedure or test that seeks information about an individual’s physical or mental impairments or health — so infectious disease testing would fall into this category.

Germany’s COVID-19 contacts tracing app to link to labs for test result notification

A German research institute that’s involved in developing a COVID-19 contacts tracing app with the backing of the national government has released some new details about the work which suggests the app is being designed as more of a ‘one-stop shop’ to manage coronavirus impacts at an individual level, rather than having a sole function of alerting users to potential infection risk.

Work on the German app began at the start of March, per the Fraunhofer-Gesellschaft institute, with initial funding from the Federal Ministry of Education and Research and the Federal Ministry of Health funding a feasibility study.

In a PDF published today, the research organization reveals the government-backed app will include functionality for health authorities to directly notify users about a COVID-19 test result if they’ve opted in to get results this way.

It says the system must ensure only people who test positive for the virus make their measurement data available to avoid incorrect data being inputed. For the purposes of “this validation process”, it envisages “a digital connection to the existing diagnostic laboratories is implemented in the technical implementation”.

“App users can thus voluntarily activate this notification function and thus be informed more quickly and directly about their test results,” it writes in the press release (which we’ve translated from German with Google Translate) — arguing that such direct digital notification of tests results will mean that no “valuable time” is lost to curb the spread of the virus.

Governments across Europe are scrambling to get Bluetooth-powered contacts tracing apps off the ground, with apps also in the works from a number of other countries, including the UK and France, despite ongoing questions over the efficacy of digital contacts tracing vs such an infectious virus.

The great hope is that digital tools will offer a route out of economically crippling population lockdowns by providing a way to automate at least some contacts tracing — based on widespread smartphone penetration and the use of Bluetooth-powered device proximity as a proxy for coronavirus exposure.

Preventing a new wave of infections as lockdown restrictions are lifted is the near-term goal. Although — in line with Europe’s rights frameworks — use of contacts tracing apps looks set to be voluntary across most of the region, with governments wary about being seen to impose ‘health surveillance’ on citizens, as has essentially happened in China.

However if contacts tracing apps end up larded with features that are deep linking into national health systems that raises questions about how optional their use will really be.

An earlier proposal by a German consortium of medical device manufacturers, laboratories, clinics, clinical data management systems and blockchain solution providers — proposing a blockchain-based Digital Corona Health Certificate, which was touted as being able to generate “verifiable, certified test results that can be fed into any tracing app” to cut down on false positives — claimed to have backing from the City of Cologne’s public health department, as one example of potential function creep.

In March, Der Spiegel also reported on a large-scale study being coordinated by the Helmholtz Center for Infection Research in Braunschweig, to examine antibody levels to try to determine immunity across the population. Germany’s Robert Koch Institute (RKI) was reportedly involved in that study — and has been a key operator in the national contacts tracing push.

Both RKI and the Fraunhofer-Gesellschaft institute are also involved in parallel German-led pan-EU standardization effort for COVID-19 contacts tracing apps (called PEPP-PT) that’s been the leading voice for apps to centralize proximity data with governments/health authorities, rather than storing it on users’ device and performing risk processing locally.

As we reported earlier, PEPP-PT and its government backers appear to be squaring up for a battle with Apple over iOS restrictions on Bluetooth.

PEPP-PT bases its claim of being a “privacy-preserving” standard on not backing protocols or apps that use location data or mobile phone numbers — with only arbitrary (but pseudonymized) proximity IDs shared for the purpose of tracking close encounters between devices and potential coronavirus infections.

It has claimed it’s agnostic between centralization of proximity data vs decentralization, though so far the only protocol it’s publicly committed to is a centralized one.

Yet, at the same time, regional privacy experts, the EU parliament and even the European Commission have urged national governments to practice data minimization and decentralized when it comes to COVID-19 contacts tracing in order to boost citizen trust by shrinking associated privacy risks.

If apps are voluntary citizens’ trust must be earned not assumed, is the key argument. Without substantial uptake the utility of digital contacts tracing seems doubtful.

Apple and Google have also come down on the decentralized side of this debate — outting a joint effort last week for an API and later opt-in system-wide contacts tracing. The first version of their API is slated to be in developers’ hands next week.

Meanwhile, a coalition of nearly 300 academics signed an open letter at the start of this week warning that centralized systems risked surveillance creep — voicing support for decentralized protocols, such as DP-3T: Another contact tracing protocol that’s being developed by a separate European coalition which has been highly critical of PEPP-PT.

And while PEPP-PT claimed recently to have seven governments signed up to its approach, and 40 more in the pipeline, at least two of the claimed EU supporters (Switzerland and Spain) had actually said they will use a decentralized approach.

The coalition has also been losing support from a number of key research institutions which had initially backed its push for a “privacy-preserving” standard, as controversy around its intent and lack of transparency has grown.

Nonetheless the two biggest EU economies, Germany and France, appear to be digging in behind a push to centralize proximity data — putting Apple in their sights.

Bloomberg reported earlier this week that the French government is pressurizing Apple to remove Bluetooth restrictions for its COVID-19 contacts tracing app which also relies on a ‘trusted authority’ running a central server (we’ve covered the French ROBERT protocol in detail here).

It’s possible Germany and France are sticking to their centralized guns because of wider plans to pack more into these contacts tracing apps than simply Bluetooth-powered alerts — as suggested by the Fraunhofer document.

Access to data is another likely motivator.

“Only if research can access sufficiently valid data it is possible to create forecasts that are the basis for planning further steps against are the spread of the virus,” the institute goes on. (Though, as we’ve written before, the DP-3T decentralized protocol sets out a path for users to opt in to share proximity data for research purposes.)

Another strand that’s evident from the Fraunhofer PDF is sovereignty.

“Overall, the approach is based on the conviction that the state healthcare system must have sovereignty over which criteria, risk calculations, recommendations for action and feedback are in one such system,” it writes, adding: “In order to achieve the greatest possible usability on end devices on the market, technical cooperation with the targeted operating system providers, Google and Apple, is necessary.”

Apple and Google did not respond to requests for comment on whether they will be making any changes to their API as result of French and German pressure.

Fraunhofer further notes that “full compatibility” between the German app and the centralized one being developed by French research institutes Inria and Inserm was achieved in the “past few weeks” — underlining that the two nations are leading this particular contacts tracing push.

In related news this week, Europe’s Data Protection Board (EDPB) put out guidance for developers of contacts tracing apps which stressed an EU legal principle related to processing personal data that’s known as purpose limitation — warning that apps need to have purposes “specific enough to exclude further processing for purposes unrelated to the management of the COVID-19 health crisis (e.g., commercial or law enforcement purposes)”.

Which sounds a bit like the regulator drawing a line in the sand to warn states that might be tempted to turn contacts tracing apps into coronavirus immunity passports.

The EDPB also urged that “careful consideration” be given to data minimisation and data protection by design and by default — two other key legal principles baked into Europe’s General Data Protection Regulation, albeit with some flex during a public health emergency.

However the regulatory body took a pragmatic view on the centralization vs decentralization debate — saying both approaches are “viable” in a contacts tracing context, with the key caveat that “adequate security measures” must be in place.

EU privacy body urges anonymization of location data for COVID-19 tracking

The European Data Protection Board (EDPB) has published guidance for the use of location data and contacts tracing tools intended to mitigate the impact of the COVID-19 pandemic.

Europe’s data protection framework wraps around all such digital interventions, meaning there are legal requirements for EU countries and authorities developing tracing tools or soliciting data for a coronavirus related purpose.

“These guidelines clarify the conditions and principles for the proportionate use of location data and contact tracing tools, for two specific purposes: using location data to support the response to the pandemic by modelling the spread of the virus so as to assess the overall effectiveness of confinement measures; [and] contact tracing, which aims to notify individuals of the fact that they have been in close proximity of someone who is eventually confirmed to be a carrier of the virus, in order to break the contamination chains as early as possible,” the EDPB writes in the document.

The European Commission and the EU parliament have already weighed in with their own recommendations in this area, including a toolbox to help guide contacts tracing app developers. The Commission has also urged Member States to take a common approach to building such apps. And has been leaning on local telcos to provide ‘anonymized and aggregated’ metadata for modelling the spread of the virus across the EU.

The guideline document from the EDPB — a body made up of representatives from the EU’s national data protection agencies which helps coordinate the application of pan-EU data protection law — brings additional expert steerage for those developing digital interventions as part of a public health response to the coronavirus pandemic.

“The EDPB generally considers that data and technology used to help fight COVID-19 should be used to empower, rather than to control, stigmatise, or repress individuals,” it writes. “Furthermore, while data and technology can be important tools, they have intrinsic limitations and can merely leverage the effectiveness of other public health measures. The general principles of effectiveness, necessity, and proportionality must guide any measure adopted by Member States or EU institutions that involve processing of personal data to fight COVID-19.”

Among the body’s specific recommendations are that where location data is being considered for modelling the spread of the coronavirus or assessing the effectiveness of national lockdown measures then anonymizing the data is preferable — with the EDPB emphasizing that proper anonymization is not easy.

Given the inherent complexity it also recommends transparency around the anonymization methodology used. (tl;dr: there’s no security in obscurity, nor indeed accountability.)

“Many options for effective anonymisation exist, but with a caveat. Data cannot be anonymised on their own, meaning that only datasets as a whole may or may not be made anonymous,” it notes.

“A single data pattern tracing the location of an individual over a significant period of time cannot be fully anonymised. This assessment may still hold true if the precision of the recorded geographical coordinates is not sufficiently lowered, or if details of the track are removed and even if only the location of places where the data subject stays for substantial amounts of time are retained. This also holds for location data that is poorly aggregated.

“To achieve anonymisation, location data must be carefully processed in order to meet the reasonability test. In this sense, such a processing includes considering location datasets as a whole, as well as processing data from a reasonably large set of individuals using available robust anonymisation techniques, provided that they are adequately and effectively implemented.”

On contact tracing apps — aka digital tools that are designed to map proximity between individuals, as a proxy for infection risk — the EDPB urges that use of such apps be voluntary.

“The systematic and large scale monitoring of location and/or contacts between natural persons is a grave intrusion into their privacy,” it warns. “It can only be legitimised by relying on a voluntary adoption by the users for each of the respective purposes. This would imply, in particular, that individuals who decide not to or cannot use such applications should not suffer from any disadvantage at all.”

The importance of accountability is also front and center, with the EDPB saying the controller of such apps must be clearly defined.

“The EDPB considers that the national health authorities could be the controllers for such application; other controllers may also be envisaged. In any cases, if the deployment of contact tracing apps involves different actors their roles and responsibilities must be clearly established from the outset and be explained to the users.”

Purpose limitation is another highlighted component. Apps need to have purposes that are “specific enough to exclude further processing for purposes unrelated to the management of the COVID- 19 health crisis (e.g., commercial or law enforcement purposes)”, it says.

So, in other words, no function creep — and no EU citizen mass surveillance via a pandemic backdoor.

The EDPB also writes that “careful consideration should be given to the principle of data minimisation and data protection by design and by default” — noting specifically that contact tracing apps “do not require tracking the location of individual users”.

Instead “proximity data should be used” for the contacts tracing purpose.

“Contact tracing applications can function without direct identification of individuals,” it further emphasizes, adding that “appropriate measures should be put in place to prevent re-identification”.

The guidance aligns with the coronavirus contacts tracing model devised jointly by Apple and Google — which have said they will be offering a cross-platform API for COVID-19 contacts tracing based on ephemeral proximity IDs shared via Bluetooth.

At one point the EDPB guidance appears to be leaning towards favoring such decentralized approaches to contacts tracing apps, with the body writing that “the collected information should reside on the terminal equipment of the user and only the relevant information should be collected when absolutely necessary”.

Although later on the in guidance it discussed centralized models that involve proximity data being uploaded to a server in the cloud, writing that: “Implementations for contact tracing can follow a centralized or a decentralized approach. Both should be considered viable options, provided that adequate security measures are in place, each being accompanied by a set of advantages and disadvantages.”

In Europe there is currently a big fight between different camps over whether contacts tracing apps should use a centralized or decentralized model for storing and processing proximity data — with a contacts tracing app standardization effort known as PEPP-PT that’s backed by Germany’s Fraunhofer Institute for Telecommunications and some EU governments wanting to support centralized protocols for COVID-19 contacts tracking, while a separate coalition of European academics wants only decentralized approaches on privacy grounds, and has developed a protocol called DP-3T.

“The current health crisis should not be used as an opportunity to establish disproportionate data retention mandates,” the EDPB warns. “Storage limitation should consider the true needs and the medical relevance (this may include epidemiology-motivated considerations like the incubation period, etc.) and personal data should be kept only for the duration of the COVID-19 crisis. Afterwards, as a general rule, all personal data should be erased or anonymised.”

The body also recommends algorithms used in contacts tracing apps be audited and regularly reviewed by outside experts.

Again, a key criticism of the PEPP-PT initiative has been around lack of transparency — including its failure to publish code for external review. (Though it has said it will be publishing code.)

“In order to ensure their fairness, accountability and, more broadly, their compliance with the law, algorithms must be auditable and should be regularly reviewed by independent experts. The application’s source code should be made publicly available for the widest possible scrutiny,” the EDPB writes.

Another notable piece of the guidance is for a data protection impact assessment not only to be carried out but that it be published — which marks a further push for accountability via transparency in such an unprecedented moment.

“The EDPB considers that a data protection impact assessment (DPIA) must be carried out before implementing such tool as the processing is considered likely high risk (health data anticipated large-scale adoption, systematic monitoring, use of new technological solution). The EDPB strongly recommends the publication of DPIAs,” it writes.

Typically DPAs leave it up to data controllers to decide whether to publish a DPIA or not — in this case the strong push from the central authority is that these documents are made public where COVID-19 contacts tracing apps are concerned.

Having highlighted the pros and cons of centralized vs decentralized approaches to contacts tracing, the EDPB goes on to recommend that the conceptual phase of app development “should always include thorough consideration of both concepts carefully weighing up the respective effects on data protection/privacy and the possible impacts on individuals rights”.

“Any server involved in the contact tracing system must only collect the contact history or the pseudonymous identifiers of a user diagnosed as infected as the result of a proper assessment made by health authorities and of a voluntary action of the user. Alternately, the server must keep a list of pseudonymous identifiers of infected users or their contact history only for the time to inform potentially infected users of their exposure, and should not try to identify potentially infected users.”

“Putting in place a global contact tracing methodology including both applications and manual tracing may require additional information to be processed in some cases. In this context, this additional information should remain on the user terminal and only be processed when strictly necessary and with his prior and specific consent,” it adds.

You can read the full document here.

ForgeRock nabs $93.5M for its ID management platform, gears up next for an IPO

For better or worse, digital identity management services — the process of identifying and authenticating users on networks to access services — has become a ubiquitous part of interacting on the internet, all the more so in the recent weeks as we have been asked to carry out increasingly more of our lives online.

Used correctly, they help ensure that it’s really you logging into your online banking service; used badly, you feel like you can’t innocently watch something silly on YouTube without being watched yourself. Altogether, they are a huge business: worth $16 billion today according to Gartner but growing at upwards of 30% and potentially as big as $30.5 billion by 2024, according to the latest forecasts.

Now, a company called ForgeRock, which has built a platform that is used to help make sure that those accessing services really are who they say are, and help organizations account for how their services are getting used, is announcing a big round of funding to continue expanding its business amid a huge boost in demand.

The company is today announcing that it has raised $93.5 million in funding, a Series E it will use to continue expanding its product and take it to its next step as a business, specifically investing in R&D, cloud services and its ForgeRock Identity Cloud, and general global business development.

The round is being led by Riverwood Capital, and Accenture Ventures, as well as previous investors Accel, Meritech Capital, Foundation Capital and KKR Growth, also participated.

Fran Rosch, the startup’s CEO, said in an interview that this will likely be its final round of funding ahead of an IPO, although given the current static of affairs with a lot of M&A, there is no timing set for when that might happen. (Notably, the company had said its last round of funding — $88 million in 2017 — would be its final ahead of an IPO, although that was under a different CEO.)

This Series E brings the total raised by the company to $230 million. Rosch confirmed it was raised as a material upround, although he declined to give a valuation. For some context, the company’s last post-money valuation was $646.50 million per PitchBook, and so this round values the company at more than $730 million.

ForgeRock has annual recurring revenues of more than $100 million, with annual revenues also at over $100 million, Rosch said. It operates in an industry heavy with competition, with some of the others vying for pole position in the various aspects of identity management including Okta, LastPass, Duo Serurity and Ping Identity.

But within that list it has amassed some impressive traction. In total it has 1,100 enterprise customers, who in turn collectively manage 2 billion identities through ForgeRock’s platform, with considerably more devices also authenticated and managed on top of that.

Customers include the likes of the BBC — which uses ForgeRock to authenticate and log not just 45 million users but also the devices they use to access its iPlayer on-demand video streaming service — Comcast, a number of major banks, the European Union and several other government organizations. ForgeRock was originally founded in Norway about a decade ago, and while it now has its headquarters in San Francisco, it still has about half its employees and half its customers on the other side of the Atlantic.

Currently ForgeRock provides services to businesses related to identity management including password and username creation, identity governance, directory services, privacy and consent gates, which they in turn provide both to their human customers as well as to devices accessing their services, but we’re in a period of change right now when it comes to identity management. It stays away from direct-to-consumer password management services and Rosch said there are no plans to move into that area.

These days, we’ve become more aware of privacy and data protection. Sometimes, it’s been because of the wrong reasons, such as giant security breaches that have leaked some aspect of our personal information into a giant database, or because of a news story that has uncovered how our information has unwittingly been used in ‘legit’ commercial schemes, or other ways we never imagined it would.

Those developments, combined with advances in technology, are very likely to lead us to a place over time where identity management will become significantly more shielded from misuse. These could include more ubiquitous use of federated identities, “lockers” that store our authentication credentials that can be used to log into services but remain separate from their control, and potentially even applications of blockchain technology.

All of this means that while a company like ForgeRock will continue to provide its current services, it’s also investing big in what it believes will be the next steps that we’ll take as an industry, and society, when it comes to digital identity management — something that has had a boost of late.

“There are a lot of interesting things going on, and we are working closely behind the scenes to flesh them out,” Rosch said. “For example, we’re looking at how best to break up data links where we control identities to get access for a temporary period of time but then pull back. It’s a powerful trend that is still about four to five years out. But we are preparing for this, a time when our platform can consume decentralised identity, on par with logins from Google or Facebook today. That is an interesting area.”

He notes that the current market, where there has been an overall surge for all online services as people are staying home to slow the speed of the coronavirus pandemic, has seen big boosts in specific verticals.

Its largest financial services and banking customers have seen traffic up by 50%, and digital streaming has been up by 300%, and government services have also been spiking, in part because many services that hadn’t been online are now developing online presences or seeing much more traffic from digital channels than before. Unsurprisingly, its customers in hotel and travel, as well as retail, have seen drops, he added.

“ForgeRock’s comprehensive platform is very well-positioned to capitalize on the enormous opportunity in the Identity & Access Management market,” said Jeff Parks, co-founder and managing partner of Riverwood Capital, in a statement. “ForgeRock is the leader in solving a wide range of workforce and consumer identity use cases for the Global 2000 and is trusted by some of the largest companies to manage millions of user identities. We have seen the growth acceleration and are thrilled to partner with this leadership team.” Parks is joining the board with this round.

Europe’s PEPP-PT COVID-19 contacts tracing standard push could be squaring up for a fight with Apple and Google

A coalition of EU scientists and technologists that’s developing what’s billed as a “privacy-preserving” standard for Bluetooth-based proximity tracking, as a proxy for COVID-19 infection risk, wants Apple and Google to make changes to an API they’re developing for the same overarching purpose.

The Pan-European Privacy-Preserving Proximity Tracing (PEPP-PT) uncloaked on April 1, calling for developers of contacts tracing apps to get behind a standardized approach to processing smartphone users’ data to co-ordinate digital interventions across borders and shrink the risk of overly intrusive location-tracking tools gaining momentum as a result of the pandemic.

PEPP-PT said today it has seven governments signed up to apply its approach to national apps, with a claimed pipeline of a further 40 in discussions about joining.

“We now have a lot of governments interacting,” said PEPP-PT’s Hans-Christian Boos, speaking during a webinar for journalists. “Some governments are publicly declaring that their local applications will be built on top of the principles of PEPP-PT and also the various protocols supplied inside this initiative.

“We know of seven countries that have already committed to do this — and we’re currently in conversation with 40 countries that are in various states of onboarding.”

Boos said a list of the governments would be shared with journalists, though at the time of writing we haven’t seen it. But we’ve asked PEPP-PT’s PR firm for the info and will update this report when we get it.

“The pan-European approach has worked,” he added. “Governments have decided at a speed previously unknown. But with 40 more countries in the queue of onboarding we definitely have outgrown just the European focus — and to us this shows that privacy as a model and as a discussion point… is a statement and it is something that we can export because we’re credible on it.”

Paolo de Rosa, the CTO at the Ministry of Innovation Technology and Digital Transformation for the Italian government, was also on the webinar — and confirmed its national app will be built on top of PEPP-PT.

“We will have an app soon and obviously it will be based on this model,” he said, offering no further details.

PEPP-PT’s core ‘privacy-preserving’ claim rests on the use of system architectures that do not require location data to be collected. Rather devices that come near each other would share pseudonymized IDs — which could later be used to send notifications to an individual if the system calculates an infection risk has occurred. An infected individual’s contacts would be uploaded at the point of diagnosis — allowing notifications to be sent to other devices they had come into contact with.

Boos, a spokesman for and coordinator of PEPP-PT, told TechCrunch earlier this month the project will support both centralized and decentralized approaches. The former meaning IDs are uploaded to a trusted server, such as one controlled by a health authority; the latter meaning IDs are held locally on devices, where the infection risk is also calculated — a backend server is only in the loop to relay info to devices.

It’s just such a decentralized contacts tracing system that Apple and Google are collaborating on supporting — fast-following PEPP-PT last week by announcing a plan for cross-platform COVID-19 contacts tracing via a forthcoming API and then a system-wide (opt-in) for Bluetooth-based proximity tracking.

That intervention, by the only two smartphone platforms that matter when the ambition is mainstream app adoption, is a major development — putting momentum in the Western world behind decentralized contacts tracing for responding digitally to the coronavirus crisis, certainly at the platform level.

In a resolution passed today the European parliament also called for a decentralized approach to COVID-19 proximity tracking. MEPs are pushing for the Commission and Member States to be “fully transparent on the functioning of contact tracing apps, so that people can verify both the underlying protocol for security and privacy, and check the code itself to see whether the application functions as the authorities are claiming”. (The Commission has previously signalled a preference for decentralization too.)

However backers of PEPP-PT aren’t giving up on the option of a “privacy-preserving” centralized option — which some in their camp are dubbing ‘pseudo-decentralized’ — with Boos claiming today that discussions are ongoing with Apple and Google about making changes to their approach.

As it stands, contacts tracing apps that don’t use a decentralized infrastructure won’t be able to carry out Bluetooth tracking in the background on Android or iOS — as the platforms limit how general apps can access Bluetooth — meaning users of such apps would have to have the app open and active all the time for proximity tracking to function, with associated (negative() impacts on battery life and device usability.

There are also (intentional) restrictions on how contracts tracing data could be centralized, as a result of the relay server model being deployed in the joint Apple-Google model.

“We very much appreciate that Google and Apple are stepping up to making the operating system layer available — or putting what should be the OS actually there, which is the Bluetooth measurement and the handling of crypto and the background running of such tasks which have to keep running resiliently all the time — if you look at their protocols and if you look at whom they are provided by, the two dominant players in the mobile ecosystem, then I think that from a government perspective especially, or from lots of government perspectives, there is many open points to discuss,” said Boos today.

“From a PEPP-PT perspective there’s a few points to discuss because we want choice and implementing choice in terms of model — decentralized or centralized on top of their protocol creates actually the worst of both worlds — so there are many points to discuss. But contrary to the behavior that many of us who work with tech companies are used to Google and Apple are very open in these discussions and there’s no point in getting up in arms yet because these discussions are ongoing and it looks like agreement can be reached with them.”

It wasn’t clear what specific changes PEPP-PT wants from Apple and Google — we asked for more detail during the webinar but didn’t get a response. But the group and its government backers may be hoping to dilute the tech giants’ stance to make it easier to create centralized graphs of Bluetooth contacts to feed national coronavirus responses.

As it stands, Apple and Google’s API is designed to block contact matching on a server — though there might still be ways for governments (and others) to partially workaround the restrictions and centralize some data.

We reached out to Apple and Google with questions about the claimed discussions with PEPP-PT. At the time of writing neither had responded.

As well as Italy, the German and French governments are among those that have indicated they’re backing PEPP-PT for national apps — which suggests powerful EU Member States could be squaring up for a fight with the tech giants, along the lines of Apple vs the FBI, if pressure to tweak the API fails.

Another key strand to this story is that PEPP-PT continues to face strident criticism from privacy and security experts in its own backyard — including after it removed a reference to a decentralized protocol for COVID-19 contacts tracing that’s being developed by another European coalition (called DP-3T), comprised of privacy and security experts. Coindesk reported on the silent edit to PEPP-PT’s website yesterday.

Backers of DP-3T have also repeatedly queried why PEPP-PT hasn’t published code or protocols for review to-date — and even gone so far as to dub the effort a ‘trojan horse’.

ETH Zürich’s Dr. Kenneth Paterson, who is both a part of the PEPP-PT effort and a designer of DP-3T, couldn’t shed any light on the exact changes the coalition is hoping to extract from ‘Gapple’ when we asked.

“They’ve still not said exactly how their system would work, so I can’t say what they would need [in terms of changes to Apple and Google’s system],” he told us in an email exchange.

Today Boos couched the removal of the reference to DP-3T on PEPP-PT’s website as a mistake — which he blamed on “bad communication”. He also claimed the coalition is still interested in including the former’s decentralized protocol within its bundle of standardized technologies. So the already sometimes fuzzy lines between the camps continue to be redrawn. (It’s also interesting to note that press emails to Boos are now being triaged by Hering Schuppener; a communications firm that sells publicity services including crisis PR.)

“We’re really sorry for that,” Boos said of the DP-3T excision. “Actually we just wanted to put the various options on the same level that are out there. There are still all these options and we very much appreciate the work that colleagues and others are doing.

“You know there is a hot discussion in the crypto community about this and we actually encourage this discussion because it’s always good to improve on protocols. What we must not lose sight of is… that we’re not talking about crypto here, we’re talking about pandemic management and as long as an underlying transport layer can ensure privacy that’s good enough because governments can choose whatever they want.”

Boos also said PEPP-PT would finally be publishing some technical documents this afternoon — opting to release information some three weeks after its public unveiling and on a Friday evening (a 7-page ‘high level overview’ has since been put on their Github here — but still a far cry from code for review) — while making a simultaneous plea for journalists to focus on the ‘bigger picture’ of fighting the coronavirus rather than keep obsessing over technical details. 

During today’s webinar some of the scientists backing PEPP-PT talked about how they’re testing the efficacy of Bluetooth as a proxy for tracking infection risk.

“The algorithm that we’ve been working on looks at the cumulative amount of time that individuals spend in proximity with each other,” said Christophe Fraser, professor at the Nuffield Department of Medicine and Senior Group Leader in Pathogen Dynamics at the Big Data Institute, University of Oxford, offering a general primer on using Bluetooth proximity data for tracking viral transmission.

“The aim is to predict the probability of transmission from the phone proximity data. So the ideal system reduces the requested quarantine to those who are the most at risk of being infected and doesn’t give the notification — even though some proximity event was recorded — to those people who’re not at risk of being infected.”

“Obviously that’s going to be an imperfect process,” he went on. “But the key point is that in this innovative approach that we should be able to audit the extent to which that information and those notifications are correct — so we need to actually be seeing, of the people who have been sent the notification how many of them actually were infected. And of those people who were identified as contacts, how many weren’t.

“Auditing can be done in many different ways for each system but that step is crucial.”

Evaluating the effectiveness of the digital interventions will be vital, per Fraser — whose presentation could have been interpreted as making a case for public health authorities to have fuller access to contacts graphs. But it’s important to note that DP-3T’s decentralized protocol makes clear provision for app users to opt-in to voluntarily share data with epidemiologists and research groups to enable them to reconstruct the interaction graph among infected and at risk users (aka to get access to a proximity graph).

“It’s really important that if you’re going to do an intervention that is going to affect millions of people — in terms of these requests to [quarantine] — that that information be the best possible science or the best possible representation of the evidence at the point at which you give the notification,” added Fraser. “And therefore as we progress forwards that evidence — our understanding of the transmission of the virus — is going to improve. And in fact auditing of the app can allow that to improve, and therefore it seems essential that that information be fed back.”

None of the PEPP-PT aligned apps that are currently being used for testing or reference are interfacing with national health authority systems, per Boos — though he cited a test in Italy that’s been plugged into a company’s health system to run tests.

“We have supplied the application builders with the backend, we have supplied them with sample code, we have supplied them with protocols, we have supplied them with the science of measurement, and so on and so forth. We have a working application that simply has no integration into a country’s health system — on Android and on iOS,” he noted.

On its website PEPP-PT lists a number of corporate “members” as backing the effort — including the likes of Vodafone — alongside several research institutions including Germany’s Fraunhofer Heinrich Hertz Institute for telecoms (HHI) which has been reported as leading the effort.

The HHI’s executive director, Thomas Wiegand, was also on today’s call. Notably, his name initially appeared on the authorship list for the DP-3T’s White Paper. However on April 10 he was removed from the README and authorship list, per its Github document history. No explanation for the change was given.

During today’s press conference Wiegand made an intervention that seems unlikely to endear him to the wider crypto and digital rights community — describing the debate around which cryptography system to use for COVID-19 contacts tracing as a ‘side show’ and expressing concern that what he called Europe’s “open public discussion” might “destroy our ability to get ourselves as Europeans out of this”.

“I just wanted to make everyone aware of the difficulty of this problem,” he also said. “Cryptography is only one of 12 building blocks in the system. So I really would like to have everybody go back and reconsider what problem we are in here. We have to win against this virus… or we have another lockdown or we have a lot of big problems. I would like to have everybody to consider that and to think about it because we have a chance if we get our act together and really win against the virus.”

The press conference had an even more inauspicious start after the Zoom call was disrupted by racist spam in the chat. Right before this Boos had kicked off the call saying he had heard from “some more technically savvy people that we should not be using Zoom because it’s insecure — and for an initiative that wants security and privacy it’s the wrong tool”.

“Unfortunately we found out that many of our international colleagues only had this on their corporate PCs so over time either Zoom has to improve — or we need to get better installations out there. It’s certainly not our intention to leak the data on this Zoom,” he added.

Security lapse exposed Clearview AI source code

Since it exploded onto the scene in January after a newspaper exposé, Clearview AI quickly became one of the most elusive, secretive, and reviled companies in the tech startup scene.

The controversial facial recognition startup allows its law enforcement users to take a picture of a person, upload it, and match it against its alleged database of 3 billion images, which the company scraped from public social media profiles.

But for a time, a misconfigured server exposed the company’s internal files, apps and source code for anyone on the internet to find.

Mossab Hussein, chief security officer at Dubai-based cybersecurity firm SpiderSilk, found the repository storing Clearview’s source code. Although the repository was protected with a password, a misconfigured setting allowed anyone to register as a new user to log in to the system storing the code.

The repository contained Clearview’s source code, which could be used to compile and run the apps from scratch. The repository also stored some of the company’s secret keys and credentials, which granted access to Clearview’s cloud storage buckets. Inside those buckets, Clearview stored copies of its finished Windows, Mac, and Android apps, as well as its iOS app, which Apple recently blocked for violating its rules. The storage buckets also contained early, pre-release developer app versions that are typically only for testing, Hussein said.

The repository also exposed Clearview’s Slack tokens, according to Hussein, which, if used, would have allowed password-less access to the company’s private messages and communications.

Clearview has been dogged by privacy concerns since it was forced out of stealth following a profile in The New York Times, but its technology has gone largely untested and the accuracy of its facial recognition tech unproven. Clearview claims it only allows law enforcement to use its technology, but reports show that the startup courted users from private businesses like Macy’s, Walmart and the NBA. But this latest security lapse is likely to invite greater scrutiny of the company’s security and privacy practices.

When reached for comment, Clearview founder Hoan Ton-That claimed his company “experienced a constant stream of cyber intrusion attempts, and have been investing heavily in augmenting our security.”

“We have set up a bug bounty program with HackerOne whereby computer security researchers can be rewarded for finding flaws in Clearview AI’s systems,” said Ton-That. “SpiderSilk, a firm that was not a part of our bug bounty program, found a flaw in Clearview AI and reached out to us. This flaw did not expose any personally identifiable information, search history or biometric identifiers,” he said.

Clearview AI’s app for iOS did not need a log-in, according to Hussein. He took several screenshots to show how the app works. In this example, Hussein used a photo of Mark Zuckerberg.

Ton-That accused the research firm of extortion, but emails between Clearview and SpiderSilk paint a different picture.

Hussein, who has previously reported security issues at several startups, including MoviePass, Remine and Blind, said he reported the exposure to Clearview but declined to accept a bounty, which he said if signed would have barred him from publicly disclosing the security lapse.

It’s not uncommon for companies to use bug bounty terms and conditions or non-disclosure agreements to prevent the disclosure of security lapses once they are fixed. But experts told TechCrunch that researchers are not obligated to accept a bounty or agree to disclosure rules.

Ton-That said that Clearview has “done a full forensic audit of the host to confirm no other unauthorized access occurred.” He also confirmed that the secret keys have been changed and no longer work.

Hussein’s findings offer a rare glimpse into the operations of the secretive company. One screenshot shared by Hussein showed code and apps referencing the company’s Insight Camera, which Ton-That described as a “prototype” camera, since discontinued.

A screenshot of Clearview AI’s app for macOS. It connects to Clearview’s database through an API. The app also references Clearview’s former prototype camera hardware, Insight Camera.

According to BuzzFeed News, one of the firms that tested the cameras is New York City real estate firm Rudin Management, which trialled use of a camera to two of its city residential buildings.

Hussein said that he found some 70,000 videos in one of Clearview’s cloud storage buckets, taken from a camera installed at face-height in the lobby of a residential building. The videos show residents entering and leaving the building.

Ton-That explained that, “as part of prototyping a security camera product we collected some raw video strictly for debugging purposes, with the permission of the building management.”

TechCrunch could not ascertain from which building the videos were taken. A representative from Rudin Management did not return our emails.

One of the videos from a camera in a lobby of a residential building, recording residents (blurred by TechCrunch) as they pass by.

Clearview has come under intense scrutiny since its January debut. It’s also attracted the attention of hackers.

In February, Clearview admitted to customers that a list of its customers was stolen in a data breach — though, it claimed its servers were “never accessed.” Clearview also left several of its cloud storage buckets containing its Android app unprotected.

Vermont’s attorney general’s office has already opened an investigation into the company for allegedly violating consumer protection laws, and police departments have been told to stop using Clearview, including in New Jersey and San Diego. Several tech companies, including Facebook, Twitter, and YouTube.

In an interview with CBS News in February, Ton-That defended his company’s practices. “If it’s public and it’s out there and could be inside Google’s search engine, it can be inside ours as well,” he said.