UK’s coronavirus tracing app strategy faces fresh questions over transparency and interoperability

The UK’s data protection watchdog confirmed today the government still hasn’t given it sight of a key legal document attached to the coronavirus contacts tracing app which is being developed by the NHSX, the digital transformation branch of the country’s National Health Service .

Under UK and EU law, a Data Protection Impact Assessment (DPIA) can be a legal requirement in instances where there are high rights risks related to the processing of people’s information.

Last month the European Data Protection Board strongly recommended publication of DPIAs in the context of coronavirus contacts tracing apps. “The EDPB considers that a data protection impact assessment (DPIA) must be carried out before implementing such tool as the processing is considered likely high risk (health data anticipated large-scale adoption, systematic monitoring, use of new technological solution). The EDPB strongly recommends the publication of DPIAs,” the pan-EU data protection steerage body wrote in the guidance.

Giving evidence to the human rights committee today, UK information commissioner Elizabeth Denham confirmed that her department, the ICO, is involved in advising the government on the data protection elements of the app’s design. She said the agency has been provided with some technical documents for review thus far. But, under committee questioning, she reserved any firmer assessment of the rights impacts’ of the government’s choice of app design and architecture — saying the ICO still hasn’t seen the DPIA.

“I think that is on the verge of happening,” she said when asked if she had any idea when the document would be published or provided to the ICO for review.

“Having that key document — and the requirement for the NHXS to do that, and provide that to me and to the public — is a really important protection,” Denham added. “Especially when everything’s happening at pace and we want the public to take up such an app, to help with proximity and notification.

“The privacy notice and the DPIA will both need to be shared with us and I do know that NHSX plans to also publish that so that they can show the public — be transparent and accountable for what they’re doing.”

The NHSX has given a green light for the ICO to audit the app in future, she also told the committee.

Coronavirus contacts tracing applications are a new technology which, in the UK case, entail repurposing the Bluetooth signals emitted by smartphones to measure device proximity as a proxy for calculating infection risk. The digital tracing process opens a veritable pandora’s box of rights risks, with health data, social graph and potentially location information all in the mix — alongside overarching questions about how effective such a tech will prove in battling the coronavirus.

Yesterday the BBC reported that the NHSX will trial the tracing app in the Isle of Wight this week.

“As we see the trial in the Isle of Wight we’ll all be very interested to see the results of that trial and see if it’s working the way that the developers have intended,” added Denham.

At a separate parliamentary committee hearing last week NHSX CEO, Matthew Gould, told MPs that the app could be “technically” ready to deploy nationally within two to three weeks, following the limited geographical trial.

He also said the app will iterate — with future versions potentially asking users to share location data. So while the NHSX has maintained that only pseudonymized data will be collected and held centrally — where it could be used for public health “research” purposes — there remains a possibility that data could be linked to individual identities, such as if different pieces of data are combined by state agencies and/or if the centralized store of data is hacked and/or improperly accessed.

Privacy experts have also warned of the risk of ‘mission creep’ down the tracing line.

Today the Guardian reported that the government is in talks with digital identity startups about building technology to power so called ‘immunity passports’, as another plank of its digital response to the coronavirus. Per the report, such a system could combine facial recognition technology with individual coronavirus test results so a worker could verify their COVID-19 status prior to entrance to a workplace, for example. (A spokeswomen for Onfido confirmed to TechCrunch that it’s in discussions with the government but added: “As you’d expect these are confidential until publicly shared.”)

Returning to the coronavirus tracing app, the key point is that the government has opted for a system design that centralizes proximity events on an NHSX-controlled server — when or if a user elects to self-report themselves suffering from COVID-19 symptoms (or does so after getting a confirmed diagnosis).

This choice to centralize proximity event processing elevates not just privacy and security questions but also wider human rights risks, as the committee highlighted in a series of questions to Denham and Gould today — pointing out, for example, that Denham and the ICO have previously suggested that decentralized architectures would be preferable for such high rights risk technology.

On that Denham said: “Because I’m the information commissioner, if I were to start with a blank sheet of paper [it] would start with a decentralized system — and you can understand, from a privacy and security perspective, why that would be so. But that does not, in any way, mean that a centralized system can’t have the same kind of privacy and security protections. And it’s up to the government — it’s up to NHSX — to determine what kind of design specifications the system needs.

“It’s up to government to identify what those functions and needs are and if those lead to a centralized system then the question that the DPIA has to answer is why centralized? And my next question would be how are the privacy and security concerns addressed?  That’s what a DPIA is. It’s about the mitigation of concerns.”

Apple and Google are also collaborating on a cross-platform API that will support the technical functioning of decentralized national tracing apps, as well as baking a decentralized and opt-in system-wide contacts tracing into their own platforms.

The tech giants’ backing for decentralized tracing apps raises interoperability questions and technical concerns for governments that choose to go the other way and pool data.

In additional details for the forthcoming Exposure Notification API, released today, the tech giants stipulate that apps must gain user consent to get access to the API; should only gather the minimum info necessary for the purposes of exposure notification, and only use it for a COVID-19 response; and can’t access or even seek permission to access a device’s Location Services — meaning no uploading location data (something the NHSX app may ask users to do in future, per Gould’s testimony to a different parliamentary committee last week. He also confirmed today that users will be asked to input the first three letters of their postcode).

A number of European governments have now said they will use decentralized systems for digital contacts tracing — including Germany, Switzerland and the Republic of Ireland.

The European Commission has also urged the use of privacy preserving technologies — such as decentralization — in a COVID-19 contacts tracing context.

Currently, France and the UK remain the highest profile backers of centralized systems in Europe.

But, interestingly, Gould gave the first sign today of a UK government ‘wobble’ — saying it’s not “locked” to a centralization app architecture and could change its mind if evidence emerged that a different choice would make more sense.

Though he also made a point of laying out a number of reasons that he said explained the design choice, and — in response to a question from the committee — denied the decision had been influenced by the involvement of a cyber security arm of the UK’s domestic intelligence agency, GCHQ .

“We are working phenomenally closely with both [Apple and Google],” he said. “We are trying very hard in the context of a situation where we’re all dealing with a new technology and a new situation to try and work out what the right approach is — so we’re not in competition, we’re all trying to get this right. We are constantly reassessing which approach is the right one — and if it becomes clear that the balance of advantage lies in a different approach then we will take that different approach. We’re not irredeemably wedded to one approach; if we need to shift then we will… It’s a very pragmatic decision about what approach is likely to get the results that we need to get.”

Gould claimed the (current) choice of a centralized architecture was taken because the NHSX is balancing privacy needs against the need for public health authorities to “get insight” — such as about which symptoms subsequently lead to people subsequently testing positive; or what contacts are more risky (“what the changes are between a contact, for example, three days before symptoms develop and one day before symptoms develop”).

“It was our view that a centralized approach gave us… even on the basis of the system I explained where you’re not giving personal data over — to collect some very important data that gives serious insight into the virus that will help us,” he said. “So we thought that in that context, having a system that both provided that potential for insight but which also, we believe provided serious protections on the privacy front… was an appropriate balance. And as the information commissioner has said that’s really a question for us to work out where that balance is but be able to demonstrate that we have mitigations in place and we’ve really thought about the privacy side as well, which I genuinely believe we have.”

“We won’t lock ourselves in. It may be that if we want to take a different approach we have to do some heavy duty engineering work to take the different approach but what I wanted to do was provide some reassurance that just because we’ve started down one route doesn’t mean we’re locked into it,” Gould added, in response to concern from committee chair, Harriet Harman, that there might only be a small window of time for any change of architecture to be executed.

In recent days the UK has faced criticism from academic experts related to the choice of app architecture, and the government risks looking increasingly isolated in choosing such a bespoke system — which includes allowing users to self report having COVID-19 symptoms; something the French system will not allow, per a blog post by the digital minister.

Concerns have also been raised about how well the UK app will function technically, as it will be unable to plug directly into the Apple-Google API.

While international interoperability is emerging as a priority issue for the UK — in light of the Republic of Ireland’s choice to go for a decentralized system. 

Committee MP Joanna Cherry pressed Gould on that latter point today. “It is going to be a particular problem on the island of Ireland, isn’t it?” she said.

“It raises a further question of interoperability that we’ll have to work through,” admitted Gould.

Cherry also pressed Denham on whether there should be specific legislation and a dedicated oversight body and commissioner, to focus on digital coronavirus contacts tracing — to put in place clear legal bounds and safeguards and ensure wider human rights impacts are considered alongside privacy and security issues.

Denham said: “That’s one for parliamentarians and one for government to look at. My focus right now is making sure that I do a fulsome job when it comes to data protection and security of the data.”

Returning to the DPIA point, the government may not have a legal requirement to provide the document to the ICO in advance of launching the app, according to one UK-based data protection expert we spoke to. Although he agreed there’s a risk of ministers looking hypocritical if, on the one hand, they’re claiming to be very ‘open and transparent’ in the development of the app — a claim Gould repeated in his evidence to the committee today — yet, at the same time, aren’t fully involving the ICO (given it hasn’t had access to the DPIA); and also given what he called the government’s wider “dismal” record on transparency.

Asked whether he’d expect a DPIA to have been shared with the ICO in this context and at this point, Tim Turner, a UK based data protection trainer and consultant, told us: “It’s a tricky one. NHSX have no obligation to share the DPIA with the ICO unless it’s under prior consultation where they have identified a high risk and cannot properly manage or prevent it. If NHSX are confident that they’ve assessed and managed the risks effectively, even though that’s a subjective judgement, ICO has no right to demand it. There’s also no obligation to publish DPIAs in any circumstances. So it comes down to issues of right and wrong rather than legality.

“Honestly, I wouldn’t expect NHSX to publish it because they don’t have to,” he added. “If they think they’ve done it properly, they’ve done what’s required. That’s not to say they haven’t done it properly, I have no idea. I think it’s an example of where the concept of data ethics bumps into reality — it would be a breach of the GDPR [General Data Protection Regulation] not to do a DPIA, but as long as that’s happened and we don’t have an obvious personal data breach, ICO has nothing to complain about. Denham might expect organisations to behave in a certain way or give her information that she wants to see, but if an organisation’s leadership wants to stick rigidly to what the law says, her expectations don’t have any powers to back them up.”

On the government’s claim to openness and transparency, Turner added: “This isn’t a transparent government. Their record on FOI [Freedom of Information] is dismal (and ICO’s record on enforcing to do something about that is also dismal). It’s definitely hypocritical of them to claim to be transparent on this or indeed other important issues. I’m just saying that NHSX can fall back on not having an obligation to do it. They should be more honest about the fact that ICO isn’t involved and not use them as a shield.”

UK’s coronavirus tracing app strategy faces fresh questions over transparency and interoperability

The UK’s data protection watchdog confirmed today the government still hasn’t given it sight of a key legal document attached to the coronavirus contacts tracing app which is being developed by the NHSX, the digital transformation branch of the country’s National Health Service .

Under UK and EU law, a Data Protection Impact Assessment (DPIA) can be a legal requirement in instances where there are high rights risks related to the processing of people’s information.

Last month the European Data Protection Board strongly recommended publication of DPIAs in the context of coronavirus contacts tracing apps. “The EDPB considers that a data protection impact assessment (DPIA) must be carried out before implementing such tool as the processing is considered likely high risk (health data anticipated large-scale adoption, systematic monitoring, use of new technological solution). The EDPB strongly recommends the publication of DPIAs,” the pan-EU data protection steerage body wrote in the guidance.

Giving evidence to the human rights committee today, UK information commissioner Elizabeth Denham confirmed that her department, the ICO, is involved in advising the government on the data protection elements of the app’s design. She said the agency has been provided with some technical documents for review thus far. But, under committee questioning, she reserved any firmer assessment of the rights impacts’ of the government’s choice of app design and architecture — saying the ICO still hasn’t seen the DPIA.

“I think that is on the verge of happening,” she said when asked if she had any idea when the document would be published or provided to the ICO for review.

“Having that key document — and the requirement for the NHXS to do that, and provide that to me and to the public — is a really important protection,” Denham added. “Especially when everything’s happening at pace and we want the public to take up such an app, to help with proximity and notification.

“The privacy notice and the DPIA will both need to be shared with us and I do know that NHSX plans to also publish that so that they can show the public — be transparent and accountable for what they’re doing.”

The NHSX has given a green light for the ICO to audit the app in future, she also told the committee.

Coronavirus contacts tracing applications are a new technology which, in the UK case, entail repurposing the Bluetooth signals emitted by smartphones to measure device proximity as a proxy for calculating infection risk. The digital tracing process opens a veritable pandora’s box of rights risks, with health data, social graph and potentially location information all in the mix — alongside overarching questions about how effective such a tech will prove in battling the coronavirus.

Yesterday the BBC reported that the NHSX will trial the tracing app in the Isle of Wight this week.

“As we see the trial in the Isle of Wight we’ll all be very interested to see the results of that trial and see if it’s working the way that the developers have intended,” added Denham.

At a separate parliamentary committee hearing last week NHSX CEO, Matthew Gould, told MPs that the app could be “technically” ready to deploy nationally within two to three weeks, following the limited geographical trial.

He also said the app will iterate — with future versions potentially asking users to share location data. So while the NHSX has maintained that only pseudonymized data will be collected and held centrally — where it could be used for public health “research” purposes — there remains a possibility that data could be linked to individual identities, such as if different pieces of data are combined by state agencies and/or if the centralized store of data is hacked and/or improperly accessed.

Privacy experts have also warned of the risk of ‘mission creep’ down the tracing line.

Today the Guardian reported that the government is in talks with digital identity startups about building technology to power so called ‘immunity passports’, as another plank of its digital response to the coronavirus. Per the report, such a system could combine facial recognition technology with individual coronavirus test results so a worker could verify their COVID-19 status prior to entrance to a workplace, for example. (A spokeswomen for Onfido confirmed to TechCrunch that it’s in discussions with the government but added: “As you’d expect these are confidential until publicly shared.”)

Returning to the coronavirus tracing app, the key point is that the government has opted for a system design that centralizes proximity events on an NHSX-controlled server — when or if a user elects to self-report themselves suffering from COVID-19 symptoms (or does so after getting a confirmed diagnosis).

This choice to centralize proximity event processing elevates not just privacy and security questions but also wider human rights risks, as the committee highlighted in a series of questions to Denham and Gould today — pointing out, for example, that Denham and the ICO have previously suggested that decentralized architectures would be preferable for such high rights risk technology.

On that Denham said: “Because I’m the information commissioner, if I were to start with a blank sheet of paper [it] would start with a decentralized system — and you can understand, from a privacy and security perspective, why that would be so. But that does not, in any way, mean that a centralized system can’t have the same kind of privacy and security protections. And it’s up to the government — it’s up to NHSX — to determine what kind of design specifications the system needs.

“It’s up to government to identify what those functions and needs are and if those lead to a centralized system then the question that the DPIA has to answer is why centralized? And my next question would be how are the privacy and security concerns addressed?  That’s what a DPIA is. It’s about the mitigation of concerns.”

Apple and Google are also collaborating on a cross-platform API that will support the technical functioning of decentralized national tracing apps, as well as baking a decentralized and opt-in system-wide contacts tracing into their own platforms.

The tech giants’ backing for decentralized tracing apps raises interoperability questions and technical concerns for governments that choose to go the other way and pool data.

In additional details for the forthcoming Exposure Notification API, released today, the tech giants stipulate that apps must gain user consent to get access to the API; should only gather the minimum info necessary for the purposes of exposure notification, and only use it for a COVID-19 response; and can’t access or even seek permission to access a device’s Location Services — meaning no uploading location data (something the NHSX app may ask users to do in future, per Gould’s testimony to a different parliamentary committee last week. He also confirmed today that users will be asked to input the first three letters of their postcode).

A number of European governments have now said they will use decentralized systems for digital contacts tracing — including Germany, Switzerland and the Republic of Ireland.

The European Commission has also urged the use of privacy preserving technologies — such as decentralization — in a COVID-19 contacts tracing context.

Currently, France and the UK remain the highest profile backers of centralized systems in Europe.

But, interestingly, Gould gave the first sign today of a UK government ‘wobble’ — saying it’s not “locked” to a centralization app architecture and could change its mind if evidence emerged that a different choice would make more sense.

Though he also made a point of laying out a number of reasons that he said explained the design choice, and — in response to a question from the committee — denied the decision had been influenced by the involvement of a cyber security arm of the UK’s domestic intelligence agency, GCHQ .

“We are working phenomenally closely with both [Apple and Google],” he said. “We are trying very hard in the context of a situation where we’re all dealing with a new technology and a new situation to try and work out what the right approach is — so we’re not in competition, we’re all trying to get this right. We are constantly reassessing which approach is the right one — and if it becomes clear that the balance of advantage lies in a different approach then we will take that different approach. We’re not irredeemably wedded to one approach; if we need to shift then we will… It’s a very pragmatic decision about what approach is likely to get the results that we need to get.”

Gould claimed the (current) choice of a centralized architecture was taken because the NHSX is balancing privacy needs against the need for public health authorities to “get insight” — such as about which symptoms subsequently lead to people subsequently testing positive; or what contacts are more risky (“what the changes are between a contact, for example, three days before symptoms develop and one day before symptoms develop”).

“It was our view that a centralized approach gave us… even on the basis of the system I explained where you’re not giving personal data over — to collect some very important data that gives serious insight into the virus that will help us,” he said. “So we thought that in that context, having a system that both provided that potential for insight but which also, we believe provided serious protections on the privacy front… was an appropriate balance. And as the information commissioner has said that’s really a question for us to work out where that balance is but be able to demonstrate that we have mitigations in place and we’ve really thought about the privacy side as well, which I genuinely believe we have.”

“We won’t lock ourselves in. It may be that if we want to take a different approach we have to do some heavy duty engineering work to take the different approach but what I wanted to do was provide some reassurance that just because we’ve started down one route doesn’t mean we’re locked into it,” Gould added, in response to concern from committee chair, Harriet Harman, that there might only be a small window of time for any change of architecture to be executed.

In recent days the UK has faced criticism from academic experts related to the choice of app architecture, and the government risks looking increasingly isolated in choosing such a bespoke system — which includes allowing users to self report having COVID-19 symptoms; something the French system will not allow, per a blog post by the digital minister.

Concerns have also been raised about how well the UK app will function technically, as it will be unable to plug directly into the Apple-Google API.

While international interoperability is emerging as a priority issue for the UK — in light of the Republic of Ireland’s choice to go for a decentralized system. 

Committee MP Joanna Cherry pressed Gould on that latter point today. “It is going to be a particular problem on the island of Ireland, isn’t it?” she said.

“It raises a further question of interoperability that we’ll have to work through,” admitted Gould.

Cherry also pressed Denham on whether there should be specific legislation and a dedicated oversight body and commissioner, to focus on digital coronavirus contacts tracing — to put in place clear legal bounds and safeguards and ensure wider human rights impacts are considered alongside privacy and security issues.

Denham said: “That’s one for parliamentarians and one for government to look at. My focus right now is making sure that I do a fulsome job when it comes to data protection and security of the data.”

Returning to the DPIA point, the government may not have a legal requirement to provide the document to the ICO in advance of launching the app, according to one UK-based data protection expert we spoke to. Although he agreed there’s a risk of ministers looking hypocritical if, on the one hand, they’re claiming to be very ‘open and transparent’ in the development of the app — a claim Gould repeated in his evidence to the committee today — yet, at the same time, aren’t fully involving the ICO (given it hasn’t had access to the DPIA); and also given what he called the government’s wider “dismal” record on transparency.

Asked whether he’d expect a DPIA to have been shared with the ICO in this context and at this point, Tim Turner, a UK based data protection trainer and consultant, told us: “It’s a tricky one. NHSX have no obligation to share the DPIA with the ICO unless it’s under prior consultation where they have identified a high risk and cannot properly manage or prevent it. If NHSX are confident that they’ve assessed and managed the risks effectively, even though that’s a subjective judgement, ICO has no right to demand it. There’s also no obligation to publish DPIAs in any circumstances. So it comes down to issues of right and wrong rather than legality.

“Honestly, I wouldn’t expect NHSX to publish it because they don’t have to,” he added. “If they think they’ve done it properly, they’ve done what’s required. That’s not to say they haven’t done it properly, I have no idea. I think it’s an example of where the concept of data ethics bumps into reality — it would be a breach of the GDPR [General Data Protection Regulation] not to do a DPIA, but as long as that’s happened and we don’t have an obvious personal data breach, ICO has nothing to complain about. Denham might expect organisations to behave in a certain way or give her information that she wants to see, but if an organisation’s leadership wants to stick rigidly to what the law says, her expectations don’t have any powers to back them up.”

On the government’s claim to openness and transparency, Turner added: “This isn’t a transparent government. Their record on FOI [Freedom of Information] is dismal (and ICO’s record on enforcing to do something about that is also dismal). It’s definitely hypocritical of them to claim to be transparent on this or indeed other important issues. I’m just saying that NHSX can fall back on not having an obligation to do it. They should be more honest about the fact that ICO isn’t involved and not use them as a shield.”

Decrypted: Chegg’s third time unlucky, Okta’s new CSO Rapid7 beefs up cloud security

Ransomware is getting sneakier and smarter.

The latest example comes from ExecuPharm, a little-known but major outsourced pharmaceutical company that confirmed it was hit by a new type of ransomware last month. The incursion not only encrypted the company’s network and files, hackers also exfiltrated vast amounts of data from the network. The company was handed a two-for-one threat: pay the ransom and get your files back or don’t pay and the hackers will post the files to the internet.

This new tactic is shifting how organizations think of ransomware attacks: it’s no longer just a data-recovery mission; it’s also now a data breach. Now companies are torn between taking the FBI’s advice of not paying the ransom or the fear their intellectual property (or other sensitive internal files) are published online.

Because millions are now working from home, the surface area for attackers to get in is far greater than it was, making the threat of ransomware higher than ever before.

That’s just one of the stories from the week. Here’s what else you need to know.

THE BIG PICTURE


Chegg hacked for the third time in three years

Education giant Chegg confirmed its third data breach in as many years. The latest break-in affected past and present staff after a hacker made off with 700 names and Social Security numbers. It’s a drop in the ocean when compared to the 40 million records stolen in 2018 and an undisclosed number of passwords taken in a breach at Thinkful, which Chegg had just acquired in 2019.

Those 700 names account for about half of its 1,400 full-time employees, per a filing with the Securities and Exchange Commission. But Chegg’s refusal to disclose further details about the breach — beyond a state-mandated notice to the California attorney general’s office — makes it tough to know exactly went wrong this time.

Security lapse at India’s Jio exposed coronavirus symptom checker results

Since the start of the outbreak, governments and companies have scrambled to develop apps and websites that can help users identify COVID-19 symptoms.

India’s largest cell network Jio, a subsidiary of Reliance, launched its coronavirus self-test symptom checker in late March, just before the Indian government imposed a strict nationwide lockdown to prevent the further spread of the coronavirus. The symptom checker allows anyone to check their symptoms from their phone or Jio’s website to see if they may have become infected with COVID-19.

But a security lapse exposed one of the symptom checker’s core databases to the internet without a password, TechCrunch has found.

Jio’s coronavirus symptom checker. One of its databases exposed users’ responses. (Image: TechCrunch)

Security researcher Anurag Sen found the database on May 1, just after it was first exposed, and informed TechCrunch to notify the company. Jio quickly pulled the system offline after TechCrunch made contact. It’s not known if anyone else accessed the database.

“We have taken immediate action,” said Jio spokesperson Tushar Pania. “The logging server was for monitoring performance of our website, intended for the limited purpose of people doing a self-check to see if they have any COVID-19 symptoms.”

The database contains millions of logs and records starting April 17 through to the time that the database was pulled offline. Although the server contained a running log of website errors and other system messages, it also ingested vast numbers of user-generated self-test data. Each self-test was logged in the database and included a record of who took the test — such as “self” or a relative, their age, and their gender.

The data also included the person’s user agent, a small snippet of information about the user’s browser version and the operating system, often used to load the website properly but can also be used to track a user’s online activity.

The database also contains individual records of those who signed up to create a profile, allowing users to update their symptoms over time. These records contained the answers to each question asked by the symptom checker, including what symptoms they are experiencing, who they have been in contact with, and what health conditions they may have.

Some of the records also contained the user’s precise location, but only if the user allowed the symptom checker access to their browser or phone’s location data.

We’ve posted a redacted portion of one of the records below.

A redacted portion of the exposed database. (Image: TechCrunch)

From one sample of data we obtained, we found thousands of users’ precise geolocation from across India. TechCrunch was able to identify people’s homes using the latitude and longitude records found in the database.

Most of the location data is clustered around major cities, like Mumbai and Pune. TechCrunch also found users in the United Kingdom and North America.

The exposure could not come at a more critical time for the Indian telecoms giant. Last week Facebook invested $5.7 billion for a near-10% stake in Jio’s Platforms, valuing the Reliance subsidiary at about $66 billion.

Jio did not answer our follow-up questions, and the company did not say if it will inform those who used the symptom tracker of the security lapse.

New bill threatens journalists’ ability to protect sources

Online child exploitation is a horrific crime that requires an effective response. A draft bill, first proposed by Sen. Lindsey Graham (R-SC) in January, intends to provide exactly that. However, technology experts warn the bill not only fails to meet the challenge, it creates new problems of its own. My job is to enable journalists to do their work securely — to communicate with others, research sensitive stories and publish hard-hitting news. This bill introduces significant harm to journalists’ ability to protect their sources.

Under the Eliminating Abusive and Rampant Neglect of Interactive Technologies (or EARN IT) Act, a government commission would define best practices for how technology companies should combat this type of material. On the surface, EARN IT proposes an impactful approach. A New York Times investigation in September found that “many tech companies failed to adequately police sexual abuse imagery on their platforms.” The investigation highlighted features, offered by these companies, that provide “digital hiding places for perpetrators.”

In reality, the criticized features are exactly the same ones that protect our privacy online. They help us read The Washington Post in private and ensure we only see authentic content created by the journalists. They allow us to communicate with each other. They empower us to express ourselves. And they enable us to connect with journalists so the truth can make the page. This raises the question of whether the bill will primarily protect children or primarily undermine free speech online.

It should be pointed out that EARN IT does not try to ban the use of these features. In fact, the bill does not specifically mention them at all. But if we look at how companies would apply the “best practices,” it becomes clear that the government is intending to make these features difficult to provide, that the government is looking to discourage companies from offering — and increasing the use of — these features. By accepting EARN IT, we will give up our ability — and our children’s future abilities — to enjoy online, social, connected and private lives.

Our digital life is protected by the same features that allow some bad people to do bad things online.

Four of the “best practices” relate to requiring companies to have the ability to “identify” child sexual abuse material. Unfortunately, it’s not possible to identify this material without also having the ability to identify any and all other types of material — like a journalist communicating with a source, an activist sharing a controversial opinion or a doctor trying to raise the alarm about the coronavirus. Nothing prevents the government from later expanding the bill to cover other illegal acts, such as violence or drugs. And what happens when foreign governments want to have a say in what is “legal” and what is not?

Our digital life is protected by the same features that allow some bad people to do bad things online. They protect us as we visit The Washington Post website, use the Signal app to contact one of its journalists or use the Tor Browser to submit information to their anonymous tip line. These features all enable privacy, a core component of the journalistic process. They enable journalists to pursue and tell the truth, without fear or favor. And not just in the U.S., but globally. We should empower and enable this work, not sabotage it by removing crucial capabilities, even in the name of child protection.

The same New York Times investigation found that law enforcement agencies devoted to fighting online child exploitation “were left understaffed and underfunded, even as they were asked to handle far larger caseloads.” The National Center for Missing and Exploited Children (NCMEC), established by Congress in 1984 to reduce child sexual exploitation and prevent child victimization, “was ill equipped for the expanding demands.” It’s worth asking, then, why EARN IT does not instead empower these agencies with additional resources to solve crimes.

We must consider the possibility that this bill fails to achieve its stated goal. That it will not protect children online, and will introduce harm to their digital presence and ability to speak freely. Everyone deserves good security, and it’s on us to find ways to prevent harm without compromising on our digital rights. To force companies to weaken our protection to give law enforcement greater insight would be the equivalent of forcing people to live without locks and curtains in their homes. Are we willing to go that far?

That’s not to say we have to accept no solution. But it can’t be this one.

Cliqz pulls the plug on a European anti-tracking alternative to Google search

Cliqz, a Munch-based anti-tracking browser with private search baked in that has sought to offer a local alternative to Google powered by its own search index, is shutting down — claiming this arm of its business has been blindsided by the coronavirus crisis.

The bigger challenge is of course competing in a market so dominated by Google .

In Europe, where the tech giant’s search engine commands a marketshare approaching 95%, trying to lure users to an alternative ecosystem is difficult at the best of times, and a pandemic is certainly not that.

“We didn’t see a pandemic coming,” Cliqz wrote in a farewell blog post yesterday. “We didn’t expect that a virus could have impact on Cliqz. And even just one and a half months ago, we completely underestimated what this would do to the economy and even more so to the political priorities. It became clear to us in the last weeks, that all political initiatives to create an independent European digital infrastructure have been stalled or postponed for years. Covid-19 is overshadowing everything. This is not a climate where we will have any meaningful discussion about a public funding of a solution like Cliqz.”

It’s been a long road for Cliqz, which was founded back in 2008 — initially focused on German-speaking markets. The browser was a fork of Mozilla’s Firefox, and Cliq went on to take investment from Mozilla, in 2016, when it was eyeing expanding to more markets.

In 2017 it acquired the Ghostery anti-tracking tool, which had around 8 million users at the time, with the aim of combining algorithmic and blocklist anti-tracking approaches. But the wider challenge for Cliqz’s browser+search effort was not a lack of tech but the difficulty of building broad backing for its alternative approach.

The farewell blog post says the company failed to raise enough awareness among mainstream web users to convince them to step off Alphabet’s beaten path. But it’s also true that, in recent years, mainstream browsers have been baking in anti-tracking and steadily upping their own splashy privacy claims.

Even Google has said it will phase out third party cookie tracking in its Chrome browser — so the available space for ‘easy’ differentiation around privacy is shrinking. Unless you can clearly and powerfully articulate key technical nuance and complex wider market dynamics related to how user data is passed around in the background.

There is also ongoing regulatory failure in Europe around privacy, despite a recently updated data protection framework, with many national watchdogs failing to grasp the nettle of rampant unlawful online tracking.

The lack of GDPR enforcement against major tech and adtech platforms also means there’s been less succour for those businesses that are making privacy respecting choices than they might have been led to expect, having read the rules on paper.

“We failed to make people truly aware of the problem; we failed to reach a scale that would allow our search engine to be self-financing,” Cliqz writes. “We have reached several hundred thousand daily users. But — and this is the disadvantage of running our own technology — this is not enough to run a search engine, to cover our costs. And most of all, we failed to convince the political stakeholders, that Europe desperately needs an own independent digital infrastructure.”

While the Cliqz browser and search is being shuttered, the company is not closing down entirely — and a spokesman confirmed Ghostery will continue.

Cliqz investor, Hubert Burda Media, which holds a majority stake in the business, said Thursday that the resulting “restructuring” of the business will affect 45 employees — “for whom individual solutions are currently being sought”.

“The 100% Cliqz subsidiary Ghostery, headed by Jeremy Tillman, will continue to bundle Cliqz’s expertise in the area of ​​anti-tracking,” it wrote. “In addition, a team of experts will be formed from Cliqz, which will take care of technical issues such as artificial intelligence, search and the influence of technology on media.”

Burda added that it’s looking at a possible integration of Cliqz’s MyOffrz unit — aka the division that had sought to monetize use of the anti-tracking browser via contextually targeted (and thus privacy sensitive) ads.

In a wider statement on the restructuring, Burda CEO Paul-Bernhard Kallen said: “We have invested in Cliqz for years because we believe that Europe needs its own digital infrastructure to stay fit for the future. Without the necessary political structures at European level for this, however, we will not be able to overcome the superiority of the tech giants from the USA and China. In addition, the Corona pandemic is unlikely to lead to a far-reaching innovation program in Europe in the foreseeable future, so that we can no longer drive this path alone. I very much regret this because the basic idea of ​​establishing a counterweight to the USA and China in the European search sector is still the right one.”

Fairphone teams up with /e/OS on a box-fresh ‘deGoogled’ handset

The makers of the world’s most ethical smartphone, the Fairphone 3, have teamed up for a version of the device with even less big tech on board.

The Netherlands-based device maker has partnered with France’s /e/OS to offer a ‘de-Googled’ version of its latest handset, running an Android AOSP fork out of the box that’s itself built atop a fork of CyanogenMod (remember them?) — called LineageOS (via Engadget).

“The deGoogled Fairphone 3 is most likely the first privacy conscious and sustainable phone,” runs the blurb on /e/OS’ website. “It combines a phone that cares for people and planet and an OS and apps that care for your privacy.”

A pithy explainer of its “privacy by design ecosystem” — and the point of “Android without Google” — further notes: “We have removed many pieces of code that send your personal data to remote servers without your consent. We don’t scan your data in your phone or in your cloud space, and we don’t track your location hundred times a day or collect what you’re doing with your apps.”

When the Fairphone 3 launched last September it came with Android 9 preloaded. But the company touted a post-launch update that would make it easy for buyers to wipe Google services off their slate and install the Android Open Source Project, which it recommended for advanced users.

The new /e/OS flavor offers a third OS option.

Per Engadget, Fairphone said it polled members of its community asking which alternative OS to offer and /e/OS got more votes than a number of others. The company also highlighted /e/OS’ privacy by design as a factor in the choice, lauding how it shuts down “unwanted data flows”, meaning users have more control over what their phone is doing.

The e/OS flavor of the Fairphone 3 ships from May 6, priced at just under €480 — a €30 premium on the Googley flavor of Android you get on the standard Fairphone 3.

Existing owners of Fairphone’s third gen handset can manually install /e/OS gratis via an installer on its website.

When the Fairphone 3 launched last year the company told us only around 5% of Fairphone users opt to go full open source — which suggests the /e/OS Fairphone 3 will be a niche choice for even these discerning buyers.

UK privacy and security experts warn over coronavirus app mission creep

A number of UK computer security and privacy experts have signed an open letter raising transparency and mission creep concerns about the national approach to develop a coronavirus contacts tracing app.

The letter, signed by around 150 academics, follows a similar letter earlier this month signed by around 300 academics from across the world, who urged caution over the use of such tech tools and called for governments that choose to deploy digital contacts tracing to use privacy-preserving techniques and systems.

We urge that the health benefits of a digital solution be analysed in depth by specialists from all relevant academic disciplines, and sufficiently proven to be of value to justify the dangers involved,” the UK academics write now, directing their attention at NHSX, the digital arm of the National Health Service which has been working on building a digital contacts tracing app since early March. 

It has been reported that NHSX is discussing an approach which records centrally the de-anonymised ID of someone who is infected and also the IDs of all those with whom the infected person has been in contact. This facility would enable (via mission creep) a form of surveillance.”

Yesterday the NHSX’s CEO, Matthew Gould, was giving evidence to the UK parliament’s Science and Technology committee. He defended the approach it’s taking — claiming the forthcoming app uses only “a measure of centralization”, and arguing that it’s a “false dichotomy” to say decentralized is privacy secure and centralized isn’t.

He went on to describe a couple of scenarios he suggested show why centralizing the data is necessary in the NHSX’s view. But in the letter the UK academics cast doubt on the validity of the central claim, writing that “we have seen conflicting advice from different groups about how much data the public health teams need“.

We hold that the usual data protection principles should apply: collect the minimum data necessary to achieve the objective of the application,” they continue. “We hold it is vital that if you are to build the necessary trust in the application the level of data being collected is justified publicly by the public health teams demonstrating why this is truly necessary rather than simply the easiest way, or a ‘nice to have’, given the dangers involved and invasive nature of the technology.”

Europe has seen fierce debate in recent weeks over the choice of app architecture for government-backed coronavirus contacts tracing apps — with different coalitions forming to back decentralized and centralized approaches and some governments pressuring Apple over backing the opposing horse with a cross-platform API for national coronavirus contacts tracing apps it’s developing with Android-maker Google.

Most of the national apps in the works in the region are being designed to use Bluetooth proximity as a proxy for calculating infection risk — with smartphone users’ devices swapping pseudonymized identifiers when near each other. However privacy experts are concerned that centralized stores of IDs risk creating systems of state surveillance as the data could be re-identified by the authority controlling the server.

Alternative decentralized systems have been proposed, using a p2p system with IDs stored locally. Infection risk is also calculated on device, with a relay server used only to push notifications out to devices — meaning social graph data is not systematically exposed.

Although this structure does require the IDs of people who have been confirmed infected to be broadcast to other devices — meaning there’s a potential for interception and re-identification attacks at a local level.

At this stage it’s fair to say that the momentum in Europe is behind decentralized approaches for the national contacts tracing apps. Notably Germany’s government switched from previously backing a centralized approach to decentralized earlier this week, joining a number of others (including Estonia, Spain and Switzerland) — which leaves France and the UK the highest profile backers of centralized systems for now.

France is also seeing expert debate over the issue. Earlier this week a number of French academics signed a letter raising concerns about both centralized and decentralized architectures — arguing that “there should be important evidence in order to justify the risks incurred” of using any such tracking tools.

In the UK, key concerns being attached to the NHSX app are not only the risk of social graph data being centralized and reidentified by the state — but also scope/function creep.

Gould said yesterday that the app will iterate, adding that future versions could ask people to voluntarily give up more data such as their location. And while the NHSX has said use of the app will be voluntary, if multiple functions get baked in that could raise questions over the quality of the consent and whether mission creep is being used as a lever to enforce public uptake.

Another concern is that a public facing branch of the domestic spy agency, GCHQ, has also been involved in advising on the app architecture. And yesterday Gould dodged the committee’s direct questions on whether the National Cyber Security Centre (NCSC) had been involved in the decision to select a centralized architecture.

There may be more concerns on that front, too. Today the HSJ reports that health secretary Matt Hancock recently granted new powers to the UK’s intelligence agencies which mean they can require the NHS to disclose any information that relates to “the security” of the health service’s networks and information systems during the pandemic.

Such links to database-loving spooks are unlikely to quell privacy fears.

There is also concern about how involved the UK’s data watchdog has been in the detail of the app’s design process. Last week the ICO’s executive director, Simon McDougall, was reported to have told a public forum he had not seen plans for the app, although the agency put out a statement on April 24 saying it was working with NHSX “to help them ensure a high level of transparency and governance”.

Yesterday Gould also told the committee the NHSX would publish data protection impact assessments (DPIAs) for each iteration of the app, though none has yet been published.

He also said the software would be “technically” ready to launch in a few weeks’ time — but could not confirm when the code would be published for external review.

In their letter, the UK academics call on NHSX to publish a DPIA for the app “immediately”, rather than dropping it right before deployment, to allow for public debate about the implications of its use and in order that that public scrutiny can take place of the claimed security and privacy safeguards.

The academics are also calling for the unit to publicly commit to no database or databases being created that would allow de-anonymization of users of the system (other than those self reporting as infected), and which could therefore allow the data to be used for constructing users’ social graphs.

They also urge the NHSX to set out details on how the app will be phased out after the pandemic has passed — in order “to prevent mission creep”.

Asked for a commitment on the database point, an NHSX spokesman told us that’s a question for the UK’s Department of Health and Social Care and/or the NCSC — which won’t salve any privacy concerns around the governments’ wider plans for app users’ data.

We also asked when the NHSX will be publishing a DPIA for the app. At the time of writing we were still waiting for a response.

France postpones parliament debate on contact-tracing app

Earlier today at the National Assembly, French Prime Minister Édouard Philippe gave a lengthy speech about post-lockdown measures in France ahead of a debate with deputies. The government originally planned to discuss France’s contact-tracing app StopCovid. It broadened the subject of the debate to post-lockdown policies a couple of days ago.

But Philippe said that it was too early to discuss the rollout of StopCovid. According to him, as the app is not ready yet, it would be difficult to vet the advantages and risks of a contact-tracing app.

The section of Philippe’s speech on StopCovid wasn’t long, so here’s everything he had to say on contact tracing:

“Could we or should we leverage the extraordinary resources or digital tools in order to be more efficient? A European consortium has laid groundwork that should lead to the creation of the StopCovid app — the app would only be useful in addition to what I already described. It would come in addition to the health investigations that I described. Those investigations, whether they are physical or by phone, are essential but present a flaw.

They are sometimes hampered by the inability to establish transmission chains in urban areas, in the most dense areas and in particular in public transit. It’s hard to notify the person who was in the same metro train at 7:46 on line 12. You don’t know them, they don’t know you and [public transport company] RATP doesn’t know either of them.

That’s the goal of the StopCovid project. It would make it possible to notify people who crossed the path of a positive individual in order to indicate what to do next without, of course, collecting any data on the identity of the person you encountered.

A large number of political leaders, from all parties, including the president of the National Assembly himself, told me about their concern on that type of tools, on the questions that it would inevitably raise when it comes to civil liberties and individual liberties. Those questions, and I already said it before, are justified. They should be asked, they should be debated. I even think that it should lead to a vote.

For now, given the uncertainty around this application, it would be quite difficult for me to tell you if it works and how it works exactly. I have no doubt that engineers will work tirelessly and will deliver on this project.

But, as it is not the case right now, I think that holding a debate would be premature. But I want to reaffirm my pledge. When the application currently in development is working, and before its release, we will hold a debate dedicated to StopCovid followed by a vote.

French Prime Minister Edouard Philippe delivers a speech as he presents the government’s plan to exit from the lockdown situation at the French National Assembly (Photo by David Nivière / Pool / AFP / Getty Images)

While it seems like the French government is going back to the drawing board, the Digital Ministry told TechCrunch that it is still going ahead with its controversial ROBERT protocol.

“I can confirm that French authorities only work on a solution that avoids the circulation of a list (even encrypted) of Covid+ people as in the Robert protocol. The discussions with Germany haven’t stopped to find the best architecture,” a spokesperson for the Digital Ministry told TechCrunch.

Last week, research institutes Inria and Fraunhofer released the specifications of the protocol that the French and German governments planned to use at the time. Named ROBERT, it is a centralized protocol that uses pseudonymization to match contacts on a central server. A national authority manages the central servers.

But you have to trust your government that they’re not doing anything nefarious on their servers without telling you — a centralized design could lead to abuse and even state surveillance.

On Sunday, Germany reversed its stance and abandoned ROBERT altogether. German authorities are now working on a decentralized protocol that leverages smartphones to compute social interactions. Ephemeral IDs are stored on your device and you can accept to share ephemeral IDs with a relay server to send them to the community of app users.

Yesterday, hundreds of French cryptography and security researchers signed a letter to raise awareness about the potential risks of a contact-tracing app. “It is essential to thoroughly analyze the health benefits of a digital solution with specialists — there should be important evidence in order to justify the risks incurred,” the letter said.

The French government is now also betting heavily on health investigators in order to contact everybody who has been tested positive in order to list their social interactions. After that, investigators can contact those other individuals and tell them to get tested as soon as possible.

TechCrunch’s Natasha Lomas contributed to this report

UK’s coronavirus contacts tracing app could ask users to share location data

More details have emerged about a coronavirus contacts tracing app being developed by UK authorities. NHSX CEO, Matthew Gould, said today that future versions of the app could ask users to share location data to help authorities learn more about how the virus propagates.

Gould, who heads up the digital transformation unit of the UK’s National Health Service, was giving evidence to the UK parliament’s Science & Technology Committee today.

At the same time, ongoing questions about the precise role of the UK’s domestic spy agency in key decisions about the NHSX’s choice of a centralized app architecture means privacy concerns are unlikely to go away — with Gould dodging the committee’s about GCHQ’s role.

A basic version of the NHSX’s coronavirus contacts tracing app is set to be tested in a small geographical region in the next 1-2 weeks, per Gould — who said “technically” it would be ready for a wider rollout in 2-3 weeks’ time.

Although he emphasized that any launch would need to be part of a wider government strategy which includes extensive testing and manual contacts tracing, along with a major effort to communicate to the public about the purpose and importance of the app as part of a combined response to fighting the virus.

In future versions of the app, Gould suggested users could be asked to contribute additional data — such as their location — in order to help epidemiologists identify infection hot spots, while emphasizing that such extra contributions would be voluntary.

“The app will iterate. We’ve been developing it at speed since the very start of the situation but the first version that we put out won’t have everything in it that we would like,” he said. “We’re quite keen, though, that subsequent versions should give people the opportunity to offer more data if they wish to do so.

“So, for example, it would be very useful, epidemiologically, if people were willing to offer us not just the anonymous proximity contacts but also the location of where those contacts took place — because that would allow us to know that certain places or certain sectors or whatever were a particular source of proximity contacts that subsequently became problematic.”

“If people were willing to do that — and I suspect a significant proportion of people would be willing to do that — then I think that would be very important data because that would allow us to have an important insight into how the virus was propagated,” he added.

For now, the basic version of the contacts tracing app the NHSX is devising is not being designed to track location. Instead, it will use Bluetooth as a proxy for infection risk, with phones that come into proximity swapping pseudonymized identifiers that may later be uploaded to a central server to calculate infection risk related to a person’s contacts.

Bluetooth proximity tracking is now being baked into national contacts tracing apps across Europe and elsewhere, although app architectures can vary considerably.

The UK is notable for being one of now relatively few European countries that have opted for a centralized model for coronavirus contacts tracing, after Germany switched its choice earlier this week.

France is also currently planning to use a centralized protocol. But countries including Estonia, Switzerland and Spain have said they will deploy decentralized apps — meaning infection risk calculations will be performed locally, on device, and social graph data will not be uploaded to a central authority.

Centralized approaches to coronavirus contact tracing have raised substantial privacy concerns as social graph data stored on a central server could be accessed and re-identified by the central authority controlling the server.

Apple and Google’s joint effort on a cross-platform API for national coronavirus contacts tracing apps is also being designed to work with decentralized approaches — meaning countries that want to go against the smartphone platform grain may face technically challenges such as battery drain and usability.

The committee asked Gould about the NHSX’s decision to develop its own app architecture, which means having to come up with workarounds to minimize issues such as battery drain because it won’t just be able to plug into the Apple -Google API . Yesterday the unit told the BBC how it’s planning to do this, while conceding its workaround won’t be as energy efficient as being able to use the API.

“We are co-operating very closely with a range of other countries. We’re sharing code, we’re sharing technical solutions and there’s a lot of co-operation but a really key part of how this works is not just the core Bluetooth technology — which is an important part of it — it’s the backend and how it ties in with testing, with tracing, with everything else. So a certain amount of it necessarily has to be embedded in the national approach,” said Gould, when asked why NHSX is going to the relative effort and hassle of developing its own bespoke centralized system rather than making use of protocols developed elsewhere.

“I would say we are sensibly trying to learn international best practice and share it — and we’ve shared quite a lot of the technological progress we’ve made in certain areas — but this has to embed in the wider UK strategy. So there’s an irreducible amount that has to be done nationally.”

On not aligning with Apple and Google’s decentralized approach specifically, he suggested that waiting for their system-wide contact tracing product to be released — due next month — would “slow us down quite considerably”. (During the committee hearing it was confirmed the first meeting relating to the NHSX app took place on March 7.)

While on the wider decision not to adopt a decentralized architecture for the app, Gould argued there’s a “false dichotomy” that decentralized is privacy secure and centralized isn’t. “We firmly believe that both our approach — though it has a measure of centralization in as much as your uploading the anonymized identifiers in order to run the cascades — nonetheless preserves people’s privacy in doing so,” he said.

“We don’t believe that’s a privacy endangering step. But also by doing so it allows you to see the contact graph of how this is propagating and how the contacts are working across a number of individuals, without knowing who they are, that allows you to do certain important things that you couldn’t do if it was just phone to phone propagation.”

He gave the example of detecting malicious use of contacts tracing being helped by being able to acquire social graph data. “One of the ways you can do that is looking for anomalous patterns even if you don’t know who the individuals are you can see anomalous propagation which the approach we’ve taken allows,” he said. “We’re not clear that a decentralized approach allows.”

Another example he gave was a person declaring themselves symptomatic and a cascade being run to notify their contacts and then that person subsequently testing negative.

“We want to be able to release all the people that have been given an instruction to isolate previously on the basis of [the false positive person] being symptomatic. If it was done in an entirely decentalized way that becomes very difficult,” he suggested. “Because it’s all been done phone to phone you can’t go back to those individuals to say you don’t have to be locked down because your index case turned out to be negative. So we really believe there are big advantages the way we’re doing it. But we don’t believe it’s privacy endangering.”

Responding to the latter claim, Dr Michael Veale — a lecturer in digital rights and regulation at UCL who is also one of the authors of a decentalized protocol for contacts tracing, called DP-3T, that’s being adopted by a number of European governments — told us: “It is trivial to extend a decentralised system to allow individuals to upload ‘all clear’ keys too, although not something that DP-3T focussed on building in because to my knowledge, it is only the UK that wishes to allow these cascades to trigger instructions to self-isolate based on unverified self-reporting.”

In the decentralized scenario, “individuals would simply upload their identifiers again, flagging them as ‘false alarm’, they would be downloaded by everyone, and the phones of those who had been told to quarantine would notify the individual that they no longer needed to isolate”, Veale added — explaining how a ‘false alarm’ notification could indeed be sent without a government needing to centralize social graph data.

The committee also asked Gould directly whether UK spy agency, GCHQ, was involved in the decision to choose a centralized approach for the app. The BBC reported yesterday that experts from the cyber security arm of the spy agency, the National Cyber Security Centre (NCSC), had aided the effort.

At first pass Gould dodged the question. Pressed a second time he dodged a direct answer, saying only that the NCSC were “part of the discussions in which we decided to take the approach that we’ve taken”.

“[The NCSC] have, along with a number of others — the Information Commission’s Office, the National Data Guardian, the NHS — been advising us. And as the technical authority for cyber security I’m very glad to have had the NCSC’s advice,” he also said.

“We have said will will open source the software, we have said we will publish the privacy model and the security model that’s underpinning what we’re going to do,” he added. “The whole model rests on people having randomized IDs so the only point in the process at which they need to say to us who they are is when they need to order a test having become symptomatic because it’s impossible to do that otherwise.

“They will have the choice both to download the app and turn it on but also to upload the list of randomized IDs of people they’ve been in touch with. They will also have the choice at any point to delete the app and all the data that they haven’t shared with us up to that point with it. So I do believe that what we’ve done is respectful of people’s privacy but at the same time effective in terms of being able to keep people safe.”

Gould was unable to tell the committee when the app’s code will be open sourced, or even confirm it would happen before the app was made available. But he did say the unit is committed to publishing data protection impact assessments — claiming this would be done “for each iteration” of the app.

“At every stage we will do a data protection impact assessment, at every stage we’ll make sure the information commission know’s what we’re doing and is comfortable with what we’re doing so we will proceed carefully and make sure what we do is compliant,” he said.

At another point in the hearing, Lillian Edwards, a professor of law, innovation and society at Newcastle Law School who was also giving evidence, pointed out that the Information Commissioner’s Office’s executive director, Simon McDougall, told a public forum last week that the agency had not in fact seen details of the app plan.

“There has been a slight information gap there,” she suggested. “This is normally a situation with an app that is high risk stakes involving very sensitive personal data — where there is clearly a GDPR [General Data Protection Regulation] obligation to prepare a Data Protection Impact Assessment — where one might have thought that prior consultation and a formal sign off by the ICO might have been desirable.”

“But I’m very gratified to hear that a Data Protection Impact Assessment is being prepared and will be published and I think it would be very important to have a schedule on that — at least at some draft level — as obviously the technical details of the app are changing from day to day,” Edwards added.

We’ve reached out to the ICO to ask if it’s seen plans for the app or any data protection impact assessment now.

During the committee hearing, Gould was also pressed on what will happen to data sets uploaded to the central server once the app has been required. He said such data sets could be used for “research purposes”.

“There is the possibility of being able to use the data subsequently for research purposes,” he said. “We’ve said all along that the data from the app — the app will only be used for controlling the epidemic, for helping the NHS, public health and for research purposes. If we’re going to use data to ask people if we can keep their data for research purposes we will make that abundantly clear and they’ll have the choice on whether to do so.”

Gould followed up later in the session by adding that he didn’t envisage such data-sets being shared with the private sector. “This is data that will be probably under the joint data controllership of DHSC and NHS England and Improvement. I see no context in which it would be shared with the private sector,” he said, adding that UK law does already criminalize the reidentification of anonymized data.

“There are a series of protections that are in place and I would be very sorry if people started talking about sharing this data with the private sector as if it was a possibility. I don’t see it as a possibility.”

In another exchange during the session Gould told the committee the app will not include any facial recognition technology. Although he was unable to entirely rule out some role for the tech in future public health-related digital coronavirus interventions, such as related to certification of immunity.