Skip to main content

15 June 2020

Will the government's contact tracing app be legal?

Swee Leng Harris

SWEE LENG HARRIS: There are concerns that people will be discriminated against

contact tracing app

Like many countries, the UK’s National Health Service (NHS) is developing a contact tracing app to help stop the spread of Covid-19. Contact tracing as such is not unique to coronavirus – it’s also been used in the response to Ebola, for example – but the idea of using digital technology instead of or alongside human contact tracers is relatively new.

 

The government has stressed that using the app will be voluntary, but at present there is no legal protection in the UK against people being coerced into downloading and using the app, which could result in employers requiring workers to use it in order to reopen workplaces – something that’s already happened in Australia.

 

There are related concerns about discrimination. In March, there were allegations of NHS workers being evicted due to concerns about Covid-19. Given this context, it can be expected that some landlords might require tenants to use the app in order to rent a home. If access to housing depends on using the app, is it really still voluntary?

 

The app’s algorithm makes an automated decision to notify users based on a risk-score that it gives them. If someone’s risk is deemed high enough, they are directed to further guidance on how to self-isolate and take any other precautions.

 

From what the NHS has said, it appears that there is no human oversight of this process. The government’s Data Protection Impact Assessment (DPIA) for the app confirms this, and states that such automated decisions are authorised by the Health Service (Control of Patient Information) Regulations 2002. This kind of automated decision-making is also governed by Article 22 of the GDPR.

 

But on closer inspection, the legal grounds for this are shakier than they seem – the lack of human intervention in fact raises important questions of legal safeguards for rights and freedoms.

 

Michael Veale argues that the 2002 regulations only authorise data processing, not decision-making. Data processing includes a range of activities, such as collecting, using or disseminating data – but this is distinct from automated decisions that result from such processing. So, while the regulations provide a lawful basis for the app’s data processing, they do not authorise automated decisions.

 

Furthermore, even if the 2002 regulations did authorise automated decisions, the GDPR requires that any authorising law also provide safeguards for rights and freedoms and legitimate interests – and the regulations fail this test. For example, there are no safeguards to prevent people from being required to use the app to access services – something that was proposed in Professor Lilian Edwards’ draft bill.

 

By contrast, in Australia, there is a Ministerial Determination and proposed primary legislation to ensure people are not required to download a contact tracing app and cannot be denied access to services or work because of a refusal to use it. Consequently, in the example noted above, the employer was ultimately unable to force its workers to use the Australian app.

 

The UK government might argue that its app falls within an exception included in the GDPR, because users have consented to download and use it. However, if there are no legal protections against people being effectively coerced into using this technology for fear of losing work or being deprived of key services, and people are so coerced, then the consent-based justification for the automated decision would be in doubt.

 

Putting to one side questions about the UK app’s merits or efficacy, relying on the 2002 Regulations as authorising law when those regulations fall far short of the GDPR’s provisions is a poor precedent for the government to set. This legal problem should be remedied by legislation that authorises the automated decisions while providing safeguards for people’s rights, freedoms and legitimate interests.

 

Swee Leng Harris is Principal, Data & Digital Rights, at Luminate, and a Visiting Senior Research Fellow at the Policy Institute, King’s College London.

Related departments