Questions and answers about coronavirus and the UK economy

Do we make informed decisions when sharing our personal data?

The use of digital technologies to fight Covid-19 raises questions about public access to private data. It also shines a light on how people make economic decisions with regards to their personal data and information.

In the fight against Covid-19, policy-makers have increasingly explored the role that digital technologies could play to help reduce the reinfection rate. Proposed technological solutions such as symptom- and contact-tracing apps rely heavily on accessing citizens’ personal data. While exceptional times often lead to calls for exceptional measures, there are concerns about the willingness and ability of people to trade personal information for personal freedoms. 

People have concerns about the protection of their personal data, or they might not see the full costs and benefits. If a sizeable share of society is unwilling or unable to provide their data, this would call into question the viability of large-scale data-based policy measures. There is also an issue with setting a precedent for surveillance once the pandemic is over.

What is the relevance of different factors influencing decisions about personal data, and where are there gaps in the research evidence?

What does the evidence from economic research tell us?

  • The combined value to society of sharing certain personal data is much larger than the sum of the individual valuations (Coyle et al, 2020). For example, the more people provide their location history in combination with their health and isolation status, the higher the potential to reduce the basic reproduction number. A study in South Korea shows that public disclosure of extensive private information could reduce the economic costs by 50% (Argente et al, 2020). 
  • There seems to be a higher willingness to share data when the health of people is concerned and data are accessed by public health providers (Ipsos MORI, 2016). This is also supported by the fact that less than 3% of patients registered with the NHS have opt-out from making their confidential patient data available for research and planning (NHS Digital, 2019).
  • The fear that some personal data will become identifiable stops some people from sharing their data in the first place. According to the UK’s communications regulator Ofcom (2019), around two-thirds of adults in the UK have concerns about hacking and security (for example, theft of personal information or private information being made public), while almost half worry about data privacy (for example, unconsented processing of personal information or government surveillance). 
  • Digital markets make it very difficult for people to make informed decisions about their desired level of privacy as they often do not know what data are being collected about them, and for what purpose (Acquisti et al, 2016). Evidence shows that most adults in the UK know little or nothing about how governments use their data, or how much data they hold (Boyon and Wallard, 2019). If there is a low level of transparency, provision of detail and scientific evidence, fewer people are willing to provide their personal data to fight the pandemic.
  • People are more willing to provide personal data when the risk of infection is higher. A survey of UK adults finds that at present only around 45% would ‘definitely install’ a contact tracing app. Nevertheless, this number increases to 58% if there would be a confirmed infection within the community, and 63% in case of a confirmed infection among close contacts (Abeler et al, 2020). Around 80% of smartphone owners would have to install it in order to reach sufficient coverage. This compares with less than 60% that have installed WhatsApp, the UK’s most used app. 
  • The value of personal data is context-dependent, and privacy can both enhance or reduce social wellbeing (Acquisti et al, 2016). While privacy and security are often presented as a direct trade-off, this is at best a simplistic and sometimes counter-productive view of highly complex issues. 

How reliable is the evidence? 

Survey questions are often vague and fail to distinguish between different types of data and scenarios of personal freedoms. For example, a recent survey shows that half of adults in the UK would support the use of phone tracking to locate and also penalise those that were not in compliance with the restrictions (Ipsos MORI, 2020). It is not clear what type of penalisation respondents were supposed to have in mind. 

The same survey shows that 65% would be willing to use ‘mobile phone roaming data’ to ‘track people diagnosed with Covid-19 and their close contacts’. The study concludes that the ‘majority of Britons support government using mobile data for surveillance to tackle the crisis’ (Ipsos MORI, 2020). 

Most empirical evidence on privacy preferences and use of personal data is based on rapidly implemented surveys. Sample sizes are generally small (around 1,000-2,000), although some attention has been given toward making them representable of the general population. Considering the small sample sizes of existing surveys, only limited conclusions can be drawn regarding differences across regions, ethnicity or income. 

What further evidence is needed? 

It is not clear how effective automated contract-tracing via apps actually is. Findings from Iceland, where almost 40% of the population have installed such an app, indicate that the impact of automated tracing has been limited (Johnson, 2020).

While personal data has significant economic value, it is unknown how the data could be used to create greater public good beyond the immediate uses to track and trace the virus. Survey evidence highlights that two-thirds of people in the UK would be willing to have their data de-identified and made available for university researchers to better prepare for future pandemics (Abeler et al, 2020).

Our understanding of the potential economic and social costs of a major government data breach is limited. This matters as potential costs are higher whenever data is centrally collected and held, as in the case of the initial version of the proposed NHS app. 

There is a risk of treating ‘personal data’ as a catch-all term, but there are differences between types of data. While some people might be willing to share limited location data, one cannot assume that the same applies to phone numbers of friends and relatives, credit card histories, medical records or the use of facial recognition tracking in public spaces. We also need more evidence on what type of data people are willing to trade for what purposes (for example, reopening schools, workplaces, gyms, museums, foreign travel, etc). 

Many factors can influence the willingness of people to install (and keep) a contact tracing app, including the type of technology used, types of data that are accessed, and the types of personal freedoms offered in exchange. We need a better understanding of specific trade-offs that people are willing to make. Some of the most pressing questions include:

  • Where data will be stored (on device or centrally).
  • Whether symptoms and confirmed infections are self-reported or entered by a doctor.
  • Whether location data will be used to prosecute those that leave their home when they should self-isolate.
  • What the rate of false positives and negatives is. 
  • Whether precise location history is recorded based on GPS or only proximity to others based on Bluetooth.
  • Whether data have expiration dates after which they are automatically deleted.

Evidence shows that people are often not fully aware of the type and amount of information companies and governments are collecting about them. This is important because the modern economy is a digital economy where personal data are a key factor of production. When people are not fully aware of what is happening with their personal data, their decisions about their desired level of privacy can be biased. 

While many people welcome the idea of using data to tackle the current pandemic (as in the fight against terrorism), there is a risk that they do not have a full understanding of what type of data is used for what purpose, who has access to it and for how long. More evidence is urgently needed, as some people have started to call for having contact tracing apps that are installed ‘by default’.

Where can I find out more? 

Why is contact tracing useful? Johannes Abeler and colleagues discuss issues around data protection, personal freedoms and economic costs of the lockdown. 

Exit through the app store? In this evidence review, the Ada Lovelace Institute concludes that there is not enough evidence to date that would support use of digital contact tracing. 

Tracking the global response to COVID-19: Privacy International provides a detailed global tracker of known measures (public and private) that trade privacy for people’s freedoms. 

Society needs to be consulted about tech solutions surrounding COVID-19: Edgar Whitley explores the technological choices related to privacy when politicians need to make decisions on behalf of society. 

Big data versus COVID-19: opportunities and privacy challenges: Scott Marcus describes how digital technologies can help to fight the pandemic, what the risks of using big data are and what governments should do to mitigate them. 

Who are experts on this question?

  • Dr Johannes Abeler is an Associate Professor in the Department of Economics at the University of Oxford. He has led a large survey on attitudes towards contact-tracing apps in the UK and other countries and has also written about preferences to share private information. 
  • Dr Jeni Tennison, Vice President and Chief Strategy Adviser at the Open Data Institute. Some of her work explores issues on data ethics and privacy, data ecosystems, and the value of data. 
  • Prof Alessandro Acquisti is a Professor of Information Technology and Public Policy at the Heinz College, Carnegie Mellon University. He has written extensively on the trade-offs involving economics, privacy and technology. 
  • Dr Marion Oswald is Vice-Chancellor’s Senior Fellow in Law at the University of Northumbria. In her research she explores the interaction between law and digital technology, with a focus on the legal, ethical and social issues arising from sharing personal data. 
  • Prof Peter Fussey is a professor of Sociology at the University of Essex. His research explores a wide range of issues (and their interconnectedness), including surveillance, digital technology, security, human rights, and society. 
Author: David Nguyen

Published on: 27th May 2020

Last updated on: 25th Jun 2020

Funded by

UKRI Economic and Social Research Council
Skip to main content