top of page

Surveillance as a First Response?

Desperate times call for desperate measures, or so the adage goes. And these are desperate times. With a rising death toll, increased unemployment, and a virus that continues to befuddle us all, it is no wonder that our governments, weakened by years of austerity, are reaching for extraordinary tools to flatten the curve. But while measures may be desperate, they should also be appropriate, lawful and just. It’s unlikely that the NHS's current plan to build a large-scale Covid-19 datastore meets any of those requirements.

 

Last March, the NHS announced a new plan to build a datastore that aggregates Covid-19 health data:  ‘To provide a single source of truth about the rapidly evolving situation, data will then be integrated, cleaned, and harmonised in order to develop the single and reliable information that is needed to support decision-making.’ The datastore, furthermore, will facilitate data sharing between the NHS, government bodies and private partners, specifically Microsoft, Amazon, Faculty, Palantir and Google. 


The plan is not without its critics. One Whitehall source described the amount of data held by the store as ‘unprecedented’ and said it was collected ‘with insufficient regard for privacy, ethics or data protection’. In a legal opinion, Ryder, Craven, Sarathy and Naik conclude that the plan itself ‘does not comply, thus far, with data protection principles’.


Critics further raised concerns about the companies NHS is partnering with:

  • Amazon has come under attack for not sufficiently protecting and compensating its warehouse workers. 

  • Palantir received bad press for its role in deportation of US immigrants, as well as its far-reaching surveillance technology. It’s, moreover, unclear why Palantir is willing to do this work for just one pound. What do they hope to gain in additional benefits?  

  • A previous data sharing agreement between Google’s former sister company DeepMind (now Google Health) and various NHS Trusts resulted in a nation-wide scandal.  

  • Faculty, an artificial intelligence startup, received 7 government contracts in the past 18 months. It’s unclear how the company, which formerly worked on the Vote Leave campaign, obtained the contracts and how its personal ties to Downing Street’s elite factored into those decisions.


To ease our unease, the NHS promises to comply with regulations set out in the General Data Protection Regulation. It promises that all data will remain under the control of the NHS, will only be used for Covid-19 and will be anonymised. Once the pandemic ends, the data will be destroyed or returned to previous databases. It further mentions ‘strict contractual agreements’ between the NHS and its partners, but leaves vague what those agreements entail.  Finally, the announcement mentions that the NHS will follow ‘established principles of openness and transparency’. 


However, we already know how easily promises can be broken. The aforementioned DeepMind hoovered up 1.6 million patient records through data sharing agreements with Royal Free NHS Foundation Trust. The company promised never to integrate the data with other Google datasets. Yet, that is exactly what happened when DeepMind merged into Google Health, just one year later. 


It does not bode well that it only took two weeks for Guardian journalists to get their hands on documents that challenge the promises made by the NHS. They found that, even though anonymised, the datastore may still include personal information, such as data on gender, postcodes, symptoms and more. The documents also challenged the control NHS will have over the data: ‘the Guardian was able to see confidential documents used by Palantir, Faculty and NHSX officials to plan, develop and execute the Covid-19 datastore.’


But the main problem is not in the details provided, it’s the fact that even though the plans were announced well over a month ago, we still know far too little to judge whether the promises made are indeed upheld. Attempts by law firm Foxglove to obtain more information along traditional routes have failed and the group is now preparing to go to court. 


Key questions

The effective use of data can help us formulate adequate public health responses and assist in the coordination of our actions. But, do we really need to surrender control over our most sensitive data to big tech in order to stay healthy? Tony Blair’s think tank would certainly like you to think so. It went as far as to say that ‘surveillance is a price we pay for fighting Covid-19’. Privacy, apparently, is a luxury we can no longer afford. But the choice between health and privacy is a false one. In fact, the erosion of our data rights may impair rather than improve our health, when it means we are less likely to confide in our doctors and trust our governments.


Before we bargain away our future freedoms in an ill-fated attempt to save the present, we may want to ask ourselves and our elected officials whether the proposed measures solve a real need. Then, we should continue to ask what the opportunity costs of the proposed plans are, what risks they entail to both our democratic values and the fate of those most vulnerable, who we can hold accountable when things go wrong and what such accountability might look like.


Let’s go through these questions in turn!


Is there a real need for this solution?

Anyone who has ever been anxious knows that anxious brains love to hoard data. But before we let our anxieties determine public policy, we may want to ask what problem we hope to solve, whether the proposed solution is indeed solving that problem and whether it’s the best way to do so.


What are we not doing while we are doing this?

Solving problems is great, but with limited resources we need to prioritise needs. While we’re busy building datastores and contact tracing apps many other needs go ignored. Those include the need for more testing, the known needs of the (recently) unemployed who can no longer afford to pay rent, the seasonal workers who live in such close proximity to one another as to be in greater risk of exposure, the victims of domestic abuse and the essential workers who continue to work without proper protection, or proper compensation.


Sometimes we can address many needs at once, sometimes we cannot and we are faced with difficult trade-offs. Understanding the opportunity costs of new policies is vital.


How does the outsourcing shift power from the public to the private sector?

The NHS did not merely propose to build surveillance technology, it is outsourcing both the development of the platforms and the processing of our data to private partners. How does doing so shift the balance of power away from the public sector to the private sector? What measures are in place to prevent the tools being built from locking us in?  Just as it’s hard to leave Facebook because we cannot take our social graph with us, it might be equally hard to leave a data platform built by Palantir without rendering the underlying data unusable. Finally, who owns the rights to the insights and services built on top of that data?


In short, by outsourcing to third parties, what long-term dependencies do we create that cannot simply be undone?


Who is in control?

Closely related to the question of power, is the question of who is in control. Who gets to decide what data can be collected, how that data can be accessed and what it can be used for? Not just in theory, but also in practice. Do the governments that are currently setting up datastores have sufficient knowledge and public support to make these decisions? Or will they leave the de facto decision-making and control to the corporations they hired to build the infrastructures? What role do individual NHS Trusts play in deciding this? Do physicians get a say about how their patient’s data is used? What about the patients themselves?


Who is most at risk? And how do we protect them?

When estimating the risk of data sharing efforts, it’s not enough to anonymise data (if that’s even possible) and it’s not enough to rely on individual consent alone. For one, a lot of data about us also describes someone else: DNA data about you, also describes your family members. But more than that, even when data is only about a single person and even when that data is fully anonymised, it can still be used to infer attributes about someone else. Consider for instance the sharing of Fitbit data with health insurance companies in return for benefits. If all the healthy people decide to share their data, but all the less healthy people do not, what inferences could an insurance company make about the latter group? How would that impact their insurance premiums?


Before embarking on large-scale data sharing, we need to understand who this puts at risk and in what ways. A risk analysis, as well as measures to protect the most vulnerable, should be in place before a single line of code is written.


What is the exit strategy?

For what duration is the data collected and what happens when that period ends? If the exit strategy depends on the pandemic ending, then what criteria are used to determine when the pandemic is indeed over?


Have alternatives been considered?

We have access to alternative data governance models, like data trusts and data commons. They would enable us to entrust our data rights with those whose sole responsibility it is to look out for our best interests, rather than the interests of a company. These models grant us greater collective control over our data, while still allowing for it to be shared with appropriate users, for specific purposes. When evaluating whether any approach to solving a problem is the right approach, those alternatives need to be considered as well.


Are the measures transparent? Who is accountable?

Emergencies ask for rapid responses, but the public should still be involved in the decision-making and receive needed documentation to understand the agreements in place. In addition, an external auditor needs to be in place to evaluate the legality of the data collection, access and use.


Get involved

  • Together with Beyond Return, I’m preparing to take action to highlight the need for the UK to bring these questions to the forefront of health technology discourse. Follow me on twitter (@AnoukRuhaak) or send me an email (hello@anoukruhaak.com) to receive updates.

  • Join Beyond Return, a global movement imagining and coordinating towards new economies and modes of society (@BeyondReturnOrg).

  • Donate to Foxglove, to help them get answers and transparency; Donate to Open Democracy, working to hold governments’ coronavirus responses to account; Donate to Open Rights Group, to support their work on upholding digital rights.

 

Anouk Ruhaak

As a Mozilla Fellow embedded with AlgorithmWatch, Anouk creates new models of data governance for the public good, with a focus on data trusts and data commons. She has a background in political economics and software development, and founded several communities in the tech space. In addition to her data governance work, she is an organiser with Beyond Return and co-founded Radical Engineers.


Follow Anouk on twitter: @AnoukRuhaak

ความคิดเห็น


ปิดการแสดงความคิดเห็น
bottom of page