You can find an app for just about anything these days. Have you ever wanted to communicate with a ghost? There’s an app for that. Or perhaps you would rather receive a virtual hug from a stranger? Well, there’s an app for that too. And my personal favourite, an electric shaving app, which simulates the dull task of shaving and is as pointless as it sounds. Of course, these are some silly examples, but they illustrate a movement in our society towards tech “solutionism”; the idea that every problem, no matter how outlandish, has a technological fix. There really is an app for everything it seems. It comes as no surprise that in the midst of the current global crisis, contact tracing apps have emerged as a new hero in the fight against coronavirus. Increasingly desperate governments across the globe are pinning their hopes on these apps – which promise to automate the process of contact tracing – as a way of getting us out of lockdown sooner and restarting our draining economies.
In the UK, the NHS has developed a new app that the government says will be central to its “test track and trace” strategy. The app, they claim, will help save lives, reduce pressure on the NHS and return people to normal life. Despite the current whirlwind of chaos, one thing has become clear. In order to tackle this pandemic, and indeed pandemics of the future, we will need to come up with innovative solutions to the problems they pose. Technology could play a huge part in this. However, much like the shaving app example, the technologies currently being put forward may prove to be totally useless. A mere pretense at something being done. Worse still, there are growing concerns that these systems, if left to propagate unchecked, would allow unprecedented surveillance of society at large.
The prospect of a technological quick fix – to automate the laborious process of contact-tracing – has certainly been an attractive one, with more than 30 systems being developed across the globe. The UK is currently piloting its own contact tracing app on the Isle of Wight. The health secretary Matt Hancock has put a great deal of faith in the technology, as he told Isle of Wight residents: “By downloading the app you are protecting your own health, you are protecting the health of your loved ones and the health of your community”. If the trial is successful – it’s too early to tell just yet – the app will be rolled out to the rest of the country in a week or so. Along with an army of 18,000 contact tracers to help bolster the system.
How exactly can a smartphone app hope to achieve such results? It works by using low- energy bluetooth signals to detect when two phones come into close contact with each other. Randomised identifying numbers, which supposedly include no personal information, are exchanged between the two devices, a kind of “digital handshake”. This information is stored on both phones for 28 days, building up a contact list. If a user develops coronavirus symptoms they can notify the app, which uploads the list of their contacts to the NHS system. A risk-scoring algorithm is used to decide which contacts are potentially infected, based on the strength of bluetooth signal (to measure proximity) and duration of that contact. The NHS would then notify potentially infected individuals to self isolate or seek testing. Proponents of such technology say this will allow cases to be identified earlier and drastically reduce transmission of the virus in countries that choose to adopt it.
However, the use of contact tracing technology has sparked a global debate about the security and ethics of mass population surveillance. On the surface, it would appear the NHS app is pretty innocuous in terms of data collection. Most countries, including the UK, are currently not using GPS information as this would be in breach of privacy laws in most European democracies. Bluetooth signals cannot determine your location, and all the ID’s sent between handsets are ‘anonymised‘. The UK government has stressed that the app doesn’t collect user data that could identify individuals. Ian Levy, the technical director of the National Cyber Security Centre (NCSC), who have been advising the government on creation of the app, said in a blog post: “the system ends up with a list of devices that have been near each other, even though they’re anonymous.” He also said the app “doesn’t have any personal information about you, it doesn’t collect your location and the design works hard to ensure that you can’t work out who has become symptomatic”. Despite assurances that the app will not ask for personal details, users will be required to input the first line of their postcode. Officials say this is to allow the NHS to track the spread of coronavirus and identify hotspots more easily.
At the heart the debate concerning population surveillance is not the data itself, but rather where it will be stored that becomes important for security. To centralise or to decentralise, that is the question on everybody’s lips. On the 10th of April, Apple and Google revealed their bitter rivalry had been “set aside for the greater good” and were working together on an API for digitalised contact tracing. This decentralised system would allow programmers to create their own apps – where most of the necessary data would be stored on an individuals phone only – and not collected in large databases. This would put privacy at the heart of the technology, and reduce the risk of big data getting into the wrong hands. A move praised by the UK data watchdog as “best practice compliance” with data protection regulations.
Yet the UK government has backed away from Apple and Google’s decentralised approach, positioning itself boldly as one of the few outlying countries in the world to take a control over privacy approach. This decision comes after German officials were forced to backtrack from a centralised approach, due to huge backlash from civil liberty groups and the public worried about the threat of surveillance. The UK government has instead opted to create its own app, that would store anonymised lists of contacts on a centralised database, accessible to the NHS. The big question is, why has the UK opted for increased surveillance when other viable, privacy protecting options exist? It’s a very risky move indeed.
Matthew Gould, the CEO of NHSX, explained that their justification for having a centralised database was to give greater insight into how the disease spreads and help make the app more efficient. Gould says without a centralised system to catch data – postcodes, time of incident, strength and duration of signal – this valuable data would be lost.
He would add that:
In other words, they want to capture this data to inform epidemiological modelling, which will in turn drive public policy decisions.
The message from the government is quite clear: Give up a little bit of your data to the experts and save lives. Now is not the time to be concerning yourselves with data privacy issues, think of the “greater good”. However, leaving it up to the experts didn’t work so well the first time. A failed herd immunity experiment and decision not to go into lockdown straight away, helped push the UK towards the unfortunate title of highest death toll in Europe. They are certainly not the only factors that contributed to the UK’s high death rate, but with the benefit of hindsight we now know these decisions were the wrong ones. One of the government’s chief scientific advisors also flaunted his own lockdown rules to shag another man’s wife, but that’s a whole other story. Hilarious though it is, it illustrates rather candidly that leaving our data in unscrupulous hands might not be the best idea.
Instead, should we not heed the advice of those who know a thing or two about security and privacy? A joint statement from 200 security and privacy experts in the UK said that the collecting of data in central databases “would enable (via mission creep) a form of surveillance”. They added:
Matt Hancock has been quick to push back against these privacy concerns, saying that “all data will be handled according to the highest ethical and security standards”. Despite these assurances, there are troubling clues as to what lies ahead.
During the UK parliament’s Science & Technology Committee on the April 28th, Gould said that they were considering allowing users to share their location data in future versions of the app, in order to help epidemiologists identify infection hot spots. “The app will iterate. We’ve been developing it at speed since the very start of the situation but the first version that we put out won’t have everything in it that we would like,” he said. “We’re quite keen, though, that subsequent versions should give people the opportunity to offer more data if they wish to do so”. Worryingly, Gould also admitted that the data they harvest will not be deleted and may be used at a later date for “research purposes”.
The government’s primary defense against privacy concerns is that all the data will be given to them “anonymously”. There is always a risk however that a database full of anonymous ID numbers attached to personal information can be re-identified; to work out who users are and who they associate with. This is precisely why the decentralised system proposed by Apple and Google produces new ID numbers for each user each day, to thwart any attempts of re-identification from a bad actor.
In a paper discussing these privacy concerns, Levy says: “In all large scale social graphs of this sort, there is a risk that seemingly innocuous data can be analysed to identify particular subgroups of the population”. And while at present he says the risk of re-identification is low, he also added: “The risk comes as more data is added to the graph, or commingled with it… the addition of more data to the graph nodes needs careful consideration.” It makes perfect sense that the more data they have on you, the easier you are to identify. They know where you live, who you hang with and potentially your location at any given time. That random string of numbers hiding your identity becomes a thin veil. A spokesperson for NHSX denied there are plans to de-anonymise data, now or in the future. Maybe they won’t, but who is to say that a hacker can’t? Earlier this week, the Joint Human Rights Committee heard evidence that users’ identity could in theory be re identified by hackers. Hell, the government could even sell your data off to private parties, big data is worth big money – just ask Facebook!
There is also a very real danger that what starts as a voluntary app could soon become mandatory for public and social engagement. The government can justify its snooping powers as a necessity to keep people safe. Edward Snowden, the whistleblower who exposed the US National Security Agency’s mass surveillance programmes, said that increased surveillance amid the coronavirus outbreak could lead to a long lasting erosion of civil liberties. “Five years later the coronavirus is gone, this data’s still available to them – they start looking for new things”. Alex Gladstein, from the Human Rights Foundation, expressed similar concerns, saying officials could add more invasive features over time. “This is a slippery slope that leads to you being colour coded”, he said.
Just think of China’s social credit system. Any employer, retailer or even government could start requiring the citizens to display the status of their app in order to access certain goods and services. A kind of “immunity passport”. It’s easy to dismiss claims that this will happen in a democratic society such as ours. However, wherever there is fear there is an opportunity to take advantage. Ever heard of the The Patriot Act? Post 9/11 the US passed the act, a broad set of laws that massively expanded government surveillance powers. The UK passed emergency powers without anyone batting an eyelid. Essential though some of these powers are to keep us safe, the chaos that ensues from a global pandemic is the perfect opportunity to sneak powers in that overstep the mark. In Taiwan for example, there were reports of people getting visits from the police if they failed to report their location to authorities.
Is the government’s promise of a return to freedom in exchange for our privacy even worth it? Well their decision to go for a control centric over privacy centric model could completely undermine the effectiveness of the contact tracing technology in the first place. The effectiveness of such apps is largely dependent on the number of people using them. The government is hedging its bets on the fact that so far, the terrified public have done what has been asked of them out of civic duty. Are they willing to go one step further and relinquish their data for the good of the country?
Possibly not. The UK public, already in uproar over the government’s decision to use a centralised database to store data and track their movements, are unlikely to warm to the app. Oxford academics have estimated that 60% of the UK population would need to be using the app for it to work effectively. This is currently not being achieved anywhere in the world. Singapore was one of the first countries to introduce such technology, but were only able to convince 20% of the population to sign up for their TraceTogether app, far below that threshold level of 60%. Only 3.5 million out of a population of 25 million Australians have downloaded their contact tracing app. Even though it uses the decentralised system that is considered far safer than ours. A recent poll in the UK did suggest as many as two thirds of Brits would download the app, which would be enough to make it work. Whether the app works well enough to convince them to keep using it is another matter though. My guess is it won’t.
Singapore has been used by advocates of the technology to justify its success. And it’s true, they did enjoy success in suppressing the virus early on. However, Jason Bay, one of the lead developers in their TraceTogether app, was quick to quash the “excitement” around the technology. Writing in a recent blog post, he says:
He continues “Any attempt to believe otherwise, is an exercise in hubris, and technology triumphalism”. Bay makes it clear that there should not be an over-reliance on technology, and says that traditional, manual contact tracing has a far greater role to play. As indeed was the case in Singapore. That’s not to say technology wont play its part in the future, but we’re dealing with this pandemic in the here and now.
The Ada Lovelace Institute has recently published a review of the technical and societal implications of using technology to transition from the coronavirus crisis. As director of the institute, Carly Kid explains:
With the technology being so new, much of the hype around contact tracing apps, it seems, comes from anecdotal evidence rather than hard science. A team of researchers at the KU Leuven Institute for the Future in Belgium attempted to find some empirical evidence of their benefits. They were unable to do so, concluding that the evidence of their benefit was “limited”.
The technology is just not the perfect solution some have tried to dress it up as. It has severe limitations; bluetooth signals cannot determine whether there is a protective barrier or window between two adjacent people for example. Likewise it cannot determine the ventilation of a room. The app in its current guise won’t be able to work with the decentralised apps that are being created by Germany, Switzerland and many other countries across the world. Travelers with the UK may require quarantining when going abroad, or even barred from travelling completely. And while Gould says the NHS are “co-operating very closely with a range of other countries” to find work arounds to these issues, they could save themselves a lot of bother and just use the system already created by Apple and Google, as most of the world is already doing. The UK wants to be a pioneer in this regard, but it seems that we’re playing catch up. Over 65’s, the segment of the population most at risk of the disease, are also the least likely to own smartphones. In this case, will the government be handing out phones likeWerther’s Originals to the elderly? Unlikely. And of course there’s the risk that people under or over report their symptoms. A phone app of this kind simply cannot apply the judgement a human contact tracer can. There will be many false negatives and false positives, eventually leading to a complete loss of trust and subsequent abandonment of the technology. It could be dead in the water before it even has the chance to to prove its worth.
If the pilot trial on the Isle of Wight is a success, you can be sure that the app will be keeping its watchful eye over us until a vaccine arrives, if not long after then. Although the effectiveness of contact tracing apps is up for debate, they should not be ruled out completely. As with any new technology, it may be many years until we realise its full potential and the system will need a lot of tweaking. The government has at least made some acknowledgment of this fact, as Gould himself said: “We need to level with the public on this, that when we launch it, it won’t be perfect and as our understanding of the virus develops, so will the app.” However, for the technology to triumph and be accepted by all, it must have the appropriate safeguards. Particularly if the UK decides to stick with its centralised model where – in a post lockdown world – there is a high risk of individuals with coronavirus being subjected to stigma, harassment or dismissal.
According to the UK’s data protection watchdog, the UK government has still not published a Data Protection Impact Assessment (DPIA) for the use of the app. The DPIA would outline possible impacts on privacy and human rights relating to the use of the app, a legal requirement under UK/EU law for any organisation embarking on “high risk” data processing. Despite the European Data Protection Board saying it “strongly recommends” the publication of DPIAs, the government has yet to do so. Without absolute clarity and transparency about privacy issues, the public is likely to reject the app.
On May 7th, the Joint Committee on Human Rights warned the government that the contact tracing app could fall foul of privacy and human rights law. They urged the government to enact robust legal protection for what that data will be used for, who will have access to it and how it will be safeguarded from hacking. This would go some way to ensure the mass surveillance of personal data did not result in a human rights violation. They also said legislation should impose strict purpose, access and time limitations – most importantly a sunset clause to mitigate the danger of these apps becoming mandatory in the future. For decades now, policymakers have worked extremely hard to ensure that the healthcare industry conforms to strict standards of safety and efficacy to keep everyone safe. It’s not unreasonable to expect the same standards to be applied for this technology.
It would be wise to follow the example set by Australia, who have published legislation alongside their app that aims to protect the rights of users. The Australian government has also agreed to destroy the data once the pandemic is over. Professor Lilian Edwards, an academic at the University of Newcastle who specialises in internet law, has written a draft bill that aims to protect the privacy of individuals who use contact tracing apps. The bill includes laws such as the right to refuse to install tracing apps and the right to not provide employers with their health status if asked. “We have a precedent of previous pandemics leading to a mass extension in state surveillance… that is my worry” said Edwards. One legal firm has said that: “a centralised smartphone system – which is the current UK Government proposal – is a greater interference with fundamental rights and would require significantly greater justification to be lawful. That justification has not yet been forthcoming”. This means the government’s use of the app could actually be illegal. Ravi Naik, a solicitor at the firm, also said it was “inevitable” that a legal challenge to the UK’s system will be launched.
It feels rather Orwellian that the government, in times of crisis, can exert such overreaching powers with no safeguards for our privacy or data protection. These concerns have been voiced by researchers at KU Leuven, who say that: “In times of crisis and urgency, ethical and privacy issues may be overlooked, leading to unintended consequences that are later difficult or impossible to undo”. Similarly, In an open letter, 300 of the world’s leading international researchers said they were “concerned that some ‘solutions’ to the crisis may, via mission creep, result in systems which would allow unprecedented surveillance of society at large”. They called for greater transparency, urging the government to make all protocols privacy preserving by design and available for public scrutiny.
“The best defence we have against ‘creep’ is transparency and an assurance that what we will do is open” says Gould. “We have said will will open source the software, we have said we will publish the privacy model and the security model that’s underpinning what we’re going to do”. This transparency is happening too slowly though. Harriet Harman, the chair of the Joint Committee on Human Rights, said government assurances on privacy were “not enough”. “Parliament was able quickly to agree to give the government sweeping powers. It is perfectly possible for parliament to do the same for legislation to protect privacy,” she said.
There are no magic bullets to get us through this pandemic, and this technological solution does not appear to be one. It’s not that digital contact tracing shouldn’t be done, but these apps are not the finished article yet. They cannot substitute for teams of human contact tracers. Perhaps contact tracing apps are one for the future. Phil Booth, co-ordinator of Medconfidential, a medical data-protection firm, sums it up nicely: “contact tracing and immunity passports are all really apps for the next pandemic” . If we use them prematurely, without the necessary safeguards, not only could we cock up easing of the lockdown, we might also sleep walk our way into an era of mass population surveillance.