When it comes to contact tracing apps however, our feted love-hate relationship with big tech comes into sharper focus and it appears to be testing the boundaries. Will we embrace it as offering a way out of the crisis to our ‘new normal’, or will suspicions, particularly around privacy, prove too strong?
How we feel about the answer to this question might well depend on where we live or the approach proposed. We are seeing some interesting variations in both. One distinction that is emerging is between decentralized and centralized structures for tracing apps. The centralized approach involves a central server into which alerts from users are received, held and sent out. Examples include the UK’s NHSX app (which is still being tested) and the Australian app. In contrast, with the decentralized approach (including those built on Google and Apple’s tracing API technology) alerts are passed directly between user devices with no central depository. The decentralized approach is currently favoured by more countries including Germany, Switzerland, Canada, and Ireland, among others.
At the heart of the debate are the issues of privacy and trust, with concerns being voiced about the possibility of even anonymous identifiers being circumvented, of data being used for other purposes and of the risks of potential attack. These concerns are stronger with a centralized model, of course, where the target offers more rewards. Data Protection Authorities (including the UK’s Information Commissioner) and the European Data Protection Board have commented with a preference for a decentralized model whilst noting that both can (with an emphasis on ‘can’) be compliant with data protection laws.
However, some have questioned whether European data protection laws are in fact fit for purpose in this case? In Australia, additional legislation has already been passed to amend the Privacy Act 1988, including provisions to ensure employers can’t make downloading the app a condition of returning to work, as well as covering the obtaining of consent. There have been some calls in other countries, including the UK, for more specific legal protections to be implemented.
In the UK and European Union, additional laws should not be necessary for tracing apps. The General Data Protection Regulation (the Regulation) which celebrated its second birthday on 25 May, has very wide ranging requirements impacting both public authorities and private companies. The Regulation contains central principles on data minimisation, purpose limitation, special category data (such as health data) usage, and security, to name a few. Guidance has also been emerging on how these requirements should apply in the context of tracing apps, yet concerns still remain.
A primary point of contention is the heavy reliance on principles in the Regulation, especially combined with regulators constantly saying they will take a “risk-based approach”. This leaves a little too much wiggle room and ambiguity to be able to convince everyone. A further complication is that many of the key provisions of the Regulations around employee data and use of data by public health authorities in the public interest or for health emergencies are left for local law confirmation, meaning it is natural to see some variations in approach by different Member States.
On balance, however, it is clear that there are laws in place that can be applied in the case of contact tracing apps, and which should provide sufficient safeguards and controls. It, therefore, seems unlikely that we will see more specific regulation for apps in Europe even though the last few years have seen constant calls for tech regulation, including for biometrics, facial recognition and artificial intelligence. Member States and regulators are understandably now reluctant to rush in new laws which raise such big and complex questions around transparency and liberty, and need careful debate and consideration unless strictly necessary.
One might question, in the face of potential freedoms that tracing app technologies offer, whether public mistrust is actually less with big tech currently and more with government.
An interesting reality, therefore, is that the rollout of these apps will in itself amount to an experiment in direct democracy. Governments reportedly need 60% of the population to download their apps for these to work effectively, an ominous number when it comes to turnout. In the European elections, such turnout was last achieved in 1979. By comparison, in the last UK general election, it was 67.3% which was the second highest turnout in the past two decades, except for the Brexit referendum. Achieving a similar outcome in this context may be a tall order given that not everyone even has a smartphone for the apps to work. What will the results of this civic choice really indicate, if anything, about our new relationship with technology and dependence upon it?
We may all feel we are living through a strange experiment with uncertain outcomes at the moment. However, unquestionably, we are seeing large technology companies stepping up to the plate with options, technologies, solutions, and examples of privacy by default and design in practice. The pandemic has also brought a period of reflection for many. As we reassess our values, some of those factors that are key to data governance, such as fairness, transparency, accountability, remain as prevalent as ever. However, in a post-pandemic world, perhaps the tech-backlash of recent years won’t ring quite so loud.