Coronavirus and freedoms – S. Carrasco

 Health services are being stretched and tech solutions seem to be a good way to keep us safe from coronavirus, but we don’t have to accept all limits to our freedom.

The coronavirus pandemic is putting health services in many countries to a stern test. The coronavirus pandemic has brought to light deficiencies and a lack of resources, including healthcare services not having enough resources, and disparate data from health institutions, even within the same country. Countries are trying to solve these problems, especially through technology, but they have not had enough time to properly evaluate how their solutions affects citizens’ rights.

Tech solutions being looked at but they could increase inequality

There has been talk of using geolocation data from telecoms companies, developing applications for mobile phones that can provide both information and monitoring of their users, using artificial intelligence to predict demand and anticipate needs, and tracking contacts through a whole series of different systems. There has been very little coordination among bodies responsible for public health in developing and implementing these solutions, although this should have happened at the beginning of the crisis.

Technology is seen as a must for monitoring containment measures effectively. It can be implemented quickly and can respond to the demands of health and resource management authorities. However, it poses a number of privacy-related problems, especially surrounding the processing of the data obtained and guarantees concerning its use. But we must also remember that technology can only be used in this way if people have the right devices and this can widen the gap between those who have smartphones and those that don’t. This would affect the disadvantaged the most.

Measures may create a false sense of security while eroding freedoms

As the pandemic progresses, the debate has shifted from the use of aggregated and theoretically anonymized data obtained from operators, towards the search for legal authorization for individualized control of people. Countries such as the Netherlands have determined that a monitoring app requires a user base of nearly 60% in order to be effective, and, given the low percentage of citizens that have installed it in countries such as Singapore (only 16%) they are considering making installing it compulsory.

Other measures might include movement control through QR codes to facilitate a gradual easing of the lockdown and the use of temperature cameras with facial recognition to detect potentially sick people. In many cases bear these types of measures have unacceptable margins of error and create a false sense of security, and come along with a guaranteed violation of privacy.

In the end, the debate that has arisen places citizens in the false dilemma of having to choose between privacy and the fight against the disease. A fight which, as has been announced, requires the efforts of everyone, and for which everyone will have to make concessions.

Public authorities know how the social consciousness of the masses works, especially in critical situations like this one. One glance at previous debates about content control or the security of encryption tools in the wake of terrorist attacks make it clear how external situations can change users’ awareness of the value of rights, and how far they are ready to go.

Society likely to accept limits to their freedom but it may not be easy to get them back

In the present situation, it seems likely that we will accept a transfer of rights, which wouldn’t have been possible under ordinary circumstances. Accepting having our locations tracked or having our faces in a facial recognition database may seem appropriate in exchange for a feeling of security. The problem is that giving up rights is simple, but getting them back once the situation returns to normal can be complicated.

Due to the duration and gravity of the situation, society is now more likely to change its perception of normality. And despite having systems of regulations with guarantees, in the end many of the measures adopted by different countries bear limit rights in a disproportionate way.

We have forgotten basic principles such as privacy by design and by default, and the need to minimize data, because institutions are saying that all information is strictly necessary. And many applications, developed by both individuals and institutions, that have been designed not to protect their users, but to obtain their data.

It is difficult to speak of real conscious consent in this case, especially when we are talking about tools that are being provided to citizens by healthcare services. Faced with symptoms that can indicate an illness, most people will not hesitate to use any applications necessary, without really paying attention to privacy policies or how their data might be used.

European framework flexible enough but guarantees should not be forgotten

Regulations are flexible enough to adopt exceptional measures in cases such as this one. As different data protection authorities have pointed out, legislation has not been repealed in response to this pandemic, and we must take it into account. The European framework is different to those of other countries that have been used as an example, such as China, South Korea or Singapore. The gravity should not be used as an excuse to forget about existing guarantees, because things can still be done well.

It is true that in these last few days, pan-European projects focused on sharing data and information of all kinds, and on developing decentralised systems with privacy guarantees for the trace-ability of contacts using Bluetooth, have been put forward. But this did not happen until the situation had already reached its limits. It is also important to remember that technology should not be the only tool we are using. It is just one more pillar of the set of measures to be adopted.

Previous outbreaks never prepared us for this

Despite the situation experienced with SARS in 2003 and MERS in 2015, the reality is that the current pandemic has exceeded all predictions. It is being fought with a healthcare system that lacks resources and is not properly coordinated. What’s the point in using artificial intelligence to make predictions without a data model that permits the integration of information from different governments If the base is not correct? Any projection made like this will be flawed.

It is undeniable that we are facing an exceptional situation, which requires extraordinary measures. But these must be delimited in time, and the data treated should have also limited purposes. What will happen when everything ends? This is the question we must ask ourselves.

Sergio Carrasco is a Lawyer specialized in Artificial Intelligence and Privacy.

This article is reproduced here with the kind authorization of Rights International Spain and Liberties. You can read the original version at: https://www.liberties.eu/en/news/our-freedoms-must-be-guaranteed-under-any-measures-to-fight-coronavirus/19139

To quote this article, please use the following reference: Carrasco (2020). “Coronavirus and freedoms”, Observatory on contemporary crises, April 29, 2020, URL: https://crisesobservatory.es/coronavirus-and-freedoms-s-carrasco/

 

The OCC publishes a wide range of opinions that are meant to help our readers think of International Relations. This publication reflects the views only of the author, and neither the OCC nor Saint Louis University can be held responsible for any use which may be made of the opinion of the author and/or the information contained therein.