Faced with the pandemic, the public authorities must not push to deal with a technical response as deceptive as it is dangerous. The use of the Stopcovid application combines ethical, political and health risks.
Grandstand. On Tuesday and Wednesday, deputies and senators will have their say on the StopCovid application that is under development. What societal problems raise this application? What ethical and political questions does it generate? Without wishing to embrace all of the problems induced by this project, it is nonetheless necessary to measure its scope beyond the short term. Because it is a mode of biopolitical governmentality that its insufficiently questioned generalization could help to establish.
For several weeks now, we have been experiencing a major health crisis which destabilizes us collectively, confronting us with the fear of being infected, of infecting others and our loved ones, or the fear of death itself. This is one of the reasons why we no doubt accept confinement in such an obedient manner. These two factors, increasing uncertainty and increasing fear, are likely to create the conditions for strong acceptance of a system presented as an effective way to contain the pandemic.
If the Stopcovid application is accompanied by a promotional speech, a risk would be to offer a technical solution to a problem that is not, with all the limits that we know vis-à-vis this type of answers, formulated urgently and in a context that promotes a warlike posture. A parallel can here be made with the generalization of video surveillance to fight crime or terrorism; it is also (although the context is obviously very different) a technical response to major societal and political crisis issues. These challenges call for work to assess what is effective and what is not, what is legitimate or not.
The very name of the application is intended to be reassuring: if a person is tested positive for the virus, it will be possible with them to inform everyone who has worked with them in the previous days. Such an application is not compulsory. Its users choose to install it and can deactivate it at any time. But under its apparent virtues, there are high risks of generating more insecurity than security. Because to be effective, it would have to be adopted by everyone at the same time, otherwise it would risk creating completely biased information situations.
Without going into detail these technical problems, which have already been well pointed out, we quickly realize that beyond the questions raised by the very architecture of the solution (high risk of cyber-hacking, difficulty in modeling the risks of a virus which is still unknown number of epidemiological parameters, fallible anonymization protocols), the choice of Bluetooth technology that it is planned to use raises questions: the difficulty in qualifying the precise proximity of people between them makes it impossible to guarantee real exposure to the risk of contamination. The very structure of the device responds to the old dream of cybernetics to develop machines that communicate with each other. However, this mode of communication, carried out by means of the signal, cannot occur without creating errors of perception.
As IT specialists explain in this regard, the signal strength varies depending on the context, including the chips in the phones. If you have the latest iPhone, the phone will send a strong signal. But with an old smartphone, the signal strength will be weak. Another risk would also be to leave out people who do not have a smartphone, or who prefer to have a phone that is only used to make calls. Choosing to use a smartphone would immediately rule out nearly 13 million French people who do not have one – especially children and the elderly.
This knowing that to be effective, the adoption of this application would have to be massive and extremely fast. Because if the critical size of users is not reached, it could have the opposite effect of giving its users a false sense of security. The lack of notification could even be perceived as “A disappearance of the health risk” (1) paradoxically increasing the risk of spreading the virus. What can be said also of the risks taken in public space to follow this application rather than to apply the barrier gestures of physical distance which nobody disputes the effectiveness and which should be accentuated from the visualization of modeled data, allowing better understand the need to adjust distances according to situations. Microdroplet simulations by scientists from the Japanese Association for Infectious Diseases are very instructive in this regard. A wide dissemination of these modeling experiences would allow a majority of people to adapt a physical distance according to the contexts (depending on whether you come across a person who is walking or running for example).
The balance of our societies threatened
The reality of this application and the context of its use require a step back which helps us to identify what is at stake in the adoption of any innovation. Because, remember, we never meet technology per se, but the social, economic and political contexts in which it takes place. And discursive regimes are always engaged in the generalization of this or that technology. The proliferation of security discourse, for example, often illustrates the ambition of state powers to reaffirm a weakened legitimacy, by intervening on the symptoms of crises rather than on their causes. In this case, helpless in the face of the pandemic but forced to react quickly, the public authorities are diverting here the nature of the response through science and democratic debate to the technical field.
However, the urgent context of the adoption of such a technical system poses a problem and makes us think, in our recent history, at this moment when the French State, in the aftermath of the attacks of 2015, instituted the generalization of the massive surveillance by passing measures of the state of emergency into the ordinary legislative apparatus. We know this from our experience in the fight against terrorism: the temporary nature of attacks on freedoms can quickly turn into a permanent exception (this has been the case with anti-terrorism laws), especially when the political issues that remain national are doubled private economic stakes of powerful transnational companies.
Finally, the desire to use the StopCovid application poses major political and societal questions. In the event that the application is generalized for other uses and in the longer term, mechanisms of discrimination may arise for certain populations (under the guise of “risk selection”) which would hamper fundamental rights to free movement or medical confidentiality. This, regardless of the precautions that can be taken by its designers today. It is care and ethics that we need, not answers likely to favor the extension of a “high-tech” biopolitics which could last, well beyond the announced end of confinement. Thus, it seems important to us to present here some recommendations.
First of all, it is a question of reviewing the operating rules of democracy. If there is a debate with vote in the National Assembly on April 28 on the government’s plan and if it is now certain that such a procedure will also be followed in the Senate, it is not a vote on a bill which the government would have presented, which would then have been the subject of a preliminary examination by the Council of State and would be likely to be after submitted to the Constitutional Council. This would have been by far the most suitable procedure under article 34 of the Constitution.
Then, it seems important to us to reaffirm that technologies, used wisely, constitute a formidable tool of health policy. Indeed, if they are not associated with any filing and repression device, if they study the large masses to arrive at a finer epidemiological understanding, then these technologies are welcome, in the name of knowledge and the common good. Statistics were, as we know, an effective health policy tool. Concretely, this is already the case for the Covid-19, the public authorities having made a truly legitimate and benevolent use of technologies. For example,open data helped model the evolution of the virus’s spread.
If new technologies are now essential to any public policy, placing at the center of a State strategy an application whose effectiveness is more than doubtful, and above all inversely proportional to the ethical, political and health risks it entails, would be catastrophic for the evolution of our society. The exceptional moments we are living in must not push us to deal with a technical response, as deceptive as it is dangerous, nor make us renounce our democratic demands. On the contrary, they must encourage us to reaffirm the share of solidarity and confidence that a society needs to continue to invent itself unceasingly, by stimulating the social imagination that nourishes it.
(1) See Adrienne Brotons and Paul Christophle.
Signatories: Pierre-Antoine Chardel, philosopher and sociologist, professor at Institut Mines-Télécom Business School and member of the Interdisciplinary Institute for Contemporary Anthropology (CNRS / EHESS), Valérie Charolles, economist and philosopher, researcher at Institut Mines-Télécom Business School and associate researcher at the Interdisciplinary Institute for Contemporary Anthropology (CNRS / EHESS), Mireille Delmas-Marty, jurist, honorary professor at the Collège de France and member of the Academy of Moral and Political Sciences, Asma Mhalla, consultant in digital economy and lecturer at Sciences Po Paris.