Facial Recognition and Social Distrust

Newsletter December 2020

At the 20th edition of its meetings on the 13th November 2020, the Values and Policies of Personal Information research Chair explored the broadening and deployment of facial recognition.

The dialogue, as the news over the past few weeks, highlighted the fact that this technology raises questions primarily related to a choice of model of society, beyond the technological, legal and organisational issues. Included notably are numerous questions regarding its trustworthiness (which represents a weak objection, since the false-positives and false-negatives errors can be expected to diminish as the systems mature), ethical and gender biases, procedures for the protection of minors, and the proper execution of a risk analysis.

Hence, the responses shall not be considered only from a technical or legal perspective; they are obviously related to their overarching facet, which is political.

A Political Response Before All

Thus, concomitantly with the review of the legislative draft on global security that was tabled on the 20th October 2020,[1] the French Home Office published its white paper on security[2] on the 16th of November 2020. Echoing what the Secretary of State for digital transition and electronic communications Cédric O suggested,[3] the document – comprising no less than 332pages – proposes “experimenting facial recognition in public spaces, in order to harness that technology on the technical, operational and legal levels for the purpose of protecting the French people.”[4]

As other institutional stakeholders (the city of Portland, in the United States of America, prohibited since the 10th of September 2020 the use of facial recognition for law enforcement and commercial purposes[5]) and private sector actors (IBM[6], Amazon[7] and Microsoft[8] adopted a moratorium and refuse to sell their facial recognition technology to governmental institutions as long as its impact on fundamental rights will not have been assessed) are choosing an alternative path, that white paper emphasises a very different option: intensification. That intensification would hence materialise through experiments and the creation of a biometric database of faces for criminal purposes “which would include, on one hand, like now, the reference images, but also, on the other hand, the comprehensive set of unidentified latents, in the same manner as what currently exists for fingerprints and genetic traces.[9] This aims to expand the opportunities for recognition through technical systems, and eventually extending this to behavioural recognition. Thus, the goal would no longer only be to “identify or locate a person through biometric recognition, but to analyse contexts or detect scenes that may constitute a danger for the public or that correspond to tortious or criminal conducts.[10]

Regarding the matter of authentication, the facial recognition technologies that allow to verify that a person is truly who she claims to be, seem, for most of them, efficient and consistent with our democratic values. It can be noted that, in most cases, the person is aware that her biometric data are being processed. For example, this is the case when she passes through border control zones in airports, via automated identity control terminals named PARAFE (Passage Automatisé Rapide aux Frontières Extérieures, an automated fast-track border crossing system)[12]. Regarding the matter of identification, namely the ability to find a person among a group, some purposes, depending on the involved contexts and the chosen options, can raise enforcement problems, or even risks of non-compliance with legal requirements.

The different judgements given at the end of year 2020 provide in this regard a broad overview of the limitations affecting the use by the government of devices that capture biometric data, and the need for a strict compliance with GDPR.

As a first illustration of those challenges related to authentication, we will review the Conseil d'État’s prescriptions regarding the implementation of the ALICEM application. Then the use of facial recognition by the Welsh police that was sanctioned by the London Court of Appeal will be examined, and, as a conclusion, the decision taken by the French high jurisdiction regarding the pre-emptive use for detecting public gatherings through drones will be analysed.

The ALICEM Application

In France, the enforcement of the legislation related to personal data is being questioned, as illustrated by the judgement given by the Conseil d'État on the 4th November 2020 regarding ALICEM (Authentification en LIgne CErtifiée sur Mobile), a mechanism for certified online authentication on mobile phone. Gathered in full court, the high administrative court indeed confirmed the compatibility of the implemented application with the GDPR.[13]

As a reminder, the ALICEM application was authorised by decree No. 2019-452 of 13th May 2019.[14] While awaiting the arrival at summer 2021 of the electronic identity card, ALICEM allows French nationals holding a biometric passport, and foreign nationals bearing a biometric residence to establish their identity with a security level equivalent to the one provided by the electronic identity document they possess, namely the high guarantee level as defined in the eIDAS Regulation.[15] Beforehand creating that digital identity, the person must create her account and prove that she is who she claims to be. For this purpose, she must film herself in real-time while accomplishing three actions: smile, turn her head and blink. That video is then used by the ANTS (Agence Nationale des Titres Sécurisés, the French secure documents agency) to check it matches the person holding the phone (dynamic facial recognition) and to extract a photo that is compared to the one included in her passport or residence permit.

The CNIL’s Ratione: Lack of Free Consent

In its deliberation No. 2018-342 of 18th October 2018,[16] the CNIL had considered that “refusing the processing of biometric data impairs the account activation, and limits entirely the scope of the initial consent given during account creation.” Engaging in a strict scrutiny of the necessity principle, the CNIL pointed out that “the requirement to utilise a biometric mechanism to check a person’s identity for the purpose of reaching the high guarantee level of the electronic identity, as defined in the eIDAS Regulation, has not been established, in particular in view of the possibility of using alternative verification mechanisms.

Those alternatives solutions could include: a face-to-face meeting in the premises of a public institution;[17] a manual verification of the video and the photography on the document;[18] or a live video call with an ANTS agent. The CNIL concluded that “the consent to the processing of biometric data cannot be considered as being free.

The Conseil d'État’s Perspective: The Collection of Biometrics is Required by the High Guarantee Level and the Purpose of the Processing

The Conseil d'État delivers a more “adaptable” approach in its judgement of 4th November 2020 following the action for annulment of decree No. 2019-452 of 13th May 2019 brought by the association La Quadrature Du Net. The high jurisdiction considers that “it is not clear from the file documents that, on the date of the attacked decree, in order to create electronic identifiers, there were other means of authenticating the identity of the user in a completely dematerialised manner while obtaining the same level of guarantee as the facial recognition system.”Therefore, the collection of biometrics that is carried out by ALICEM “should be considered as required by the purpose of that processing.” Furthermore, the Conseil d'État points out that the persons are not compelled to use ALICEM. They can use France Connect, another public service for online identification, which does not require the processing of biometric data. Henceforth, the users “should not be considered as suffering an injury within the meaning of the General Data Protection Regulation” (GDPR).

That interpretation of GDPR does however question the use of the ALICEM application by people who are not willing to consent to the processing of their biometric data. It can be understood only to the strict extent that the electronic identity market is currently under construction, and that the private sector should (or at least is hoped to) provide shortly mobile identity solutions that will also offer a high guarantee level under the meaning of the eIDAS Regulation. In that respect, the ANSSI recently published a public call for comments regarding the requirements framework applicable to the providers of remote identity verification systems for natural persons.[19] In that first version, the framework provides for the possibility of verifying the identity of a natural person[20] either through automatic mechanisms, or through “a human interaction.

Difficulties in the Enforcement of Legal Rules by the Welsh Police

For its part, last summer, the Court of Appeal of London condemned the Welsh police[21] on the basis of the unlawfulness of using a facial recognition system “on-the-fly”. The judges considered that “an excessive discretion is currently given to each police agent[22], pointing out the lack of clear guidance on the conditions governing who should be included in the “watch list”, and on the places where the police deploys facial recognition.[23] The appellate jurisdiction also criticised the Welsh police for not having done enough to insure that the software does not have any racial or gender bias,[24] and for not having correctly assessed the impact of that technology on data protection under the Data Protection Act of 2018.[25]

Nonetheless, it can be pointed out that, while the Court of Appeal of London considered that, in that case, the use of facial recognition was not sufficiently supervised, it did not call into question the use of the technology itself.

What About Drones?

In France, the Conseil d'État took a similar line, recalling the legal framework governing the use of drones by the Paris Police Prefecture.[26] The goal here was to ensure compliance with the containment measures during the sanitary emergency period, and more specifically to use those flying devices to detect public gatherings infringing to the measures in force in certain public spaces and, if need be, to disperse any such rally. Yet, even though the drones did not record any personal data but simply transmitted images,[27] the Conseil d'État recalls that this processing of personal data should have been authorised by some regulatory text (either an order of one of the relevant ministers, or a decree taken after a reasoned opinion published by the CNIL) in accordance with the provisions of Article 31 of the French Data Protection Act (loi Informatique et Libertés).[28]

Henceforth, the Conseil d'État logically concluded that “considering that it implies a risk of seeing arise uses that would infringe the personal data protection rules,” the implementation of this data processing on behalf of the State “without the prior involvement of a regulatory text that would authorise its creation, and set the terms of use […] and the guarantees that should accompany it clearly constitutes a serious and manifestly unlawful breach of the right to privacy.

It should be pointed out here that the images transmitted by the drones, which were viewed in real-time, were not used to establish the violations nor identify their authors.[29] Hence, it was not planned to use a facial recognition system.

However, the use of those remote-controlled is not unrelated with facial recognition, as that technology can be easily associated with other mechanisms. As the CNIL points out, “unlike, for example, the video capture and processing systems, which require the implementation of physical mechanisms, facial recognition is a software feature that can be implemented into existing systems.[30]

It is therefore important to protect our privacy and our freedom of movement and assembly, to consider that eventuality of gradual shifting through an accumulation of technologies that would be deployed all-out. Steps must be taken to ensure that intrusive technologies, combined with each other, are not stacked without a prior overall reflection and hindsight.

The various questions related to facial recognition are therefore still socially and politically outstanding, as both the public debate that has been opened and the arrangements for implementing and the linkage between our different freedoms are far from being complete.

A matter of trust, once again, or, to be more exact, of unquestionable distrust, as long as the necessary proofs and signs of trust are lacking.


Claire Levallois-Barth, Lecturer in Law at Télécom Paris, Coordinator of the VP-IP research Chair of the Institut Mines-Télécom (IMT)


[1] Legislative draft on global security brought forward on 20th October 2020 by Jean‑Michel FAUVERGUE, Alice THOUROT, Christophe CASTANER, Olivier BECHT, Yaël BRAUN‑PIVET, Pacôme RUPIN, some members of group La République en Marche et apparentés and the members of group Agir, http://www.assemblee-nationale.fr/dyn/15/textes/l15b3452_proposition-loi#.

[2] Ministry of the Interior, “White Paper on Domestic Security” (“Livre blanc sur la sécurité intérieure”) of 16th November 2020, https://www.interieur.gouv.fr/fr/Actualites/L-actu-du-Ministere/Livre-blanc-de-la-securite-interieure.

[3] Cédric O: “We need to experiment facial recognition in order for our manufacturers to move forward” (“Expérimenter la reconnaissance faciale est nécessaire pour que nos industriels progressent”), newspaper “Le Monde”, 14th October 2019, https://www.lemonde.fr/economie/article/2019/10/14/cedric-o-experimenter-la-reconnaissance-faciale-est-necessaire-pour-que-nos-industriels-progressent_6015395_3234.html.

[4] “White Paper on Domestic Security” (“Livre blanc sur la sécurité intérieure”), aforementioned, p. 263. The goal would also be to “measure the deployment complexities at the scale of large networks, in terms of computing load, of cost of deployment hardware, and of evaluating the different algorithm categories.

[5] “Portland passes broadest facial recognition ban in the US”, by Rachel Metz, CNN Business, 10th September 2020, https://edition.cnn.com/2020/09/09/tech/portland-facial-recognition-ban/index.html.

[6] https://www.heidi.news/sciences/ibm-met-un-coup-d-arret-a-ses-activites-de-reconnaissance-faciale.

[7] “Amazon’s Shareholders Say Stop to Facial Recognition” (“Les actionnaires d’Amazon disent stop à la reconnaissance faciale”), 18th January 2019, https://www.presse-citron.net/les-actionnaires-damazon-disent-stop-la-reconnaissance-faciale/.

[8] “Racial Discrimination: In turn, Microsoft Takes a Stand Against Police Abuses” (“Discrimination raciale : Microsoft fait front à son tour contre les dérives policières”), 12th June 2020, https://global.techradar.com/fr-fr/news/microsoft-police-reconnaissance-faciale-black-lives-matter.

[9] “White Paper on Domestic Security” (“Livre blanc sur la sécurité intérieure”), aforementioned, p. 259-260.

[10] “White Paper on Domestic Security” (“Livre blanc sur la sécurité intérieure”), aforementioned, p. 264.

[11] “Facial Recognition: Advertisements Targeted Just For Your Eyes?” (“Reconnaissance faciale : des pubs ciblées rien que pour vos yeux ?”, 10th November 2017 https://www.frandroid.com/editoid/469929_reconnaissance-faciale-des-pubs-ciblees-rien-que-pour-vos-yeux.

[12] “The Nice Côte d'Azur Airport Installs PARAFE Airlock Gates with Facial Recognition” (“L'Aéroport Nice Côte d'Azur installe des sas PARAFE à reconnaissance faciale”), Aéroport de Nice, 17th July 2018, https://www.nice.aeroport.fr/Passagers/Actualites/L-aeroport-installe-des-sas-PARAFE-a-reconnaissance-faciale.

[13] Conseil d'État, 9th and 10th chambers assembled, 04/11/2020, 432656, unpublished in the Lebon Report, https://www.legifrance.gouv.fr/ceta/id/CETATEXT000042499854?tab_selection=cetat&searchField=ALL&query=432656&searchType=ALL&juridiction=TRIBUNAL_CONFLIT&juridiction=CONSEIL_ETAT&juridiction=COURS_APPEL&juridiction=TRIBUNAL_ADMINISTATIF&sortValue=DATE_DESC.

[14] Decree No. 2019-452 of 13th May 2019 authorising the creation of an electronic identification mechanism referred as “Authentification en ligne certifiée sur mobile” (“Certified online authentication on mobile phone”), French Official Gazette No. 0113 of 16th May 2019, https://www.legifrance.gouv.fr/loda/id/JORFTEXT000038475477/2019-07-10/.

[15] Regulation (EU) No.° 910/2014 from the European Parliament and the Council of 23rd July 2014 on electronic identification and trust services for electronic transactions in the internal market and repealing Directive 1999/93/CE, OJEU L 257/73 of 28th August 2014 (eIDAS Regulation).

[16] CNIL’s deliberation No. 2018-342 of 18th October 2018 giving notice on a proposed decree authorising the creation of an automated processing allowing to authenticate a digital identity by electronic means designated as (ALICEM), for “Application de lecture de l’identité d’un citoyen en mobilité” (“Application for reading the identity of a citizen in a mobile context”), and modifying the Code governing the entry and residence of foreigners and asylum (request for advisory opinion No. 18008244), French Official Gazette of 17th May 2019, https://www.legifrance.gouv.fr/cnil/id/CNILTEXT000038477075/.

[17] At the prefecture, the city hall or another public service that is directly open to citizens.

[18] Identity verification carried out by an agent based on a video sent to the ANTS’s server.

[19] ANSSI, public call for comments regarding the requirement framework applicable to the providers of remote identity verification systems (Prestataires de Vérification d’Identité à Distance, PVID), 30th November 2020, https://www.ssi.gouv.fr/actualite/appel-public-a-commentaires-sur-le-referentiel-dexigences-applicables-aux-prestataires-de-verification-didentite-a-distance-pvid/.

[20] ANSSI, public call for comments regarding the requirement framework applicable to PVID, aforementioned, p. 6: “The present repository imposes requirements applicable to the providers of remote identity verification services, those services being asynchronous, synchronous with a human interaction, synchronous without a human interaction, internal or external”.

[21] Court of Appeal (Civil Division), appeal from the High Court of Justice Queen’s Bench Division (Administrative Court), Cardiff District Registry, Haddon-Cave LJ and Swift J, [2019] EWHC 2341 (Admin), Case No.: C1/2019/2670, 11/08/2020, https://t.co/L8cgiXjzYz?amp=1 (PDF).

[22]Too much discretion is currently left to individual police officers”, §91, Court of Appeal (Civil Division), 11/08/2020, aforementioned.

[23]It is not clear who can be placed on the watchlist nor is it clear that there are any criteria for determining where AFR [facial recognition technology] can be deployed”, §91, Court of Appeal (Civil Division), 11/08/2020, aforementioned.

[24] “We would hope that, as AFR [facial recognition technology] is a novel and controversial technology, all police forces that intend to use it in the future would wish to satisfy themselves that everything reasonable which could be done had been done in order to make sure that the software used does not have a racial or gender bias”, §201, Court of Appeal (Civil Division), 11/08/2020, aforementioned.

[25]The inevitable consequence of those deficiencies is that, notwithstanding the attempt of the DPIA to grapple with the Article 8 issues, the DPIA failed properly to assess the risks to the rights and freedoms of data subjects and failed to address the measures envisaged to address the risks arising from the deficiencies we have found, as required by section 64(3)(b) and (c) of the DPA [Data Protection Act] 2018”, §153, Court of Appeal (Civil Division), 11/08/2020, aforementioned.

[26] Conseil d'État’s order of 18th May 2020, association “La Quadrature du Net” and Human Rights League, Nos. 440442 and 440445, https://www.conseil-etat.fr/ressources/decisions-contentieuses/dernieres-decisions-importantes/conseil-d-etat-18-mai-2020-surveillance-par-drones.

[27] Indeed, the drones were not equipped with a memory card, so no recording nor any kind of image storage was carried out.

[28] See section 31-I of the Data Protection Act (loi “Informatique et Libertés”), that states that “Is authorised through an order of one of the relevant ministers, taken after a reasoned opinion published by the CNIL, any processing of personal data implemented on behalf of the State that: 1. Is related to national security, defence or public security; or 2. Has the purpose of preventing, investigating, identifying or prosecuting criminal offences, or executing criminal convictions or security measures. The CNIL’s opinion shall be published with the order authorising the related processing.”

[29] See section 11 of the Conseil d'État’s order of 18th May 2002, aforementioned: “The intended purpose of the contentious mechanism is not to ascertain violations nor to identify their authors, but to inform the headquarters of the Paris Police Prefecture to make it possible to decide, in due course, to deploy an on-site intervention unit tasked with conducting the dispersal of the gathering  in question or evacuating places closed to the public in order to put a stop to or prevent the public disturbance that arises from any disrespect of the health security rules.

[30] CNIL, “Facial Recognition: Calling For a Public Debate Commensurate With the Stakes Involved” (“Reconnaissance faciale : pour un débat à la hauteur des enjeux”, 15th Nov. 2019, p. 4, https://www.cnil.fr/fr/reconnaissance-faciale-pour-un-debat-la-hauteur-des-enjeux.

Commentaires Clos.