Abstract
Facial Recognition Technology (FRT) has become a powerful tool for law enforcement agencies, enabling the automated identification of individuals through their unique facial features. Its growing use across Europe has raised fundamental legal and ethical concerns, particularly regarding privacy and data protection. This article analyses the implications of FRT under European Union (EU) law, evaluates the shortcomings of current regulation—especially the Artificial Intelligence Act (AI Act)—and proposes guidelines for a comprehensive legal framework.
1. Introduction
FRT is increasingly deployed in criminal investigations to identify suspects, victims, or missing persons. By 2021, 11 out of 27 EU Member States had implemented FRT in law enforcement contexts. However, the European Court of Human Rights (ECtHR) has ruled that unregulated use of FRT—for example, analysing CCTV or social media footage to identify protesters—violates fundamental rights. This demonstrates the urgent need for stricter regulation to balance security benefits with privacy safeguards .2. Technical Foundations of FRT
FRT is a biometric technology that works in two main ways:-
Identification (1:n) – comparing one face against a database.
-
Verification (1:1) – matching one face with another to confirm identity.
Advances in Artificial Intelligence and deep learning have made FRT far more accurate and widely applicable. Yet, challenges such as algorithmic bias, false positives, and discrimination—especially against minorities and women—remain significant .
3. FRT as Multi-Step Data Processing
The use of FRT in law enforcement follows three stages:-
Data Collection – capturing facial images from CCTV, bodycams, or databases.
-
Data Analysis – transforming images into biometric templates and matching them against databases.
-
Data Exploitation – using results in investigations, evidence, or sharing with other agencies.
Each stage interferes with the right to privacy under Article 8 of the European Convention on Human Rights (ECHR) and Articles 7–8 of the Charter of Fundamental Rights of the EU (CFREU). Such interferences must therefore be strictly necessary, legally justified, and proportionate .
4. Regulation: AI Act and Beyond
The EU AI Act (effective August 2024) classifies FRT as a high-risk AI system. It prohibits most uses of real-time FRT in public spaces, except in exceptional cases such as preventing terrorist attacks, finding missing persons, or investigating serious crimes. Nevertheless, the AI Act provides only baseline rules. Member States must still create specific laws defining when and how law enforcement can lawfully collect, analyse, and exploit biometric data .5. Conclusion
FRT represents a new kind of biometric data analysis with significant implications for fundamental rights. While it offers powerful tools for criminal investigations, its deployment requires strict legal frameworks that regulate every stage of data processing. Without such regulation, the use of FRT in law enforcement remains legally problematic.Source: Simmler, M., & Canova, G. (2025). Facial recognition technology in law enforcement: Regulating data analysis of another kind. Computer Law & Security Review, 56, 106092. https://doi.org/10.1016/j.clsr.2024.106092