Doesn’t facial recognition surveillance make us safer?
Jeremy Peckham - Research Lead at the AI, Faith and Civil Society Commission & Mohammed Ahmed - Research Manager at the AI, Faith and Civil Society Commission
Background and Context
Live Facial Recognition (LFR) technology, which scans facial images in real-time and compares them against a database of watchlists, has become a contentious issue worldwide. The UK serves as a critical case study, highlighting both the benefits and risks associated with its adoption. While proponents argue that LFR enables swift identification of suspects, critics warn of its implications for privacy, civil liberties, and systemic bias.
LFR in Practice: The UK Experience
In 2024, the use of LFR surged in England and Wales, with the Metropolitan Police deploying it 117 times between January and August, compared to 32 instances in the prior three years. Arrests using LFR reached 360 by October 2024, underscoring its utility in law enforcement. However, high-profile incidents such as the wrongful identification of Shaun Thompson, a Black anti-knife crime activist, highlight the risks of misidentification. Studies revealed inaccuracies exceeding 86% in systems deployed by the Met and South Wales Police, disproportionately affecting racial minorities.
Globally, countries like the US and EU have taken a more restrictive approach. The EU’s Artificial Intelligence Act, enacted in 2024, introduced near-total bans on LFR except for severe crimes and with prior judicial approval. Conversely, the UK continues to expand LFR’s scope, deploying it at events like Beyoncé’s Cardiff concert and during public disturbances.
Broader Implications
The UK’s extensive use of LFR reflects a broader trend of integrating artificial intelligence into public surveillance. Countries such as China and Russia have heavily invested in LFR, often sacrificing individual freedoms for state control. In contrast, democratic nations face the challenge of balancing public safety with the protection of civil liberties. The “revolving door” between regulatory agencies and technology firms, as seen in the UK with the former Biometrics Commissioner joining Facewatch, raises concerns about oversight independence.
Human Values Risk Analysis:
Truth and Reality – LOW RISK
No direct impact on truth and reality
Privacy and Freedom – HIGH RISK
Mass surveillance scans biometric data indiscriminately, infringing on individuals' right to privacy. Weak regulatory frameworks and self-policing by tech firms undermine public trust.
Authentic Relationships – MEDIUM RISK
LFR systems often exhibit racial and gender bias, exacerbating inequalities. Public awareness of pervasive surveillance may deter lawful activities, including protests and free expression.
Moral Autonomy – MEDIUM RISK
No direct impact on moral autonomy
Dignity of Work – LOW RISK
No direct impact on dignity of work
Cognition and Creativity – LOW RISK
No direct impact on cognition and creativity
Policy Recommendations
- Comprehensive Regulation:
Enact specific legislation to govern LFR use, mandating transparency, accountability, and stringent safeguards. Require judicial oversight for deployments, akin to the EU’s AI Act.
- Independent Oversight:
Establish a robust, independent body to regulate LFR technologies, replacing the fragmented oversight system in the UK. Prohibit conflicts of interest, such as former regulators joining LFR firms.
- Bias Mitigation:
Mandate regular audits of LFR algorithms to identify and correct racial and gender biases. Require diverse datasets to train AI models, ensuring equitable outcomes.
- Targeted Deployment:
Restrict LFR use to specific scenarios, such as locating missing persons or preventing terrorism, with clear limitations on data retention.
- Public Awareness and Education:
Conduct public consultations to gauge societal acceptance and concerns regarding LFR. Develop educational campaigns to increase awareness of privacy rights and the ethical implications of AI technologies.
- Global Collaboration:
Align with international standards on AI governance to prevent the UK from becoming an outlier. Participate in multilateral discussions to ensure LFR use respects human rights globally.
References
Home security guides: facial recognition technology | Angel Security [Image]
Police Use of Live Facial Recognition Technology, House of Commons Library, 2024
Big Brother is Watching You, March 2024
See statistics from Big Brother Watch, May 2023