The United Kingdom is unique in Europe for its large-scale deployment of real-time facial-recognition systems in public spaces. This week, hundreds of cameras were set up at the Notting Hill Carnival in London, an event that draws millions of attendees. According to police, the goal is to identify and intercept wanted individuals by scanning faces in large crowds and matching them against a database of suspects.
The technology is being hailed as an “effective policing tool” by Metropolitan Police chief Mark Rowley, who noted that it has resulted in over 1,000 arrests since the start of 2024. The use of this technology has grown considerably over the past three years. According to the NGO Liberty, some 4.7 million faces were scanned in 2024 alone. In fact, UK police have deployed the live facial-recognition system about 100 times since late January, a stark increase compared to just 10 times between 2016 and 2019.
Privacy and Ethical Concerns
Despite its growing use, the deployment of this technology has sparked significant controversy and criticism. Organizations like Big Brother Watch argue that this mass data capture “treats us like a nation of suspects.” Rebecca Vincent, its interim director, highlighted the lack of a legal framework, stating that without legislation, police are essentially “left to write their own rules.”
There are also concerns about the technology’s use by private entities. Retailers, including supermarkets and clothing stores, are using a service called Facewatch to combat shoplifting. This service compiles a list of suspected offenders and alerts staff if one enters a store. Critics argue that this removes the possibility of “living anonymously” in a city and can have major implications for political and cultural life, as people often don’t know they are being profiled.
The use of this technology has been called into question by human rights groups. For example, 11 organizations, including Human Rights Watch, have written to the Metropolitan Police chief, urging him not to use it during the Notting Hill Carnival. They argue that the technology “unfairly target[s]” the Afro-Caribbean community and has racial biases. As a concrete example of this bias, a 39-year-old Black man named Shaun Thompson stated he was wrongly arrested after being misidentified by one of the cameras and has since filed an appeal against the police.
The UK’s human rights regulator has also weighed in, stating that the Metropolitan Police’s policy on using the technology is “unlawful” and “incompatible” with rights regulations. This stands in contrast to the European Union, which prohibited the use of real-time facial recognition in its AI legislation, with limited exceptions for serious crimes like counterterrorism.

