logo
Notting Hill face-recognition technology will be used without bias

Notting Hill face-recognition technology will be used without bias

In a letter to the commissioner, 11 groups had said the technology is a 'mass surveillance tool that treats all carnival-goers as potential suspects' and has 'no place at one of London's biggest cultural celebrations'.
It also said that LFR technology was 'less accurate for women and people of colour' in certain settings.
Responding to the concerns, Sir Mark said the technology will help locate any dangerous individuals attending Notting Hill carnival over the August bank holiday weekend.
He wrote that when the technology was used at the carnival in 2016 and 2017, it 'did not build public confidence', but has since 'significantly improved' and now performs to a 'much higher standard'.
Sir Mark acknowledged concerns about bias in facial recognition technology, adding that the force has selected the algorithm it uses 'with care' and knows how to use it in a non-discriminatory way.
It comes after the letter, signed by groups including Liberty and Big Brother Watch, said there is 'no clear legal basis' for Scotland Yard's use of LFR.
The letter added: 'Notting Hill Carnival is an event that specifically celebrates the British African Caribbean community, yet the MPS (Metropolitan Police Service) is choosing to use a technology with a well-documented history of inaccurate outcomes and racial bias.'
Rebecca Vincent, interim director at Big Brother Watch, said she is 'deeply disappointed' that the Met 'has chosen to dig its heels in' after the call to scrap the 'Orwellian' technology.
She added: 'We all want criminals off the streets, but turning (the) carnival into a mass police line-up is not the way to do it.'
About 7,000 officers and staff will be deployed each day over the weekend.
LFR cameras will be used by police at the carnival to search for people who are marked as being wanted on the police national computer.
Meanwhile, a UK retail facial recognition system has reported its highest-ever monthly total of suspect alerts, its operators say.
In July 2025, Facewatch sent 43,602 alerts to subscriber retail stores – the equivalent of more than 10,000 suspects flagged every week for the first time and a 134.8% increase compared to July 2024 (18,564).
Over the 12 months to July 31, Facewatch said it recorded 407,771 alerts in total, with current live data already showing the rising trend continuing into August.
Nick Fisher, chief executive of Facewatch, said: 'July's record numbers are a further stark warning that retailers and their employees are facing unprecedented levels of criminal activity, including violent and aggressive behaviour.'
A spokeswoman for Big Brother Watch said: 'This technology turns shoppers into walking barcodes and makes us a nation of suspects, with devastating consequences for people's lives when it inevitably makes mistakes.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Metropolitan Police's policy over live facial recognition ‘unlawful'
Metropolitan Police's policy over live facial recognition ‘unlawful'

South Wales Guardian

time3 hours ago

  • South Wales Guardian

Metropolitan Police's policy over live facial recognition ‘unlawful'

The Equality and Human Rights Commission (EHRC) has said the UK's biggest police force's rules and safeguards over using the tool 'fall short' and could have a 'chilling effect' on individuals' rights when used at protests. The concerns come as the Met is set to deploy LFR, which captures people's faces in real-time CCTV cameras, at this year's Notting Hill Carnival over the August bank holiday weekend. Metropolitan Police commissioner Sir Mark Rowley has already sought to reassure campaign groups that the technology will be used without bias. The EHRC has been given permission to intervene in an upcoming judicial review over LFR, brought by privacy campaigner Big Brother Watch director Silkie Carlo and anti-knife crime community worker Shaun Thompson. They are seeking the legal challenge claiming Mr Thompson was 'grossly mistreated' after LFR wrongly identified him as a criminal last year. EHRC chief executive John Kirkpatrick said the technology, when used responsibly, can help combat serious crime and keep people safe, but the biometric data being processed is 'deeply personal'. 'The law is clear: everyone has the right to privacy, to freedom of expression and to freedom of assembly. These rights are vital for any democratic society,' he said. 'As such, there must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. 'We believe that the Metropolitan Police's current policy falls short of this standard. The Met, and other forces using this technology, need to ensure they deploy it in ways which are consistent with the law and with human rights.' The watchdog said it believes the Met's policy is 'unlawful' because it is 'incompatible' with Articles 8, right to privacy, 10, freedom of expression, and 11, freedom of assembly and association of the European Convention on Human Rights. Big Brother Watch interim director Rebecca Vincent said the involvement of EHRC in the judicial review was hugely welcome in the 'landmark legal challenge'. 'The rapid proliferation of invasive live facial recognition technology without any legislation governing its use is one of the most pressing human rights concerns in the UK today,' she said. 'Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we've seen in Shaun's case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities.' 'Given this crucial ongoing legal action, the Home Office and police's investment in this dangerous and discriminatory technology is wholly inappropriate and must stop.' It comes as Home Secretary Yvette Cooper defended plans to expand LFR across the country to catch 'high-harm' offenders last week. Last month, the Metropolitan Police announced plans to expand its use of the technology across the capital. Police bosses said LFR will now be used up to 10 times per week across five days, up from the current four times per week across two days. The Metropolitan Police has been contacted for comment.

Metropolitan Police's policy over live facial recognition ‘unlawful'
Metropolitan Police's policy over live facial recognition ‘unlawful'

Leader Live

time3 hours ago

  • Leader Live

Metropolitan Police's policy over live facial recognition ‘unlawful'

The Equality and Human Rights Commission (EHRC) has said the UK's biggest police force's rules and safeguards over using the tool 'fall short' and could have a 'chilling effect' on individuals' rights when used at protests. The concerns come as the Met is set to deploy LFR, which captures people's faces in real-time CCTV cameras, at this year's Notting Hill Carnival over the August bank holiday weekend. Metropolitan Police commissioner Sir Mark Rowley has already sought to reassure campaign groups that the technology will be used without bias. The EHRC has been given permission to intervene in an upcoming judicial review over LFR, brought by privacy campaigner Big Brother Watch director Silkie Carlo and anti-knife crime community worker Shaun Thompson. They are seeking the legal challenge claiming Mr Thompson was 'grossly mistreated' after LFR wrongly identified him as a criminal last year. EHRC chief executive John Kirkpatrick said the technology, when used responsibly, can help combat serious crime and keep people safe, but the biometric data being processed is 'deeply personal'. 'The law is clear: everyone has the right to privacy, to freedom of expression and to freedom of assembly. These rights are vital for any democratic society,' he said. 'As such, there must be clear rules which guarantee that live facial recognition technology is used only where necessary, proportionate and constrained by appropriate safeguards. 'We believe that the Metropolitan Police's current policy falls short of this standard. The Met, and other forces using this technology, need to ensure they deploy it in ways which are consistent with the law and with human rights.' The watchdog said it believes the Met's policy is 'unlawful' because it is 'incompatible' with Articles 8, right to privacy, 10, freedom of expression, and 11, freedom of assembly and association of the European Convention on Human Rights. Big Brother Watch interim director Rebecca Vincent said the involvement of EHRC in the judicial review was hugely welcome in the 'landmark legal challenge'. 'The rapid proliferation of invasive live facial recognition technology without any legislation governing its use is one of the most pressing human rights concerns in the UK today,' she said. 'Live facial recognition surveillance turns our faces into barcodes and makes us a nation of suspects who, as we've seen in Shaun's case, can be falsely accused, grossly mistreated and forced to prove our innocence to authorities.' 'Given this crucial ongoing legal action, the Home Office and police's investment in this dangerous and discriminatory technology is wholly inappropriate and must stop.' It comes as Home Secretary Yvette Cooper defended plans to expand LFR across the country to catch 'high-harm' offenders last week. Last month, the Metropolitan Police announced plans to expand its use of the technology across the capital. Police bosses said LFR will now be used up to 10 times per week across five days, up from the current four times per week across two days. The Metropolitan Police has been contacted for comment.

I now think police use of live facial recognition will make us safer – here's why you should think so too
I now think police use of live facial recognition will make us safer – here's why you should think so too

The Guardian

time4 hours ago

  • The Guardian

I now think police use of live facial recognition will make us safer – here's why you should think so too

I was a Metropolitan police officer for more than 30 years and policed Notting Hill carnival for many of them, from the anti-police violence of the 1970s as a constable, to being a chief inspector bronze commander in the 1990s. In the 2000s, when the rightwing press succeeded in removing 'an openly gay police commander who was soft on drugs' (I advocated alternatives to arresting people for small amounts of cannabis) as the cop in charge of Brixton, 'the capital of Black Britain', local people signed a 5,000-signature petition and held a rally at Brixton town hall demanding my reinstatement. So, when Mark Rowley recruited me in 2023 as an 'access all areas' non-executive director of the Met, I was not a 'no-risk hire', as the commissioner put it. I am still concerned about the culture of the Met and the disproportionate use of police powers against black people, especially 'stop and search', but I am no longer concerned about the way the Met uses live facial recognition (LFR). Millions of people attend Notting Hill carnival every year and succeed in their sole aim of having a wonderful time, but having so many people in such a small area attracts a tiny minority intent on exploiting the situation to commit crime, including violence and sexual offences. LFR provides a non-discriminatory way of identifying some of those offenders on their way to the event with zero impact on anyone else. Past failings, which meant live facial recognition was more likely to misidentify black people and women, have been eradicated at the settings used by the Met, and it would be self-defeating if the force used this technology in any other way; false positives are currently running at one in 33,000 for everyone. The systems 'sees' people's faces, compares them with specific police databases of facial images, and immediately discards the live images if there isn't a match. It's like watching TV without recording it, unless there's a match, and if there is anyone else in the shot, they are automatically pixelated out. The types of image databases used are, for example, of people on the sex offenders register who have restrictions on where they should be, and those wanted for serious criminal offences. A match results in officers engaging the individual in a conversation to establish exactly who they are and what they are doing there. It would be much better if there were a consolidated, statutory framework that specifically addressed live facial recognition, instead of the many and various legal checks and balances currently in place, but that failure lies at the door of successive governments. The Met doesn't get everything right, but I trust those responsible for the development and deployment of live facial recognition, and compared with other police crime prevention measures, such as blanket, no-suspicion 'stop and search', it is much less intrusive. The days when we used to say, 'trust me, I'm a police officer' are sadly in the past, but things are improving. As a former police whistleblower, I wouldn't say it if I didn't believe it. Brian Paddick is a peer, a former senior police officer and a non-executive adviser for the Metropolitan police

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store