Latest news with #LindseyChiswick


ITV News
5 days ago
- ITV News
Live facial recognition to be expanded across the UK with more police forces using the technology
Live facial recognition (LFR) will be expanded across the UK as part of a government remodel of neighbourhood policing. The government says this technology will be introduced to catch 'high-harm' offenders, with new rules set to be drawn up this autumn to ensure 'safeguards and oversight," while balancing public and experts' views on how this tech should be used. The technology is already in use in some areas of the country but Wednesday's announcement will see a further ten vans equipped with cameras rolled out across seven police forces, Greater Manchester, West Yorkshire, Bedfordshire, Surrey, Sussex, Thames Valley and Hampshire, over the coming weeks. Currently, the rules say LFR can only be used to check against a police watchlist of wanted criminals, suspects and those subject to bail or court order conditions, such as sex offenders. Police compiled this list before a deployment of LFR technology and say it is intelligence-led and targeted to the specific area in which they are working. Signs are positioned to alert the public that LFR is being used, and the cameras scan the faces of every individual who walks by. These scans are compared against the criminal watchlist and if there's no match, are automatically deleted. When the technology thinks a person matches someone on the list, it alerts a team of officers, showing them a comparison of the photos in order for them to decide whether to stop them. The home secretary, Yvette Cooper, said they would be focused on identifying sex offenders or 'people wanted for the most serious crimes who the police have not been able to find.' Addressing specific concerns over its accuracy and use, the Home Office added: 'The algorithm being used in the vans has been independently tested and will only be operated in specific circumstances and with robust oversight.' Privacy campaigners have previously voiced concerns about a potential lack of regulation and transparency over the expansion of the technology. There have also been concerns raised over racial bias in the system and the accuracy with which it operated under certain conditions. The technology was tested by the National Physical Laboratory (NPL) who looked at whether the algorithm had any racial bias. The NPL concluded that the threshold, which can be set by police forces, at which the machine alerts officers to a match does include a racial bias at certain low levels of confidence. Above a certain level however, their research showed no racial bias was present. Police forces insist the technology is now never operated below this level of confidence in a match and all data on deployments, including the threshold at which the technology was operating that day, is publicly available online. Forces already deploying live facial recognition have used it to arrest rape, domestic abuse, knife crime and robbery suspects as well as sex offenders breaching their conditions. Lindsey Chiswick, National Police Chiefs' Council (NPCC) lead for facial recognition, said it had already been used 'to great success, locating thousands of wanted offenders, or others breaching their bail conditions. She added: 'I am confident that the increased use of this technology will continue to support the safety of communities across the country moving forward." South Wales Police, who have been using the technology for some years and are co-ordinating this national rollout said they understood the public's concerns. They highlighted however that in their own force, the use of LFR had "never resulted in a wrongful arrest" and there had "been no false alerts for several years as the technology and our understanding has evolved.' As well as concerns of bias, campaign group Big Brother Watch are worried about what the wider rollout of this technology means for privacy. Interim Director, Rebecca Vincent told ITV News the timing of this rollout was concerning. "We don't yet have any legislation or framework to govern the use of this. The Home Secrtary has said that's coming, but why the rush to get more vans on the street when we don't yet have that oversight and accountability." She added: "Sometimes critics of groups like ours say 'nothing to hide, nothing to fear.' "That's not quite right - we have the presumption of innocence in this country. "What live facial recognition does is reverse that. It treats us as a nation of suspects until proven otherwise."


BBC News
12-07-2025
- BBC News
Croydon's fixed facial recognition cameras spark debate
The summer heat always brings Croydon town centre to life, with a positive, lively atmosphere. However, some say the mood changes noticeably as the sun say they are forced to hire extra security to prevent thefts, and that drug-related gang crime is prevalent. The Met Police is using live facial recognition (LFR) to look for criminals, with fixed cameras being installed in Croydon for the first time anywhere in London. The force says the technology will keep the streets safer, but Big Brother Watch said the scale of the surveillance was "alarming". LFR cameras scan faces in real time and help police find those who are wanted. In March, the Met announced it would install fixed facial recognition cameras on North End and London Road in Croydon, after previously trialling mobile devices. The fixed LFR cameras look like regular CCTV but will only be switched on when the technology is in use, the Met LFR has already been used in several London boroughs, including during the King's Coronation in 2023. The Met says it's arrested more than 1,000 people by using the technology so far across London, including 93 registered sex offenders and several others in breach of court far this year, LFR has scanned 1.5 million faces in London, leading to 459 arrests – roughly one for every 3,300 scans however more than half of the Met's so-called "true matches" did not result in an Chiswick, who leads the Met's facial recognition programme, said: "This technology is making London safer by removing dangerous offenders. It's saving officers valuable time and delivering quicker, more accurate results." 'After 4pm, it's different' Live facial recognition cameras are something that shopkeeper Mohammed Kamzi welcomes, although he has concerns. Mohammed has worked on North End for four years, a time he said has seen rising crime and waning police visibility. "It's ok in the morning, we can manage with two staff members," said Mohammed, who owns U Fone. "After 4pm, it's different. We need three or four people working because someone has to be ready to chase thieves while another watches the shop."We see crime here all the time," he added. He said although the LFR cameras had benefits in terms of safety, he feels some customers aren't comfortable with it: "They feel anxious, like they've done something wrong." Shane Barrett, a local resident, believes the technology could help, but questions its placement. He said: "Croydon's getting worse, there are stabbings every other week. The cameras might help, but the stabbings happen on side streets and around Surrey Street, not North End."Shopper Helen Matthews said she wanted more clarity from the police. "I can see it being useful, but we don't know how it works or when it'll be used." 'Dystopian effect' Some civil liberties groups also have deep Stone, a senior advocacy officer at Big Brother Watch, said the new cameras would have "a chilling and dystopian effect on the high street".Charlie Whelton, from human rights group Liberty, said the technology was being used without proper legal oversight. "It's a regulatory wild west," he Whelton warned there's currently no unified legislation guiding its use nationwide, and the cameras can be used to watch anyone. The Met insists it has strict safeguards in says biometric data is permanently deleted if someone isn't on the watchlist and added that independent testing by the National Physical Laboratory found the system to be accurate and showed no significant bias based on race or gender.A Met spokesperson said: "We're committed to making London safer by using technology to target the most dangerous offenders. "We continue to engage with the public to explain how the technology works and to reassure people that strong privacy protections are in place."


BBC News
04-07-2025
- BBC News
Facial recognition cameras helps make 1,000 arrests, Met says
Live facial recognition technology (LFR) is helping the Met stay ahead of criminals at a time "where money is tight," according to the force's director of intelligence. Lindsey Chiswick, the lead for LFR at the Met and nationally, said more than 1,000 wanted criminals had been arrested since January 2024 using the tool, including paedophiles, rapists and violent said it "would be madness" if officers did not keep pace with available technology in order to protect the public. Privacy campaigners say there's been an "alarming escalation" in police use of LFR, which maps a person's unique facial features, and matches them against faces on watch lists. Since the start of 2024, a total of 1,035 arrests have been made using live facial recognition, including 93 registered sex those, 773 have been charged or tool is also being used to check up on people who have court conditions imposed, including sex offenders and stalkers. The include 73-year-old David Cheneler, a registered sex offender, who was picked up on LFR cameras in January in Denmark Hill, south-east London, with a six-year-old girl."Her mother had no idea about his offending history," says Ms Chiswick. "Without LFR that day, officers probably wouldn't have seen him with that child, or thought anything was amiss."Cheneler was jailed for two years for breaching his Sexual Harm Prevention Order, which banned him from being alone with young children, and for possessing an offensive weapon. But some have expressed concerns over the Met's increasing use of LFR, and plans for a pilot scheme in Croydon, south London, where fixed cameras will be mounted on street furniture from September, instead of used by a team in a mobile Met says the cameras will only be switched on when officers are using LFR in the Party London Assembly member Zoe Garbett previously described the pilot as "subjecting us to surveillance without our knowledge".Interim director of Big Brother Watch, Rebecca Vincent, said it represented "an alarming escalation" and that the technology is "more akin to authoritarian countries such as China".Ms Chiswick said while she understood concerns, she believed the Met was taking "really small, careful steps".She added: "I think criminals are exploiting that technology and I think it would be madness if we didn't keep pace with that and allow criminals to have access to tools which police didn't. "The world is moving fast, we need to keep up." We joined police on a recent LFR deployment in Walthamstow, where a mobile van was parked up in an area between the Tube station and the market on the high street, a hot spot for theft and told me the bespoke watch list, created for each deployment, had been compiled around 5 o'clock that morning, and contained 16,000 names of wanted offenders, and would be deleted at the end of the day. But before I even reached the van, or spotted the sign alerting that live facial recognition cameras were being used, officers had already spotted face had been scanned, and flagged as a potential match to my photo, which police had earlier added to their system so we could demonstrate how it officers' handsets bleeped, as the two images blinked up on their this case, it was a 0.7 match. Anything less than 0.64, they said, is considered unreliable and the captured image is deleted. "I've got no wish to put technology on the streets of London that is inaccurate or biased," Ms Chiswick told me. She said that the threshold had been selected after tests by the National Physical Laboratory . "All algorithms have some level of bias in them. The key thing for police to understand is how to operate it in order to ensure there is no bias at that level." The two images of my face were also flagged to the team inside the mobile van. They showed me their monitors, where the cameras were scanning all the faces in the crowd, before quickly pixilating it is not a match, officers told me, their biometric data is immediately someone on the list is identified by the system, police officers then take over. "Live facial recognition is meant to work alongside your natural policing skills," explains Supt Sarah Jackson, "so you should be able to approach those people that have been activated by the cameras, go and talk to them and ascertain if the cameras are correct". But what happens when the cameras get it wrong? How do those people react?"That does happen on very few occasions" Supt Jackson acknowledged."There's always the possibility to upset people in any walks of life if they're stopped by police. But by and large, people are happy."Ms Chiswick said since January this year, there had been 457 arrests and seven false alerts. During the Walthamstow deployment, police told me eight arrests were made including for drug offences, stalking and driving offences, and that there were no false alerts. It's not just campaigners who are concerned over potential misidentification. Local resident Christina Adejumo approached me to ask why she'd just seen a man being handcuffed in the middle of the I explained he'd been picked up by the live facial recognition cameras she told me she thought they were a good idea, but questioned their accuracy."It can be said, 'sorry, it's not you', but the embarrassment of what happened that day cannot be taken away from him." Ann Marie Campbell said: "I think it's a good idea because of public safety." She also hoped it would help tackle pickpocketing."This is very good to prevent crimes," Ansar Qureshi agreed. Is he worried about privacy? "I don't mind, because I don't have anything to hide," he told Caroline Lynch said she was "disgusted" by the technology. "I don't feel safer, no. It's just more and more 'Big Brother'."She insisted she'd rather the money was spent on safety measures including putting more police on the streets."I can get on to the Tube at 12 o'clock at night and there's absolutely no-one there to protect us." Earlier this year, the Met Commissioner Sir Mark Rowley warned that the force is "a shrinking organisation" and faces losing around 1,700 officers, PCSOs and staff by the end of the year without more money from government. "We're in an environment where money is tight," Ms Chiswick said. "We're having to make some difficult choices, and we know that technology and data can help us be more precise." But Rebecca Vincent, Interim Director of Big Brother Watch, said there was a lack of parliamentary oversight and scrutiny of LFR. "It's a massive privacy violation for people going about their daily life."There is no primary legislation governing this invasive technology. It means that police are being left to write their own rules."She said it was unclear whether in future officers might be given access to other data, such as driving licences or passports. Ms Chiswick said human rights and data protection laws, as well as guidance from the College of Policing helped police to understand how the technology should be used. She said the watchlist was always intelligence led, and only included those who were wanted by police or the courts, or who were subject to court imposed conditions. She told me that she believed there was "provision" for LFR to be used to search for vulnerable missing children, which might be considered less invasive to their privacy than making a public appeal, but that this had not yet been done. Deputy commissioner of regulatory policy at the Information Commissioner's Office (ICO) Emily Keaney, said these cameras can help to prevent and detect crime, but added that "its use must be necessary, proportionate and meet expectations of fairness and accuracy".She said they were working with the Met to review the safeguards in place and would be "closely monitoring" its added: "LFR is an evolving technology and, as our 2025 AI and biometrics strategy sets out, this is a strategic priority for the ICO. "In 2019, we published guidance for police forces to support responsible governance and use of facial recognition technology. "We'll continue to advise government on any proposed changes to the law, ensuring any future use of facial recognition technology remains proportionate and publicly trusted." The Home Office said facial recognition was "a crucial tool to keep the public safe that can identify offenders more quickly and accurately, with many serious criminals already brought to justice through its use".It added: "All police forces using this technology are required to comply with existing legislation on how and where it is used."We will set out our plans for the future use of facial recognition technology, in the coming months, including the legal framework and safeguards which will ensure it is properly used."


The Sun
03-07-2025
- The Sun
The new secret weapon nailing dangerous criminals in the street including sick paedo who befriended family
AT first glance, it looks like just another white van parked up by the side of the road. But the unremarkable Iveco truck is a secret weapon for cops in their drive to nail dangerous fugitives. 7 7 7 Covered in cameras, it houses a bank of screens and hi-tech computer equipment — the centrepiece of a Met Police Live Facial Recognition (LFR) deployment. And it plays a crucial role in safeguarding children and the vulnerable from sex offenders, stalkers and violent criminals. Seven cameras perched on top of the vehicle constantly scan the pedestrians walking by — feeding the images into a computer which flags up suspects wanted by cops. It also looks out for people on the run or subject to court orders for a range of offences, including sex crimes, to ensure they have not committed any breach. Known haunts The Sun joined a Met Live Facial Recognition team on a deployment close to Upton Park station in London's East End — and I got to test out the kit. I handed over a copy of my Press card featuring my grisly mug, which was fed into the database alongside 16,000 genuine custody images of wanted criminals and those subject to court orders. It means that when I later come into view of the truck's cameras, alerts inside start going off like a pinball machine. My features registered a high score on the biometrics system, which works using facial measurements. Happily, the cops inside knew it was all for my report. Met Central Ops director Lindsey Chiswick says: 'If you're not subject to a current order or wanted for a criminal offence, then the tech will ignore you. It is taking some really dangerous people off the streets.' The LFR van at Upton Park was sitting close to a Greggs bakery — a famous target for shoplifters. One officer told me: 'Businesses love to see us turn up. Nobody will be shoplifting at Greggs here today.' New stats released today show 1,035 wanted criminals were arrested by the Met between January 1 and June 20 thanks to this new facial recognition kit. They included more than 100 offenders who carried out serious acts of violence such as rape, stalking and domestic abuse. Some 93 of them were registered sex offenders, including David Cheneler, 73, who was lagged up by LFR in Denmark Hill, South London, in January while walking with a six-year-old girl. Cheneler, who was jailed for nine years in 2019 for offences against children, had befriended the girl's mother on his release from prison and offered to pick the youngster up from school as a 'favour'. The mum had no idea of his appalling history. 7 Cheneler was found to have a knife when stopped by the LFR team — and checks revealed that he was in breach of a Sexual Offences Prevention Order stopping him from being alone with a child under 14. The paedophile, from Lewisham, South London, was sentenced to two years at Kingston Crown Court in May after admitting breaching the terms of his order and possessing an offensive weapon. Lindsey Chiswick says: 'The tech flagged him up. 'Without it, Cheneler would still be walking the streets.' A violent robber who targeted a Rolex watch, 22-year-old Adenola Akindutrie, had false ID documents and used a fake Irish accent on cops when he was flagged up by LFR in Stratford, East London, in April. People might age, put on weight or lose it, but they will still show up as certain facial features don't change a lot. Police officer He was arrested and fingerprints proved his identity. The villain is now behind bars awaiting sentence. Shoplifter Darren Dubarry, 50, was also caught on camera in May because he was wanted for theft — and found to be in possession of designer clothing he had stolen that same day. Eight arrests were made during the Upton Park deployment attended by The Sun last month, with 14 people stopped in total. A second LFR team operating that day in Tooting, South London, made two arrests. The NeoFace system checks 28 facial measurements, including eyes, nose, mouth and head size. 7 Each person is given a score, and the threshold for an alert is anything over 0.64. After I had my image added to the system, it clocked me instantly and I scored 1.47. If a person is of no interest, then the pixelated image caught on LFR is immediately deleted. Even though the vans themselves are inconspicuous, LFR vehicles have a sign on the back saying Live Facial Recognition in operation — and there are warning notices at either end of the recognition zones. If someone flags up on the system, an alarm in the vehicle is activated and the person is pulled aside. Sunglasses and baseball caps are no barrier to the technology. And if you are wanted for an offence and hoping fat jabs may have changed your features, then forget it. You will still be spotted. In general, around one in five of those serving protection orders is found to be in breach of their conditions and arrested on the spot. Intelligence reports on those stopped but not in breach go to offender managers to make them aware of the criminals' movements. The list of 16,000 subjects of interest in Upton Park is based on reports of their known haunts. An officer on the team, says: ' Criminals are transient, but it's possible some of the 16,000 could be in the area. 'Some of the custody images might be five years old, but we have even matched a custody image which was taken 24 years ago. 'People might age, put on weight or lose it, but they will still flag up because certain facial features don't change a lot.' The LFR system was first trialled by the Met in 2016 and by South Wales Police the following year. It is expanding across the country with eight forces having used it. Freedom of information data revealed LFR scanned 4.7million faces in England and Wales last year — twice the number in 2023. False alerts are rare — one in 33,000 — and officers provide a safeguard by making the ultimate decision on whether to stop someone. Civil liberties groups have raised fears of a Big Brother era being ushered in without legal checks and balances. However, 83 per cent of the public backs the system. 7