Latest news with #DenmarkHill


BBC News
22-07-2025
- Health
- BBC News
The 'bionic bishop' who broke every bone in his face
"I think it made me look very differently at life... at what really matters... the importance of one's health and wellbeing, not taking it for granted... and to value each moment."I'm sitting in Southwark Cathedral with its bishop, the Right Reverend Christopher Chessun, as he reflects on the night that changed his September, the taxi he was travelling in came to an abrupt halt - he didn't. He smashed his face against the taxi, breaking every bone in his face apart from his lower jaw."I had no face after the accident," he tells me. "The injuries were extensive, I wasn't aware of them at the time, I was just aware of my face crumbling." 'Cornflake face' The Rt Rev Chessun was rushed to St Thomas' Hospital, then to a specialist maxillofacial trauma team at King's College Hospital in Denmark Hill, south-east the team that night was Professor Kathy Fan. The bishop remembers their first conversation."She said 'Your face is cornflakes' but she said my airbag had protected my skull, my brain and my neck. She said 'It's my job to put it back together'... and that's what she set out to do." Professor Fan told him she wanted to make the repairs in one long operation - it would be complicated but she had one aim in mind."She told me: 'When I do it, I don't want people staring at you and thinking what's different? I want them to listen to what you're saying because you're a bishop'. "So again there was that massive sense of trust, confidence, the skill, expertise, the wisdom, the experience... all those things came into play and it took away that sense of trauma for me." 'Face is identity' Professor Fan told the BBC both eye sockets were damaged, his entire left cheekbone had dropped and "entire upper jaw was hanging loose". Most of the bones on his face were was keen to get the bishop back working again before the busy Christmas period and praised the efforts of the whole said: "I have a pretty amazing job. I always think it's a real privilege to work on people's faces: people trust us."We have the ability to try and put people back when they've been unfortunate enough to be injured."Our face is our identity - people look at us and make judgment about us so it's important to recreate someone's identity." There were moments when Bishop Chessun wondered if he'd be able to return to work, but three months later, just before Christmas, he was back at Southwark face is held together with numerous pins and plates - I ask if he considers himself to be the 'bionic bishop'."Something like that, I think I am", he laughs, and recalls a recent trip to the Holy Land when he feared his face might set of the airport scanners. It didn't. Nine months on, he is nearly fully recovered and can't praise the medical teams who gave him back his face enough."People look at your face - this is how they make contact with you - so your facial identity is a crucial part of things."I think that sense of being supported by the prayers of those in my diocese, those who knew me, those who cared for me, made an enormous difference."I think not just to morale but to confidence and sense of wellbeing. I had an underlying feeling that all would be well."


BBC News
04-07-2025
- BBC News
Facial recognition cameras helps make 1,000 arrests, Met says
Live facial recognition technology (LFR) is helping the Met stay ahead of criminals at a time "where money is tight," according to the force's director of intelligence. Lindsey Chiswick, the lead for LFR at the Met and nationally, said more than 1,000 wanted criminals had been arrested since January 2024 using the tool, including paedophiles, rapists and violent said it "would be madness" if officers did not keep pace with available technology in order to protect the public. Privacy campaigners say there's been an "alarming escalation" in police use of LFR, which maps a person's unique facial features, and matches them against faces on watch lists. Since the start of 2024, a total of 1,035 arrests have been made using live facial recognition, including 93 registered sex those, 773 have been charged or tool is also being used to check up on people who have court conditions imposed, including sex offenders and stalkers. The include 73-year-old David Cheneler, a registered sex offender, who was picked up on LFR cameras in January in Denmark Hill, south-east London, with a six-year-old girl."Her mother had no idea about his offending history," says Ms Chiswick. "Without LFR that day, officers probably wouldn't have seen him with that child, or thought anything was amiss."Cheneler was jailed for two years for breaching his Sexual Harm Prevention Order, which banned him from being alone with young children, and for possessing an offensive weapon. But some have expressed concerns over the Met's increasing use of LFR, and plans for a pilot scheme in Croydon, south London, where fixed cameras will be mounted on street furniture from September, instead of used by a team in a mobile Met says the cameras will only be switched on when officers are using LFR in the Party London Assembly member Zoe Garbett previously described the pilot as "subjecting us to surveillance without our knowledge".Interim director of Big Brother Watch, Rebecca Vincent, said it represented "an alarming escalation" and that the technology is "more akin to authoritarian countries such as China".Ms Chiswick said while she understood concerns, she believed the Met was taking "really small, careful steps".She added: "I think criminals are exploiting that technology and I think it would be madness if we didn't keep pace with that and allow criminals to have access to tools which police didn't. "The world is moving fast, we need to keep up." We joined police on a recent LFR deployment in Walthamstow, where a mobile van was parked up in an area between the Tube station and the market on the high street, a hot spot for theft and told me the bespoke watch list, created for each deployment, had been compiled around 5 o'clock that morning, and contained 16,000 names of wanted offenders, and would be deleted at the end of the day. But before I even reached the van, or spotted the sign alerting that live facial recognition cameras were being used, officers had already spotted face had been scanned, and flagged as a potential match to my photo, which police had earlier added to their system so we could demonstrate how it officers' handsets bleeped, as the two images blinked up on their this case, it was a 0.7 match. Anything less than 0.64, they said, is considered unreliable and the captured image is deleted. "I've got no wish to put technology on the streets of London that is inaccurate or biased," Ms Chiswick told me. She said that the threshold had been selected after tests by the National Physical Laboratory . "All algorithms have some level of bias in them. The key thing for police to understand is how to operate it in order to ensure there is no bias at that level." The two images of my face were also flagged to the team inside the mobile van. They showed me their monitors, where the cameras were scanning all the faces in the crowd, before quickly pixilating it is not a match, officers told me, their biometric data is immediately someone on the list is identified by the system, police officers then take over. "Live facial recognition is meant to work alongside your natural policing skills," explains Supt Sarah Jackson, "so you should be able to approach those people that have been activated by the cameras, go and talk to them and ascertain if the cameras are correct". But what happens when the cameras get it wrong? How do those people react?"That does happen on very few occasions" Supt Jackson acknowledged."There's always the possibility to upset people in any walks of life if they're stopped by police. But by and large, people are happy."Ms Chiswick said since January this year, there had been 457 arrests and seven false alerts. During the Walthamstow deployment, police told me eight arrests were made including for drug offences, stalking and driving offences, and that there were no false alerts. It's not just campaigners who are concerned over potential misidentification. Local resident Christina Adejumo approached me to ask why she'd just seen a man being handcuffed in the middle of the I explained he'd been picked up by the live facial recognition cameras she told me she thought they were a good idea, but questioned their accuracy."It can be said, 'sorry, it's not you', but the embarrassment of what happened that day cannot be taken away from him." Ann Marie Campbell said: "I think it's a good idea because of public safety." She also hoped it would help tackle pickpocketing."This is very good to prevent crimes," Ansar Qureshi agreed. Is he worried about privacy? "I don't mind, because I don't have anything to hide," he told Caroline Lynch said she was "disgusted" by the technology. "I don't feel safer, no. It's just more and more 'Big Brother'."She insisted she'd rather the money was spent on safety measures including putting more police on the streets."I can get on to the Tube at 12 o'clock at night and there's absolutely no-one there to protect us." Earlier this year, the Met Commissioner Sir Mark Rowley warned that the force is "a shrinking organisation" and faces losing around 1,700 officers, PCSOs and staff by the end of the year without more money from government. "We're in an environment where money is tight," Ms Chiswick said. "We're having to make some difficult choices, and we know that technology and data can help us be more precise." But Rebecca Vincent, Interim Director of Big Brother Watch, said there was a lack of parliamentary oversight and scrutiny of LFR. "It's a massive privacy violation for people going about their daily life."There is no primary legislation governing this invasive technology. It means that police are being left to write their own rules."She said it was unclear whether in future officers might be given access to other data, such as driving licences or passports. Ms Chiswick said human rights and data protection laws, as well as guidance from the College of Policing helped police to understand how the technology should be used. She said the watchlist was always intelligence led, and only included those who were wanted by police or the courts, or who were subject to court imposed conditions. She told me that she believed there was "provision" for LFR to be used to search for vulnerable missing children, which might be considered less invasive to their privacy than making a public appeal, but that this had not yet been done. Deputy commissioner of regulatory policy at the Information Commissioner's Office (ICO) Emily Keaney, said these cameras can help to prevent and detect crime, but added that "its use must be necessary, proportionate and meet expectations of fairness and accuracy".She said they were working with the Met to review the safeguards in place and would be "closely monitoring" its added: "LFR is an evolving technology and, as our 2025 AI and biometrics strategy sets out, this is a strategic priority for the ICO. "In 2019, we published guidance for police forces to support responsible governance and use of facial recognition technology. "We'll continue to advise government on any proposed changes to the law, ensuring any future use of facial recognition technology remains proportionate and publicly trusted." The Home Office said facial recognition was "a crucial tool to keep the public safe that can identify offenders more quickly and accurately, with many serious criminals already brought to justice through its use".It added: "All police forces using this technology are required to comply with existing legislation on how and where it is used."We will set out our plans for the future use of facial recognition technology, in the coming months, including the legal framework and safeguards which will ensure it is properly used."


BBC News
22-05-2025
- BBC News
Sex offender spotted in Denmark Hill facial recognition operation
A registered sex offender has been jailed for two years after he was spotted walking with a six-year-old girl during a Live Facial Recognition (LFR) police operation. David Cheneler, 73, was found with the child in Denmark Hill, south-east London, on 10 January following an alert from LFR cameras. Checks confirmed that Cheneler was in breach of his Sexual Offence Prevention Order (SOPO), which prohibited him from being alone with a child under 14. He was also in possession of a lock-knife. Cheneler, of Lewisham, was sentenced at Kingston Crown Court on Tuesday, having previously pleaded guilty to breaching the conditions of his SOPO, and possessing an offensive weapon. 'Mother completely unaware' The Met Police uses LFR in pre-agreed locations in London to capture footage of people passing by and compare their faces against a database of wanted a match is detected, the system generates an alert. An officer will then review the match and decide if they wish to speak to the individual. Det Con Adam Pearce said: "Although there were no allegations made towards David Cheneler on this occasion, it's possible if he hadn't been identified using this technology, he could have gone on to abuse this child."Her mother was completely unaware of his offending history, and along with her young daughter were both taken advantage of by Cheneler, who abused their trust."The Met Police said officers established that Cheneler had picked the child up from school as a favour for her mother and had done so twice before, having built a relationship with them both over the course of a Chiswick, the Met's lead for LFR, said the technology could be used to stop people on a watch-list who have conditions they must adhere to."Without this technology, Cheneler may have had the opportunity to cause further harm," she SOPO was originally imposed in 2019, along with a nine-year prison sentence for 21 child abuse offences.


Daily Mail
21-05-2025
- Daily Mail
Moment Met police's facial recognition camera catches 73-year-old paedophile with six-year-old girl as he is jailed for two years
This is the chilling moment cops snared an armed paedophile walking a six-year-old girl home after identifying him with hi-tech facial recognition cameras. Convicted sex offender David Cheneler, 73, 'could have gone on to abuse' the child after befriending her mother, who was unaware of his horrific crimes spanning six decades. The bearded predator was caught after triggering a live facial recognition camera deployed on Denmark Hill, in south-east London. The technology films people walking past and generates an alert for police if there is a match with a watchlist of offenders. After being matched on the system of wanted criminals, Cheneler was stopped by officers in the street on January 10. He was later found to have had a flip knife hidden in his belt buckle. In body-worn footage released by the Metropolitan Police, the pensioner immediately confesses: 'I shouldn't be with that child. 'But I've only taken her from school to her mum's. We've made a mistake, we got the wrong bus.' Cheneler had picked the girl up from school as a 'favour' for her mother. He had done this twice before after building a relationship with them both over the course of a year, the Met Police said. Unbeknownst to her, Cheneler has a slew of convictions for sexual offences against children. In 2010 he was convicted for 15 counts of indecent assault on a female under 16 and five of gross indecency with a child between 1968 and 1993. He was jailed for nine years. Further checks confirmed he was in breach of a 2019 sexual offences prevention order - issued following his prison term - which barred him from being alone with any child under 14. Detective Constable Adam Pearce of the Met's local policing team in south-east London, who led the investigation, said: 'This is a prime example of how the Met is using technology to remove dangerous offenders from our streets, and Live Facial Recognition remains an important tool in protecting Londoners. 'Although there were no allegations made towards David Cheneler on this occasion, it is possible if he hadn't been identified using this technology, he could have gone on to abuse this child. 'Her mother was completely unaware of his offending history, and along with her young daughter, were both taken advantage of by Cheneler who abused their trust.' Lindsey Chiswick, the Met's lead for live facial recognition (LFR), said: 'The Met is committed to making London safer, using data and technology to identify offenders that pose a risk to our communities. 'This is a prime example of the variety of uses for LFR. The tool is not only used to find those wanted, but also to stop people on a watch list who have conditions they must adhere to. 'These interventions are crucial. Without this technology, Cheneler may have had the opportunity to cause further harm.' Cheneler of Lewisham, admitted breaching the conditions of his sexual offences prevention order, as well as possessing an offensive weapon at Wimbledon Magistrates' Court on January 13. He was jailed for two years at Kingston Crown Court.


The Sun
21-05-2025
- The Sun
Moment paedo arrested after being caught with six-year-old girl on street – before cops make terrifying discovery
THIS IS the moment where two police officers caught a known paedophile using Live Facial Recognition (LFR) technology. The sex offender was picked up by officers who made a chilling discovery when they reached him. 5 5 5 David Cheneler, 76, was found with a six-year-old girl in the shocking footage. The incident occurred on January 10, this year, in the busy Denmark Hill area of London. Officers can be heard telling him to 'keep his voice down' while he shouted and held an arm up to a female police officer. The known sex offender said that he was taking the girl 'from school to her mum' and claimed that they had taken the 'wrong bus'. One cop responded by reminding Cheneler that he was banned from being alone with any child below the age of 14, due to a Sexual Offences Prevention Order (SOPO). Cheneler tried bargaining with the officers in the video, captured by a body cam, by offering to show them 'something you won't probably find'. In exchange, he asked the officers to 'undo him' - in a reference to his handcuffs. Following that, the registered sex offender made a chilling admission saying: 'It's a little knife, I've got on me'. Officers retrieved a lock knife from his belt buckle and before arresting him for breaching the order. The cop, who had told Cheneler to keep his voice down, read him his rights before hauling him to the station. The LFR technology had been fitted on a police van in Denmark Hill and was scanning for known offenders in the area. Further enquiries by cops uncovered that Cheneler had picked up the girl as a favour to her mother - something he had done twice before, after building up a relationship with them over a year. However, the unnamed girl's mother was unaware of the sex offender's SOPO. Cheneler was jailed for two years on May 20, after pleading guilty to possessing an offensive weapon and breaking the terms of his SOPO. 5 The Sexual Offences Prevention Order in 2019, after being convicted of 15 counts of indecent assault on a girl under 16. He had also been convicted of five counts of gross indecency with a child between 1968 and 1993, which led to a nine-year prison term, After Cheneler's arrest in 2025, the Met's lead for Live Facial Recognition said that the operation wouldn't have been possible without facial recognition. Lindsey Chiswick said: 'The Met is committed to making London safer, using data and technology to identify offenders that pose a risk to our communities. 'This is a prime example of the variety of uses for LFR. The tool is not only used to find those wanted, but also to stop people on a watch list who have conditions they must adhere to. 'These interventions are crucial. Without this technology, Cheneler may have had the opportunity to cause further harm.' Detective Constable Adam Pearce, who led the investigation, added: 'This is a prime example of how the Met is using technology to remove dangerous offenders from our streets, and Live Facial Recognition remains an important tool in protecting Londoners. 'Although there were no allegations made towards David Cheneler on this occasion, it's possible if he hadn't been identified using this technology, he could have gone on to abuse this child. 'Her mother was completely unaware of his offending history, and along with her young daughter, were both taken advantage of by Cheneler who abused their trust.'