
West Midlands Police hail knife crime fall as new law comes in
The West Midlands said it saw 380 serious youth violence knife offences between April 23 to end March 24, which fell to 309 in the period from April 2024 to the end of March 2025, a drop of 18%.
'Huge amount of work' needed
Separate figures released by the Home Office after the targeted police action in seven high-risk areas showed the West Midlands saw a fall in knife-enabled robbery of 25% in the nine months from October 2024.The overall reduction across all seven areas was 6%.
WMP Chief Constable Craig Guildford said the force had seen "impressive reductions in knife crime", adding: "While it's certainly pleasing, there's still a huge amount of work for policing and wider society to do to tackle this problem."It's part of our ongoing commitment to make the West Midlands a safer place for everyone."The force released the figures as a ban on ninja swords came into force under Ronan's Law - named after Ronan Kanda who was killed in Wolverhampton on 29 June 2022, in a case of mistaken identity.In July 2024, the West Midlands police area recorded the highest rate of knife crime offences in England and Wales, but the force said a new policy had seen increased arrests.By May this year, reported knife crime had fallen by 6% with the region second highest after London.
Follow BBC Birmingham on BBC Sounds, Facebook, X and Instagram.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Independent
2 minutes ago
- The Independent
Police conclude investigation into death of Gogglebox's George Gilbey
Police have concluded their investigation into the death of Gogglebox's George Gilbey, with two men who were arrested facing no further police action. The 40-year-old had been 'working on a roof when he fell through a plastic skylight, landing on the ground below' in Shoeburyness in Essex last year, a previous inquest hearing was told. Mr Gilbey, of Clacton-on-Sea, suffered traumatic injuries to his head and torso, and died on March 27 last year. Essex Police said on Wednesday that two men, who were arrested on suspicion of gross negligence manslaughter, will face no further action from the force. In a statement the force said: 'Essex Police have confirmed their investigation into the death of George Gilbey last year has concluded. 'Two men who were arrested and released under investigation in connection with the incident in Shoebury on March 27, 2024 will face no further action from Essex Police. 'The thoughts of everyone at Essex Police remain with George's family and friends. 'The criminal investigation remains open and will be led by the Health and Safety Executive.' HSE inspector Natalie Prince said: 'We have been a part of this inquiry from the outset and we will continue to thoroughly investigate George's tragic death as the lead agency. 'This will aim to establish if there have been any breaches of health and safety law. 'We are in regular contact with George's family and our thoughts remain with them at this time.' Mr Gilbey was best known for appearing on the Channel 4 series Gogglebox – where participants watch and comment on TV shows from the previous week. The reality star also appeared on the 14th series of Celebrity Big Brother in 2014, reaching the final.


The Sun
3 minutes ago
- The Sun
Hero mum killed by falling branch ‘pushed daughter, 5, out of the way' before tragic death on walk with husband and kids
A MOTHER who was killed in front of her two young children when a tree branch fell on her has been hailed as a hero. Named locally as Madia Kauser, the mum is said to have pushed her five-year-old daughter out of the way of the falling branch. 3 3 She was sadly killed while taking her daughter and son for an evening stroll near Witton Country Park in Blackburn, Lancashire on Monday. The little girl was reportedly in a push chair that the brave mother managed to shove out of the way before being crushed by the falling branch. The woman's young daughter thankfully survived unharmed because of her mum's noble actions. Former mayor and local councillor Zamir Khan MBE, an uncle in the family, told the Daily Mail that her body was still with the coroner. He added: "Her little girl told me her mother pushed her out of the way as the branch fell. "The older boy was walking with his father in front and could not believe what happened. "It is very hard for the children. I do not think they will ever walk in that park again." The woman's husband was said to be just a few feet away with the couple's son when the tragedy unfolded. He rushed to offer aid to his wife but nothing could be done to save the woman. She was sadly pronounced dead at the scene after paramedics battled to save her. The remainder of the tree has since been removed by Blackburn with Darwen council, the local authority has also trimmed back nearby tree branches. Tributes have begun to pour in for the hero mother who "would do anything for her children." Social media tributes have described the heroic mum as a "lovely person" with the local Muslim community coming together to remember her. A gathering was held at Blackburn's Madina mosque on Wednesday where condolences were offered for the tragic mum and her family. Family members from across the country travelled to attend the gathering. A local social media group hosted a message for the mum, reading: "Please keep this mother and her young children in your prayers. "May Allah give them Sabrun Jameel. Condolences can be paid to the family at Madina Masjid on Oak St from 11am." The local authority said: "Blackburn with Darwen borough council is deeply saddened to confirm that a member of the public has tragically died following an incident in Witton Park, when a large tree branch fell.'


The Sun
3 minutes ago
- The Sun
My son, 16, killed himself over terrifyingly realistic deepfake… as sick ‘nudifying' apps sweep YOUR child's classroom
LOOKING out of the window of her school bus, a 13-year-old girl is distracted by the laughs of the two boys sitting in front of her as they flick through pictures on a phone. She peers over the backs of the lads' seats - only to recoil with horror at the source of their amusement: a photo of her walking into their school canteen, completely naked. 11 11 11 The image is, quite literally, the stuff of nightmares. Yet while it looks no less real than any other photograph, it is actually a deepfake image created by a 'nudifying' app, which has stripped the girl's school uniform from her body. A Sun investigation can reveal this disturbingly realistic artificial intelligence (AI)-based technology - which is used by millions and shockingly accessible on major social media sites - is sweeping British schools, putting children at risk of bullying, blackmail, and even suicide. While innocent youngsters are being 'nudified', or having their faces realistically planted onto naked bodies, teachers are being digitally 'stripped' by their students as crude 'banter'. Outside of the school playground, there are further disturbing problems, with data by the UK charity Internet Watch Foundation (IWF) revealing that reports of AI-generated child sexual abuse imagery have quadrupled in a year. In the encrypted depths of the internet, perverts are sharing sick AI 'paedophile manuals', detailing how to use artificial intelligence tools to create child sexual abuse material (CSAM). Some sickos are even creating deepfake nudes of schoolchildren to coerce them into forking over large sums of money - and for some victims, the consequences can be deadly. This February, 16-year-old schoolboy Elijah Heacock took his own life in the US after being blackmailed for more than £2,000 over an AI-generated naked picture of himself. 'I remember seeing the picture, and I thought, 'What is that? That's weird. That's like a picture of my child, but that's not my child',' grieving mum Shannon tells The Sun. The photo of Elijah - a music lover who, at 14, had started volunteering to feed the homeless in Kentucky with his twin sister, Palin - was discovered on his phone after his death. It was terrifyingly realistic - yet Shannon immediately saw signs it was fake. 'The District Attorney was like, 'No, it's a real photo',' continues the mum, who had never before heard of ' sextortion ' - where criminals blackmail a victim over sexual material. 'And we were like, 'No, it's not.' The photo almost looked like somebody sitting in a cloud.' She adds: 'He had abs. Eli did not have abs - bless his heart, he thought he did." Millions using sick apps As parents like Shannon have learned at a tragic cost, deepfakes are often so realistic-looking that experts can't tell they are AI-generated. On popular messaging apps, kids who wouldn't dream of law-breaking on Britain's streets are sharing fake nudes of their teen crushes - unaware it's illegal. British law prohibits the creation, or sharing, of indecent images of children, even if they are artificially made. Teens who do so for a 'laugh' face up to 10 years behind bars. In spite of this, 'nudifying' apps and websites that are being accessed by millions of people every month are being advertised on social media, listed by Google, and discussed avidly on the dark web. Our investigation found one website encouraging users to 'undress' celebrities - clearly, without their consent - with its 'gem'-based prices starting at £14 per bundle. 'Our tool can undress anyone in seconds,' boasts the site. Another, offering one free 'picture undress' per day, tells users they can strip a photo of 'a desired person'. And a third brags: 'Let our advanced AI bring your fantasies to life.' 11 Reviews on such sites paint - if possible - an even more horrific image. 'I can create secret images of the woman I like,' wrote the user, in his 40s, of another site. 'The sense of guilt is irresistible." Schools in crisis For schools, the rise of nudifying apps has provided a near-existential challenge. Experts warn senior staff are desperate to solve issues internally to avoid reputational damage, while teachers face career-threatening problems when fake photos of them are shared. 'The challenges of technology that nudifies photos or creates deepfake nude images is a problem most secondary schools and colleges around the country are now grappling with,' says top UK criminal defence lawyer Marcus Johnstone, who specialises in sex crime. 'I'll bet it's a live issue in every classroom.' Marcus, managing director of Cheshire-based PCD Solicitors, adds that he is seeing "an ever-increasing number of children being accused of crimes because of this 'nudify' technology". 'The schools don't want this information coming out,' he claims. 'The last thing they want is to have their school in the local press having a problem with lads at the school 'nudifying' girls and it's going around the school, around the internet. 'Parents of prospective children going there would go crackers. 'They'd say, 'Well, I'm not sending my kid there'.' Safeguarding expert Kate Flounders tells us: 'The impact [of deepfake nudes] is enormous. For staff, it can be career-ending, even if the image is found to have been AI-generated.' Calls for crackdown In April, Dame Rachel de Souza, the Children's Commissioner for England, called on the UK Government to ban apps that use AI to produce sexually explicit images of children. Her comments came as the IWF's analysts, who have special permission to hunt down and remove repulsive CSAM online, confirmed 245 reports in 2024 - a staggering 380 per cent increase on 2023. Schoolchildren now fear that 'anyone' could 'use a smartphone as a way of manipulating them by creating a naked image using these bespoke apps,' said Dame Rachel. British teens who have fallen victim to such technology have been calling the NSPCC's Childline counselling service, with one girl revealing she has been left with severe anxiety. The 14-year-old said boys at her school had created ' fake pornography ' of her and other girls. They'd then sent the explicit content to 'loads' of group chats. 'They were excluded for a bit, and we had a big assembly about why it was wrong, but after that the school told us to forget what happened,' the traumatised girl told Childline. 'I can't forget, though. 'People think that they saw me naked, and I have to see these boys every day.' Nearly six months on from Elijah's death, Shannon, a cheerleading coach, is struggling to deal with the loss of her beloved son, who is believed to have been targeted in a 'sextortion' scam by a man in Nigeria. 'We're not doing very well right now," she admits, adding jokingly: 'Elijah was an amazing brother who drove everyone insane." She continues: 'He was our tornado. Our house is so quiet and it's sad.' Shannon is calling for parents to chat to their kids about AI technology, with many mums and dads clueless about the explicit apps infiltrating their children's classrooms. 'Talk to your kids, and read about it,' she urges. 'Our children are in a war that we're not invited to.' 11 Sinister creeps Kate, CEO of the Safeguarding Association, has encountered UK-based cases where the images of schoolchildren and teachers were altered using 'freely available' apps. 'The issue is, once the image is out there, it is nigh on impossible to get it off,' she warns. 'I am aware of one case where a female student was subject to this, managed to have the image removed, only for it to resurface several years later when she was in college. 'The trauma was enormous for her.' 11 Of course, some children, and adults, create such content for more sinister reasons. 'Nudifying' services - many, with brazen terms like 'porn', 'undress', 'X' and 'AI' in their names - have been promoted in thousands of adverts on leading social media platforms. 'These apps are far too easy to access and exploit,' says Rani Govender, Policy Manager for Child Online Safety at the NSPCC, which also wants such apps to be banned. In June, Meta - the tech giant behind Facebook, Instagram and WhatsApp - announced it was suing the maker of CrushAI, an app that can create sexually explicit deepfakes. How sick 'nudifying' apps work THE technology behind 'nudifying' apps - used by children across Britain - is trained on 'massive datasets of real explicit imagery', explains AI consulting expert Wyatt Mayham. 'These 'nudifying' apps primarily use generative AI models like GANs (Generative Adversarial Networks) or newer, more sophisticated diffusion models,' he tells The Sun. 'The AI learns the patterns and textures of the human body, allowing it to 'inpaint' (fill in) or 'outpaint' (extend) a provided image, effectively stripping the clothing from a photo of a fully-clothed person and generating a realistic nude depiction.' Referring to the rise in AI-generated CSAM among UK schoolchildren, Wyatt, CEO of Northwest AI Consulting, adds: 'The danger goes far beyond a 'prank'. 'This is a new form of scalable, psychological abuse. 'For perpetrators, it's a low-risk, high-impact weapon for bullying, revenge, and control. 'More sinisterly, it's a powerful tool for 'sextortion'. 'A perpetrator can generate a realistic fake nude of a victim and then use it as leverage to extort money or, more commonly, to coerce the victim into providing real explicit images.' Jurgita Lapienytė, Editor-in-Chief at Cybernews, warns that AI tools are 'advancing quickly'. She tells us: 'Most of the apps are hard to stop because they use anonymous hosting and payments, often outside the UK. 'Social media giants and tech companies are not moving fast enough to block or report these tools, and current content monitoring often fails to catch them before damage is done.' Meta alleged the firm had attempted to 'circumvent Meta's ad review process and continue placing' adverts for CrushAI after they were repeatedly removed for violating its rules. The giant added it was taking further steps to 'clamp down' on 'nudifying' apps, including creating new detection technology and sharing information with other tech firms. But experts warn that adverts for such apps - often hosted anonymously and offshore - will only continue to pop up on a plethora of social networks as technology outpaces the law. Many of these adverts disguise their offerings as 'harmless' photo editor apps. Others, however, are more forthcoming. Our investigation found a sponsored advert, launched on Meta's platforms a day earlier, for an AI-based photo app that boasted: 'Undress reality… AI has never been this naughty.' Meta has since removed the ad. And on Google, we were able to access 'nudifying' tools at the click of a button. One search alone, made from a UK address, brought up two of these tools on the first page of results. Former FBI cyberspy-hunter Eric O'Neill tells us: 'AI-generated explicit content is widely traded on the dark web, but the real threat has moved into the light. 'These 'nudify' apps are being advertised on mainstream platforms - right where kids are. 'Today's teens don't need to navigate the dark web. 'With a few taps on their phone, they can generate and share explicit deepfakes instantly.' The process - which can destroy victims' lives - takes 'seconds', says Eric, now a cybersecurity expert and author of the upcoming book, Spies, Lies, and Cybercrime. He continues: 'A single photo - say, from a school yearbook or social post - can be fed into one of dozens of freely available apps to produce a hyper-realistic explicit image.' Legal loophole Although most 'nudifying' tools contain disclaimers or warnings prohibiting their misuse, experts say these do little to prevent users from acting maliciously. Deepfake nudes shared online among teens are at risk of being sold on the dark web - where predators prowl chat forums for 'AI lovelies' and 'child girlies'. A report published last year by UK-based Anglia Ruskin University's International Policing and Public Protection Research Institute (IPPPRI) reveals the horrors of such forums. One user chillingly wrote: 'My aim is to create a schoolgirl set where she slowly strips.' How to get help EVERY 90 minutes in the UK a life is lost to suicide It doesn't discriminate, touching the lives of people in every corner of society – from the homeless and unemployed to builders and doctors, reality stars and footballers. It's the biggest killer of people under the age of 35, more deadly than cancer and car crashes. And men are three times more likely to take their own life than women. Yet it's rarely spoken of, a taboo that threatens to continue its deadly rampage unless we all stop and take notice, now. If you, or anyone you know, needs help dealing with mental health problems, the following organisations provide support: Another dreamed of a 'paedo version of Sims combined with real AI conversational and interactive capabilities', while others called the vile creators of AI-generated CSAM 'artists'. Shockingly, perverts can now even digitally 'stitch' children's faces onto existing video content - including real footage of youngsters being previously sexually abused. In many cases, girls are the target. 'At the NSPCC, we know that girls are disproportionately targeted, reflecting a wider culture of misogyny - on and offline - that must urgently be tackled,' says Rani. 'Young girls are reaching out to Childline in distress after seeing AI-generated sexual abuse images created in their likeness, and the emotional impact can be devastating.' 'Digital assault' Earlier this year, the UK Government announced plans to criminalise the creation - not just the sharing - of sexually explicit deepfakes, which experts have praised as a 'critical step'. The change in law will apply to images of adults, with child imagery already covered. The Government will also create new offences for taking intimate images without consent, and the installation of equipment with the intent to commit these offences. A Google spokesperson told The Sun: 'While search engines provide access to the open web, we've launched and continue to develop ranking protections that limit the visibility of harmful, non-consensual explicit content. 'Specifically, these systems restrict the reach of abhorrent material like CSAM and content that exploits minors, and are effective against synthetic CSAM imagery as well.' Meta said it has strict rules against content depicting nudity or sexual activity - even if AI-generated - with users able to report violations of their privacy in imagery or videos. It also does not allow the promotion of 'nudify' services.