logo
Pledge to halt closure of Peterborough skills hub 'ambiguous'

Pledge to halt closure of Peterborough skills hub 'ambiguous'

BBC News09-05-2025
A campaigner described a council promise to scrap plans to close a skills hub for vulnerable adults as a "stay of execution" and said it had done little to provide long-term reassurance.Hayley Janceski, whose brother Richard is one of the 29 people supported by the Industrial Hub in Peterborough, said a pledge to "halt the proposal" was "ambiguous".Peterborough City Council had said it could withdraw funding from the hub, which gives training and work experience to people with autism and learning disabilities.It has since said it will work with service users and their families to "look at how a new service could be developed" to provide "greater opportunities and a financial saving".
The authority said it faced a budget gap of more than £20m in 2025/26, and closing the hub, which is part of the City College's Day Opportunities programme, could save £500,000. Hundreds signed a petition opposing the move.Ms Janceski criticised a letter to service users which said the hub "will not close for now".
"When you're telling an autistic person or a person with learning disabilities that something won't happen for now, it's ambiguous," she said."If this letter is supposed to provide reassurance, it does the opposite. If anything, it's more distressing. "It's not closing for now, but when?"She also criticised the suggestion in the council's letter that users should be supported to "be more independent, stay healthy and happy [and] find good jobs", calling it a "real insult"."These are the best jobs they've ever had," she said. "My brother's been there since 2018. He's never been more happy."
Some users needed carers or supported living, she added, making independence or certain kinds of employment an "unachievable goal".The hub, based in Hampton, provides training and work experience in woodworking, garden maintenance, recycling, painting and crafting.Shabina Qayyum, a Labour councillor who is the cabinet member for public health and adult social care, asked council officers to stop pursuing the closure of the hub and instead look at how to transform the service."I can safely say that the decision to close the hub has been taken off the table for now and a period of thoughtful interaction with all involved will now take place," she said.
"I realise the distress this has caused those who have used the hub and their families."She added that the council's commitment to encouraging "independence" and "meaningful employment" for people with learning disabilities was "driving the change we want to make to this service", as well as the need to save money.Ms Janceski said Qayyum "has been fighting for the Industrial Hub"."We have a really good councillor there on our side, but I think it's just a stay of execution," she said.Peterborough City Council said it will still be spending £5m to provide day opportunities for people with learning disabilities, including £1.7m with City College.
Follow Peterborough news on BBC Sounds, Facebook, Instagram and X.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Government must stop children using VPNs to dodge age checks on porn sites, commissioner demands
Government must stop children using VPNs to dodge age checks on porn sites, commissioner demands

The Independent

time25 minutes ago

  • The Independent

Government must stop children using VPNs to dodge age checks on porn sites, commissioner demands

England's children's commissioner has demanded that the government stop children from using virtual private networks (VPNs) to get around age verification on porn sites. Calling for change, Dame Rachel de Souza warned it is "absolutely a loophole that needs closing" as she released a new report, which found the proportion of children saying they have seen pornography online has risen in the past two years, with most likely to have stumbled upon it accidentally. VPNs are tools that connect internet users to websites via remote servers, enabling them to hide their real IP address and location, which includes allowing them to look as if they are online but in another country. This means the Online Safety Act, which now forces platforms to check users' ages if attempting to access some adult content, can be dodged. After sites such as PornHub, Reddit and X introduced age verifcation requirements last month, VPNs became the most downloaded apps, according to the BBC. A government spokesperson told the broadcaster that there are no plans to ban VPNs as they are legal tools for adults. Dame Rachel told Newsnight: "Of course, we need age verification on VPNs – it's absolutely a loophole that needs closing and that's one of my major recommendations." She called on ministers to look at requiring VPNs 'to implement highly effective age assurances to stop underage users from accessing pornography'. More than half (58 per cent) of respondents to the commissioner's survey said that, as children, they had seen pornography involving strangulation, while 44 per cent reported seeing a depiction of rape – specifically someone who was asleep. Made up of responses from 1,020 people aged between 16 and 21 years old, the report also found that while children were on average aged 13 when they first saw pornography, more than a quarter (27 per cent) said they were 11, and some reported being aged 'six or younger'. The research suggested four in 10 respondents felt girls can be 'persuaded' to have sex even if they say no at first, and that young people who had watched pornography were more likely to think this way. The report, a follow-on from research by the Children's Commissioner's office in 2023, found a higher proportion (70 per cent) of people saying they had seen online pornography before turning 18, up from 64 per cent of respondents two years ago. Boys (73 per cent) were more likely than girls (65 per cent) to report seeing online pornography. A majority (59 per cent) of children and young people said they had seen pornography online by accident – a rise from 38 per cent in 2023. Dame Rachel said her research is evidence that harmful content is being presented to children through dangerous algorithms, rather than them seeking it out. She described the content young people are seeing as 'violent, extreme and degrading' and often illegal, and said her office's findings must be seen as a 'snapshot of what rock bottom looks like'. Dame Rachel said: 'This report must act as a line in the sand. The findings set out the extent to which the technology industry will need to change for their platforms to ever keep children safe. 'Take, for example, the vast number of children seeing pornography by accident. This tells us how much of the problem is about the design of platforms, algorithms and recommendation systems that put harmful content in front of children who never sought it out.' The research was done in May, ahead of new online safety measures coming into effect last month, including age checks to prevent children accessing pornography and other harmful content. A Department of Science, Innovation and Technology spokesperson told the BBC that "children have been left to grow up in a lawless online world for too long" and "the Online Safety Act is changing that'. However, responding to Dame Rachel's remarks on VPNs, they added that there are no plans to ban them, "but if platforms deliberately push workarounds like VPNs to children, they face tough enforcement and heavy fines'.

Londonderry: US flag saved from bonfire returned to school
Londonderry: US flag saved from bonfire returned to school

BBC News

time26 minutes ago

  • BBC News

Londonderry: US flag saved from bonfire returned to school

A historical US flag stolen from the grounds of a school built on the site of a former American naval base has been returned after it was recovered from a bonfire in flag was taken from Foyle College on the city's Limavady Road in early Monday, independent Derry City and Strabane District councillor Gary Donnelly said he believed the flag had been removed from the bonfire after efforts to have it College confirmed on Tuesday that the flag had now been handed back to the school. The school thanked those involved in securing the safe return of the flag."We hope its safe return will play a part in improving mutual understanding across our shared society and assist efforts to build a more peaceful future," a statement school said "given the sensitivity surrounding this process", it would be making no further comment. The flag was gifted to the school by members of the former US naval communications was last officially flown at the base in November 1963 to mark President Kennedy's death and more than half a century later in 2019 was presented to Foyle College which had moved to the site the year before. Police have said they are investigating the placing of materials on the bonfires, which were lit in the Creggan and Bogside areas of Derry on Friday night, including flags and wreaths, as sectarian hate crimes and sectarian hate Monday it was reported that a last-ditch attempt to save a flag stolen from Londonderry's Protestant cathedral from being burned on a bonfire in the city had failed.

Labour ‘playing gesture politics' with online safety, says Molly Russell's father
Labour ‘playing gesture politics' with online safety, says Molly Russell's father

Telegraph

time39 minutes ago

  • Telegraph

Labour ‘playing gesture politics' with online safety, says Molly Russell's father

Labour is 'playing gesture politics' over protecting children from online harms, the father of Molly Russell has said. In an exclusive article for The Telegraph, Ian Russell warned that eight years after his 14-year-old daughter's death, social media platforms were still bombarding children with the same kind of suicide and self-harm content that led her to take her life. He claimed he had been met with 'radio silence' from Sir Keir Starmer and Peter Kyle, the Technology Secretary, over the six months since they personally assured him they would look again at toughening up the Online Safety Act. 'Sticking plaster ideas' And he accused the Government of being 'more interested in playing performative gesture politics with sticking plaster ideas' – such as two-hour caps on children's app use that would do little to tackle their exposure to online harms. Mr Russell is demanding the Government strengthen the Act by replacing Ofcom's 'timid' codes of practice with clear outcomes and targets for the social media companies to wipe their sites clean of harmful content such as suicide material. Research published on Tuesday by the Molly Rose Foundation, the charity set up in his daughter's memory, found TikTok and Instagram were still deluging teenagers with 'industrial levels' of dangerous suicide and self-harm content. The study claimed more than 90 per cent of the videos recommended to potentially vulnerable teenagers were promoting or glorifying suicide or self-harm. Molly took her own life after being bombarded with 16,000 'destructive' posts – including 2,100 on Instagram – encouraging self-harm, anxiety and even suicide in her final six months. The coroner at her inquest concluded she died from an act of self-harm while suffering from depression and 'the negative effects of online content' which had 'more than minimally contributed' to her death. Mr Russell, who chairs the foundation, said: 'It is staggering that eight years after Molly's death, incredibly harmful suicide, self-harm and depression content like she saw is still pervasive across social media. 'Ofcom's recent child safety codes do not match the sheer scale of harm being suggested to vulnerable users, and ultimately do little to prevent more deaths like Molly's. 'For over a year, this entirely preventable harm has been happening on the Prime Minister's watch and where Ofcom have been timid, it is time for him to be strong and bring forward strengthened, life-saving legislation without delay.' The researchers used accounts opened with a registered age and identity of a 15-year-old girl who had previously engaged with suicide, self-harm and depression material. The study was conducted in the weeks leading up to the implementation of the Online Safety Act, which requires companies to prevent and remove such content. However, the research suggested its algorithms were still driving content deemed harmful towards teenagers. Videos were classified as harmful if they either promoted or glorified suicide or self-harm, referred to suicide or self-harm ideation, or otherwise featured highly intense themes of hopelessness, misery and despair. Almost all (96 per cent) of the algorithmically recommended videos watched on TikTok's For You Page contained content that was likely to be harmful, particularly when viewed cumulatively or in large amounts. Some 97 per cent of Instagram short-form videos (known as Reels) contained themes likely to be harmful, particularly when being recommended and consumed in large amounts. 'The findings suggest safeguards were still not in place on either TikTok and Instagram, and that in the immediate period before regulation took effect, children could still be exposed to a substantial risk of reasonably foreseeable but preventable harm,' said the charity's report. More than half (55 per cent) of recommended harmful posts on TikTok's For You Page included references to suicide and self-harm ideation, and 16 per cent referred to suicide methods. The charity said the harmful content was achieving 'disturbing' levels of interest. One in 10 of the videos deemed harmful by researchers on TikTok's For You Page had been liked at least one million times. On Instagram Reels, one in five harmful recommended videos had been liked more than 250,000 times. The Technology Secretary said: 'These figures show a brutal reality – for far too long, tech companies have stood by as the internet fed vile content to children, devastating young lives and even tearing some families to pieces. 'But companies can no longer pretend not to see. The Online Safety Act, which came into effect earlier this year, requires platforms to protect all users from illegal content and children from the most harmful content, like promoting or encouraging suicide and self-harm. Forty-five sites are already under investigation. 'Ofcom is also considering how to strengthen existing measures, including by proposing that companies use proactive technology to protect children from self-harm content and that sites go further in making algorithms safe.' Meanwhile, a study by the Children's Commissioner found that children are more likely to view porn on Elon Musk's X than on dedicated adult sites. Dame Rachel de Souza found that children as young as six are being exposed to more porn since the Online Safety Act became law than they were before. That included illegal violent porn, such as strangulation and non-consensual sex. Social media and networking sites accounted for 80 per cent of the main sources by which children viewed porn. Dame Rachel said this easy access was influencing children's attitudes towards women, meaning nearly half of them believed girls who said no could be persuaded to have sex. With social media and networking sites accounting for eight out of 10 of the sources for children viewing porn, X, formerly Twitter, remained the most common source, outstripping dedicated porn sites. The gap between the number of children seeing pornography on X and those seeing it on dedicated porn sites has widened (45 per cent versus 35 per cent in 2025, compared to 41 per cent versus 37 per cent in 2023). Snapchat accounted for 29 per cent, Instagram 23 per cent, TikTok 22 per cent, and YouTube 15 per cent. The research, based on 1,020 young people aged 16 to 21, found 70 per cent of children had seen porn before the age of 18, an increase from 64 per cent in 2023, when the Online Safety Act received royal assent. In her report, published on Tuesday, Dame Rachel said: 'Violent pornography is easily accessible to children, exposure is often accidental and often via the most common social media sites, and it is impacting children's behaviours and beliefs in deeply concerning ways. 'This report must be a line in the sand. It must be a snapshot of what was – not what will be.' A TikTok spokesman said: 'Teen accounts on TikTok have 50+ features and settings designed to help them safely express themselves, discover and learn, and parents can further customise 20+ content and privacy settings through family pairing. 'With over 99 per cent of violative content proactively removed by TikTok, the findings don't reflect the real experience of people on our platform which the report admits.' A Meta spokesman said: 'We disagree with the assertions of this report and the limited methodology behind it. Tens of millions of teens are now in Instagram teen accounts, which offer built-in protections that limit who can contact them, the content they see, and the time they spend on Instagram. 'We continue to use automated technology to remove content encouraging suicide and self-injury, with 99 per cent proactively actioned before being reported to us. We developed teen accounts to help protect teens online and continue to work tirelessly to do just that.' Is Starmer ready to take decisive measures to save our children? By Ian Russell Eight years on from my daughter Molly's death, we continue to lose the battle against the untold harm being inflicted by tech giants. Every week in the UK, we lose at least another teenager to suicide where technology plays a role. However, the unfathomable reality is that I'm less convinced than ever that our politicians will do what's necessary to stop this preventable harm in its tracks. This week, Molly Rose Foundation released deeply disturbing new research that showed that in the weeks before the Online Safety Act took effect, Instagram and TikTok's algorithms continued to recommend the type of toxic material that cost my daughter's life. Vulnerable teenagers continue to be bombarded with suicide, self-harm and intense depression material on a near industrial scale. We should be in no doubt why this is still happening. The harms on social media are the direct result of business models that actively prioritise user engagement and a race for market share. Children's safety continues to be seen as an optional extra, and Ofcom's desperately unambitious implementation of the Online Safety Act will do little to change the commercial incentives that continue to cost children's lives. This preventable harm is happening on this Government's watch. Six months ago, I met with the Prime Minister and told him that further urgent action was necessary. Ofcom 'timid and unambitious' I told him that parents were heartened by Labour's commitment to strengthen the Online Safety Act in opposition. That they were encouraged by the Technology Secretary's recognition that the Act was 'uneven and unsatisfactory'. Crucially, I explained that swift and decisive action was necessary to fix structural issues with the Online Safety Act, still the most effective and quickest way to protect children from widespread harm while also enabling them to enjoy the benefits of life online, and to arrest the sticking plaster approach that Ofcom has adopted to implementation. I don't have confidence in the regulator's approach. Ofcom has proven to be desperately timid and unambitious, and seems determined to take decisions that are stacked in favour of tech companies rather than victims. For all the regulator's breathless claims to be 'taming toxic algorithms', buried in the detail of their plans is an expectation that the likes of TikTok and Instagram will only need to spend £80,000 fixing the algorithms that helped kill Molly, and that our research shows are continuing to cause widespread and pervasive harm today. This is pocket money to platforms making tens of billions every year. It sends the clearest of signals to the tech giants that the current regime expects them to pay lip service to online safety but doesn't really expect them to implement the achievable changes to prioritise safety over profit. Six months after I met the Prime Minister and Technology Secretary Peter Kyle, and received a personal assurance from them they would look again at this issue, all I have received from Number 10 is radio silence. Meanwhile, the Technology Secretary appears to be more interested in playing performative gesture politics with sticking plaster ideas like two-hour app caps that those who work in online safety immediately recognise will do little to meaningfully change the dial. Public support for Act is strong Despite the recent predictable howls of protest from free speech activists and tech libertarians, public support for the Online Safety Act remains strong. Our polling suggests that 80 per cent of adults want the Act to be strengthened, with a growing despair from parents and the public that our politicians seem unable or unwilling to protect our children, families and wider society from preventable harm. With all this in mind, my message to Sir Keir Starmer is clear. Is he prepared to take the decisive measures necessary to strengthen regulation and take on the tech companies which are a threat to children's safety? Will he listen to the public, bereaved families and civil society to deliver a comprehensive strengthening of the Online Safety Act, knowing that the majority of people in this country will be cheering him on? The alternative is that he leaves children and families at risk from largely unchecked but inherently preventable harm. Regulation that is well-meaning but isn't up to the job will not save the lives that it must.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store