logo
#

Latest news with #illegalcontent

Ofcom investigates 4chan and porn site over suspected child safety breaches
Ofcom investigates 4chan and porn site over suspected child safety breaches

The Independent

time4 days ago

  • The Independent

Ofcom investigates 4chan and porn site over suspected child safety breaches

Ofcom has launched a string of investigations into 4chan, a porn site operator and several file-sharing platforms over suspected failures to protect children and prevent illegal content. The media regulator is looking into whether the sites have broken new rules under the Online Safety Act, which require online services to crack down on child sexual abuse material and put strong safety measures in place for UK users. The platforms under investigation include 4chan, porn provider First Time Videos, which runs and and seven file-sharing services: Krakenfiles, Nippybox, Nippydrive, Nippyshare, Nippyspace and Yolobit. Ofcom said it received complaints about illegal activity on 4chan and potential sharing of child abuse images on the file-sharing sites. It also said none of the services responded to legal information requests. The First Time Videos probe will examine whether it has 'highly effective' age checks to stop children viewing porn, which is now a legal requirement for such sites. A spokesperson said: 'We have received complaints about the potential for illegal content and activity on 4chan, and possible sharing of child sexual abuse material on the file-sharing services.' If the companies are found to have broken the law, Ofcom can impose fines of up to £18 million or 10% of global turnover, and even seek court orders to block UK access. The regulator said more enforcement action is expected as wider parts of the Online Safety Act come into force from the end of July.

4chan and porn sites investigated by Ofcom
4chan and porn sites investigated by Ofcom

BBC News

time4 days ago

  • Business
  • BBC News

4chan and porn sites investigated by Ofcom

The online message board 4chan is being investigated by the UK communications regulator over failure to comply with recently introduced online safety says it has received complaints over potential illegal content on the website, which has not responded to its requests for the Online Safety Act, online services must assess the risk of UK users encountering illegal content and activity on their platforms, and take steps to protect them from is also investigating porn provider First Time Videos over its age verification checks, and seven file sharing services over potential child sexual abuse material.4chan has been contacted for comment. Ofcom says it requested 4chan's risk assessment in April but has not had any regulator will now investigate whether the platform "has failed, or is failing, to comply with its duties to protect its users from illegal content".It would not say what kind of illegal content it is has the power to fine companies up to 10% of their global revenues, or £18m - whichever is the greater number.4chan has often been at the heart of online controversies in its 22 years, including misogynistic campaigns and conspiracy theories. Users are anonymous, which can often lead to extreme content being posted. It was the subject of an alleged hack earlier this year, which took parts of the website down for over a week. Seven file sharing services also failed to respond to requests for information from the are Krakenfiles, Nippybox, Nippydrive, Nippyshare, Nippyspace and also says it has received complaints over potential child sexual abuse material being shared on these platforms. Separately, porn provider First Time Videos, which runs two websites, is being investigated into whether it has adequate age checks in place to stop under-18s accessing its which host age-restricted content must have "robust" age checks in place by July. Ofcom does not specify exactly what this means, but some platforms have been trialling age verification using facial scanning to estimate a user's media expert Matt Navarra told BBC News earlier this year facial scanning could become the norm in the UK. Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.

Risto Bergman: Telecoms worker who created AI-generated images of child sex abuse is sentenced
Risto Bergman: Telecoms worker who created AI-generated images of child sex abuse is sentenced

Sky News

time19-05-2025

  • Sky News

Risto Bergman: Telecoms worker who created AI-generated images of child sex abuse is sentenced

A telecoms worker who created AI-generated images of child sex abuse has been sentenced. Risto Bergman, 42, who is originally from Finland, used a legitimate artificial intelligence (AI) app to make indecent images of young girls being abused. Scotland's Crown Office and Procurator Fiscal Service (COPFS) said the images were "so realistic" that they could be taken for authentic photographs. Bergman was said to have used sexually descriptive search terms while using the app. COPFS said AI then generated the "distressing images" by drawing upon a "digital library" of hundreds of pictures of real child abuse which had been previously shared online by paedophiles. Images discovered on a computer storage unit found in Bergman's Paisley flat included some described as category A - depicting the most extreme type of child sex abuse. Bergman, who has since moved from Renfrewshire to Argyll and Bute, last month pleaded guilty to making indecent photographs or pseudo-photographs of children. He returned to Paisley Sheriff Court on Monday, where he was handed an 18-month Community Payback Order (CPO). His name was also added to the sex offenders' register. David Bernard, procurator fiscal for north Strathclyde, said Bergman had created illegal material that both exploited children and perpetuated abuse. He said: "This is by no means a victimless crime. Bergman's depraved actions effectively encouraged those who abuse children to continue their activities. "Artificial intelligence apps draw upon online images of real children being subjected to sexual abuse. Behind every AI-generated 'pseudo-photograph' of abuse are real-life child victims.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store