logo
#

Latest news with #OnlineSafety

Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid
Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid

Malay Mail

time5 hours ago

  • Malay Mail

Digital paedophilia is still child abuse, even if the child is virtual — Haezreena Begum Abdul Hamid

JULY 24 — Digital paedophilia refers to the sexual exploitation of children in virtual spaces, through tools like artificial intelligence, deepfakes, cartoons, and virtual reality. These images may be digitally generated, and the children portrayed may not even exist in real life. Yet, the intent is deeply exploitative, the gratification it feeds is criminal, and the harm it causes individually and socially is undeniable. While this form of abuse doesn't always involve contact with a child, it normalises the sexualisation of minors, perpetuates a dangerous fantasy, and fuels demand in underground networks. As a criminologist, I've seen how what initially began as 'just digital content' has the potential to desensitise consumers and eventually lead to real-world offending. It systematically erodes the legal, ethical, and societal safeguards established to protect the rights and dignity of children. This threat is amplified by the pervasive culture of online sharing. Many individuals, including parents, are drawn to posting their daily lives on social media, treating platforms like Facebook and TikTok as digital diaries. While these platforms foster connection and self-expression, they also expose users — particularly children — to serious risks. It is disturbingly easy to download images from Facebook and extract videos from TikTok, even without the content owner's knowledge or consent. Once uploaded, these digital footprints can be stolen, altered, and weaponised for exploitation. Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings. What makes this even more dangerous is that it's easy to access, easy to share, and hard to trace. The perpetrators hide behind screens, usernames, and encrypted platforms. And yet, the damage is far from virtual. Digital paedophilia can take many forms, for instance: AI-generated child sexual abuse material (CSAM) that mimics realistic children in sexual scenarios; deepfakes that superimpose children's faces — often taken from social media — onto adult bodies in explicit content; illustrated pornography (such as hentai or lolicon) that sexualises underage characters; and virtual reality simulations that allow users to act out child abuse fantasies in immersive settings. ― iStock pic Malaysia has made commendable strides with the Sexual Offences Against Children Act 2017, the Penal Code, and the Communications and Multimedia Act 1998 — all of which provide mechanisms to combat traditional forms of child sexual abuse, including live-streamed exploitation. However, these laws still fall short when it comes to digital creations. Many of these materials are not technically illegal under current definitions of CSAM, because no real child was involved in their creation. But does that make them any less harmful? I would argue not. These depictions, no matter how 'virtual', are created with the intent of sexualising minors and should be recognised as forms of child sexual exploitation. Other countries have taken this step. The UK, Canada, and Australia have criminalised virtual child pornography, recognising that a legal definition limited only to real, identifiable victims is inadequate in the face of emerging technology. Therefore, Malaysia must consider doing the same. The harms of digital paedophilia are not hypothetical. Every day, images of real children — often taken innocently by their parents and shared online — are stolen, manipulated, and repurposed into sexual content. This is a profound violation of dignity, privacy, and safety. These children may never know what was done to their image, but the psychological and reputational damage can be lifelong. Unlike Denmark, which recognises an individual's legal right to their own image and personal likeness through robust copyright and data protection laws, Malaysia currently lacks explicit legal provisions that grant individuals — particularly children — ownership and control over the use of their personal images. This legal vacuum leaves Malaysian children especially vulnerable to digital exploitation, with limited recourse or protection under current frameworks. Moreover, digital abuse fuels cognitive distortions in those with paedophilic tendencies. Studies show that repeated exposure to virtual CSAM lowers inhibition, increases desensitisation, and can serve as a gateway to contact offences. As a society, we cannot afford to wait until a physical child is harmed to institute any action. The damage begins long before that. Therefore, an effective and urgent response is required. This includes amending existing laws to explicitly cover digitally created and AI-generated child sexual abuse material. We must criminalise intent and impact, not just physical involvement. Social media platforms, app developers, and AI companies must also be held accountable for detecting, reporting, and removing exploitative content — technological innovation must not come at the expense of child safety. Digital literacy should no longer be seen as optional. Parents, children, and educators must be equipped with the knowledge to understand the risks of oversharing online and how personal images can be misused. Every policy, investigation, and reform effort must place the child at the centre — not merely as a legal category, but as a human being deserving of dignity, protection, and justice. In addition, Malaysia must strengthen its cooperation with global agencies such as Interpol, Aseanapol, and other cross-border cybercrime task forces to effectively track offenders and dismantle transnational networks. Digital paedophilia sits at the intersection of technological progress and moral regression. As artificial intelligence and virtual reality continue to advance, so too does the capacity to simulate harm, commodify children, and conceal abuse behind layers of code. But just because the abuse is virtual does not mean it is any less real. The law must evolve to meet this challenge because every child, whether real or digitally represented, deserves to be safe from sexual exploitation. * Dr Haezreena Begum Abdul Hamid is a Criminologist and Senior Lecturer at the Faculty of Law, University of Malaya. She can be reached at [email protected]y * This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail.

Watchdog must fine social media companies that are slow to remove racism after Jess Carter abuse, says culture secretary
Watchdog must fine social media companies that are slow to remove racism after Jess Carter abuse, says culture secretary

Sky News

time3 days ago

  • Politics
  • Sky News

Watchdog must fine social media companies that are slow to remove racism after Jess Carter abuse, says culture secretary

The online safety regulator should use powers to fine social media companies that are not quickly removing racism, Culture Secretary Lisa Nandy told Sky News, after concerns were raised by England defender Jess Carter. Carter has declared herself ready to play in the Women's European Championship semi-final against Italy on Tuesday after speaking out on the hate she has faced online during the tournament. Players have expressed frustration they are having to use their platform to pressure the tech firms, given how often footballers have had to deal with racist abuse. There is now the Online Safety Act which should be compelling the companies to take action. "We've introduced new laws so that platforms are under a legal obligation to take down that sort of disgusting content immediately," Ms Nandy told Sky News. "And they can be pursued through fines, through Ofcom, if they don't do it. "It's now up to those platforms and up to Ofcom to fulfil those roles that we've given them and make sure that this is stamped out online, that it's dealt with very quickly." But Kick It Out chairman Sanjay Bhandari told Sky News on Sunday that "it's got worse on social media, not better" - singling out Elon Musk's X and Mark Zuckerberg's Instagram. Neither of the companies has responded to requests for comment, including via a public X post. England defender Lucy Bronze said "online abuse is getting worse and worse" in women's football. Ms Nandy said: "The racial abuse that's been directed at Jess Carter is utterly disgusting and unfortunately is too common for women at the top of their game, not just in football but across sport as a whole. "We're considering as a government what more we can do to protect women players who reach those levels of exposure." The government has made dealing with sports issues a priority, with legislation passed today to introduce an independent regulator for men's football. The watchdog aims to ensure clubs are run sustainably and are accountable to their fans. Ms Nandy said: "There are now protections in law for fans and for clubs to make sure that we have really fit and proper owners; that there is somebody who can tackle rogue owners when problems arise, that we get a proper financial flow to ensure the sustainability of clubs throughout the football pyramid and to make that fans are put back at the heart of the game where they belong." The Premier League remains concerned the regulator could harm the success of its competition through unintended consequences.

IYKYK: Here Are the Popular Teen 'Texting Codes' Every Parent Should Know
IYKYK: Here Are the Popular Teen 'Texting Codes' Every Parent Should Know

Yahoo

time16-07-2025

  • General
  • Yahoo

IYKYK: Here Are the Popular Teen 'Texting Codes' Every Parent Should Know

Millennial parents are no strangers to acronyms. In fact, Millennials and Gen Xers are credited with making "LOL" (laughing out loud) so popular on instant messenger, that it eventually earned a spot in the Oxford English Dictionary in 2011. [1] (Take that, Gen Alpha!) But even with their impressive acronym cred, parents of today's teens are finding their kids texting in a mix of letters and words that may as well be an entirely different language. (IYKYK, am I right?) And while most of the acronyms are harmless, some forms of messaging are not. Specifically, 'texting codes' can signal cases of cyberbullying and serious mental health concerns in teens. Acronyms vs. 'Texting Codes' While an acronym is the first letter of each word in a phrase, Titania Jordan, Chief Parent Officer of online safety company Bark Technologies explains texting codes as a combination of acronyms, characters, words, and even emojis that represent hidden meanings. As a result, texting codes can be much harder for parents to understand—which unfortunately is exactly the point. 'Acronyms are [used] for ease of typing, as it's just quicker to tap out 'ILY' instead of 'I love you,'' Jordan says. 'Text codes are different. They can be used to cover your tracks in case someone is monitoring your messages.' Because texting codes are meant to look like harmless symbols or slang words, parents are more likely to overlook them. For example, parents may not be aware that '🍃' is code for "marijuana", or 'seggs' is a code word for "sex". With that said, the use of codes can also simply be a way kids choose to connect, explains Erin Walsh, author of It's Their World: Teens, Screens, and the Science of Adolescence and co-founder of Spark & Stitch Institute. 'Texting codes certainly can be used to avoid adult detection of risky behaviors,' Walsh says. 'But they can also just be shorthand ways for young people to build connections with friends and demonstrate belonging to a group.' Popular Acronyms and Meanings New acronyms pop up every day, according to Jordan, but here are some of the most common ones used by kids: BRB - "Be right back" BTW - "By the way" FOMO - "Fear of missing out" GOAT - "Greatest of all time" GTG - "Got to go" GR8 - "Great" IMO - "In my opinion" ISO - "In search of" IYKYK - 'If you know you know' (meant to imply that there's an inside joke) ILY - "I love you" IRL - "In real life" JK - "Just kidding" KMS - "Kill myself" KYS - "Kill yourself" L8R - "Later" LMAO - "Laughing my ass off" LOL - "Laugh(ing) out loud" NP - "No problem" OMW - "On my way" OFC - 'Of course' ROTF - 'Rolling on the floor' (typically in laughter) SMH - 'Shaking my head' ('I don't believe it' or 'that's so dumb') STFU - "Shut the f**k up" TBH - "To be honest" TYVM - "Thank you very much" WYD - "What you doing?" WTF - "What the f**k?" WYA - "Where you at?" WYD - "What you doing?" WUF - "Where you from?" Popular Texting Codes and Meanings These code-like acronyms have underlying meanings that kids may want to keep hidden: ASL - "Age/sex/location" CD9 or Code 9 - "Parents are around" DTF - "Down to f*ck" FBOI - "F*ck boy" (or a guy just looking for sex) FWB - "Friends with benefits" LMIRL - "Let's meet in real life" NP4NP - "Naked pic for naked pic" POS - "Parent over shoulder" TDTM - "Talk dirty to me" Concerning Texting Codes Parents Should Never Ignore Experts agree the rise of acronyms and codes that refer to self harm or mental health struggles is alarming, and they should be taken seriously. In fact, the latest research suggests that social media codes can be used to identify tweens and teens at risk for suicide, which makes it critical for parents to be able to spot concerning conversations. [2] According to Jordan, these are the codes that should raise immediate red flags if you see them appear in any inappropriate social media posts involving your teen: KMS - "Kill myself" KYS - "Kill yourself" STFU - "Shut the f**k up" Unalive - "Kill" or "dead" Sewerslide - "Suicide" Grippy sock vacation - "A stay in a psychiatric treatment facility" - "mental breakdown" I had pasta tonight - "I had suicidal thoughts" I finished my shampoo and conditioner at the same time - "I'm having suicidal thoughts" 'If someone's commenting 'KYS' on your child's Instagram or texting it to them, it's potentially a sign of bullying,' Jordan warns. 'It could be causing negative effects on their sense of self-worth and their mental health.' STFU ("shut the f*ck up") can be used as an expression of disbelief between friends, but it can also signal cyberbullying when used publicly on social media. How to Support Your Teen Experts give the caveat that simply knowing what these codes mean doesn't always reveal the context in which they're being used. 'A single acronym or code rarely tells the whole story,' Welch says. For example, 'KMS' can signal serious suicidal ideation, but it's also used to describe trivial moments of embarrassment or annoyance in personal text exchanges. Welch emphasizes continued communication will help you discern between a cause for concern and simply a need for some digital-age skill-building. She suggests the following: Don't assume the worst. Ask your child for an explanation or background of what you've seen before you launch into a lecture. 'It is okay for there to be long silences as your child sorts through their feelings about online interactions,' Welch says. Their reflection will shed the best light on the meaning behind what you've seen. Avoid becoming a 'spy.' "A quick 'Gotcha!' reaction to concerning acronyms or codes can create confusion, increase conflict, and may even encourage more secrecy as teens try to avoid adult surveillance and punishment," Welch says. Let your child know you're there to help. Receiving text codes related to self harm or suicide can raise a host of difficult questions for teens, Welch says. For example, 'Is my friend serious?' 'Should I talk to someone about this?' or 'What should I do next?' Reassuring your child that you are there to support them will foster honest conversations to determine next steps. Read the original article on Parents Solve the daily Crossword

Ofcom boss: Tech firms not given much power over how to protect children online
Ofcom boss: Tech firms not given much power over how to protect children online

Yahoo

time13-07-2025

  • Business
  • Yahoo

Ofcom boss: Tech firms not given much power over how to protect children online

Technology companies are not being given much power over measures to provide greater protection to children online, the head of Ofcom has said as she defended upcoming reforms. The regulator announced last month that sites containing potentially harmful content, like porn sites, will have to perform age checks on users as part of reforms which apply to both dedicated adult sites and social media, search or gaming services as part of the Online Safety Act. Ian Russell, who has been campaigning for improved online safety since his 14-year-old daughter Molly took her own life after viewing harmful content on social media, said Ofcom needs to 'act within the bounds of the Act in the strongest possible way' and communicate weaknesses in the legislation to the Government. Ofcom's chief executive Dame Melanie Dawes told BBC's Sunday with Laura Kuenssberg: 'We've set out about five or six things that we think can work, like facial checks and using things where you've already been checked for your age, like credit cards or open banking. 'We said (to tech companies) you decide what works for your platform but we will be checking whether it's effective and those that don't put in those checks will hear from us with enforcement action.' Responding to the suggestion that Ofcom is giving companies a lot of power over how they implement measures, Dame Melanie said: 'No, we're not giving them that much power actually. What I'm saying is that when they're putting in age checks they need to work out what's going to work on their service. 'But, let me be really clear, what we are demanding to protect children and what does come in force at the end of this month they're going to need to tame those algorithms so that not just porn and suicide and self-harm material must not be shown but also violent content, dangerous challenges, misogyny, all of that must not be fed actively to kids on their feeds.' Pressed on why those types of content are not being blocked altogether, the chief executive said: 'What Parliament decided was that there should be an absolute block on suicide and self-harm material and pornography for under-18s and, then, what we've done is also add to that other types of content that we think is really harmful for children.' She added: 'I'm not a politician and I think it's incredibly important that Ofcom respects the role that we have which is to implement the laws that we've been given. 'If Parliament decides to widen those towards mis- and disinformation, or wider issues around addiction for the kids, for example, then of course, Ofcom stands ready to implement that.' Mr Russell said on the programme that it 'sounds promising' but the proof will be in what happens in practice. He said: '(Ofcom) need to act within the bounds of the Act in the strongest possible way. 'They're sitting in the middle pushed on one side by families who've lost people like me and pushed on the other side by the power of the big tech platforms. 'I also think it's really important that Melanie starts to talk back to Government because Ofcom is clear about where the act is weak and she needs to push back and communicate those weaknesses to the Government so that we can make change where necessary.' He said the charity he set up in his daughter's name, the Molly Rose Foundation, will be monitoring how harmful content online is reduced. Any company that fails to comply with the checks by July 25 could be fined or could be made unavailable in the UK through a court order. Transport Secretary Heidi Alexander said those changes in the law are 'really important', adding it was now up to technology companies to put in 'robust safeguards' for children using their platforms. But she suggested it was not the end of ministers' plans, telling the BBC's Sunday with Laura Kuenssberg: 'We are very clear as a Government that this is the foundation for a safer online experience for children, but it is not the end of the conversation. 'Peter Kyle, the Technology Secretary, has been clear that he wants to look at things such as addictive habits and how we create healthier habits for children online in the same way as we talk about healthier physical habits for children.' Ministers 'will keep under review what is required', Ms Alexander added. Ofcom research found that 8% of eight to 14-year-olds in the UK had visited an online porn site or app on smartphones, tablets or computers in a month. Last month, the regulator said it had launched a string of investigations into 4chan, a porn site operator and several file-sharing platforms over suspected failures to protect children, after it received complaints about illegal activity and potential sharing of child abuse images. A report looking into the use and effectiveness of age assurance methods will be published by Ofcom next year.

Ofcom boss: Tech firms not given much power over how to protect children online
Ofcom boss: Tech firms not given much power over how to protect children online

Yahoo

time13-07-2025

  • Business
  • Yahoo

Ofcom boss: Tech firms not given much power over how to protect children online

Technology companies are not being given much power over measures to provide greater protection to children online, the head of Ofcom has said as she defended upcoming reforms. The regulator announced last month that sites containing potentially harmful content, like porn sites, will have to perform age checks on users as part of reforms which apply to both dedicated adult sites and social media, search or gaming services as part of the Online Safety Act. Ian Russell, who has been campaigning for improved online safety since his 14-year-old daughter Molly took her own life after viewing harmful content on social media, said Ofcom needs to 'act within the bounds of the Act in the strongest possible way' and communicate weaknesses in the legislation to the Government. Ofcom's chief executive Dame Melanie Dawes told BBC's Sunday with Laura Kuenssberg: 'We've set out about five or six things that we think can work, like facial checks and using things where you've already been checked for your age, like credit cards or open banking. 'We said (to tech companies) you decide what works for your platform but we will be checking whether it's effective and those that don't put in those checks will hear from us with enforcement action.' Responding to the suggestion that Ofcom is giving companies a lot of power over how they implement measures, Dame Melanie said: 'No, we're not giving them that much power actually. What I'm saying is that when they're putting in age checks they need to work out what's going to work on their service. 'But, let me be really clear, what we are demanding to protect children and what does come in force at the end of this month they're going to need to tame those algorithms so that not just porn and suicide and self-harm material must not be shown but also violent content, dangerous challenges, misogyny, all of that must not be fed actively to kids on their feeds.' Pressed on why those types of content are not being blocked altogether, the chief executive said: 'What Parliament decided was that there should be an absolute block on suicide and self-harm material and pornography for under-18s and, then, what we've done is also add to that other types of content that we think is really harmful for children.' She added: 'I'm not a politician and I think it's incredibly important that Ofcom respects the role that we have which is to implement the laws that we've been given. 'If Parliament decides to widen those towards mis- and disinformation, or wider issues around addiction for the kids, for example, then of course, Ofcom stands ready to implement that.' Mr Russell said on the programme that it 'sounds promising' but the proof will be in what happens in practice. He said: '(Ofcom) need to act within the bounds of the Act in the strongest possible way. 'They're sitting in the middle pushed on one side by families who've lost people like me and pushed on the other side by the power of the big tech platforms. 'I also think it's really important that Melanie starts to talk back to Government because Ofcom is clear about where the act is weak and she needs to push back and communicate those weaknesses to the Government so that we can make change where necessary.' He said the charity he set up in his daughter's name, the Molly Rose Foundation, will be monitoring how harmful content online is reduced. Any company that fails to comply with the checks by July 25 could be fined or could be made unavailable in the UK through a court order. Transport Secretary Heidi Alexander said those changes in the law are 'really important', adding it was now up to technology companies to put in 'robust safeguards' for children using their platforms. But she suggested it was not the end of ministers' plans, telling the BBC's Sunday with Laura Kuenssberg: 'We are very clear as a Government that this is the foundation for a safer online experience for children, but it is not the end of the conversation. 'Peter Kyle, the Technology Secretary, has been clear that he wants to look at things such as addictive habits and how we create healthier habits for children online in the same way as we talk about healthier physical habits for children.' Ministers 'will keep under review what is required', Ms Alexander added. Ofcom research found that 8% of eight to 14-year-olds in the UK had visited an online porn site or app on smartphones, tablets or computers in a month. Last month, the regulator said it had launched a string of investigations into 4chan, a porn site operator and several file-sharing platforms over suspected failures to protect children, after it received complaints about illegal activity and potential sharing of child abuse images. A report looking into the use and effectiveness of age assurance methods will be published by Ofcom next year. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store