Latest news with #OnlineSafety


Sky News
2 days ago
- Politics
- Sky News
Watchdog must fine social media companies that are slow to remove racism after Jess Carter abuse, says culture secretary
The online safety regulator should use powers to fine social media companies that are not quickly removing racism, Culture Secretary Lisa Nandy told Sky News, after concerns were raised by England defender Jess Carter. Carter has declared herself ready to play in the Women's European Championship semi-final against Italy on Tuesday after speaking out on the hate she has faced online during the tournament. Players have expressed frustration they are having to use their platform to pressure the tech firms, given how often footballers have had to deal with racist abuse. There is now the Online Safety Act which should be compelling the companies to take action. "We've introduced new laws so that platforms are under a legal obligation to take down that sort of disgusting content immediately," Ms Nandy told Sky News. "And they can be pursued through fines, through Ofcom, if they don't do it. "It's now up to those platforms and up to Ofcom to fulfil those roles that we've given them and make sure that this is stamped out online, that it's dealt with very quickly." But Kick It Out chairman Sanjay Bhandari told Sky News on Sunday that "it's got worse on social media, not better" - singling out Elon Musk's X and Mark Zuckerberg's Instagram. Neither of the companies has responded to requests for comment, including via a public X post. England defender Lucy Bronze said "online abuse is getting worse and worse" in women's football. Ms Nandy said: "The racial abuse that's been directed at Jess Carter is utterly disgusting and unfortunately is too common for women at the top of their game, not just in football but across sport as a whole. "We're considering as a government what more we can do to protect women players who reach those levels of exposure." The government has made dealing with sports issues a priority, with legislation passed today to introduce an independent regulator for men's football. The watchdog aims to ensure clubs are run sustainably and are accountable to their fans. Ms Nandy said: "There are now protections in law for fans and for clubs to make sure that we have really fit and proper owners; that there is somebody who can tackle rogue owners when problems arise, that we get a proper financial flow to ensure the sustainability of clubs throughout the football pyramid and to make that fans are put back at the heart of the game where they belong." The Premier League remains concerned the regulator could harm the success of its competition through unintended consequences.
Yahoo
7 days ago
- General
- Yahoo
IYKYK: Here Are the Popular Teen 'Texting Codes' Every Parent Should Know
Millennial parents are no strangers to acronyms. In fact, Millennials and Gen Xers are credited with making "LOL" (laughing out loud) so popular on instant messenger, that it eventually earned a spot in the Oxford English Dictionary in 2011. [1] (Take that, Gen Alpha!) But even with their impressive acronym cred, parents of today's teens are finding their kids texting in a mix of letters and words that may as well be an entirely different language. (IYKYK, am I right?) And while most of the acronyms are harmless, some forms of messaging are not. Specifically, 'texting codes' can signal cases of cyberbullying and serious mental health concerns in teens. Acronyms vs. 'Texting Codes' While an acronym is the first letter of each word in a phrase, Titania Jordan, Chief Parent Officer of online safety company Bark Technologies explains texting codes as a combination of acronyms, characters, words, and even emojis that represent hidden meanings. As a result, texting codes can be much harder for parents to understand—which unfortunately is exactly the point. 'Acronyms are [used] for ease of typing, as it's just quicker to tap out 'ILY' instead of 'I love you,'' Jordan says. 'Text codes are different. They can be used to cover your tracks in case someone is monitoring your messages.' Because texting codes are meant to look like harmless symbols or slang words, parents are more likely to overlook them. For example, parents may not be aware that '🍃' is code for "marijuana", or 'seggs' is a code word for "sex". With that said, the use of codes can also simply be a way kids choose to connect, explains Erin Walsh, author of It's Their World: Teens, Screens, and the Science of Adolescence and co-founder of Spark & Stitch Institute. 'Texting codes certainly can be used to avoid adult detection of risky behaviors,' Walsh says. 'But they can also just be shorthand ways for young people to build connections with friends and demonstrate belonging to a group.' Popular Acronyms and Meanings New acronyms pop up every day, according to Jordan, but here are some of the most common ones used by kids: BRB - "Be right back" BTW - "By the way" FOMO - "Fear of missing out" GOAT - "Greatest of all time" GTG - "Got to go" GR8 - "Great" IMO - "In my opinion" ISO - "In search of" IYKYK - 'If you know you know' (meant to imply that there's an inside joke) ILY - "I love you" IRL - "In real life" JK - "Just kidding" KMS - "Kill myself" KYS - "Kill yourself" L8R - "Later" LMAO - "Laughing my ass off" LOL - "Laugh(ing) out loud" NP - "No problem" OMW - "On my way" OFC - 'Of course' ROTF - 'Rolling on the floor' (typically in laughter) SMH - 'Shaking my head' ('I don't believe it' or 'that's so dumb') STFU - "Shut the f**k up" TBH - "To be honest" TYVM - "Thank you very much" WYD - "What you doing?" WTF - "What the f**k?" WYA - "Where you at?" WYD - "What you doing?" WUF - "Where you from?" Popular Texting Codes and Meanings These code-like acronyms have underlying meanings that kids may want to keep hidden: ASL - "Age/sex/location" CD9 or Code 9 - "Parents are around" DTF - "Down to f*ck" FBOI - "F*ck boy" (or a guy just looking for sex) FWB - "Friends with benefits" LMIRL - "Let's meet in real life" NP4NP - "Naked pic for naked pic" POS - "Parent over shoulder" TDTM - "Talk dirty to me" Concerning Texting Codes Parents Should Never Ignore Experts agree the rise of acronyms and codes that refer to self harm or mental health struggles is alarming, and they should be taken seriously. In fact, the latest research suggests that social media codes can be used to identify tweens and teens at risk for suicide, which makes it critical for parents to be able to spot concerning conversations. [2] According to Jordan, these are the codes that should raise immediate red flags if you see them appear in any inappropriate social media posts involving your teen: KMS - "Kill myself" KYS - "Kill yourself" STFU - "Shut the f**k up" Unalive - "Kill" or "dead" Sewerslide - "Suicide" Grippy sock vacation - "A stay in a psychiatric treatment facility" - "mental breakdown" I had pasta tonight - "I had suicidal thoughts" I finished my shampoo and conditioner at the same time - "I'm having suicidal thoughts" 'If someone's commenting 'KYS' on your child's Instagram or texting it to them, it's potentially a sign of bullying,' Jordan warns. 'It could be causing negative effects on their sense of self-worth and their mental health.' STFU ("shut the f*ck up") can be used as an expression of disbelief between friends, but it can also signal cyberbullying when used publicly on social media. How to Support Your Teen Experts give the caveat that simply knowing what these codes mean doesn't always reveal the context in which they're being used. 'A single acronym or code rarely tells the whole story,' Welch says. For example, 'KMS' can signal serious suicidal ideation, but it's also used to describe trivial moments of embarrassment or annoyance in personal text exchanges. Welch emphasizes continued communication will help you discern between a cause for concern and simply a need for some digital-age skill-building. She suggests the following: Don't assume the worst. Ask your child for an explanation or background of what you've seen before you launch into a lecture. 'It is okay for there to be long silences as your child sorts through their feelings about online interactions,' Welch says. Their reflection will shed the best light on the meaning behind what you've seen. Avoid becoming a 'spy.' "A quick 'Gotcha!' reaction to concerning acronyms or codes can create confusion, increase conflict, and may even encourage more secrecy as teens try to avoid adult surveillance and punishment," Welch says. Let your child know you're there to help. Receiving text codes related to self harm or suicide can raise a host of difficult questions for teens, Welch says. For example, 'Is my friend serious?' 'Should I talk to someone about this?' or 'What should I do next?' Reassuring your child that you are there to support them will foster honest conversations to determine next steps. Read the original article on Parents Solve the daily Crossword
Yahoo
13-07-2025
- Business
- Yahoo
Ofcom boss: Tech firms not given much power over how to protect children online
Technology companies are not being given much power over measures to provide greater protection to children online, the head of Ofcom has said as she defended upcoming reforms. The regulator announced last month that sites containing potentially harmful content, like porn sites, will have to perform age checks on users as part of reforms which apply to both dedicated adult sites and social media, search or gaming services as part of the Online Safety Act. Ian Russell, who has been campaigning for improved online safety since his 14-year-old daughter Molly took her own life after viewing harmful content on social media, said Ofcom needs to 'act within the bounds of the Act in the strongest possible way' and communicate weaknesses in the legislation to the Government. Ofcom's chief executive Dame Melanie Dawes told BBC's Sunday with Laura Kuenssberg: 'We've set out about five or six things that we think can work, like facial checks and using things where you've already been checked for your age, like credit cards or open banking. 'We said (to tech companies) you decide what works for your platform but we will be checking whether it's effective and those that don't put in those checks will hear from us with enforcement action.' Responding to the suggestion that Ofcom is giving companies a lot of power over how they implement measures, Dame Melanie said: 'No, we're not giving them that much power actually. What I'm saying is that when they're putting in age checks they need to work out what's going to work on their service. 'But, let me be really clear, what we are demanding to protect children and what does come in force at the end of this month they're going to need to tame those algorithms so that not just porn and suicide and self-harm material must not be shown but also violent content, dangerous challenges, misogyny, all of that must not be fed actively to kids on their feeds.' Pressed on why those types of content are not being blocked altogether, the chief executive said: 'What Parliament decided was that there should be an absolute block on suicide and self-harm material and pornography for under-18s and, then, what we've done is also add to that other types of content that we think is really harmful for children.' She added: 'I'm not a politician and I think it's incredibly important that Ofcom respects the role that we have which is to implement the laws that we've been given. 'If Parliament decides to widen those towards mis- and disinformation, or wider issues around addiction for the kids, for example, then of course, Ofcom stands ready to implement that.' Mr Russell said on the programme that it 'sounds promising' but the proof will be in what happens in practice. He said: '(Ofcom) need to act within the bounds of the Act in the strongest possible way. 'They're sitting in the middle pushed on one side by families who've lost people like me and pushed on the other side by the power of the big tech platforms. 'I also think it's really important that Melanie starts to talk back to Government because Ofcom is clear about where the act is weak and she needs to push back and communicate those weaknesses to the Government so that we can make change where necessary.' He said the charity he set up in his daughter's name, the Molly Rose Foundation, will be monitoring how harmful content online is reduced. Any company that fails to comply with the checks by July 25 could be fined or could be made unavailable in the UK through a court order. Transport Secretary Heidi Alexander said those changes in the law are 'really important', adding it was now up to technology companies to put in 'robust safeguards' for children using their platforms. But she suggested it was not the end of ministers' plans, telling the BBC's Sunday with Laura Kuenssberg: 'We are very clear as a Government that this is the foundation for a safer online experience for children, but it is not the end of the conversation. 'Peter Kyle, the Technology Secretary, has been clear that he wants to look at things such as addictive habits and how we create healthier habits for children online in the same way as we talk about healthier physical habits for children.' Ministers 'will keep under review what is required', Ms Alexander added. Ofcom research found that 8% of eight to 14-year-olds in the UK had visited an online porn site or app on smartphones, tablets or computers in a month. Last month, the regulator said it had launched a string of investigations into 4chan, a porn site operator and several file-sharing platforms over suspected failures to protect children, after it received complaints about illegal activity and potential sharing of child abuse images. A report looking into the use and effectiveness of age assurance methods will be published by Ofcom next year.
Yahoo
13-07-2025
- Business
- Yahoo
Ofcom boss: Tech firms not given much power over how to protect children online
Technology companies are not being given much power over measures to provide greater protection to children online, the head of Ofcom has said as she defended upcoming reforms. The regulator announced last month that sites containing potentially harmful content, like porn sites, will have to perform age checks on users as part of reforms which apply to both dedicated adult sites and social media, search or gaming services as part of the Online Safety Act. Ian Russell, who has been campaigning for improved online safety since his 14-year-old daughter Molly took her own life after viewing harmful content on social media, said Ofcom needs to 'act within the bounds of the Act in the strongest possible way' and communicate weaknesses in the legislation to the Government. Ofcom's chief executive Dame Melanie Dawes told BBC's Sunday with Laura Kuenssberg: 'We've set out about five or six things that we think can work, like facial checks and using things where you've already been checked for your age, like credit cards or open banking. 'We said (to tech companies) you decide what works for your platform but we will be checking whether it's effective and those that don't put in those checks will hear from us with enforcement action.' Responding to the suggestion that Ofcom is giving companies a lot of power over how they implement measures, Dame Melanie said: 'No, we're not giving them that much power actually. What I'm saying is that when they're putting in age checks they need to work out what's going to work on their service. 'But, let me be really clear, what we are demanding to protect children and what does come in force at the end of this month they're going to need to tame those algorithms so that not just porn and suicide and self-harm material must not be shown but also violent content, dangerous challenges, misogyny, all of that must not be fed actively to kids on their feeds.' Pressed on why those types of content are not being blocked altogether, the chief executive said: 'What Parliament decided was that there should be an absolute block on suicide and self-harm material and pornography for under-18s and, then, what we've done is also add to that other types of content that we think is really harmful for children.' She added: 'I'm not a politician and I think it's incredibly important that Ofcom respects the role that we have which is to implement the laws that we've been given. 'If Parliament decides to widen those towards mis- and disinformation, or wider issues around addiction for the kids, for example, then of course, Ofcom stands ready to implement that.' Mr Russell said on the programme that it 'sounds promising' but the proof will be in what happens in practice. He said: '(Ofcom) need to act within the bounds of the Act in the strongest possible way. 'They're sitting in the middle pushed on one side by families who've lost people like me and pushed on the other side by the power of the big tech platforms. 'I also think it's really important that Melanie starts to talk back to Government because Ofcom is clear about where the act is weak and she needs to push back and communicate those weaknesses to the Government so that we can make change where necessary.' He said the charity he set up in his daughter's name, the Molly Rose Foundation, will be monitoring how harmful content online is reduced. Any company that fails to comply with the checks by July 25 could be fined or could be made unavailable in the UK through a court order. Transport Secretary Heidi Alexander said those changes in the law are 'really important', adding it was now up to technology companies to put in 'robust safeguards' for children using their platforms. But she suggested it was not the end of ministers' plans, telling the BBC's Sunday with Laura Kuenssberg: 'We are very clear as a Government that this is the foundation for a safer online experience for children, but it is not the end of the conversation. 'Peter Kyle, the Technology Secretary, has been clear that he wants to look at things such as addictive habits and how we create healthier habits for children online in the same way as we talk about healthier physical habits for children.' Ministers 'will keep under review what is required', Ms Alexander added. Ofcom research found that 8% of eight to 14-year-olds in the UK had visited an online porn site or app on smartphones, tablets or computers in a month. Last month, the regulator said it had launched a string of investigations into 4chan, a porn site operator and several file-sharing platforms over suspected failures to protect children, after it received complaints about illegal activity and potential sharing of child abuse images. A report looking into the use and effectiveness of age assurance methods will be published by Ofcom next year. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


BBC News
13-07-2025
- Business
- BBC News
We're looking at further online safety rules, says minister
The government is considering further action to keep children safe online and will not "sit back and wait" on the issue, a cabinet minister has Secretary Heidi Alexander told the BBC new age-verification rules beginning later this month would have a "really important" said the regulations, to be overseen by media regulator Ofcom, would not be the "end of the conversation" on online boss Melanie Dawes vowed to rigorously enforce the new requirements, adding the regulator "means business". But she acknowledged Ofcom may require further legal powers in order to keep pace with the rapidly developing impact of artificial intelligence (AI). Under new powers introduced by the Online Safety Act and passed under the previous Tory government, Ofcom will require internet companies to conduct stricter age verification methods to check whether a user is under 18.A new code of practice, to apply from 25 July, will also require platforms to change algorithms affecting what is shown in children's feeds to filter out harmful the last election, Labour committed to "build on" the previous government's law and consider further measures to keep children it is yet to publish fresh legislation of its own, with ministers arguing the existing set of new regulations need to be rolled out first. 'Addictive habits' Speaking to Sunday with Laura Kuenssberg, Alexander said the new rules would bring in "really robust safeguards" to ensure proper age she added: "We are very clear as a government that this is the foundation for a safer online experience for children, but it is not the end of the conversation".She said Technology Secretary Peter Kyle was looking at further action in a number of areas, including how to address "addictive habits" among children, although she did not provide further details."We're not going to be a government that sits back and waits on this, we want to address it," she added. Ofcom's chief executive told the programme the new rules would mean tech platforms would have to change their content algorithms "very significantly".Ms Dawes said the regulator would give websites some flexibility when deciding which age-verification tools to use, but pledged that those failing to put adequate checks in place "will hear from us with enforcement action".However, she acknowledged some newer forms of AI "may not" be covered be powers contained in the existing legislation."There may need to be some changes to the legislation to cover that," she added.