
School vows fix after student spread lewd AI pictures
A SCHOOL in Johor admitted that it had been slow in taking action when one of its students was accused of spreading AI-generated pornographic images of his classmates, Sin Chew Daily reported.
Foon Yew High School chairman Tong Sing Chuan admitted that the Chinese independent school was sluggish in addressing the matter and promised reforms.
'Many parents are concerned about this matter. The board is also saddened (that this happened).
'I stress that the board will not accept or tolerate any form of wrongdoing,' he said.
Earlier, it was reported that a 16-year-old male student from the school was arrested in April for using AI to create and sell pornographic images of his female schoolmates and alumni.
About 30 to 40 female students and alumni are believed to have become victims.
Following the incident, the principal resigned on April 12.
> A woman in China called off her wedding after her fiance was found to be one of dozens of men who had sex with the infamous cross-dressing 'Red Uncle', China Press reported.
Chinese cyberspace was inundated with posts about a man dubbed Red Uncle, who was rumoured to have lured 1,691 heterosexual men into his home for sexual encounters.
He then recorded and distributed the intimate encounters online.
Netizens then went about identifying the men involved.
One man, who made many visits to Red Uncle, was nicknamed 'Jacket Man' because he wore a leather jacket in the videos.
Netizens identifed his social media profile and found pre-wedding photos he had taken with his fiancee, and began sharing them.
The woman reportedly called off her wedding and will be getting herself tested for sexually transmitted infections.
The 38-year-old Red Uncle, whose surname is Jiao, was arrested by Nanjing police for spreading obscene videos online.
The police denied that Jiao had encounters with 1,691 men but did not give a specific figure.
(The above articles are compiled from the vernacular newspapers (Bahasa Malaysia, Chinese and Tamil dailies). As such, stories are grouped according to the respective language/medium. Where a paragraph begins with a >, it denotes a separate news item.)
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Star
2 hours ago
- The Star
China school expels woman for sex with Ukraine man that ‘hurts national dignity'
A Chinese student is facing expulsion from university for having casual sex with a Ukrainian gamer. The student, surnamed Li, has been accused of 'hurting national dignity', sparking online accusations of excessive punishment and rights infringement. An official from Dalian Polytechnic University, in northeastern China's Liaoning province, on July 8, said the institution planned to expel Li for an act of 'misconduct' on December 16, 2024. The 21-year-old was reported to have had a one-night stand with 37-year-old former Counter-Strike player Danylo Teslenko, who is 'Zeus', while he was attending an event in Shanghai last December. Teslenko posted intimate videos and photos of himself and Li in his fan group, and reportedly called her an 'easy girl'. The footage was leaked by his Chinese fans, and Li's information, including her real name, family background, and social media accounts, was doxxed. Some men reportedly harassed the university to punish Li. But many people said it was the reaction of the university that they found shocking. Not only did the university divulge Li's full name, it also cited disciplinary regulations stating that Li's behaviour saw her 'socialise with foreigners improperly and undermine national dignity and the school's reputation'. Many protested against the school over its excessive punishment and the infringement of Li's privacy. 'Her private life is none of the school's business,' one online observer said. It was widely reported that Teslenko was married with a child and that Li had a boyfriend at the time of the fling. However, Teslenko said he was neither married nor in a relationship when he was intimate with Li. He also expressed regret for posting the videos and denied the accusations that he had ever called Chinese girls 'easy'. A UK gaming news website reported that what Teslenko had said in Russian while hugging Li in the video may have been misconstrued as meaning he thought Chinese girls were 'easy'. The school's statement said that Li could appeal the decision by September 7. Li has yet to respond to the school's decision. Jin Lin, a lawyer from Beijing's Xidong Law Firm, told the mainland media outlet Fengmian News that the school's decision does not have legal grounds. According to the Provisions on the Administration of Students in Regular Institutions of Higher Education, students can only be expelled in eight circumstances, none of which match Li's situation. More than one lawyer advised Li to seek legal help and sue the school. The case also triggered accusations of bias against female students. In similar and even more serious cases involving male students, full names were rarely revealed, and some perpetrators received minor punishments. One online observer said: 'We still do not know Uncle Red's full name, but the school cannot wait to publish a female student's name for living her own life.' Uncle Red is a 38-year-old cross-dresser from eastern China's Jiangsu province who went viral worldwide for tricking hundreds of men into having sex with him and filming them. 'She is the biggest victim in this case. The Ukrainian man posted her face without her consent. The men witch-hunted her. Her school further hurt her by expelling her and publishing her name. They all owe her an apology,' said another. The university had not responded to the public comments by the time of writing. - SOUTH CHINA MORNING POST


The Sun
4 hours ago
- The Sun
AI-powered 'nudify' apps fuel deadly wave of digital blackmail
WASHINGTON: After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of 'nudify' apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. 'The people that are after our children are well organized,' John Burnett, the boy's father, said in a CBS News interview. 'They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child.' US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a 'horrific increase' in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an 'alarming number of suicides,' the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. 'Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images,' the British watchdog Internet Watch Foundation (IWF) said in a report last year. 'Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI.' The IWF identified one 'pedophile guide' developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan 'Take It Down Act,' which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. 'To date, the fight against AI nudifiers has been a game of whack-a-mole,' Indicator said, calling the apps and sites 'persistent and malicious adversaries.' - AFP


The Sun
4 hours ago
- The Sun
AI Sextortion Scams Target Teens, Trigger Global Alarm
WASHINGTON: After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of 'nudify' apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. 'The people that are after our children are well organized,' John Burnett, the boy's father, said in a CBS News interview. 'They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child.' US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a 'horrific increase' in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an 'alarming number of suicides,' the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. 'Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images,' the British watchdog Internet Watch Foundation (IWF) said in a report last year. 'Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI.' The IWF identified one 'pedophile guide' developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan 'Take It Down Act,' which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. 'To date, the fight against AI nudifiers has been a game of whack-a-mole,' Indicator said, calling the apps and sites 'persistent and malicious adversaries.' - AFP