logo
#

Latest news with #IWF

Sports Bill empowers women with greater representation and responsibility
Sports Bill empowers women with greater representation and responsibility

Time of India

time4 hours ago

  • Sport
  • Time of India

Sports Bill empowers women with greater representation and responsibility

Mirabai Chanu Earlier this year when I got a letter from the Indian Weightlifting Federation (IWF) to become the chairperson of its Athletes' Commission, it came as a pleasant surprise. Coming as it did at a time when our sports is in intense need to find the next generation of weightlifters to win global medals for India, I accepted the position alongside Commonwealth Games gold medallist S. Sathish Kumar, who is the vice-chairman of the commission. Go Beyond The Boundary with our YouTube channel. SUBSCRIBE NOW! My four-year term will be a big opportunity to express how women feel about weightlifting, where the intricacies of the sport are vastly different from how men play it. Weightlifting hinges on explosive power and every muscle in a woman's body is impacted. So, it is important to build an ecosystem that is favourable for women to take up such a physically demanding sport. The Khelo Bharat Niti and the Sports Governance Bill mooted by the Union sports ministry complement each other. We have had a sports code that has worked well but sport is dynamic and therefore there is a need for policies and laws to change for the better. The good thing is that the Khelo Bharat Niti is aimed at the greater good of the youth at large and the Bill will ensure the ease of sporting business so that there are no hiccups and athletes wishing to excel at the highest level can just focus on their training. The inclusion of women in governance should not be seen as an intrusion. Gender neutrality is at the heart of the Olympic movement and now we have the first woman as the president of the International Olympic Committee (IOC). Since Tokyo 2020, the focus on women in global sports has been highlighted. At Los Angeles 2028, there will be more medals at stake for women than men! It is a significant indication that National Olympic Committees and their constituents will need to align themselves accordingly and ensure women get the respect they deserve on and off the field. The Bill mandates that at least four women should be in executive committee of national sports bodies. It's a good start to enable women with international sporting experience to be represented and more importantly, heard. Going forward, I expect 50% participation of women in executive committees. Governance does not mean ticking the boxes to superficially satisfy some rules and regulations. Since the Bill is wholly athlete centric, it also means taking into consideration small details that are often overlooked. Women, by nature, are meticulous and when armed with administrative powers, will surely be responsible and result oriented. At least that's the way I see myself in my new role in IWLF. Having meaningful conversations and creating a strategy that translates into growth and excellence should be the objective. Poll How important do you think the Khelo Bharat Niti is for youth sports in India? Very important Moderately important Slightly important Since 2014, there has been a definitive mindset shift towards sports in India. The Khelo India initiative is now at the fountainhead of sporting revolution in India but what has contributed to growth and excellence is the focus on infrastructure development, sports science and the government's deep desire to back athletes to shine. I have personally seen the transformation at Netaji Subhas National Institute of Sports in Patiala. The Sports Governance Bill will only make sure we take rapid strides to win more medals at the continental and global levels without compromising on the tenets of good governance. Since the 2016 Rio Olympics, we have witnessed a remarkable rise in the performance and presence of women athletes on the global stage. This momentum continued with even greater impact at the 2020 Tokyo Olympics and now at the 2024 Paris Olympics, where Indian women athletes have consistently delivered inspiring performances and carried the hopes of a nation. Their growing success not only reflects the evolving landscape of Indian sports but also highlights the importance of creating more inclusive opportunities. Although I have personally never faced any harassment or abuse, I am keen that the 'Safe Sports Policy' is strictly adopted by national sports federations and their affiliates. We still live in a male-dominated world and protection of the girl child is mandatory. (Mirabai Chanu is an Olympic medal-winning weightlifter) Catch Rani Rampal's inspiring story on Game On, Episode 4. Watch Here!

AI-powered 'nudify' apps fuel deadly wave of digital blackmail
AI-powered 'nudify' apps fuel deadly wave of digital blackmail

GMA Network

time18-07-2025

  • GMA Network

AI-powered 'nudify' apps fuel deadly wave of digital blackmail

WASHINGTON, United States - After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries." —Agence France-Presse

AI-Driven 'Nudify' Apps Fuelling Digital Blackmail Across the World
AI-Driven 'Nudify' Apps Fuelling Digital Blackmail Across the World

The Wire

time18-07-2025

  • The Wire

AI-Driven 'Nudify' Apps Fuelling Digital Blackmail Across the World

After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding USD 3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps – AI tools that digitally strip off clothing or generate sexualised imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organised," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps – which rose to prominence targeting celebrities – are being increasingly weaponised against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes – many of which are generated using these 'nudifying' services – seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful – maybe even as harmful as real images in some cases – can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualised images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offence, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalises the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries."

AI 'nudify' apps blackmailing minors rising in number
AI 'nudify' apps blackmailing minors rising in number

Express Tribune

time17-07-2025

  • Express Tribune

AI 'nudify' apps blackmailing minors rising in number

After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him, reports AFP. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps — AI tools that digitally strip off clothing or generate sexualised imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail, which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organised," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps — which rose to prominence targeting celebrities — are being increasingly weaponised against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six per cent of American teens have been a direct victim of deepfake nudes. "Reports of fakes and deepfakes — many of which are generated using these 'nudifying' services — seem to be closely linked with reports of financial sextortion, or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful — maybe even as harmful as real images in some cases — can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to USD36 million a year.

AI-powered 'nudify' apps fuel deadly wave of digital blackmail
AI-powered 'nudify' apps fuel deadly wave of digital blackmail

Time of India

time17-07-2025

  • Time of India

AI-powered 'nudify' apps fuel deadly wave of digital blackmail

After a Kentucky teenager died by suicide this year, his parents discovered he had received threatening texts demanding $3,000 to suppress an AI-generated nude image of him. The tragedy underscores how so-called sextortion scams targeting children are growing around the world, particularly with the rapid proliferation of "nudify" apps -- AI tools that digitally strip off clothing or generate sexualized imagery. Elijah Heacock, 16, was just one of thousands of American minors targeted by such digital blackmail , which has spurred calls for more action from tech platforms and regulators. His parents told US media that the text messages ordered him to pay up or an apparently AI-generated nude photo would be sent to his family and friends. "The people that are after our children are well organized," John Burnett, the boy's father, said in a CBS News interview. "They are well financed, and they are relentless. They don't need the photos to be real, they can generate whatever they want, and then they use it to blackmail the child." US investigators were looking into the case, which comes as nudify apps -- which rose to prominence targeting celebrities -- are being increasingly weaponized against children. The FBI has reported a "horrific increase" in sextortion cases targeting US minors, with victims typically males between the ages of 14 and 17. The threat has led to an "alarming number of suicides," the agency warned. Instruments of abuse In a recent survey, Thorn, a non-profit focused on preventing online child exploitation, found that six percent of American teens have been a direct victim of deepfake nudes . "Reports of fakes and deepfakes -- many of which are generated using these 'nudifying' services -- seem to be closely linked with reports of financial sextortion , or blackmail with sexually explicit images," the British watchdog Internet Watch Foundation (IWF) said in a report last year. "Perpetrators no longer need to source intimate images from children because images that are convincing enough to be harmful -- maybe even as harmful as real images in some cases -- can be produced using generative AI." The IWF identified one "pedophile guide" developed by predators that explicitly encouraged perpetrators to use nudifying tools to generate material to blackmail children. The author of the guide claimed to have successfully blackmailed some 13-year-old girls. The tools are a lucrative business. A new analysis of 85 websites selling nudify services found they may be collectively worth up to $36 million a year. The analysis from Indicator, a US publication investigating digital deception, estimates that 18 of the sites made between $2.6 million and $18.4 million over the six months to May. Most of the sites rely on tech infrastructure from Google, Amazon, and Cloudflare to operate, and remain profitable despite crackdowns by platforms and regulators, Indicator said. 'Whack-a-mole' The proliferation of AI tools has led to new forms of abuse impacting children, including pornography scandals at universities and schools worldwide, where teenagers created sexualized images of their own classmates. A recent Save the Children survey found that one in five young people in Spain have been victims of deepfake nudes, with those images shared online without their consent. Earlier this year, Spanish prosecutors said they were investigating three minors in the town of Puertollano for allegedly targeting their classmates and teachers with AI-generated pornographic content and distributing it in their school. In the United Kingdom, the government this year made creating sexually explicit deepfakes a criminal offense, with perpetrators facing up to two years in jail. And in May, US President Donald Trump signed the bipartisan "Take It Down Act," which criminalizes the non-consensual publication of intimate images, while also mandating their removal from online platforms. Meta also recently announced it was filing a lawsuit against a Hong Kong company behind a nudify app called Crush AI, which it said repeatedly circumvented the tech giant's rules to post ads on its platforms. But despite such measures, researchers say AI nudifying sites remain resilient. "To date, the fight against AI nudifiers has been a game of whack-a-mole," Indicator said, calling the apps and sites "persistent and malicious adversaries."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store