logo
#

Latest news with #NCII

Malaysia aims 100 cybersecurity experts accredited as C-CISO this year
Malaysia aims 100 cybersecurity experts accredited as C-CISO this year

The Sun

time2 days ago

  • Business
  • The Sun

Malaysia aims 100 cybersecurity experts accredited as C-CISO this year

KUALA LUMPUR: The Ministry of Digital is targeting at least 100 cybersecurity experts to be accredited as Certified Chief Information Security Officers (C-CISO) by the end of this year, said Minister Gobind Singh Deo. He said that through the C-CISO Certification Programme, the ministry is committed to producing more talents and experts in the field, to strengthen the public sector's preparedness against cyber threats. The initiative, he said, was implemented through a strategic collaboration with the EC-Council and the Human Resource Development Corporation (HRD Corp), with the first seven participants receiving the certification today. 'Our target is that by the end of this year, we can produce a total of 100 C-CISOs as a start and moving forward, we want to see more people showing interest and participating in this programme,' he told a press conference after launching the Cyber ​​Security Professional Capability Development Programme here today. He said the programme would also be expanded to other sectors to ensure local talents in the field of cybersecurity could continue to be polished and developed continuously. Also present were the Digital Ministry secretary-general Fabian Bigar, CyberSecurity Malaysia chief executive officer Datuk Dr Amirudin Abdul Wahab and EC-Council president Sanjay Bavisi. Earlier in his speech, Gobind Singh said the certification program is one of the key components in supporting the implementation of the Cyber ​​Security Act 2024 (Act 854), especially among National Critical Infrastructure (NCII) entities. 'As a Chief Information Security Officer (CISO), the person holds a strategic role in ensuring the organisation's compliance with the provisions under Act 854. 'The responsibilities of the CISO include formulating cybersecurity policies, implementing technical controls, risk management and organisational preparedness in dealing with cyber incidents, in addition to serving as a strategic link between the government, industry, technology providers and the NCII community,' he said. He said a CISO also plays a crucial role in shaping and driving a security-first culture within an organisation by promoting continuous training and certification, while ensuring that all systems and technologies in use adhere to established security standards. The C-CISO programme covers five main domains, including governance, security audit, data protection, operations management and strategic planning, which Gobind described as a long-term investment in the development of the country's digital leadership. 'If in the past, the strength of a country was measured through the military, today it depends on the security and trust in digital systems. Digital defence is the main pillar of the country's prosperity and stability,' he said.

APD: 11 children victims of child exploitation investigation in Austin, man arrested in case
APD: 11 children victims of child exploitation investigation in Austin, man arrested in case

Yahoo

time6 days ago

  • General
  • Yahoo

APD: 11 children victims of child exploitation investigation in Austin, man arrested in case

AUSTIN (KXAN) — Eleven children were victims of a child exploitation investigation in Austin, and a 19-year-old man was arrested in the case, Austin Police Department Child Exploitation Unit Sgt. Russell Weirich said during a media briefing Thursday. Jack Bullington was charged on 10 of the cases after APD said it received those reports in September 2024. He was accused of posting 'explicit images' of the juveniles on social media, which were 'cropped' on images of juveniles' 'nude bodies' that were then altered by artificial intelligence (AI), Weirich said. Bill to protect victims of deepfake 'revenge' porn passes US Senate Three cyber tips came into police, which were generated by the National Center for Missing and Exploited Children (NCMEC). Afterward, police obtained a search warrant on the social media account associated with Bullington. The NCMEC is 'a nonprofit organization created in 1984 by child advocates with the purpose to help find missing children, fight child sexual exploitation and prevent child victimization,' said Jennifer Newman, executive director at the NCMEC. 'Every day NCMEC receives a constant flow of horrific child sexual abuse and exploitative material into the cyber tip line. Once received, we review and assess that information, add value to the report and refer it out to the appropriate law enforcement agency,' Newman said during the briefing. The investigation determined that Bullington 'attained photographs of the victims from a variety of social media sites and platforms when the victims were younger than 18 years of age.' He would then share those images on social media with 'another individual located overseas,' Weirich said. According to NCMEC, the nonprofit saw a 1,300% increase in cyber tip line reports that involved generative AI technology, which went from 4,700 reports in 2023 to more than 67,000 reports in 2024. 'We know it's scary, and we want you to know you're not alone,' Newman said. Furthermore, she said NCMEC has free resources to take down 'nude or sexually exploitative imagery that may be online.' This comes after the U.S. Senate passed the TAKE IT DOWN Act in February, which criminalizes the publication of non-consensual intimate imagery (NCII). The act, also called S.4569, was introduced by Senators Ted Cruz, R-Texas, and Amy Klobuchar, D-Minnesota. 'In terms of prevention and talking to your kids, that communication is such a key part of this. And really opening that door and having these discussions early and often with your kids is really the biggest tool in the toolbox when we talk about online child sexual exploitation. Also making sure that your kids know that you're a safe space to come to, that you're not going to respond with anger or, you know, be overly upset, that they are turning to you because they're upset,' Newman said. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Trump hails cooperative effort at anti-revenge porn bill signing: 'Bipartisanship is still possible'
Trump hails cooperative effort at anti-revenge porn bill signing: 'Bipartisanship is still possible'

Yahoo

time19-05-2025

  • Politics
  • Yahoo

Trump hails cooperative effort at anti-revenge porn bill signing: 'Bipartisanship is still possible'

President Donald Trump said the cooperation he witnessed to get the Take It Down Act into law was one of the greatest moments of bipartisanship he has seen. The president signed the bill, which punishes internet abuse involving nonconsensual, explicit imagery, during an outdoors ceremony in the White House Rose Garden Monday afternoon, joined by first lady Melania Trump, who has been championing the issue since her husband's inauguration. "This legislation is a powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused through non-consensual, or intimate imagery of NCII," the first lady said from the rose garden Monday afternoon. "Artificial intelligence and social media at a digital candy for the next generation," she added. "Sweet, addictive and engineered to have an impact on the cognitive development of our children. But unlike sugar, these new technologies can be weaponized to shape beliefs, and sadly affect emotions and even be deadly." Trump reiterated the importance of the new legislation during the signing ceremony Monday afternoon from the White House. He also touted "a level of bipartisanship" he's never seen before to get the legislation across the finish line, citing the work of the first lady as a big catalyst. "We've shown that bipartisanship is possible," Trump said shortly before he signed the new act. "I mean, it's the first time I've seen such a level of bipartisanship, but it's a beautiful thing to do. I'm not even sure you realize, honey, you know, a lot of the Democrats and Republicans don't get along so well. You've made them get along, and she didn't even know about that. She didn't know we had a problem. She didn't know we had a problem. She got it. The Take It Down Act is a bill introduced in the Senate by Sens. Ted Cruz, R-Texas, and Amy Klobuchar, D-Minn., that would make it a federal crime to publish, or threaten to publish, nonconsensual intimate imagery, including "digital forgeries" crafted by artificial intelligence. The bill unanimously passed the Senate in February, and passed in the House of Representatives in April with a vote of 409–2. Read On The Fox News App Melania Trump Speaks On Capitol Hill For First Time In Roundtable Focused On Punishing Revenge Porn The law would require penalties of up to three years in prison for sharing nonconsensual intimate images — authentic or AI-generated — involving minors and two years in prison for those images involving adults. It also would require penalties of up to two and a half years in prison for threat offenses involving minors, and one and a half years in prison for threats involving adults. The bill requires social media companies, like Snapchat, TikTok, Instagram and similar platforms, to put procedures in place to remove such content within 48 hours of notice from the victim. AI-generated images known as "deepfakes" often involve editing videos or photos of people to make them look like someone else by using artificial intelligence. Deepfakes hit the public's radar in 2017 after a Reddit user posted realistic-looking pornography of celebrities to the platform, opening the floodgates to users employing AI to make images look more convincing and widely shared in the following years. Right now, nearly every U.S. state has a law protecting people from nonconsensual intimate image violations, but the laws vary in classification of crime and penalty. In March, the first lady spoke on Capitol Hill for the first time since returning to the White House to participate in a roundtable with lawmakers and victims of revenge porn and AI-generated deepfakes. The first lady invited 15-year-old Elliston Berry, whose high school peers used AI to create nonconsensual imagery of her and spread them across social media. "It's heartbreaking to witness young teens, especially girls, grappling with the overwhelming challenges posed by malicious online content, like deepfakes," Trump said. "This toxic environment can be severely damaging. We must prioritize their well-being by equipping them with the support and tools necessary to navigate this hostile digital landscape. Every young person deserves a safe online space to express themselves freely, without the looming threat of exploitation or harm." Revenge Porn Bill Backed By Melania Trump Heads To President's Desk After Overwhelming House Vote Berry, a Texas native, told the roundtable she was just 14 years old when she realized in 2023 that "a past Instagram photo with a nude body and my face attached made from AI," was circulating on social media. "Fear, shock and disgust were just some of the many emotions I felt," Berry said. "I felt responsible and began to blame myself and was ashamed to tell my parents. Despite doing nothing wrong. As I attended school, I was scared of the reactions of someone or someone could recreate these photos." "We need to hold big tech accountable to take action," the young woman continued. "I came here today to not only promote this bill, but to fight for the freedom of so many survivors, millions of people, male, female, teenage children, kids all are affected by the rise of this image-based sexual abuse. This is unacceptable. The Take It Down act will give a voice to the victims and provide justice." Another young girl, Francesca Mani of New Jersey, recounted that she also was just 14 when she and other peers found deepfake images on themselves online. "Teenagers might not know all the laws, but they do know when something is wrong," Mani said. "Schools need to take immediate, serious action to ensure that AI exploitation, harassment and deepfake abuse are met with real consequences." The first lady invited the young women as her special guests for Trump's first address to a joint session of Congress in March. Sharing nonconsensual and AI-generated explicit images on social media and the internet has not just affected young girls, as young boys and adults also face similar crimes. A woman named Breeze Liu told the roundtable that she worked tirelessly to remove AI-generated images of herself that landed on a pornography site in 2020 when she was 24 years old. And Republican South Carolina state Rep. Brandon Guffey also joined the group of lawmakers and the first lady in March, recounting how his 17-year-old son committed suicide in 2022 after he was caught up in a sextortion scam. "I lost my oldest son, Gavin Guffey, to suicide," he shared. "We quickly found out that he was being extorted online. That someone pretending to be a young female at another college requested images to be shared back and forth. And as soon as he shared those images, he took his life. It was an hour and 40 minutes from the time that he was contacted until the time that he took his life." Meanwhile, during the first Trump administration, Melania Trump hosted virtual roundtables on foster care as part of her "Be Best" initiative and focused on strengthening the child welfare system. The "Be Best" initiative also focused on online safety. "As first lady, my commitment to the 'Be Best' initiative underscores the importance of online safety," she said. "In an era where digital interactions are integral to daily life, it is imperative that we safeguard children from mean-spirited and hurtful online behavior." The first lady, in March, said the bill "represents a powerful step toward justice, healing and unity."Original article source: Trump hails cooperative effort at anti-revenge porn bill signing: 'Bipartisanship is still possible'

How the Take It Down Act signed by Trump works
How the Take It Down Act signed by Trump works

The National

time19-05-2025

  • Politics
  • The National

How the Take It Down Act signed by Trump works

Victims of non-consensual, intimate online images, including deepfake AI-generated content, will soon have new legal options to have the content removed after US President Donald Trump signed what has become known as the Take It Down Act. First lady Melania Trump, who has been a major proponent of the legislation, gave a speech just before Mr Trump signed the bill into law. 'Over the past few months I have met with brave survivors, deeply loving families and advocates who know first hand the emotional and psychological toll of NCII and deepfake abuse,' she said at the bill-signing ceremony outside the White House. 'Many thanks to both parties for passing this legislation.' According to the US Senate committee on commerce, science and transport, the Take It Down Act criminalises the publication of non-consensual intimate imagery, often referred to as 'revenge porn'. The law requires that social media sites or other content-hosting websites, along with service providers, 'remove such content within 48 hours of notice from victims". Just before signing the bill into law, Mr Trump said it increases penalties and introduces civil liabilities for online platforms that do not act to take such content down. The act also includes provisions related to content generated with artificial intelligence tools. According to The 19th, a non-profit newsroom focused on gender, politics and policy, internet platforms will have approximately one year to establish a process by which users can report the non-consensual content. Though Take It Down passed almost unanimously in the US House of Representatives and the Senate, the act is not without critics. The Electronic Frontier Foundation (EFF), a non-profit group promoting civil liberties in the tech world, has voiced frequent concerns. 'Good intentions don't make good laws,' the EFF said in a news release when the act was first introduced in January. It said the legislation's 48-hour deadline would put too much burden on smaller websites and service providers, making it more likely that they would comply quickly, rather than accurately, to avoid litigation. 'Instead, services will rely on automated filters – infamously blunt tools that frequently flag legal content, from fair-use commentary to news reporting,' the EFF said. 'Take It Down is the wrong approach to helping people whose intimate images are shared without their consent. We can help victims of online harassment without embracing a new regime of online censorship.' The commerce committee, however, insists that the act is narrowly tailored to uphold the First Amendment and in turn, prevent an effect on 'lawful speech". According to the committee, the Take It Down Act also has the support of more than 120 organisations and companies including Meta, Snap, Google, Microsoft, TikTok and X. Linda Yaccarino, chief executive of X, attended the bill-signing ceremony. In February, as Take It Down legislation was gaining momentum, the EFF continued to oppose the bill, pointing out that victims of non-consensual intimate imagery already had legal options. 'In addition to 48 states that have specific laws criminalising the distribution of non-consensual pornography, there are defamation, harassment and extortion statutes that can all be wielded against people abusing NCII,' it said. 'Congress should focus on enforcing and improving these existing protections.' The Take It Down Act is not the first law aimed at protecting reputations from being unfairly compromised. In 2014, the European Union enacted what has become known as a 'right to be forgotten' policy, which makes it easier for people to request deletion of certain private data collected by digital entities. Much like the Take It Down Act, however, the 'right to be forgotten' is not without critics, and has been subject to legal challenges in parts of the world.

Trump signs the Take It Down Act into law
Trump signs the Take It Down Act into law

The Verge

time19-05-2025

  • Politics
  • The Verge

Trump signs the Take It Down Act into law

President Donald Trump signed the Take It Down Act into law, enacting a bill that will criminalize the distribution of nonconsensual intimate images (NCII) — including AI deepfakes — and require social media platforms to promptly remove them when notified. The bill sailed through both chambers of Congress with several tech companies, parent and youth advocates, and first lady Melania Trump championing the issue. But critics — including a group that's made it its mission to combat the distribution of such images — warn that its approach could backfire and harm the very survivors it seeks to protect. The law makes publishing NCII, whether real or AI-generated, criminally punishable by up to three years in prison, plus fines. It also requires social media platforms to have processes to remove NCII within 48 hours of being notified and 'make reasonable efforts' to remove any copies. The Federal Trade Commission is tasked with enforcing the law, and companies have a year to comply. 'I'm going to use that bill for myself, too' Under any other administration, the Take It Down Act would likely see much of the pushback it does today by groups like the Electronic Frontier Foundation (EFF) and Center for Democracy and Technology (CDT), which warn the takedown provision could be used to remove or chill a wider array of content than intended, as well as threaten privacy-protecting technologies like encryption, since services that use it would have no way of seeing (or removing) the messages between users. But actions by the Trump administration in his first 100 days in office — including breaching Supreme Court precedent by firing the two Democratic minority commissioners at the FTC — have added another layer of fear for some of the law's critics, who worry it could be used to threaten or stifle political opponents. Trump, after all, said during an address to Congress this year that once he signed the bill, 'I'm going to use that bill for myself, too, if you don't mind, because nobody gets treated worse than I do online. Nobody.' The Cyber Civil Rights Initiative (CCRI), which advocates for legislation combating image-based abuse, has long pushed for the criminalization of nonconsensual distribution of intimate images (NDII). But the CCRI said it could not support the Take It Down Act because it may ultimately provide survivors with 'false hope.' On Bluesky, CCRI President Mary Anne Franks called the takedown provision a 'poison pill … that will likely end up hurting victims more than it helps.' 'Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII,' they wrote. 'Platforms attempting to identify authentic complaints may encounter a sea of false reports that could overwhelm their efforts and jeopardize their ability to operate at all.' In an interview with The Verge, Franks expressed concern that it could be 'hard for people to parse' the takedown provision. 'This is going to be a year-long process,' she said. 'I think that as soon as that process has happened, you'll then be seeing the FTC being very selective in how they treat supposed non-compliance with the statute. It's not going to be about putting the power in the hands of depicted individuals to actually get their content removed.' Trump, during his signing ceremony, dismissively referenced criticism of the bill. 'People talked about all sorts of First Amendment, Second Amendment… they talked about any amendment they could make up, and we got it through,' he said. Legal challenges to the most problematic parts may not come immediately, however, according to Becca Branum, deputy director of CDT's Free Expression Project. 'It's so ambiguously drafted that I think it'll be hard for a court to parse when it will be enforced unconstitutionally' before platforms have to implement it, Branum said. Eventually, users could sue if they have lawful content removed from platforms, and companies could ask a court to overturn the law if the FTC investigates or penalizes them for breaking it — it just depends on how quickly enforcement ramps up.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store