logo
#

Latest news with #CyberTipline

Malaysia's fight against CSAM: Why shared responsibility is key
Malaysia's fight against CSAM: Why shared responsibility is key

Borneo Post

time6 days ago

  • Borneo Post

Malaysia's fight against CSAM: Why shared responsibility is key

Many adults hesitate to report CSAM when they encounter it, unsure whether they are allowed to, afraid of the stigma, or worried about making things worse for the child involved. – AI Image KUALA LUMPUR (July 20): The internet has transformed how children learn, play, and connect but it has also opened the door to new and deeply disturbing forms of harm. Among the most serious of these is the growing circulation of Child Sexual Abuse Material (CSAM), which represents a profound violation of a child's safety, dignity, and rights. In Malaysia and around the world, alarming spikes in online child sexual exploitation and abuse have made it clear that this is no longer a hidden problem. It has become a public crisis that requires urgent, collective action. The Kempen Internet Selamat (KIS) plays a pivotal role in encouraging public involvement and legal reform in the fight against CSAM. The campaign emphasises that online safety requires ordinary citizens to understand the signs, know how to report abuse, and demand accountability from platforms and policymakers. Educating the public is a vital first step in dismantling the silence that allows online child exploitation to persist. CSAM is a severe violation of a child's dignity and safety where in 2023 alone, there were 32 million reports of CSAM worldwide, with over 90 percent of the images being self-generated, often through coercion, manipulation, or blackmail. Alarmingly, cases involving very young children, even those between the ages of 3 and 6, have been on the rise. According to the National Center for Missing & Exploited Children (NCMEC), Malaysia recorded 197,659 reports of suspected CSAM through its CyberTipline in 2024. That same year, the Internet Watch Foundation Malaysia reported 8,600 actionable cases. One of the reasons this crisis persists is the silence that surrounds it. Many adults hesitate to report CSAM when they encounter it, unsure whether they are allowed to, afraid of the stigma, or worried about making things worse for the child involved. This silence from surrounding adults is enabling harm to continue and further traumatise the victims. As Sarawak Women for Women Society (SWWS) member Gill Raja, aptly puts it, 'If we don't take appropriate action, we are complicit.' Inaction allows CSAM to circulate, and that can lead to further exploitation, blackmail, and psychological trauma for the victim. Reporting CSAM is the first step towards taking it down. Yet many people, including victims, fear that reporting could draw more attention or lead to further harm. 'This is why trusted, child-friendly channels to report are so important. Accessible, confidential reporting options must be widely known and easy to use,' Gill emphasizes. Gill also warns of the risks of turning a blind eye, 'The child could continue to be exploited and abused to create more material if those doing this remain in contact with them or have passed on their details to others.' Gill Raja She reminds us that failing to act means becoming part of the problem, 'If we don't take appropriate action, we are complicit in harming them. We need to protect each other's children to make the internet a safe place.' The good news is, everyone has a role to play in ending this. Shared responsibility is not just a slogan, it is the only viable solution. Parents, teachers, corporations, social media platforms, government agencies, non-profits, and everyday citizens all have unique roles and tools they can mobilise to fight this crisis. The internet may be vast and borderless, but so is society's capacity to protect the children if everyone acts together to protect the children. For individuals, reporting CSAM is a critical first step. Safe and confidential channels exist but remain underutilised due to lack of public awareness. Malaysians can report abuse directly via the Childline Foundation portal, which connects to the Internet Watch Foundation's global takedown system. The Talian Kasih 15999 hotline and Cyber999 portal also offer accessible, sometimes anonymous, options. As Gill highlights, 'We need more awareness and easy access so as soon as people see CSAM, they can easily see how to report. This requires a stronger response from social media platforms than we currently have. 'Every report is a crucial step in reducing stress on a child and shows that you care and are standing by them,' she says. At a community level, adults or caretakers must normalise discussions around online safety where children need to be taught, in age-appropriate and culturally relevant ways, how to protect themselves, recognise risks, and seek help. However, equally, the adults in their lives including parents, teachers, guardians must have the knowledge and tools to respond appropriately when abuse is disclosed. Gill notes that current efforts fall short. 'We need to reach all children in age, language, and culturally appropriate ways that effectively engage them, plus informing the adults in their lives too.' She stresses the need for training not only for children, but also for adults, who must understand 'how they may inadvertently put their children at risk by sharing photos online or how young people are themselves being sucked into viewing and sometimes producing CSAM.' Media and tech platforms also bear tremendous responsibility. Safety-by-design should no longer be optional. Platforms must be required to proactively screen, detect, and remove CSAM. They must offer easy-to-use reporting tools that children and adults can find without difficulty. While some platforms are making progress, others have scaled back moderation just as AI-generated CSAM is on the rise. As Gill observes, 'Some major platforms have recently cut back on their vetting processes just as we are seeing a surge of material being produced including using AI. This is unacceptable.' Laws and policies must also evolve rapidly. While Malaysia has the Sexual Offences Against Children Act 2017, there is no legal requirement for platforms or ISPs to take down or report CSAM promptly. Nor are there age-verification or parental consent mechanisms for online access. These loopholes allow predators to exploit vulnerable users and make law enforcement's job more difficult. That's why advocates are pushing for a harmonised legal framework that outlines the responsibilities of both public institutions and private companies. A designated national lead agency with the resources and authority to coordinate efforts across sectors is essential. This body could ensure consistent reporting mechanisms, facilitate international cooperation, and manage end-to-end victim support systems including helplines, counselling and legal assistance. Access to psychological care, legal aid, and rehabilitation must be expanded to help the victims cope before trauma becomes permanent. Services must be inclusive and sensitive to each child's age, gender, ability, and background. A single, toll-free, 24/7 national child helpline staffed by trained professionals could be a lifeline. Prevention efforts should also include nationwide digital literacy campaigns that teach children and adults about healthy online behaviour, consent, and boundaries. Ultimately, protecting children online is not the sole responsibility of parents, teachers, or the police. It is a collective duty. 'Today we are part of a huge global, internet 'village,'' says Gill. 'We have to give children a path back. Every time we educate, report, and intervene, we are part of the solution.' This is not just about fighting abuse. It is about defending every child's right to grow up free from exploitation and fear. Every child deserves that chance and every adult has the power to make it happen. child pornography CSAM Gill Raja lead

Cellebrite DI (CLBT) Expands its Partnership NCMEC
Cellebrite DI (CLBT) Expands its Partnership NCMEC

Yahoo

time10-07-2025

  • Business
  • Yahoo

Cellebrite DI (CLBT) Expands its Partnership NCMEC

Cellebrite DI Ltd. (NASDAQ:CLBT) is one of the On June 25, Cellebrite DI Ltd. (NASDAQ:CLBT) announced expanding its partnership with the National Center for Missing and Exploited Children (NCMEC) to help speed up investigations involving crimes against children. A key part of this collaboration is the integration of NCMEC's CyberTipline hash value list into Cellebrite DI Ltd. (NASDAQ:CLBT)'s main forensic software called the Cellebrite Inseyets. This list contains digital fingerprints, called hashes, of about 10 million files that have been confirmed as child sexual abuse material. A female engineer in a datacenter, wearing a headset, monitoring digital data. The integration allows investigators to instantly match files found on suspects' devices to known CSAM. The integration is part of Cellebrite DI Ltd. (NASDAQ:CLBT)'s 'Operation Find Them All' initiative. The program was launched in early 2024 and helps agencies use technology to rescue children and catch offenders. Cellebrite DI Ltd. (NASDAQ:CLBT) is a software company that provides a Digital Intelligence platform designed to support legally sanctioned digital investigations. While we acknowledge the potential of CLBT as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: 30 Stocks That Should Double in 3 Years and 11 Hidden AI Stocks to Buy Right Now. Disclosure: None. This article is originally published at Insider Monkey.

Indore man arrested for sharing child pornography videos online
Indore man arrested for sharing child pornography videos online

Time of India

time27-06-2025

  • Time of India

Indore man arrested for sharing child pornography videos online

Indore: A 60-year-old man from Khajrana area of Indore was arrested by the State Cyber Cell Indore for allegedly downloading and circulating child pornography videos on WhatsApp. The arrest followed a complaint received by India's Cyber Tipline from the United States-based WhatsApp Inc., which flagged the accused for sharing illegal and obscene content involving minors. Authorities are in the process of informing the concerned school and have stated that necessary action will be initiated in this regard. The accused allegedly downloaded prohibited obscene videos involving minors and circulated them to other users. In an attempt to evade legal consequences, he formatted the mobile phone used to commit the crime. Indore SP Sabyasachi Saraf said that the cyber cell team took action as part of an ongoing drive to track down offenders involved in previous cyber crimes. He said that based on the complaint, which contained information about a video involving the sexual abuse of a minor girl being downloaded and shared, a case was registered under Section 67B of the Information Technology Act. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Cuối cùng, chơi miễn phí game chiến thuật hay nhất 2025! Sea of Conquest Phát ngay Undo "Given the seriousness of the offence, the cyber cell carried out a thorough technical investigation, which led to the identification and arrest of Irshad Ahmed, 60, resident of Khajrana," he said. "During interrogation, the accused admitted to receiving a child pornography video on WhatsApp, which he subsequently downloaded and shared with another individual. The mobile phone used in the crime has been seized. Since the accused attempted to destroy evidence by formatting the phone, an additional section has been added to the case," he said. Authorities have also issued safety advisories urging the public not to click on suspicious blue-coloured links circulating on social media, to avoid joining WhatsApp or Telegram groups that circulate child pornography or pornographic videos, and to refrain from uploading, downloading or sharing any obscene content on social media platforms as it constitutes a legal offence. If anyone comes across obscene content involving minors on any social media platform, they are urged to report it at or or call the cyber helpline number 1930.

Accelerating Child Exploitation Investigations: Cellebrite Integrates Data from the National Center for Missing and Exploited Children (NCMEC)
Accelerating Child Exploitation Investigations: Cellebrite Integrates Data from the National Center for Missing and Exploited Children (NCMEC)

Business Upturn

time24-06-2025

  • Business Upturn

Accelerating Child Exploitation Investigations: Cellebrite Integrates Data from the National Center for Missing and Exploited Children (NCMEC)

TYSONS CORNER, Va., June 24, 2025 (GLOBE NEWSWIRE) — Cellebrite (NASDAQ: CLBT), a global leader in premier Digital Investigative solutions for the public and private sectors, today announced the expansion of its relationship with the National Center for Missing and Exploited Children (NCMEC) that will help speed up investigations involving crimes against children. NCMEC's CyberTipline hash value list is now integrated within Cellebrite's flagship digital forensics software, Cellebrite Inseyets, allowing public safety agencies to immediately pinpoint known child sexual abuse material (CSAM) files – speeding up time to evidence and justice for victims and survivors of abuse. The hash value list contains approximately 10-million files reported by electronic service providers to NCMEC, which have been confirmed to depict apparent CSAM. Instead of spending hours reviewing data to locate CSAM on suspected offenders' devices, this integration allows digital forensic examiners and investigators around the world to match CSAM files instantly. This provides investigators with the evidence needed to arrest and prosecute offenders, and in parallel, limit law enforcement's exposure to the material, which helps protect their mental health. 'This integration represents a critical leap forward in our efforts to protect children and hold offenders accountable,' said John Shehan, Senior Vice President, Exploited Children Division & International Engagement at NCMEC. 'We're proud to strengthen our nine-year partnership with Cellebrite in the fight to end online child exploitation.' 'Any tool that speeds up time to evidence is critical for our teams,' said Ben Morrison, the Washington Internet Crimes Against Children (ICAC) Task Force Commander. 'Digital evidence is the holy grail in ICAC investigations, and this integration means getting to more cases and protecting more kids.' New Hampshire ICAC Task Force Commander Eric Kinsman adds, 'We are very excited about this integration. When a known CSAM match is made, it adds to the probable cause in an investigation which greatly increases our chances to arrest an offender, ensuring they are no longer a danger in our community.' 'Our mission is in lock step with NCMEC, and it's an honor to partner with them and help the heroes working these cases on the front lines,' said David Gee, Cellebrite's chief marketing officer. 'This integration will be a game changer and will undoubtedly save and prevent our most vulnerable from the most heinous crimes.' This integration, available to Cellebrite Design Partners for early access now and generally available the week of June 30, 2025, is part of Cellebrite's 'Operation Find Them All' (OFTA) initiative. The landmark program is helping public safety agencies use technology to protect children – alongside strategic partners including NCMEC, The Exodus Road and Raven. Since launching in January of 2024, OFTA has assisted in numerous investigations that have helped rescue hundreds of victims and resulted in the arrests of dozens of perpetrators. OFTA is playing an important, active, ongoing role in helping to further investigations where NCMEC is assisting public safety agencies in cases involving missing and endangered children. References to Websites and Social Media Platforms References to information included on, or accessible through, websites and social media platforms do not constitute incorporation by reference of the information contained at or available through such websites or social media platforms, and you should not consider such information to be part of this press release. About Cellebrite Cellebrite's (Nasdaq: CLBT) mission is to enable its global customers to protect and save lives by enhancing digital intelligence and accelerating justice in communities around the world. Cellebrite's AI-powered Case-to-Closure (C2C) platform enables customers to lawfully access, collect, analyze and share digital evidence in legally sanctioned investigations while preserving data privacy. Thousands of public safety organizations, intelligence agencies and businesses rely on the Company's cloud-ready digital forensic and investigative solutions to close cases faster and safeguard communities. To learn more, visit us at and find us on social media @Cellebrite. About NCMEC The National Center for Missing & Exploited Children is a private, non-profit 501(c)(3) corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization. NCMEC works with families, victims, private industry, law enforcement, and the public to assist with preventing child abductions, recovering missing children, and providing services to deter and combat child sexual exploitation. Contacts: MediaVictor CooperSr. Director of Global Corporate Communications [email protected] +1 404.510.2823

Two men from Indore & Bhopal held for sharing child sexual abuse material online
Two men from Indore & Bhopal held for sharing child sexual abuse material online

Time of India

time19-06-2025

  • Time of India

Two men from Indore & Bhopal held for sharing child sexual abuse material online

Indore: Acting against online child sexual abuse material (CSAM), the Indore cell of state cyber police on Thursday arrested two men for sharing obscene videos involving minor girls through social media. Tired of too many ads? go ad free now The accused were identified based on information received from Meta Platforms Inc. (Facebook and WhatsApp) through the Cyber Tipline and NCMEC. The accused were found to be active members of social media groups engaged in distributing child pornography. Acting on this data, two separate cases were registered under Section 67B of the IT Act. The complaints were received via the Ministry of Home Affairs' Cyber Tipline, following which a detailed technical analysis was conducted. During the investigation of the first case, the cyber team traced the Facebook activity to Santosh Thakur, 38, a resident of Bag Sewania, Bhopal. He was found to be sharing CSAM through Facebook Messenger and also deleted content to destroy evidence. He was arrested, and his mobile phone and SIM card used in the offence were seized. In the second case, the accused was identified as Kamal Sharma, 35, a resident of Shri Ram Nagar, Indore. He was found to be part of groups that circulated sexually explicit content involving minors on WhatsApp. The investigation revealed that he downloaded and shared child pornographic videos and later deleted them in an attempt to destroy evidence. His mobile device and SIM card were also confiscated. Both accused admitted to being part of online groups that exchanged objectionable material. As a result, additional sections for evidence destruction have been added to the case. Tired of too many ads? go ad free now Authorities have issued a public advisory urging citizens not to click on suspicious links, especially those marked with a blue icon, and to refrain from joining any group involved in sharing pornographic or child abuse material. Sharing, uploading or downloading such content is a serious criminal offence. Citizens are also advised to secure their social media accounts using two-factor authentication, avoid unknown friend requests, and never share personal information online. In case of any such content being found, people are encouraged to file a complaint at or call the cyber helpline 1930.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store