Latest news with #ContentForum


New Straits Times
28-07-2025
- Business
- New Straits Times
Content Forum becomes first Malaysian partner in Google's flagger programme
KUALA LUMPUR: Google has partnered with the Communications and Multimedia Content Forum of Malaysia (Content Forum) to strengthen online safety through its global Priority Flagger programme. The move makes the Content Forum the first Malaysian organisation to join the initiative, which allows select partners to identify and report harmful content directly to Google and YouTube via dedicated review channels. Operating under the purview of the Malaysian Communications and Multimedia Commission (MCMC), the Content Forum will now assist in flagging content that potentially violates platform policies, with consideration for local cultural contexts. Google Malaysia country director Farhan Qureshi said the collaboration reflects the importance of tapping into local knowledge to create a safer digital environment. "By working with organisations like the Content Forum, we are adding a crucial layer of local expertise, which deepens our ability to respond to harmful content with relevance and precision," he said. The Priority Flagger programme enables trusted local agencies and non-governmental organisations (NGOs) to alert Google about problematic material across platforms such as Search, Maps, Play, Gmail, and YouTube. These reports receive priority review due to the flaggers' industry expertise. As a Priority Flagger, Content Forum will also participate in policy discussions and feedback sessions with Google, helping shape platform governance. Content Forum chief executive officer Mediha Mahmood said the onboarding marked a meaningful advancement in the country's approach to content regulation. "It allows us to move beyond dialogue into action, ensuring that harmful content is flagged and reviewed with the urgency it deserves. "This collaboration reflects our continued role in setting industry standards, empowering communities, and contributing to a safer digital ecosystem through collective responsibility." Content Forum is a self-regulatory industry body designated under the Communications and Multimedia Act 1998. It represents stakeholders ranging from broadcasters and advertisers to content creators, internet service providers, and civic groups.


The Sun
25-07-2025
- The Sun
Accountability matters in age of influence
AMID growing concern over attention-seeking stunts and misleading digital content, the Communications and Multimedia Content Forum of Malaysia (Content Forum) is calling for greater accountability from influencers and content creators across platforms. Influence does not just attract followers, it can activate real emotions, real reactions, and sometimes, real consequences. While many create to entertain or inform, others may use their platforms to provoke, manipulate or even weaponise their audience. Misleading narratives, staged scenarios and undisclosed promotions can lead to confusion, distress or trigger public reactions far beyond the screen. In some cases, influence is used not just to gain attention, but to attack, deceive or deflect accountability. When content crosses into that territory, the damage is no longer digital, it becomes real. Understanding the line between content and conduct While storytelling is a vital part of digital creativity, creators must distinguish between entertainment and manipulation. When content mimics crisis, danger or trauma – without context or disclosure – audiences are drawn into a version of reality that may not exist. Accountability does not end online – law still applies Examples from around the world have shown that digital stunts – whether faking emergencies, impersonating officials or creating dangerous public scenes – can and do result in prosecution. In Malaysia, acts that mislead or alarm the public may fall under laws addressing public mischief, misuse of communication networks or false reporting. Integrity is real currency of influence; not controversy Content Forum is an industry forum registered under the Malaysian Communications and Multimedia Commission (MCMC) and designated by the Communications and Multimedia Act 1998 to oversee and promote self-regulation of content over the electronic networked medium. The Content Forum consists of key players in the content industry, such as advertisers, advertising agencies, broadcasters, content creators/distributors, audiotext hosting service providers, advertising agencies, internet service providers and civic groups. As Malaysia's self-regulatory body under the Communications and Multimedia Act, the Content Forum represents a broad spectrum of stakeholders – from platforms and broadcasters to advertisers, creatives and civil society. Members agree that the long-term health of the content ecosystem depends on creators who understand the difference between attention and integrity. What the public can do Viewers are encouraged to engage critically with what they see online: • Pause before sharing: Ask yourself if the content is factual, exaggerated or harmful. • Question motives: Is this post informing or just provoking a reaction? • Don't reward dishonesty: Avoid boosting content that plays on fear or falsehood. • Report responsibly: Use platform tools to flag content that deceives or endangers. • Expect better: Hold creators to higher standards – for both creativity and credibility. Content Forum has joined the Priority Flagger programme across both Google and YouTube to reinforce Malaysia's efforts in creating a safer digital environment in Malaysia. The Priority Flagger Programme was introduced as a way for participating local government agencies and non-governmental organisations to flag potentially harmful or problematic content on certain Google products and services. Due to their specialised industry knowledge across a variety of subject matters, these organisations have a higher degree of accuracy when flagging violative content. Operating under the purview of the MCMC, the Content Forum serves as a self-regulatory industry body promoting responsible content practices across electronic networked media. As part of the Priority Flagger programme, Content Forum will extend its expertise to help identify potentially policy-violating content across YouTube and Google, taking into consideration local cultural contexts. As a participating organisation, they will gain access to a dedicated intake channel to inform Google of potential policy violations, which will be prioritised for review, as well as participate in discussions and feedback about Google and YouTube content policies. Google government affairs and public policy manager for Malaysia and Indonesia Arianne Santoso (left) and Content Forum CEO Mediha Mahmood commemorate the onboarding of Content Forum into its Google and YouTube Priority Flagger Programs.


New Straits Times
16-07-2025
- Business
- New Straits Times
Content Forum welcomes establishment of Malaysian Media Council
KUALA LUMPUR: The Communications and Multimedia Content Forum of Malaysia (Content Forum) has welcomed the establishment of the Malaysian Media Council (MMC). Congratulating the newly appointed founding board members, the Content Forum described the council's formation as a significant milestone in Malaysia's media landscape. It said the MMC's commitment to ethical journalism, media independence, and inclusive representation marks an important step towards strengthening public trust in the media. "As an established industry-led self-regulatory organisation under the Communications and Multimedia Act 1998, the Content Forum shares the MMC's aspirations for a content ecosystem built on transparency and accountability. "With over two decades of experience in implementing the Malaysian Communications and Multimedia Content Code and facilitating content-related complaints across broadcasting, online, and emerging platforms, the Content Forum understands the power and responsibility that comes with credible self-regulation," it said in a statement today. The Content Forum also commended the ministry for its foresight and leadership in championing a collaborative, industry-driven model of media governance. It urged its members and industry partners to support the MMC's growth by participating in its membership drive. "A vibrant and united media sector can only flourish when all stakeholders contribute to shaping its direction," the Content Forum said. Meanwhile, Content Forum chairman Rafiq Razali said the council's creation represents an important moment for the country's media ecosystem and a chance to raise the bar for credible and independent journalism. He said the council's role would meaningfully complement the Content Forum's by elevating standards, promoting accountability, and safeguarding the public interest. "We are proud to support this new chapter and look forward to working in tandem with the MMC to shape a healthier, more responsible media landscape," he said.


New Straits Times
09-07-2025
- Politics
- New Straits Times
The silence of 'digital bystanders' amplifies hate online
IN my work at the intersection of content regulation and media ethics, I've come to believe that the most dangerous element in an online space is not always the person posting hateful content. It's the silence that surrounds it. To the algorithms curating our online experiences, silence on harmful content suggests engagement or approval. No reports, no objections, no friction? It must be fine. And to people within those digital communities, silence can feel like social permission. If no one's saying anything, maybe it isn't that bad. Maybe it's even true. This is how hate gets normalised — not with a bang, but with a shrug. The phenomenon is known as the digital version of the bystander effect — where the more people witness harm, the less likely any one person is to intervene. Online, that passivity is multiplied and masked by anonymity. Research shows that in extremist chat groups, more than 80 per cent of users never post anything themselves. They don't initiate hate, but they're there — watching, clicking, sharing. In doing so, they help keep those ecosystems alive. In Malaysia, research by The Southeast Asia Regional Centre for Counter-Terrorism has found that radical content spreads fastest not on public platforms, but in private or semi-private spaces — unmoderated Telegram groups, closed chat rooms and fringe forums. These aren't necessarily echo chambers of hardened extremists. They're often filled with regular people; friends, colleagues, acquaintances who may disagree with what they see, but stay silent nonetheless. It's no surprise then that many countries have turned to deplatforming as a response. Takedowns, content moderation, algorithm tweaks — essential tools in any regulatory arsenal. But they're not silver bullets. And, without thoughtful communication, they can backfire. Remove extremist content too swiftly, and its creators are recast as martyrs. Ban a channel without explanation, and you leave behind a vacuum that quickly fills with conspiracy theories. Kick bad actors off a platform, they don't disappear — they just migrate to more opaque, harder-to-monitor spaces. Simply removing content doesn't remove its influence. This is where counterspeech becomes not just useful, but necessary. Counterspeech means responding to harmful content with facts, empathy, questions or alternative narratives. It works best when it's fast, authentic and comes from peers, not just authorities. The idea is not to out-shout hate, but to interrupt it — early, calmly and effectively. And there's data to back it up. In Sri Lanka, a PeaceTech Lab pilot found that engagement with hate content dropped by 46 per cent when counterspeech was introduced early. In Germany, civic volunteers who replied to hateful YouTube comments within the first hour helped reduce hate-driven threads by 17 per cent. These aren't massive interventions. They're small, consistent disruptions, and they matter. At the Content Forum, we're building on this idea in many of our initiatives — from suicide content guidelines to training with influencers, and in ongoing efforts to educate children and parents about digital friction and media literacy. The goal is to give more people the tools and confidence to speak up before harm escalates. Still, the question remains: why don't more people do it? In every training I run, I hear the same three reasons: "I don't want to be attacked", "It's not my place", and "It won't make a difference". Truth is, counterspeech doesn't require you to win the argument. You don't need to craft the perfect reply or go viral with your response. You just need to say something. Even a simple, "Are we sure this is okay?" is enough to interrupt the flow. It breaks the momentum. It breaches the echo chamber, and often, that's all it takes. Whether we realise it or not, silence online isn't just abstinence — it's influence. It tells the algorithm: this is fine. It tells the community: no one minds. It tells the extremist: no one will stop you. But that doesn't have to be the message we keep sending. If harmful beliefs thrive in silence, then perhaps disruption begins not with noise or outrage, but with clarity, courage and consistency. We don't need everyone to be loud, social media is loud enough as it is! We just need more people to stop being quiet.


The Star
03-06-2025
- Business
- The Star
Social media platforms urged to join Content Forum to ensure accountability of content, says Fahmi
PETALING JAYA: All social media platforms should be part of Content Forum to ensure accountability of their impacts on the society, says Datuk Fahmi Fadzil. The Communications Minister said that social media providers should be responsible for the content shared on their platforms after raking in billions of ringgit in advertisements earnings. 'We must make sure these platforms join Content Forum, so that they need to be aware of the impact they have on Malaysian society. 'They cannot profit from our loss,' he said when launching the Suicide Content Guidelines at Menara Star here on Tuesday (June 3). He added that such guidelines were important alongside participation in initiatives by Content Forum. Fahmi also said that he was hopeful that the establishment of the Malaysian Media Council would further provide media organisations with a platform to comprehensively discuss and address various issues, including online harms. In a press conference later, he also told reporters that the platforms should not 'wash their hands off' of content that could fuel online harms including suicides and cyber bullying. ALSO READ: Malaysia begins social media licencing He said that currently only TikTok has become a member of the forum and urged Meta to follow suit. 'It's actually in their interest that they participate so that they can help to develop certain best practises, codes of conduct,' he added. Fahmi said the problem of social media platforms not collaborating with the authorities were also experienced by other countries in the Asia-Pacific region and Asean despite various laws pun in place. 'Many of these social media platforms feel that they are bigger than the laws of the countries in the Asia-Pacifc region. 'So it's not just a matter of laws but the attitude of these platforms,' he added. The guideline is an industry-led, national framework that sets standards for how suicide-related topics are covered and shared in both traditional and digital platforms. Also present at the launch were Content Forum chairman Rafiq Razali, Star Media Group chairman Tan Sri Wong Foon Meng, and Star Media Group chief executive officer Chan Seng Fatt, among others.