
Fake political content spikes in Canada ahead of federal election
More than one in four Canadians has been exposed to fake political content on social media in the lead-up to the April 28 federal election, according to new research that warns of a sharp rise in online disinformation and fraud.
A report released Friday by the Media Ecosystem Observatory (MEO) describes a 'dramatic acceleration' in misleading content, ranging from deepfake videos to scam investment ads masquerading as news articles.
Researchers say much of the material is more sophisticated, more polarising, and harder for voters to detect than in previous elections.
The study found a growing number of Facebook ads impersonating trusted news brands to promote fraudulent cryptocurrency schemes.
Many of these ads use fake headlines and doctored videos to trick users into clicking links leading to scam websites.
'This is not simply low-effort misinformation – it's highly produced, visually convincing, and engineered to look like legitimate political coverage,' said Aengus Bridgman, executive director of the MEO. 'We're seeing platforms flooded with content that targets both the political system and the public's trust in media.'
The current election marks Canada's first national vote since Meta, Facebook's parent company, blocked Canadian news content across its platforms in response to the Online News Act (Bill C-18), which requires tech giants to compensate publishers for news content shared online.
Despite the ban, more than half of Canadians still report receiving political information via Facebook, according to the research.
'Users often don't realise they're not consuming verified news,' Bridgman said. 'They might follow political memes, cultural commentary pages, or candidate posts and leave feeling informed – but that's not the same as receiving fact-checked reporting.'
The report argues that the absence of credible news has created an opening for lower-quality, polarising, and fraudulent content to take hold.
Among the most concerning trends, the report identified a series of deepfake videos falsely depicting Prime Minister Mark Carney endorsing a cryptocurrency investment programme. The clips, styled to mimic CBC or CTV news segments, contain fabricated interviews and false claims about new government policies.
One widely circulated fake headline read: 'Mark Carney announces controversial retaliatory tariff plan in response to Trump's devastating tariff hikes this week'. The link led users to a scam site asking for personal financial information.
Another Facebook page, named Money Mindset, purchased five French-language ads featuring a deepfake of Carney between 4 and 9 April. The ads ran for just a few hours but reportedly received up to 10,000 impressions, costing around C$1,000.
'These imposter ads and fake videos undermine the credibility of both the political leaders and the news organisations being mimicked,' the report stated.
Canada's federal task force on Security and Intelligence Threats to Elections (Site) confirmed that foreign interference remains a concern, particularly from China, Russia, and Iran. Last week, Site revealed an operation linked to China on the Chinese-language platform WeChat, though it concluded the activity did not have material influence.
Instead, the report highlights that most of the disinformation originates from domestic sources focused on financial scams rather than electoral manipulation.
'These scams aren't necessarily designed to change votes,' Bridgman said. 'But they do erode public trust and further confuse the information environment at a critical time.'
While Meta says such ads violate its policies and encourages users to report scams, researchers argue enforcement remains inconsistent. Many ads evade detection by not identifying themselves as political, which keeps them out of Meta's public ad library.
'This is the kind of content that would never pass broadcasting standards on TV,' said Bridgman. 'And yet Facebook serves these fake Carney ads to thousands of users across the country in the middle of a federal election. It feels dystopian.'
Meta said it continues to invest in technology and enforcement tools to stop scams and impersonations, calling it an 'ongoing industry-wide challenge'.
But researchers say more stringent oversight is needed, especially in the absence of reliable news content on major platforms.
'We've effectively handed the information space over to unregulated actors,' Bridgman said. 'And it's the public who pays the price.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Express Tribune
5 hours ago
- Express Tribune
Trump offers hope on security guarantees as Ukraine braces for Putin meet
Trump has said a deal could include what he called a land swap. Russia controls around a fifth of Ukraine and a land swap within Ukraine could cement Moscow's Ukraine's allies said President Donald Trump was willing to back security guarantees for Kyiv, a potentially significant but as yet vague offer that could give some hope to Ukraine on Thursday with one day to go until a US-Russia summit on ending the war. Trump had shown willingness to join the guarantees at a last-ditch virtual meeting with European leaders and Ukrainian President Volodymyr Zelenskiy on Wednesday, leaders said, though he made no public mention of them afterwards. Zelenskiy and his allies have voiced some optimism as they intensified efforts to prevent any deal between Trump and Russia's Vladimir Putin at a meeting in Alaska on Friday that would leave Ukraine vulnerable to further Russia attack. Friday's summit comes at one of the toughest moments for Ukraine in a war, the largest in Europe since World War Two, that has killed tens of thousands and displaced millions since Russia's full-scale invasion in February 2022. Speaking after Wednesday's meeting, French President Emmanuel Macron said Trump insisted that the transatlantic NATO alliance should not be part of security guarantees that would designed to protect Ukraine from future attacks in a post-war settlement. "President Trump also stated this clearly, saying things that I find important: namely, that NATO should not be part of these security guarantees - and we know this is a key point, particularly for the Russian side - but (also) that the United States and all willing allies should be part of them. That is what we are committed to," Macron said. "And for me, this was an important clarification today." German Chancellor Friedrich Merz, who hosted Wednesday's meeting, also said there would be robust security guarantees. "President Trump also confirmed this today and said he is on board," he told reporters. Expanding on those statements, a European official told Reuters that Trump said on the call he was willing to providing some security guarantees for Europe, without spelling out what they would be. The official, who did not want to be named, said this was the first time he has been so explicit about providing some guarantees since the Coalition of the Willing talks led by Britain and France began in March. It "felt like a big step forward", the official said. However, it was not immediately what such guarantees could mean in practice. "We have no details of his (Trump's) view on this but now he is more open for some kind of US support for the guarantees," a source familiar with the matter said, adding Trump understood that a U.S backstop was needed for guarantees to be workable. "So he mentioned it (on the call) and maybe everyone will work on it," the source said. A European Commission spokesperson also welcomed Trump's offer but said the details were up to the White House to answer. Zelenskiy met British Prime Minister Keir Starmer to build on momentum from Wednesday's talks. Zelenskiy and Starmer embraced before heading in to their meeting in Downing Street. On Wednesday, Trump threatened "severe consequences" if Putin does not agree to peace in Ukraine and while he did not specify what the consequences could be, he has warned of economic sanctions if his meeting on Friday proves fruitless. However, Russia is likely to resist Ukraine and Europe's demands strongly and previously has said its stance had not changed since it was first detailed by Putin in June 2024. To gear up for the Alaska summit, Putin held a meeting with top officials and representatives of Russia's leadership, the TASS state news agency reported, citing Kremlin spokesman Dmitry Peskov. A Kremlin aide said Putin and Trump will discuss the "huge untapped potential" for Russia-U.S. economic ties as well as the prospects for ending the war at the meet, the first summit between their countries since Putin met Joe Biden in 2021. A source familiar with the matter said Russian Special Envoy Kirill Dmitriev will participate. Dmitriev, who heads up Russia's RDIF sovereign wealth fund, has previously held talks with Steve Witkoff, Trump's special envoy, and has spoken of possible business cooperation between Moscow and Washington. Russian military advances Zelenskiy confirmed this week that Russian forces had advanced by about 9-10 km (6 miles) near the town of Dobropillia in the Donetsk region. Ukraine, suffering manpower challenges, was forced to move in reserves to stabilise the situation. Trump described the aim of his talks with Putin in Alaska as "setting the table" for a quick follow-up that would include Zelenskiy. Trump has said a deal could include what he called a land swap. Russia controls around a fifth of Ukraine and a land swap within Ukraine could cement Moscow's gains. Zelenskiy and the Europeans worry that would reward Putin for nearly 11 years of efforts to seize Ukrainian land and embolden him to expand further west in Europe. Trump's agreement last week to the summit was an abrupt shift after weeks of voicing frustration with Putin for resisting the US peace initiative. As conditions for a ceasefire and the start of talks, Putin has demanded Ukraine withdraw its forces from four regions that Russia has claimed as its own but does not fully control, and formally renounce plans to join NATO. Kyiv swiftly rejected the conditions as tantamount to surrender.


Express Tribune
7 hours ago
- Express Tribune
Pakistan blocks over 500 terrorist-linked social media accounts
The Government of Pakistan has blocked over 850 social media accounts linked to banned militant groups after reporting them on different platforms in a nationwide crackdown. The accounts, allegedly linked to proscribed groups such as Tehreek-e-Taliban Pakistan (TTP), Baloch Liberation Army (BLA) and Baloch Liberation Front (BLF) — all banned by the United Nations, United States and the United Kingdom as well — had a combined following of over two million. The newly formed National Cyber Crime Investigation Agency (NCCIA) and Pakistan Telecommunication Authority (PTA) coordinated the action, reporting accounts across Facebook, Instagram, TikTok, X (formerly Twitter), Telegram and WhatsApp. Read: New Pak-US front against terror trains sights on BLA, TTP Federal IT Minister Shaza Fatima Khawaja secured direct cooperation from Telegram officials as well despite the app being banned in Pakistan. The government has also appealed for international cooperation to prevent extremist propaganda online. Facebook and TikTok acted on over 90% of the removal requests, while X and WhatsApp showed 30% compliance. Officials warned that while mainstream Pakistani media remains free of extremist content, militant groups are still using online platforms for recruitment and incitement. Islamabad urged global platforms to permanently block terrorist-linked accounts, deploy AI-based removal systems, and maintain direct contact with Pakistani authorities.


Express Tribune
21 hours ago
- Express Tribune
Social media and fracturing of Pakistan's political discourse
The writer is a public policy analyst based in Lahore. She can be reached at durdananajam1@ Political polarisation in Pakistan has intensified manifold since the ouster of former prime minister Imran Khan in 2022 through a no-confidence vote. A sharp divide emerged from this so-called democratic process, eroding the possibility of consensus even on matters of national security. As Hamra Tariq noted in Paradigm Shift, "The present time's political polarization in Pakistan has made it extremely difficult to differentiate between manipulation and guidance", with reason and evidence increasingly absent from public debate. In this fragmented environment, social media has emerged not just as a mirror of division, but as its amplifier. A recent study by the Islamabad Policy Research Institute (IPRI), titled Impact of Social Media on Political Polarisation in Pakistan, offers a comprehensive analysis of how platforms like X, TikTok, Facebook and WhatsApp are reshaping political engagement and deepening ideological divides. The findings are both illuminating and sobering. Between 2017 and 2025, Pakistan's social media user base more than doubled — from 31 million to 66.9 million. This digital expansion has brought new voices into the political conversation, but it has also created echo chambers and filter bubbles that reinforce existing biases. According to the IPRI study, 62.5% of respondents agreed that social media has made Pakistan's political sphere more polarised. The algorithms powering these platforms prioritise emotionally charged content, often at the expense of factual accuracy. As political scientist Emilia Palonen has argued, polarisation thrives on the creation of binary identities — "us versus them" — which social media is uniquely positioned to cultivate. The study's quantitative analysis, using Spearman's Rank Correlation Coefficient, reveals platform-specific dynamics. X showed a statistically significant positive correlation with political polarisation, suggesting that users who trust its political content are more likely to perceive heightened division. TikTok, interestingly, showed a statistically significant negative correlation - indicating that users who trust its political content are less likely to view it as a source of polarisation. Facebook, WhatsApp, YouTube and Instagram showed weak or no correlation, underscoring the differentiated impact of each platform. These findings challenge simplistic narratives about social media's role in politics. It is not merely the presence of political content but its structure, delivery and emotional tenor that shape public perception. The study also highlights the role of political actors in weaponising social media. Parties now routinely hire PR firms, influencers and digital activists to push partisan narratives, often blurring the line between engagement and propaganda. This has led to a spillover effect, where ordinary users become unwitting participants in a digital war of attrition. The consequences are far-reaching. Hate speech, misinformation and personal attacks are not just tolerated — they are rewarded by algorithms that prioritise virality over veracity. The IPRI study found strong correlations between political polarisation and exposure to extremist content, with respondents acknowledging that such content influences both their online and offline relationships. What makes this moment particularly dangerous is the erosion of trust — not just between political parties, but between citizens and institutions. The study notes that national institutions have increasingly become targets of online disinformation campaigns, often without credible evidence. Political actors use social media to mobilise support and discredit opponents; but in doing so, they also undermine the legitimacy of democratic institutions. This erosion of trust is not incidental — it is a deliberate attempt of polarising digital rhetoric. The study also identifies key drivers of polarisation: algorithmic amplification; political incidents; and, lack of media literacy. Algorithms create echo chambers by showing users content that aligns with their existing beliefs, reinforcing confirmation bias and limiting exposure to diverse viewpoints. Political events — such as protests, arrests and electoral disputes - are often distorted online, fuelling outrage and deepening divisions. Meanwhile, low levels of media literacy make users more susceptible to misinformation, reducing their ability to critically evaluate content. To address these challenges, the study offers a set of thoughtful recommendations. These include integrating digital literacy into educational curricula; regulating algorithmic amplification; and, promoting inclusive discourse. It also calls for the development of indigenous social media platforms to reduce dependence on foreign tech giants whose moderation policies may not align with Pakistan's socio-political realities. These reforms are not just technical — they are political. They require a commitment to democratic norms, institutional transparency and civic education. The future of political engagement in Pakistan will depend on how these digital spaces are managed. Censorship and shutdowns, as the study rightly notes, are counterproductive. What is needed is a multi-stakeholder approach involving policymakers, tech companies, educators and civil society. The current level of polarisation has made it increasingly difficult to foster social cohesion. Political engagement has become synonymous with tribal loyalty, and dissent is often treated as betrayal. Yet, the IPRI study also reveals that many users are open to opposing views and willing to verify information before sharing it. This suggests that the problem is not insurmountable — it is structural. In essence, social media is neither inherently good nor bad — it is a reflection of the society that uses it. In Pakistan, where political identities are deeply entrenched and institutional trust is fragile, the unchecked growth of polarising content poses a serious threat to democratic cohesion. The IPRI study provides a timely and rigorous framework for understanding this phenomenon and offers a roadmap for mitigating its impact. It is now up to the country's leadership - political, institutional and civic — to act on these insights and steer the digital discourse toward a more informed and united future.