
Analysis:UK politics blunts antitrust action against Google
LONDON :Britain's competition regulator has finally come up with a plan to control Google's huge search business, but a shift in the political wind in favour of big tech and the money it invests makes it more of a bark than a bite.
The Competition and Markets Authority spent years setting up a regime to intervene in the operations of tech giants such as Google, Apple and Amazon, saying it needed special expertise and powers to drive competition in the digital economy.
But just as it received new powers, Britain's Labour government said its need to grow the economy meant tough regulation was now out.
The CMA, chaired by a former Amazon executive, has touted a targeted approach as the way to meet its goal of reining in big tech without throttling investment from an industry that has spent tens of billions of pounds in Britain.
On Tuesday, it proposed designating Google as having "strategic market status" in search, giving it the power to impose conditions on the U.S. tech firm such as changing the way it ranks search results or offering users more choice.
Competition experts said the designation was no surprise, coming long after similar moves in the United States and the European Union.
"Everyone has been at the search rodeo for years, there are EC (European Commission) decisions, U.S. judgements," Cristina Caffarra, a competition economist, said. "What the CMA is doing is purely performative."
Nonetheless, the CMA's first designation is being closely watched by tech groups, lawyers, and business owners to see how it operates in the new political climate.
In announcing its proposals, CMA Chief Executive Sarah Cardell was careful to stress its "targeted and proportionate actions" to regulate a sector innovating at breakneck speed via artificial intelligence, and mindful of the political pressure.
Lawyer Ronan Scanlan, a partner at Steptoe International and former deputy director at the CMA, said Britain's Digital Markets, Competition and Consumers Act gave the CMA broad powers, but in practice it didn't have the political capital to make grand interventions.
"The DMCC Act, which was billed as this revolutionary new tool that the CMA could wield, has arrived three years too late and is becoming a bit of an albatross around its neck," he said.
"It's up against huge players like Google, Apple, Amazon, with a lot of political connections, and now - in a new political reality - somehow has to try to extricate itself with the minimum amount of damage."
The CMA's delicate balancing act is made harder by U.S. President Donald Trump's muscular defence of U.S. business interests, and Scanlan said the regulator would want to see what would happen with Google there.
TOUGHER PROPOSALS
Some of the measures the CMA is proposing, such as choice screens for consumers to easily opt for alternative search engines, have been around for decades.
Others, such as changing the ranking of results to limit Google favouring its own services, could have more impact if they are confirmed in the CMA's final decision in October.
Tom Smith, a competition lawyer at Geradin Partners and a former CMA legal director, said there was a question mark over political support for some of the regulator's tougher proposals, but thought it was trying to stick to its guns.
"Given the new context, it's still implementing the regime properly," he said, adding that the U.S. Department of Justice had proposed measures that could lead to a breakup of Google, particularly in its search and advertising businesses.
"The idea that the CMA is going too far by putting in a choice screen, it's quite ludicrous."
Despite that, Alphabet-owned Google warned it may not bring new features and services to Britain if the regulator goes ahead with the proposals, and said "proportionate, evidence-based regulation" was needed if Britain was to grow its economy.
Google, which employs around 7,000 people in Britain, accounts for more than 90 per cent of all general search queries in the country, with more than 200,000 businesses relying on its search advertising to reach their customers.
But according to submissions to the CMA from the likes of flights and hotel website Skyscanner and the recommendation platform Checkatrade, that dominance may have enabled it to favour its own services over their offerings, and they want regulatory intervention.
Silicon Valley has been wary of the CMA since 2023, when it blocked Microsoft's $69 billion acquisition of the "Call of Duty" maker Activision-Blizzard. Having sparked fury from the U.S. companies, it then tore up its own rule book to approve the case after Microsoft made some changes.
Its second investigation under its new powers is examining mobile operating systems, targeting Google and Apple.
Previous CMA investigations had pointed to Amazon as the subject of the third strategic market status investigation that was due to be announced this summer. On Tuesday, however, the CMA pushed the third case back to next year.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Straits Times
an hour ago
- Straits Times
While You Were Sleeping: 5 stories you might have missed, June 26, 2025
US President Donald Trump speaking during a press briefing at the Nato summit on June 25, in the Netherlands. PHOTO: REUTERS While You Were Sleeping: 5 stories you might have missed, June 26, 2025 Trump says Russia's Putin 'has to end' the war US President Donald Trump indicated on June 25 he will consider providing more of the Patriot missiles that Ukraine needs to defend against mounting Russian strikes, adding that Russian President Vladimir Putin 'really has to end that war'. His remarks came after a 50-minute meeting with Ukrainian President Volodymyr Zelensky on the sidelines of a Nato summit in The Hague. Both leaders described it as a positive step in a conflict that Mr Trump described as 'more difficult than other wars'. During a press conference in which he celebrated his own diplomatic efforts in the Middle East, Mr Trump said the Patriots were 'very hard to get' but that 'we are going to see if we can make some of them available'. READ MORE HERE Trump teases Iran talks next week President Donald Trump said on June 25 that the United States would hold nuclear talks with Iran next week, teasing the possibility of a deal even after boasting that recent US strikes had crippled the Islamic republic's atomic programme. Mr Trump - speaking during a Nato summit in the Netherlands - credited the unprecedented US attacks with the 'total obliteration' of Iran's nuclear capabilities, and said they had set the country's programme back 'decades'. But leaked US intelligence cast doubt on that assessment, saying the strikes had likely delayed Tehran by just a few months. READ MORE HERE New York mayoral vote floors Democratic establishment Zohran Mamdani's victory in New York City's Democratic mayoral primary is a seismic wake-up call for a party establishment out of step with younger Americans and failing to counter Donald Trump, analysts say. The little-known state legislator, a proud 'democratic socialist,' is now favourite to win November's election and become a major voice in the battle between progressives and the establishment wing for the party's soul. Mr Mamdani, just 33, was at a lowly 1 per cent in the polls in February, but saw off three-term former New York governor Andrew Cuomo with a populist campaign that has Democrats nationwide taking notice ahead of next year's midterm elections. READ MORE HERE Jeff Bezos and fiancee arrive in Venice for lavish wedding Amazon's billionaire founder Jeff Bezos and his fiancee Lauren Sanchez arrived in Venice on June 25 ahead of their wedding, an event that has sparked protests in the Italian city. Mr Bezos, the world's fourth-richest person, and his former television anchor bride-to-be were seen stepping off a water taxi at the Aman Hotel on the Grand Canal. The couple's three-day nuptials are due to start on June 26, and the wedding ceremony is to be held at a secret location. READ MORE HERE For first time, Webb telescope discovers an alien planet In addition to providing a trove of information about the early universe, the James Webb Space Telescope since its 2021 launch has obtained valuable data on various already-known planets beyond our solar system, called exoplanets. Now, for the first time, Webb has discovered an exoplanet not previously known. Webb has directly imaged a young gas giant planet roughly the size of Saturn, our solar system's second-largest planet, orbiting a star smaller than the sun located about 110 light years from Earth in the constellation Antlia, researchers said. READ MORE HERE Join ST's Telegram channel and get the latest breaking news delivered to you.


CNA
an hour ago
- CNA
Commentary: Social media regulation should protect users, not push them out
SINGAPORE: Across the Asia Pacific, governments are tightening the rules around who gets to use social media. In Vietnam, users must now verify their accounts with a national ID or local phone number under Decree 147. Malaysia in January began requiring social media platforms to obtain operating licences. Indonesia is considering a minimum age of 18, while Australia has already banned children under 16. These aren't just rules about what you can post. They are rules about who gets to participate. The shift is subtle but significant, going from regulating content to regulating access. Whether you can participate now is increasingly about fitting into the right category – by age, by location, by documentation – not just about how you behave online. In this climate, it does not take much for caution to harden into restriction. And when that happens, platforms might stop being open spaces. They start becoming exclusionary systems that pre-emptively screen users out before anything even happens. WHAT HAPPENS WHEN WE OVERPROTECT Blanket restrictions look decisive, but they often miss the mark. Blocking the young, the anonymous or the vulnerable does not always lead to safety. It often results in exclusion, silence or migration to platforms with weaker rules and safeguards. Australia's under-16 ban has drawn global attention. But it's still too early to know whether it's working as intended. Will it protect children, or merely push them towards less regulated corners of the internet? These are questions we need to ask before more countries follow suit. Sonia Livingstone, a UK scholar of digital literacy and youth technology use, has long warned against protection that turns into exclusion. Young people have a right to be in digital spaces – safely, yes, but meaningfully too. And that principle applies more broadly: Exclusion by legislation doesn't just affect teens. It cuts off anyone who won't – or can't – verify their identity on demand. The truth is, anonymity poses challenges but it's not the only issue. Accountability is another. At SG Her Empowerment, we've supported victims whose intimate images were shared by both anonymous users and known individuals. In both cases, the ecosystem struggled to respond to prevent or mitigate harm. When perpetrators can slip between accounts or disappear altogether, it becomes harder to trace, report and stop the abuse. Anonymity can make that harder. But the deeper issue is whether our systems are built to hold anyone – visible or not – to account. VISIBILITY IS NOT VIRTUE We are drifting towards a global system that increasingly treats visibility as virtue and invisibility as risk. That is a misconception. Whistleblowers, survivors and marginalised communities often need anonymity to speak freely and safely. And it's not just about safety. Anonymity also nurtures creativity, experimentation and candid self-expression – ways of thinking, expressing or deliberating that are not always possible when every action is tied back to a name, job or family. Not everyone posting is trying to hide nefarious deeds. Some are just trying to grow – without the cost of getting it wrong in public. A healthy digital space must ensure room for that too. The countries in the region seem to be edging towards digital systems that assume users should be screened before they can participate. In these jurisdictions, ID, location and traceability are fast becoming the price of entry to online social spaces. That might make enforcement easier, but it narrows the space for meaningful interaction. PRECISION SAFETY When safety is enforced at the door, the burden shifts away from where it matters most: how systems respond when things go wrong. To be clear, a safer internet is not just one with fewer bad actors. It's one where harm is taken seriously, where victims are supported and where platforms are held accountable. That requires more than just gatekeeping – it requires a redesign of social media systems to ensure they can respond to failures and hold up under pressure. Singapore's model has been lauded as frontrunning but is nonetheless still evolving. While early legislation like the Protection from Online Falsehoods and Manipulation Act (better known as POFMA) raised concerns about its scope and ministerial discretion, it was designed to issue correction directions for falsehoods post-publication, not to impose blanket restrictions to social media platforms or services. The Online Safety (Miscellaneous Amendments) Act 2022 expanded regulatory powers further to direct platforms to remove or block access to egregious content such as child sexual exploitation, suicide promotion and incitement to violence. Still, it left room for ambiguity – especially around harms that fall outside these categories, including non-consensual distribution of sexual content, targeted harassment, or content promoting dangerous behaviours. The next step is the Online Safety (Relief and Accountability) Bill. Once passed, it will establish a dedicated Online Safety Commission in 2026. It will also give regulators the authority to request user identity information – but only when serious harm has occurred and legally defined thresholds are met. In this case, identity disclosure is not the starting point. Instead, the focus is on harm-based disclosure: targeted, post-incident and justified. REGULATING USERS IS ONLY PART OF THE PICTURE Governments are leaning on what they know: identity checks, age gates, device and user verification. These are easy to understand and relatively easier to enact. They show immediate action and are often framed as part of a broader effort to protect minors and rein in perpetrators who exploit anonymity to evade detection or accountability. But they don't get to the root of the problem. Why do social media algorithms keep pushing content that distress, provoke or trigger users? Why are reporting avenues and mechanisms buried under three layers of menus? Why are some platforms better at responding to harm than others, even with the same risks? Real trust does not come from mere gatekeeping, but by ensuring platforms behave predictably when things go wrong. That means robust reporting tools, responsive moderation and interface designs that prioritise user safety and well-being. Right now, most of that isn't happening. On many social media platforms, algorithms still reward emotional extremes. Autoplay and endless scroll are still the default. Reporting tools are scattered, inconsistent and underpowered. GETTING IT RIGHT BEFORE IT CLOSES IN Regulation is necessary. But it has to be targeted, not sweeping; responsive to real harm, not preoccupied with suspicion. Safeguards must aim to protect users of social media from potential dangers, not to protect social media platforms from users. Will the current surge of regulation eventually make social media unusable? Perhaps not – but it will certainly make it overwhelmingly more conditional. The question we must ask ourselves is: Conditional on what? Identity? Risk profiles? If we get this wrong, we won't just be regulating platforms – we'll be deciding who gets to belong in the digital world we're building next. That's a decision worth getting right.


CNA
3 hours ago
- CNA
Micron forecasts quarterly revenue above estimates on AI-driven memory chip demand
Micron Technology forecast fourth-quarter revenue above Wall Street estimates on Wednesday, betting on robust demand for its memory chips used for artificial intelligence hardware in data centers. Shares of the memory chipmaker rose 6.7 per cent in extended trading. Micron is positioned as a key player in the AI memory market. Its high-bandwidth memory chips are used in some of the most advanced artificial intelligence systems, including data centers. Many cloud companies, such as Google, have reaffirmed their investments this year to expand AI infrastructure, underscoring strong demand for AI-related products. Micron said it expects revenue of $10.7 billion, plus or minus $300 million, in the fourth quarter, compared with analysts' average estimate of $9.88 billion, according to data compiled by LSEG.