
How Technology Can Enable an Overdue Rethink of Corporate Lending
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Guardian
2 hours ago
- The Guardian
The Guardian view on Britain's AI strategy: the risk is that it is dependency dressed up in digital hype
There was a time when Britain aspired to be a leader in technology. These days, it seems content to be a willing supplicant – handing over its data, infrastructure and public services to US tech giants in exchange for the promise of a few percentage points of efficiency gains. Worryingly, the artificial intelligence strategy of Sir Keir Starmer's government appears long on rhetoric, short on sovereignty and built on techno-utopian assumptions. Last week Peter Kyle, the technology secretary, was promoting the use of AI-generated discharge letters in the NHS. The tech, he said, will process complex conversations between doctors and patients, slashing paperwork and streamlining services. Ministers say that by applying AI across the public sector, the government can save £45bn. But step back and a more familiar pattern emerges. As Cecilia Rikap, a researcher at University College London, told the Politics Theory Other podcast, Britain risks becoming a satellite of the US tech industry – a nation whose public infrastructure serves primarily as a testing ground and data source for American AI models hosted on US-owned cloud computing networks. She warned that the UK should not become a site of 'extractivism', in which value – whether in the form of knowledge, labour or electricity – is supplied by Britain but monetised in the US. It's not just that the UK lacks a domestic cloud ecosystem. It's that the government's strategy does nothing to build one. The concern is that public data, much of it drawn from the NHS and local authorities, will be shovelled into models built and trained abroad. The value captured from that data – whether in the form of model refinement or product development – will accrue not to the British public, but to US shareholders. Even the promise of job creation appears shaky. Datacentres, the physical backbone of AI, are capital-intensive, energy-hungry, and each one employs only about 50 people. Meanwhile, Daron Acemoglu, the MIT economist and Nobel laureate, offers a still more sobering view: far from ushering in a golden age of labour augmentation, today's AI rollout is geared almost entirely toward labour displacement. Prof Acemoglu sees a fork: AI can empower workers – or replace them. Right now, it is doing the latter. Ministerial pledges of productivity gains may just mean fewer jobs – not better services. The deeper problem is one of imagination. A government serious about digital sovereignty might build a public cloud, fund open-source AI models and create institutions capable of steering technological development toward social ends. Instead, we are offered efficiency-by-outsourcing – an AI strategy where Britain provides the inputs and America reaps the returns. In a 2024 paper, Prof Acemoglu challenged Goldman Sachs' 10-year forecast that AI would lead to global growth of 7% – about $7tn – and estimated instead under $1tn in gains. Much of this would be captured by US big tech. There's nothing wrong with harnessing new technologies. But their deployment must not be structured in a way that entrenches dependency and hollows out public capacity. The Online Safety Act shows digital sovereignty can enforce national rules on global platforms, notably on porn sites. But current turmoil at the Alan Turing Institute suggests a deeper truth: the UK government is dazzled by American AI and has no clear plan of its own. Britain risks becoming not a tech pioneer, but a well-governed client state in someone else's digital empire. Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.


Reuters
4 hours ago
- Reuters
Gas demand at two of the top US LNG plants declines
HOUSTON, Aug 18 (Reuters) - Two of the United States' largest liquefied natural gas export plants recorded major declines in demand for natural gas on Monday, suggesting parts of their facilities might be down, according to data from financial firm LSEG. Cheniere's (LNG.N), opens new tab Sabine Pass export facility in Texas, which utilizes up to 4.5 billion cubic feet of natural gas per day, was down to 3.7 bcf, and Sempra's (SRE.N), opens new tab Cameron LNG plant in Louisiana, which processes 2 bcfd, was down to 1.3 bcf, according to LSEG data. Cheniere declined to comment, while Sempra did not immediately respond to a request for comment. Sabine Pass is the United States' largest LNG plant and Cameron the fourth largest. Together they have helped keep the U.S. as the world's largest LNG exporter since 2023. Monday's drop in consumption from those two facilities pulled the day's demand down to 14.7 bcf, the lowest level in two months, according to LSEG data. U.S. natural gas futures fell about 1% on Monday morning, with front-month gas futures for September delivery on the New York Mercantile Exchange falling 2 cents to $2.90 per million British thermal units.


Reuters
4 hours ago
- Reuters
Big Tech, power grids take action to reign in surging demand
August 18 - As data centers become an increasing driver of U.S. power demand, operators and power companies are seeking ways to better integrate them into the power network. Power plant developers and network operators are scrambling to keep up with the demands of new data centers that need electricity day and night. The U.S. Department of Energy forecasts 20 GW of new data center load by 2030 and predicts data centers will consume 6.7%-12% of total U.S. power production by 2028, up from 4.4% in 2023. Grid operators were already overloaded with renewable energy applications and surging requests from new data centers projects are leading to connection delays and holding back growth. The pace at which utilities, developers, and state regulators can accelerate their processes for siting, permitting, and building new infrastructure will 'very likely act as a constraint on data center growth in the near to medium term,' consultancy Energy and Environmental Economics (E3) said in a report on grid strains in Virginia. As grid capacity dwindles, availability of power and grid infrastructure has become a critical driver of site selection for data center developers. Larger data center complexes are being developed with some surpassing 1 GW, equivalent to one large nuclear reactor. 'We are having to make geographic choices based on the fact that that we need to move pretty quickly to deploy the infrastructure that we need," Bobby Hollis, VP of energy at Microsoft, told Reuters Events. "If one location is moving significantly slower than another because of interconnection or transmission line requirements, then we might have to go to another location." Join 3,000+ senior decision-makers across energy and finance at Reuters Events Energy LIVE 2025. Developers typically favor sites with competitive clean energy supply, transmission availability and low development costs, and development is soaring in Texas and Northern Virginia, where end-users can benefit from some of the lowest power prices in the United States. As AI demand soars, the European Union is implementing measures to increase the energy efficiency of data centers, but there is no such centralized push in the United States. U.S. utilities and regional network operators are implementing a patchwork of incentives and market mechanisms to minimize the impact of data center load. Meanwhile, data center operators are working on innovative technologies and systems to increase energy efficiency and be flexible to grid needs. Power market controls Grid operators are having to spend more on power infrastructure to meet the growing demands of AI, with knock-on effects for the wider public. Dominion Energy, Northern Virginia's primary utility, proposed in April a new rate class for high-load users including data centers to reduce the cost burden for residential customers. Dominion had 40 GW of data center capacity under contract by the end of 2024, including planned facilities, up from 21 GW six months prior. Dominion also proposed higher electricity rates for other customers to cover its costs and expand clean power capacity. CHART: Forecast US data center electricity demand PJM, the operator of the U.S.' largest network spanning much of eastern U.S. and parts of northern U.S., is using market tools like capacity auctions and demand response to manage growing load from hyperscale customers like data centers. In California, grid operator CAISO uses incentives like flexible load, demand response payments and time of use rates to ease the impact of large customers. In Texas, soaring demand for clean power and data centers has prompted grid operator ERCOT to introduce stricter regulation for large consumers. Large users must be able to ride through grid faults and quickly resume consumption while high price caps reward flexible consumption by offering revenue for data center operators able to cut load, shift load to other time periods, or use on-site generation, during peak demand periods. Download our exclusive report: Soaring US Power Demand Opens New Paths for Developers. With better data and AI, grid infrastructure could handle more load, Rich Voorberg, President of Siemens Energy North America, a power technology group, said. Greater access to data allows grid partners to use tools such as digital twins to assess whether existing infrastructure can be optimized to expand overall capacity. 'In pockets we are already seeing it," Voorberg said. "We're consulting with different grids on how to better optimize the grid.' As grid capacity dwindles, some tech groups and data center developers are seeking to co-locate new power generation with data center sites to reduce grid connection delays and reduce exposure to regional markets and grid bottlenecks. Efficiency savings To ease grid pressure, data center operators are adopting more efficient, low-emission designs, which will have an impact on future power demand. Microsoft is investing in liquid and natural air cooling to reduce energy and water use, plus AI-driven tools to maximize computing efficiency. 'We're dealing with constraints like the rest of the marketplace,' Hollis said. CHART: Impact on data centers of new cooling technologies vs air cooling Amazon Web Services (AWS) is optimizing its data center mechanical systems and designing proprietary high-efficiency components. AWS focuses on performance metrics such as Power Usage Effectiveness (PUE) and water usage effectiveness. "To increase efficiency, AWS uses different cooling techniques, including free air cooling depending on the location and time of year, as well as real-time data to adapt to weather conditions," the company said. "AWS' latest data center design seamlessly integrates optimized air-cooling solutions alongside liquid cooling capabilities for the most powerful AI chipsets." Innovative cooling mechanisms are a major focus of energy reduction. Data center group Digital Reality is shifting from air cooling to liquid cooling through the use of a closed loop where the fluid is recirculated and supports higher density racks. 'We're aggressively shifting toward liquid cooling," said Aaron Binkley, Digital Reality's VP of Sustainability. The company has a global portfolio of around 170 data centers and supports liquid cooling in more than half of its facilities, he said. For exclusive energy insights, sign up to our newsletter. Variations in load profiles between fluctuating AI and crypto consumption and the flatter load for cloud services also offer opportunities to manage and optimize power use. Data centers are sometimes seen as a resource for the grid, with the possibility to shift loads by pausing activities or shifting activity to off-peak hours. 'The great thing about the AI data centers is they are inherently more flexible in terms of workload (…). It creates some opportunities to lessen the overall impact they have and thereby provide benefits back,' said David Porter, EPRI's VP Electrification & Sustainable Energy Strategy. PUE (Power Usage Effectiveness) of data centers is currently the dominant industry metric, but operators are now evaluating performance based on how much useful computing—such as data processing units known as AI tokens, or mathematical calculations known as floating point operations (FLOPS)—is delivered per watt. Meanwhile, some data centers are exploring using direct current (DC) rather than alternating current (AC) to cut conversion losses. Swiss power technology group ABB created a demonstration site in Zurich operating entirely on DC which increased energy efficiency by 10% while reducing installation costs by 20% and lowering investment costs for electrical components by 15%. 'This is a new trend… we are also speaking with our partners,' said Massimiliano Cifalitti, president of ABB's Smart Power division.