logo
#

Latest news with #ColinHumphreys

Beware of AI energy hyperbole: scientists already have a solution
Beware of AI energy hyperbole: scientists already have a solution

Telegraph

timea day ago

  • Business
  • Telegraph

Beware of AI energy hyperbole: scientists already have a solution

Scientists from three British universities are together developing an atom-thick graphene chip that slashes energy use for computing and AI data centres by over 90pc, radically changing the trajectory of global electricity demand over the next quarter century. It promises a future where semiconductors are so energy efficient that we will have to recharge our mobile phones just once a week. A good laptop battery will run for 80 hours. 'We're very confident that we will be able to cut electricity use for computing by 90pc and perhaps even by five times more than that,' said Sir Colin Humphreys, the project leader and professor of materials science at Queen Mary University of London. 'We expect to have a prototype that works by 2029, and we should be manufacturing millions of working devices by 2032-2033,' he said. Queen Mary has teamed up with the University of Nottingham and the James Watt Nanofabrication Centre at the University of Glasgow, putting together the most advanced collective knowledge on 2D graphene semiconductors on the planet. They are backed by grants from the Engineering and Physical Sciences Research Council. 'It is very much a global race but we have the world lead in graphene. The Chinese are pouring huge sums into this and have been trying to reproduce our technology but can't yet do it,' said Prof Humphreys. 'They asked me to come to China and more or less said 'name your price'. I declined the offer. The technology has huge military implications,' he said. If graphene can deliver this quantum jump before the end of the decade, and start rolling it out at scale in the early 2030s, Google, Microsoft, Meta, Amazon and the giant hyper-scalers will not need extra fleets of nuclear reactors and gas plants to run their AI data centres. Shell, BP and the European drilling majors may come to regret their Faustian pivot back to natural gas, a strategy that is in essence one giant bet on AI energy hyperbole. Russia is the latest country to jump on the data centre bandwagon, eyeing a resurrection play for Gazprom after the loss of 140bn cubic meters of annual pipeline sales to Europe. Alexei Chekunkov, the minister for the Russian Far East, told the St Petersburg economic forum this week that power-hungry computers could save the industry. 'All this gas is lying unused underground. The question of what to do with it is very urgent, so let's think about it for AI and blockchain generation,' he said. It was the same talk this spring at the CeraWeek energy conference in Houston, Texas, where it was an article of faith that AI and language learning models will require a vast expansion of fossil-fired power for decades to come, mostly from gas but also from coal if Donald Trump gets his way. Trump has signed an executive order to ' turbocharge coal mining ', proclaiming that America will need to double electricity output to drive America's AI supremacy. 'We're ending Joe Biden's war on beautiful, clean coal once and for all. All those plants that have been closed are going to be opened,' he said. We may avert this dystopian disaster after all. The magic lies in the unique properties of graphene, a flat sheet of carbon atoms first isolated in a Nobel prize discovery in Manchester in 2004. 'Graphene conduction electrons don't go through the material like copper and silicon: they glide along the surface like an ice-skater. That is why it is the best conductor in the world,' said Prof Humphreys. More chips can be packed into a data processing hub, and stacked in layers, without voracious needs for water cooling. 'They are diverting whole rivers in the US to build data centres,' he said. Google alone used 6bn gallons of water last year to cool its operations. ChatGPT uses half a litre for every 100-word request, and much of this is happening in areas under water-stress. You can grow graphene by using methane as the raw material. It is plentiful, harmless, and can be entirely home-made. 'There is no dependence on other countries for supplies,' he said. The technical problem with graphene is that it has no 'band gap', which is what enables semiconductors to switch on and off rapidly. The project has found a way to overcome this by adding layers of indium selenide. This is the secret source. The research arms of the US army and navy are backing a rival 2D technology at Penn State University using molybdenum disulfide to build a computer. The project published its findings two weeks ago in Nature, which also point to dramatic savings in energy use. Whoever gets there first, the 80-year age of silicon is over, and so is the old model of TSMC, Intel and the incumbent semiconductor industry. Trying to miniaturise chips down to the frontier of three or two nanometers (nm) is the last expensive gasp of a technology that will be obsolete in a few years. 'This is going to put silicon out of business. We have reached its atomic limits. At the end of the day, it is inefficient and won't be able to compete,' said Prof Humphreys. I wrote recently about an entirely different approach to AI data centres by the global nanotechnology institute IMEC, which uses superconductors to slash energy use by orders of magnitude. It involves soaking standard 28nm chips in liquid helium at minus 269C and keeping them cold by cryogenic cooling. This lets you stack chips a hundred layers high for the extreme demands of AI without causing the copper wires to overheat. In the meantime we face an energy crunch until we get over the hump. AI computing demands are doubling every six months. Typical data centres consumed eight kW per rack three years ago. Nvidia's latest GB200 chip needs 120 kW per rack for training ChatGPT. Data centres are already consuming 20pc of Ireland's electricity. National Grid expects commercial power demand for British data centres to rise sixfold over the next decade, most of it concentrated around London where it is hardest to deliver extra power. But wild talk that computing will gobble up half the world's electricity by 2040 is crude extrapolation and likely to prove another Malthusian scare, much like the alarmism four years ago over lithium and critical minerals – before the bubble popped and prices collapsed by 80pc. There is a problem for some rare earths and strategic minerals but not because they are scarce: it is because the West fell asleep while China locked up the processing industry and the immediate supply chain to gain political leverage. That can and will be fixed. By the same token, the energy needs of advanced computing will also be fixed, and without requiring a mad dash for coal and gas. Technologists will once again save us from our own incorrigible stupidity.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store