Latest news with #AI-optimised


Time of India
13 hours ago
- Business
- Time of India
Little-known AI powerhouse rockets up Fortune 500 by 206 spots, surging faster than any other company this year
Amid a year of tech job cuts , volatile chip markets, and a rush to expand AI infrastructure , one lesser-known firm leapt the most on the Fortune 500 : Super Micro Computer , as per a report. Super Micro Computer Surges 206 Spots on Fortune 500 Amid AI Boom The IT hardware company based in San Jose rose 206 positions higher than any other firm on this year's Fortune 500 list to No. 292, reported Fortune. It doubled its revenue to $14.99 billion, a 110% gain from last year, and recorded $1.15 billion in earnings, the highest one-year profit growth among its industry, as per the report. Super Micro Computer's rise is mainly because of its strategic position at the intersection of AI, cloud computing , and data centre infrastructure, as these three areas are the fastest-growing areas in technology currently, according to Fortune. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Promoções imperdíveis de voos baratos Voos | Anúncios de Pesquisa Saiba Mais Undo ALSO READ: MAGA base erupts as Trump admin's Palantir-powered national citizen database sparks outrage and distrust CEO Charles Liang's Innovation-First Playbook The company's cofounder and CEO, Charles Liang, has revealed that the firm is focused on innovation, like early-to-market compatibility with Nvidia chips and customisable server hardware built to handle diverse, high-performance workloads, reported Fortune. Live Events Partnering with the Tech Giants According to the report, Super Micro Computer has close partnerships with Nvidia and Intel, which made it a preferred vendor for companies building AI-optimised infrastructures. Recently, the firm was even chosen by tech billionaire Elon Musk's xAI team to develop a 750,000-square-foot data centre in Memphis, reported Fortune. Scaling Up in the US and Going Green The company also has plans to grow its server production capacity in the United States as demand for AI has increased, as per the report. Super Micro Computer had also invested in green computing, branding its systems as energy-efficient alternatives in an industry which is under rising scrutiny for environmental impact, according to Fortune. While, a spokesperson for Super Micro Computer said, 'We are investing in people, processes, and systems to scale our foundation, advancing our leadership in liquid cooling technology, and delivering Data Center Building Block Solutions to achieve and surpass our revenue targets,' as quoted in the report. FAQs What is Super Micro Computer? A once lesser-known IT hardware company based in San Jose, now one of the fastest-growing names on the Fortune 500 due to its role in AI infrastructure, as per Fortune report. How is it connected to Nvidia? Super Micro has early integration with Nvidia's chips, allowing them to quickly deliver hardware that supports the latest AI technologies.


Malaysian Reserve
2 days ago
- Business
- Malaysian Reserve
AI PULSE UNVEILS GDEPIN: THE WORLD'S FIRST DECENTRALISED GPU COMPUTE MODEL THAT POWERS AI
WASHINGTON, June 2, 2025 /PRNewswire/ — AI Pulse, the cutting-edge decentralised AI compute platform, officially announces the launch of GDePIN—a revolutionary new computing model that merges GPU (Graphics Processing Unit) power with DePIN (Decentralised Physical Infrastructure Networks). Developed through five years of deep R&D and strategic collaboration between blockchain and AI experts, GDePIN introduces a globally first-of-its-kind framework that allows anyone—from everyday smartphone users to enterprise GPU operators—to earn passive income by contributing their idle computing power to fuel the AI revolution. Solving the AI Compute Crisis with a Crowd-Powered Solution As global AI adoption accelerates, demand for high-performance compute power has surged—pushing GPU prices to historic highs and creating widespread supply shortages. Traditional cloud providers and centralised GPU farms are increasingly unable to meet this exploding demand. AI Pulse is addressing this problem head-on with a distributed, scalable, and user-powered alternative. 'GDePIN isn't just a technical innovation; it's a movement that redistributes the future of AI computing back into the hands of the people,' said Robert Julian Carl, the CEO of AI Pulse. 'We're democratising access to AI earnings and empowering communities worldwide to participate in building the AI infrastructure of tomorrow.' What Is GDePIN? GDePIN stands for GPU + DePIN, an industry-first model that transforms decentralised physical infrastructure into a high-throughput, AI-optimised computing grid. Each GDePIN unit leverages a dedicated Nvidia H100 GPU or a combination of consumer-grade computing resources, all coordinated through AI Pulse's proprietary supercomputing algorithm. This algorithm smartly balances workloads across thousands of distributed devices—whether high-performance servers, laptops, or even smartphones—enabling efficient, secure, and dynamic resource utilisation. At the same time, the platform's DePIN layer reclaims idle processing power from contributors and routes it into active AI workloads like model training, inference, scientific computing, and even cryptocurrency mining. How Contributors Earn: The GDePIN Ecosystem in Action GDePIN's incentive-driven architecture ensures that anyone with a device and an internet connection can become a contributor and start earning. Here's how it works: Connect & Contribute Users register on the AI Pulse platform and install a lightweight connector that links their device—be it a GPU server, laptop, or mobile phone—to the AI Pulse network. Smart Allocation of Tasks The platform's AI algorithm evaluates each device's specs and allocates suitable workloads automatically. High-performance devices may process AI model training, while smaller devices assist in auxiliary or paralleliable tasks. Fair Rewards & Blockchain Settlement Contributors earn tokens based on the volume, quality, and reliability of compute resources they provide. All transactions are executed transparently via blockchain smart contracts, ensuring real-time visibility, fairness, and auditability. Compounding Ecosystem Growth As more contributors join, AI Pulse's compute grid grows exponentially, enabling it to handle more commercial workloads. In turn, this generates more demand, more compute purchases, and more earnings for contributors. This win-win cycle ensures a stable and growing revenue stream for participants while making high-quality AI compute power more affordable and accessible to developers, researchers, and enterprises. Built in Washington D.C., Designed for the World AI Pulse is headquartered in Washington D.C., but its mission is global. The team behind the platform comprises blockchain pioneers and AI specialists with over five years of active development experience. The company has strategically positioned itself as the only provider of a fully functional blockchain-AI compute leasing solution, backed by proprietary GDePIN technology. Since 2020, AI Pulse has heavily invested in research, resulting in a robust, scalable platform that meets the practical needs of the AI industry without compromising decentralisation, efficiency, or economic inclusion. Join the AI Compute Revolution AI Pulse invites all device owners—from students with gaming laptops to tech professionals with server farms—to become part of the GDePIN ecosystem. By contributing your idle compute power, you not only earn passive income but also help accelerate the AI progress shaping our collective future. To register as a contributor or learn more, visit


France 24
02-05-2025
- Business
- France 24
Will artificial intelligence use all our electricity?
The city of Espoo lies 200 km west of Helsinki. The panorama around the city is what you might expect of Finland – evergreen forests, crisp lakes and ice thawing at the end of winter. But the skyline is about to be occupied by another kind of building: a data centre. Excavators and all sorts of machines rumble on in the background under the watchful eye of Alistair Speirs, Senior Director of Azure Global Infrastructure at Microsoft. 'A data centre is the home of the cloud. It's an industrial facility that holds thousands and thousands of servers. It's powering all the cloud services that we use for work, for home, gaming, streaming, financial services, education and even medicine,' he says, pointing at the construction site behind him. This future data centre campus aims to be one of the most innovative yet for the American tech giant. 'This data centre will be one of the most sustainable data centres in Europe,' the engineer explains. ' Excess heat that's generated from the data centre will go on to heat around 250,000 houses in the area.' It was this promise that made the project so attractive to local authorities, who have welcomed data centres on their soil. ' This data centre will be our main source of clean heating,' says Kai Mykkänen, mayor of Espoo and Finland's former Minister of Climate and the Environment. The vast, sparsely populated areas and long, cold winters make Finland an ideal location for these behemoths, which have to install energy- and water-hungry cooling mechanisms in warmer climates. What's more, the local electricity grid is well equipped. 'Finland has a huge amount of clean electricity. We've been able to scale up electricity production during the past ten years from non-fossil sources, better than in most areas in Europe. And we have a very strong national grid connection. So it was possible to find a location where there is a strong grid, enough electricity, and then also a need for this excess heat,' says the mayor. But these data centres are not so easy to host all over Europe. In Ireland, for example, in 2023, the country was forced to halt the construction of new centres in the Greater Dublin area. In one year, the existing data centres had consumed as much electricity as the whole country's urban housing. European startups join the race for energy Not all energy networks are adapted to such a high electricity demand. And it's not going to get any easier. In its latest report, dated 10 April 2025, the International Energy Agency (IEA) estimates that by 2030, data centre global electricity consumption will have doubled, and even quadrupled for AI-optimised data centres. So how can we cope with such a surge in demand, especially at a time when Europe has pledged to reduce its carbon emissions by half compared to 1990, to achieve its goal of being climate-neutral by 2050? For many tech companies, a lot of the energetic cost comes from developing and training AI models. Some European players are trying to optimise the resources they have to decrease the electricity bill. That's the case of Kyutai, a French AI research lab. In their Parisian office, facing Les Halles and the Bourse de Commerce, Kyutai developers work in open source: they make their research available so that others wanting to replicate their models can access it, use it, and avoid repeating the same processes of trial and error they went through. And that saves energy. 'Our aim with open source is to ensure that the cost of training is worth it, by reducing the cost of experimentation for other structures, whether they're start-ups or other competing labs,' says Alexandre Défossez, Chief Exploration Officer at the startup, tapping out lines of code on his computer. 'Some big American companies are focused on performance at all costs. Us European players, we have access to and use far fewer computing resources than the Americans. So that also forces us to be a bit clever about what we do,' he adds. Thirty-five nuclear power plants to cover our needs Engineer Marlène de Bank is focused on finding solutions to reconciling AI energy consumption with the climate transition. De Bank researches digital tech at The Shift Project, a French think tank working on decarbonising the economy. 'The demand for AI at European level is 35 gigawatts. 1 gigawatt is one nuclear power plant. So we're going to have to add the equivalent of 35 nuclear power stations. If it's something else, that means we'll have to add 35 times that equivalent. We also have to find it,' she explains. The think tank is currently drawing up a report assessing the impact of AI on the climate transition. At the moment, they are undecided as to whether AI is entirely good or bad for the environment. AI can help us do things like prevent natural disasters or manage waste, but it can also help optimise oil extraction. De Bank believes that a lot depends on what AI is used for. 'It's like a car. To travel one kilometre, you can choose to bike instead. So your carbon emissions at the end of the year depend on how many times you've driven and how many times you've cycled. For AI, it's going to depend on how often you use AI and how often you use your brain instead.'


Techday NZ
28-04-2025
- Business
- Techday NZ
Lenovo unveils AI-optimised storage for faster enterprise ROI
Lenovo has announced a significant expansion of its data storage portfolio with the release of new AI-optimised storage solutions aimed at supporting enterprise AI and data modernisation strategies. The new portfolio consists of 21 ThinkSystem and ThinkAgile models and is designed to assist organisations in addressing challenges related to AI deployment, virtualisation, and sustainability. The updates encompass storage arrays, software defined infrastructure, and new solutions intended to deliver improved efficiency, performance and scalability required by modern data centres. "The new Lenovo Data Storage Solutions help businesses harness AI's transformative power with a data-driven strategy that ensures scalability, interoperability, and tangible business outcomes powered by trusted infrastructure," said Scott Tease, Vice President and General Manager of Infrastructure Solutions Product Group at Lenovo. "The new solutions help customers achieve faster time to value no matter where they are on their IT modernisation journey with turnkey AI solutions that mitigate risk and simplify deployment." The range includes the introduction of AI Starter Kits for the Lenovo Hybrid AI Platform, pre-configured options designed to accelerate enterprise adoption of AI, notably for retrieval-augmented generation (RAG) workflows. Also launched is the ThinkAgile HX Series GPT-in-a-Box solution, equipped with Lenovo Neptune Liquid Cooling technology. Lenovo states that this is the industry's first liquid-cooled hyperconverged infrastructure (HCI) appliance, capable of providing up to 25% energy savings over previous generations. "This refresh is an imperative step in supporting local organisations achieve their AI ambitions. We know AI investment in ANZ is accelerating, growing fourfold in 2025, yet our CIO Playbook highlights that ROI remains a key barrier to AI adoption" said Sumir Bhatia, President, Asia Pacific Infrastructure Solutions Group, Lenovo. "We are assisting in speeding up ROI, particularly with our industry-first liquid-cooled HCI appliance, which will yield up to 25% energy savings over previous generations." Among the products launching as part of this portfolio refresh is the Lenovo ThinkAgile SDI V4 Series, which is described as a full-stack, turnkey solution intended to streamline IT infrastructure and facilitate computing for data-driven workloads, including AI large language models (LLMs) inferencing. The ThinkSystem Storage Arrays are also part of the release, with claims of offering up to three times faster performance and power consumption reductions, translating to up to 97% energy savings and 99% improvement in storage density over legacy hardware based on 10K HDDs. The converged ThinkAgile and ThinkSystem hybrid cloud and virtualisation solutions now allow for independent scaling of compute and storage capabilities, with Lenovo reporting potential software licensing cost reductions of up to 40% by enabling additional storage without incurring extra licensing charges. The ThinkAgile HX Series GPT-in-a-Box offerings, which use Neptune Liquid Cooling, are engineered to allow turnkey AI inferencing suitable for distributed applications across edge and cloud environments. The solution is positioned to reduce energy usage, which Lenovo states will increase return on investment and allow organisations to achieve data-driven outcomes faster. The AI Starter Kits for the Lenovo Hybrid AI Platform come as pre-configured packages combining compute, storage, GPUs, and networking components. These kits are intended to be scalable and adaptable to various organisational needs and to minimise the complexity involved with deploying enterprise AI services. Storage arrays configured with the AI Starter Kits support unified file, block, and object storage, leveraging SSD flash technology to accelerate time to insight from enterprise data. Bringing computation capabilities closer to the source of enterprise data is a growing requirement for AI model training and inference tasks. Lenovo's new ThinkAgile SDI V4 Series and ThinkSystem Storage Arrays are positioned as full-stack AI-ready infrastructure options for organisations at the start of their AI journey. Additionally, the new hybrid cloud and virtualisation solutions are designed to offer flexibility and operational simplicity. For virtualisation, the Lenovo ThinkAgile Converged Solution for VMware brings together features from the ThinkAgile VX Series and ThinkSystem DG Series storage arrays. The aim is to provide a unified hybrid cloud platform capable of supporting diverse storage workloads, facilitated by integrated lifecycle management and operational features. The new ThinkAgile V4 series now includes what Lenovo describes as the industry's first liquid-cooled HCI appliance, specifically targeting the efficiency challenges of high-powered AI workloads. The company asserts up to 25% energy savings when compared with the previous generation, and this hardware is positioned as a repeatable solution for swift AI integration. The security aspect of Lenovo's storage solutions includes features like Premier Enhanced Storage Support, which provides expert support and rapid response for IT teams. Additionally, the ThinkSystem DG and DM storage arrays have new AI-powered autonomous ransomware protection that employs machine learning to proactively identify and mitigate cyber threats. Lenovo Data Storage Solutions also include the company's XClarity systems management software, offering comprehensive security, management, encryption, and compliance features for storage management across enterprise environments.


Arabian Business
22-04-2025
- Business
- Arabian Business
Khazna launches UAE's first AI-optimised data centre
Khazna Data Centers (Khazna), a global leader in digital infrastructure, announced breaking ground on two brand new facilities in the UAE, marking continued expansion of its data centre footprint in the country. The two new centres – AUH4 in Mafraq and AUH8 in Masdar City – are located in Abu Dhabi, and will join QAJ1, the quickly progressing facility in Ajman, which is billed as the region's first AI-optimised data centre. The rising number of data centres are expected to play a crucial role in supporting the growth of digital and AI-based services across the UAE. Supporting UAE's AI ambitions 'The UAE economy is transforming rapidly as industries across the board continue to embed AI into more of their critical processes. This is creating unprecedented capacity demand for AI-optimised infrastructure, and we're proud to be meeting this demand,' said Hassan Alnaqbi, CEO, Khazna. 'The establishment of AUH4 and AUH8, as well as the strong progress we are making on QAJ1, reaffirm our position as a key enabler of the future economy that is currently being built in the UAE,' he said. The company said AUH4 and AUH8 will substantially expand the region's cloud hosting capacity and provide a combined 60MW of capacity. Due for completion in December 2026 and August 2026 respectively, Khazna is using a modular design architecture to improve efficiency during the build phase, with minimal waste and faster construction timelines. By employing adiabatic free cooling to improve cooling efficiency, these data centres are set to deliver industry-leading Power Usage Effectiveness (PUE) figures for the region, the company said. The steel structures for QAJ1 facility in Ajman is complete and the initial project phase is due for completion in December 2026, it said. With large-scale AI infrastructure a strategic priority for the Emirates, the aggressive construction timelines of all three facilities will help the UAE meet its ambitious digital transformation targets.