logo
OpenAI finance chief sees firm selling AI data centre services to offset ChatGPT costs

OpenAI finance chief sees firm selling AI data centre services to offset ChatGPT costs

OpenAI is considering eventually helping other businesses tap into the data centres and physical infrastructure needed for artificial intelligence, potentially creating a new revenue line that could offset some of the ChatGPT maker's immense costs.
Advertisement
The service would be loosely inspired by the success Amazon.com found renting out its spare cloud computing capacity to companies, OpenAI chief financial officer Sarah Friar said in an interview Wednesday.
OpenAI was not 'actively looking' at such an effort today because it was focused on securing computing capacity for its own operations, she said, but 'I do think about it as a business down the line, for sure.'
In recent years, OpenAI has gained expertise on how to design and set up data centres to optimise workloads for AI. The company now sees an opportunity to capitalise on that know-how. It also wants to become more directly involved in the process rather than rely solely on third-party vendors.
'If all we do is buy from others, all we're doing is giving them our IP because they're learning how to build AI infrastructure,' Friar said.
OpenAI CEO Sam Altman. Photo: Reuters
OpenAI has raised tens of billions of dollars to pay for advanced chips and data centres to build and operate cutting-edge AI services. The company is also working with SoftBank Group and Oracle on an ambitious infrastructure venture called Stargate, with plans to build massive data centres in the US and abroad.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Trump's ‘excessive' US$515 million fraud penalty dismissed by appeal court
Trump's ‘excessive' US$515 million fraud penalty dismissed by appeal court

South China Morning Post

time5 hours ago

  • South China Morning Post

Trump's ‘excessive' US$515 million fraud penalty dismissed by appeal court

An appeal court has thrown out the massive civil fraud penalty against US President Donald Trump, ruling on Thursday in New York state's lawsuit accusing him of exaggerating his wealth. The decision came seven months after the Republican returned to the White House. A panel of five judges in New York's mid-level Appellate Division said the verdict, which stood to cost Trump more than US$515 million and rock his real estate empire, was 'excessive'. After finding that Trump engaged in fraud by flagrantly padding financial statements that went to lenders and insurers, Judge Arthur Engoron ordered him last year to pay US$355 million in penalties. With interest, the sum has topped US$515 million. The total – combined with penalties levied on some other Trump Organization executives, including Trump's sons Eric and Donald Jnr – now exceeds US$527 million, with interest. 'While the injunctive relief ordered by the court is well crafted to curb defendants' business culture, the court's disgorgement order, which directs that defendants pay nearly half a billion dollars to the State of New York, is an excessive fine that violates the Eighth Amendment of the United States Constitution,' Judges Dianne T. Renwick and Peter H. Moulton wrote in one of several opinions shaping the appeal court's ruling. Engoron also imposed other punishments, such as banning Trump and his two eldest sons from serving in corporate leadership for a few years. Those provisions have been on pause during Trump's appeal, and he was able to hold off collection of the money by posting a US$175 million bond.

Tech war: DeepSeek hints China close to unveiling home-grown ‘next generation' AI chips
Tech war: DeepSeek hints China close to unveiling home-grown ‘next generation' AI chips

South China Morning Post

time7 hours ago

  • South China Morning Post

Tech war: DeepSeek hints China close to unveiling home-grown ‘next generation' AI chips

Chinese artificial intelligence start-up DeepSeek said that China will soon have home-grown 'next generation' chips for AI stacking, fanning speculation over breakthroughs China may have achieved. In a one-line note on its official WeChat account explaining the 'UE8M0 FP8 scale' of its newly released model V3.1, the Hangzhou-based firm said that the model was particularly designed 'for the home-grown chips to be released soon'. It did not specify the vendor of these chips or whether their use would be in the training of AI models or inferencing. In a technical paper explaining V3.1, which integrates reasoning and non-reasoning modes into one model, DeepSeek said the model was trained 'using the UE8M0 FP8 scale data format to ensure compatibility with microscaling data formats'. The disclosure hints that China has made key progress in building a self-sufficient AI stack consisting of domestic technologies, a development that could help the country shrug off US chip export restrictions. FP8, or floating-point 8 is an 8-bit data format that reduces precision to speed up AI training and inference by using less memory and bandwidth. UE8M0, a format with 8 bits for exponent and 0 bits for mantissa, could further increase training efficiency and in turn reduce hardware requirements, as it could cut memory use by up to 75 per cent. DeepSeek's use of these formats, if combined with China's domestic chips, could translate to a new breakthrough in hardware-software coordination. The revelation marks a bold claim from the company, which has been relatively quiet since it shocked the world with the release of its R1 reasoning model in January 2025 and its V3 model in December 2024. DeepSeek said its V3 model was trained on 2,048 Nvidia H800 chips. It did not disclose the chips it used to train R1 or V3.1.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store