
Chinese chip makers, cloud providers rush to embrace homegrown DeepSeek
SHANGHAI, Feb 5 (Reuters) - Chinese companies, from chip makers to cloud service providers, are rushing to support DeepSeek's artificial intelligence models, spurring analysts to hail a "watershed moment" for the industry.
Moore Threads and Hygon Information Technology (688041.SS), opens new tab, which makes AI chips and looks to compete with Nvidia, said on Monday their computing clusters and accelerators would be able to support DeepSeek's R1 and V3 models.
"We pay tribute to DeepSeek," Moore Threads headlined its post on WeChat, adding that progress by the firm's models using domestically made graphic processing units (GPU) could "set on fire" China's AI industry.
On Saturday, Huawei Technologies (HWT.UL), which also has its own line of AI chips, said it was working with AI infrastructure start-up SiliconFlow to make DeepSeek's models available to customers on its Ascend cloud service.
Their performance was comparable to models run on global, high-end chips, it added.
The news that Huawei had integrated DeepSeek's models with its Ascend chips marked a "watershed moment," Bernstein analysts said in a note on Sunday.
"DeepSeek demonstrates that competitive large language models (LLM) can be deployed on China's 'good enough' chips, easing reliance on cutting-edge U.S. hardware"," they added, citing Ascend and planned chips from Cambricon and Hygon.
Alibaba (9988.HK), opens new tab, Baidu (9888.HK), opens new tab and Tencent's (0700.HK), opens new tab cloud arms have also said they have made DeepSeek's models accessible via their services.
Last month, DeepSeek launched a free AI assistant that it says uses less data at a fraction of the cost of existing services.
Within a few days, its app overtook U.S. rival ChatGPT in downloads from Apple's App Store, triggering a global selloff in tech shares.
Earlier the company earlier drew attention in global AI circles with a research paper in December that said the training of DeepSeek-V3 required less than $6 million worth of computing power from Nvidia's H800 chips, versus the billions of dollars spent by the likes of tech giants Meta and Microsoft.
China has welcomed its success, turning the startup based in the eastern city of Hangzhou, and the firm's founder, Liang Wenfeng, into pop culture celebrities.
Microsoft (MSFT.O), opens new tab and Amazon's (AMZN.O), opens new tab cloud services have also started offering DeepSeek's models but several countries such as Italy and the Netherlands have blocked, or are investigating, DeepSeek's AI app on concerns of privacy.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Reuters
an hour ago
- Reuters
Nvidia chips make gains in training largest AI systems, new data shows
SAN FRANCISCO, June 4 (Reuters) - Nvidia's (NVDA.O), opens new tab newest chips have made gains in training large artificial intelligence systems, new data released on Wednesday showed, with the number of chips required to train large language models dropping dramatically. MLCommons, a nonprofit group that publishes benchmark performance results for AI systems, released new data about chips from Nvidia and Advanced Micro Devices (AMD.O), opens new tab, among others, for training, in which AI systems are fed large amounts of data to learn from. While much of the stock market's attention has shifted to a larger market for AI inference, in which AI systems handle questions from users, the number of chips needed to train the systems is still a key competitive concern. China's DeepSeek claims to create a competitive chatbot using far fewer chips than U.S. rivals. The results were the first that MLCommons has released about how chips fared at training AI systems such as Llama 3.1 405B, an open-source AI model released by Meta Platforms (META.O), opens new tab that has a large enough number of what are known as "parameters" to give an indication of how the chips would perform at some of the most complex training tasks in the world, which can involve trillions of parameters. Nvidia and its partners were the only entrants that submitted data about training that large model, and the data showed that Nvidia's new Blackwell chips are, on a per-chip basis, more than twice as fast as the previous generation of Hopper chips. In the fastest results for Nvidia's new chips, 2,496 Blackwell chips completed the training test in 27 minutes. It took more than three times that many of Nvidia's previous generation of chips to get a faster time, according to the data. In a press conference, Chetan Kapoor, chief product officer for CoreWeave, which collaborated with Nvidia to produce some of the results, said there has been a trend in the AI industry toward stringing together smaller groups of chips into subsystems for separate AI training tasks, rather than creating homogenous groups of 100,000 chips or more. "Using a methodology like that, they're able to continue to accelerate or reduce the time to train some of these crazy, multi-trillion parameter model sizes," Kapoor said.


Daily Record
an hour ago
- Daily Record
‘AI-scientist' discovers that common medication could kill cancer cells
Commonly used non-cancer drugs could help in the treatment of the disease, an 'AI-scientist' has discovered. It seems that technology is reaching new heights, as an AI-powered 'scientist' has made a significant discovery. Working alongside human researchers, the AI model GPT-4 (not to be confused with ChatGPT) has suggested that combinations of cheap and safe drugs could also be effective at treating cancer. The research team, led by the University of Cambridge, used the GPT-4 large language model (LLM) to sift through extensive heaps of scientific literature in order to identify potential new cancer drugs. It was found that drugs for conditions such as high cholesterol and alcohol dependence could potentially kill cancer cells, in research results published in the Journal of the Royal Society Interface. The researchers asked GPT-4 to identify potential new drug combinations that could have an impact on a type of breast cancer cell which is commonly used in medical research. They instructed the 'AI scientist' to avoid standard cancer drugs and identify medications that would attack cancer cells without harming healthy cells. They also prompted the AI model to prioritise drugs that were affordable and approved by regulators. When GPT-4 had made its suggestions, the chosen drugs were then tested by human scientists to measure their effectiveness against breast cancer cells. It was found that three of the 12 drug combinations suggested by GPT-4 worked better than current breast cancer drugs. The AI model then learned from these tests and suggested a further four combinations, three of which also showed promising results. Simvastatin (commonly used to lower cholesterol) and disulfiram (used in alcohol dependence) stood out against breast cancer cells. And while these drugs are not traditionally associated with cancer care, they could be used as potential cancer treatments- although they would first have to go through extensive clinical trials. The researchers have emphasised that AI is not a replacement for scientists, but that supervised AI researchers have the potential to accelerate discovery in areas like cancer research. Models like GPT-4 have been known to return results that aren't true. But in scientific research, these incorrect suggestions, which are known as hallucinations, can still lead to new ideas that are worth testing. 'Supervised LLMs offer a scalable, imaginative layer of scientific exploration, and can help us as human scientists explore new paths that we hadn't thought of before,' said Professor Ross King from Cambridge's Department of Chemical Engineering and Biotechnology, who led the research. 'This is not automation replacing scientists, but a new kind of collaboration,' added co-author Dr Hector Zenil from King's College London. 'Guided by expert prompts and experimental feedback, the AI functioned like a tireless research partner—rapidly navigating an immense hypothesis space and proposing ideas that would take humans alone far longer to reach. 'This study demonstrates how AI can be woven directly into the iterative loop of scientific discovery, enabling adaptive, data-informed hypothesis generation and validation in real time." Join the Daily Record WhatsApp community!


Reuters
2 hours ago
- Reuters
Most emerging market currencies set to hold on to gains
BENGALURU/JOHANNESBURG, June 4 (Reuters) - Most emerging market currencies will hold the gains they have made this year or extend them against a retreating dollar in the next six months as traders ditch the U.S. exceptionalism trade that fuelled the greenback's dream run, a Reuters poll of FX strategists found. At the start of the year, emerging market currencies looked set for a rough ride on expectations of U.S. economic strength and delayed Federal Reserve interest rate cuts as well as trade tensions. But they have since defied expectations as U.S. President Donald Trump's broader-than-expected but erratically implemented tariff together with a deteriorating fiscal outlook have sparked a flight from the dollar and U.S. assets. That is expected to continue, with more than half the currencies polled forecast to trade in tight ranges or gain, while the rest were expected to give back only a small portion of this year's strong gains, according to a May 30–June 4 poll of more than 50 foreign exchange strategists. "The path of least resistance is a mildly weaker dollar at the moment," said Christopher Turner, head of FX strategy at ING. "We think (the decline) will be sort of modest and gradual and that should keep the mindset for investors to buy EM currencies on dips and that's kind of what we're seeing at the moment." Separately, the dollar has become a preferred funding currency as Trump's trade war fuels recession fears and outflows from U.S. assets. The EM carry trade - borrowing in low-yielding currencies to invest in higher-yielding EM ones - has long attracted investors chasing returns. High-yielders like the South African rand and Brazilian real are up around 6.0% and 10.0% respectively this year. The real was predicted to lose only about 2.0%, while the rand is likely to trade in a tight range over the next six months. "I think the trend for emerging market currency outperformance can continue in the second half of this year, but there are downside risks to be wary of as well," said Lee Hardman, senior currency economist at MUFG, referring to trade disruption and the potential hit to global growth. The Turkish lira, the weakest-performing emerging market currency so far this year, is projected to soften by another 8.0% from 39 per dollar to 42.8/dollar over the next six months. In Asia, the heavily managed Chinese yuan is expected to stay rangebound despite widespread concerns about weak demand in its economy, and a standoff with Washington over tariff policy and export controls. The Indian rupee , Korean won and Thai baht are all expected to gain just less than 1% by the end of November, pointing to steady but modest appreciation. "The big risk we see short-term for emerging market currencies is the risk of a turnaround in dollar sentiment," said Nick Rees, head of Macro Research at Monex Europe. "We do expect longer-term depreciation, but by the same token, we think the dollar looks too cheap on a fundamentals basis right now," added Rees. (Other stories from the June Reuters foreign exchange poll)