logo
#

Latest news with #DeepSeek

DeepSeek Pushes Out V3.1 Update as Nvidia Dominates AI Hardware
DeepSeek Pushes Out V3.1 Update as Nvidia Dominates AI Hardware

Yahoo

time10 hours ago

  • Business
  • Yahoo

DeepSeek Pushes Out V3.1 Update as Nvidia Dominates AI Hardware

China's DeepSeek just pushed out an update to its flagship AI model, showing it's not backing down in the global AI race. The new version, V3.1, comes with a longer context window essentially letting the model remember and process more information in a single query. The company made the announcement on its official WeChat account Tuesday. Warning! GuruFocus has detected 5 Warning Signs with NVDA. It's a timely move. Back in January, DeepSeek's R1 model shook U.S. markets, catching traders off guard and underscoring China's ambition to build its own AI champions. But while U.S. rivals like Nvidia (NASDAQ:NVDA) keep surging ahead powering everything from ChatGPT to cloud supercomputers DeepSeek has struggled to get its hands on the high-end processors needed to train bigger, faster models. That's partly why its next big model, R2, remains delayed. Even so, V3.1 shows DeepSeek is still iterating quickly, despite supply hurdles. It's a reminder that while Nvidia dominates the hardware side of AI, Chinese firms are racing to prove they can innovate on the software side and keep the pressure on U.S. players. This article first appeared on GuruFocus.

OpenAI CEO Sounds Alarm On China's Next-Gen AI Advances: "I Am Worried"
OpenAI CEO Sounds Alarm On China's Next-Gen AI Advances: "I Am Worried"

NDTV

time16 hours ago

  • Business
  • NDTV

OpenAI CEO Sounds Alarm On China's Next-Gen AI Advances: "I Am Worried"

Sam Altman, CEO of OpenAI, has expressed concerns that the United States may be underestimating China's advancements in next-generation artificial intelligence. In a recent media briefing, he highlighted the complexity of the US-China AI race, suggesting it's not just about who's ahead but involves multiple layers like inference capacity, research, and product development. "I'm worried about China," he said. "There's inference capacity, where China probably can build faster. There's research, there's product; a lot of layers to the whole thing. I don't think it'll be as simple as: Is the U.S. or China ahead?", he added as reported by CNBC. Mr Altman also admitted that China's progress, particularly with open-source models like DeepSeek and Kimi K2, influenced OpenAI's decision to release its open-weight models, gpt-oss-120b and gpt-oss-20b. "It was clear that if we didn't do it, the world was gonna head to be mostly built on Chinese open source models. That was a factor in our decision, for sure. Wasn't the only one, but that loomed large," the CEO revealed. Notably, these text-only models are designed to be lower-cost options, allowing developers, researchers, and companies to download, run locally, and customise them. The larger model, gpt-oss-120b, has 117 billion parameters and can run on a single 80GB GPU, matching or exceeding the performance of OpenAI's o4-mini model on key benchmarks. The smaller model, gpt-oss-20b, has 21 billion parameters and can operate on devices with as little as 16GB of RAM, making it accessible for developers with limited hardware resources. During the briefing, Mr Altman also questioned the effectiveness of US export controls on semiconductors, noting that China could find workarounds, such as building its chip fabrication facilities. "My instinct is that doesn't work. You can export-control one thing, but maybe not the right thing… maybe people build fabs or find other workarounds," he said. "I'd love an easy solution. But my instinct is: That's hard," he added. Mr Altman's comments come as the US government is fine-tuning its approach to limiting China's advancements in AI. China's tech giants are instead pivoting towards self-reliance, investing heavily in domestic semiconductor development. One notable example is Huawei's push into high-end AI chips, particularly the Ascend 910C. This chip is designed to match Nvidia's flagship H100 performance and is poised to fill the gap left by US export restrictions. Industry experts warn that these export controls may ultimately harm US companies more than China, driving innovation in China's semiconductor sector while limiting US firms' access to the lucrative Chinese market.

Sam Altman admits Chinese AI pushed OpenAI to release its own open-weight models
Sam Altman admits Chinese AI pushed OpenAI to release its own open-weight models

India Today

time17 hours ago

  • Business
  • India Today

Sam Altman admits Chinese AI pushed OpenAI to release its own open-weight models

OpenAI made a big move earlier this month. Ahead of ChatGPT 5's release on August 7, the company released two open-eight models. This was the first time the AI startup had launched open-weight models since GPT 2 in 2019. OpenAI CEO Sam Altman has now accepted that Chinese rivals played a key role in this two open-weight models launched were gpt-oss-120b and gpt-oss-20b. The former is a large model meant to work with data systems and high-end laptops. On the other hand, gpt-oss-20b can work on most laptops, desktops, and even on phones with relatively modest hardware, per OpenAI. These models which can be run locally and customised by researchers, developers, and this year, DeepSeek shook the AI space with its R1 model. Not only did R1 match the likes of ChatGPT and Gemini, it was also open-source. This allowed users to get much more freedom compared to closed AI models like ChatGPT. This sparked fears within OpenAI. Sam Altman claimed that releasing open-weight models became a necessity to avoid Chinese dominance. He told CNBC, 'It was clear that if we didn't do it, the world was gonna head to be mostly built on Chinese open-source models.'Altman accepted that this was a serious reason behind the release. However, he clarified that it was not the only open-weight model really meansAn open-weight model like gpt-oss is not the same as an open-source model like DeepSeek R1. In an open-weight model, users are given access to 'weights.' Weights refer to the characteristics or elements that are you used to train a Large Language Model (LLM).For certain queries, the AI gives more weight to certain words or sequences. An open-weight model gives access of these weights to users. Developers can then see these weights and how they're used in the creation of AI the way the AI was trained or the information used to train the model remain restricted. Thus, the code or the information used to train gpt-oss is not available to the ChatGPT's open-weight models can be crucial. Developers can not only understand the weights of the models, but use them locally or add them to pre-existing programs. This will help negate the dependence on Chinese open-source models, strengthening US' move comes at a time when the US government is concerned about China's rise in the AI race. The Trump administration has even put stricter restrictions on advanced chip sales to Beijing.- Ends

Is AI Market Forming a Bubble? Nasdaq-100 ETF in Focus
Is AI Market Forming a Bubble? Nasdaq-100 ETF in Focus

Yahoo

time19 hours ago

  • Business
  • Yahoo

Is AI Market Forming a Bubble? Nasdaq-100 ETF in Focus

OpenAI CEO Sam Altman has recently suggested that the artificial intelligence (AI) industry is currently experiencing a bubble fear, as quoted on CNBC. He explained that while AI represents one of the most significant technological shifts in decades, the excitement around it has led to overinflated expectations from investors. Altman sees similarity between the current environment to the dot-com boom of the late 1990s, which was hit hard when many Internet companies failed to materialize the euphoria into profits. Warnings From Industry Leaders Altman's remarks mirror concerns raised by other influential figures in business and finance. Alibaba co-founder Joe Tsai, Bridgewater Associates founder Ray Dalio, and Apollo Global Management's chief economist Torsten Slok have all cautioned that AI valuations may be overheating. Slok has even argued that the present AI surge could be more inflated than the Internet bubble, pointing out that today's most valuable companies in the S&P 500 are more stretched in valuation than they were during the 1990s. Note that between March 2000 and October 2002, the Nasdaq lost nearly 80% of its value. Analysts Divided on the Bubble Narrative Not all analysts think that the entire AI market has entered bubble territory. Some experts argue that the fundamentals of AI and semiconductor supply chains remain strong and that the long-term growth prospects justify continued investment. However, many fear that capital is being invested in companies with weaker fundamentals, which may cause problems later on. Rising Competition From Low-Cost Chinese Peers The concerns intensified earlier this year when the 'Magnificent Seven' had fallen from grace due to factors such as new, cheaper-cost AI entrants (e.g., DeepSeek) and individual companies' ability to handle broader macro uncertainty. DeepSeek, a Chinese startup developing AI models, revealed in late January that training the R1 model cost just $5.6 million, significantly less than the $100 million required to train OpenAI's GPT-4 model. On the other hand, Alibaba BABA introduced the QwQ-32B model, an AI system that rivals DeepSeek but requires only a fraction of the data. Such advancements triggered doubts that the huge capital investments deployed by U.S. tech majors to develop AI technologies will generate the expected returns at all. Rocky Journey of ChatGPT-Fame Open AI Although the credibility of these claims has been questioned, the development has raised questions about whether current spending levels in AI are sustainable. Despite OpenAI's huge success and its annual recurring revenue projection to top $20 billion this year, the company remains unprofitable. The rollout of its latest GPT-5 model has also been anything but smooth, with some users finding it less intuitive than expected. Nasdaq-100 ETF in Focus Most AI biggies have exposure to the Nasdaq-100-based exchange-traded fund (ETF) Invesco QQQ Trust, Series 1 QQQ. The P/E ratio of QQQ stands at 59.27X. The 10-year range of the P/E ratio is 19.7X to 59.46X. The median P/E of the past 10 years is 25.8X. This shows the overvaluation concerns associated with QQQ. However, the price-to-book (P/B) ratio of QQQ is currently 3.6X, which is the lowest value considering the past 10-year range. The 10-year median P/B is 6.03X, per Moreover, with the Fed likely to cut rates in the coming months amid a weakening labor market, the growth stocks of QQQ should see some tailwinds. Hence, the sudden crash of AI euphoria (if there is any) may not hurt QQQ that hard. Still, investors should be mindful of relentless AI investing going forward. Their portfolio may need diversification at the current juncture. Note that the annualized return of QQQ is 18.78% over the past 10 years, while it is 21.23% over the past three years (due to the AI rally). Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report Invesco QQQ (QQQ): ETF Research Reports Alibaba Group Holding Limited (BABA) : Free Stock Analysis Report This article originally published on Zacks Investment Research ( Zacks Investment Research Inicia sesión para acceder a tu portafolio

China's DeepSeek Releases V3.1, Boosting AI Model's Capabilities
China's DeepSeek Releases V3.1, Boosting AI Model's Capabilities

Bloomberg

time19 hours ago

  • Bloomberg

China's DeepSeek Releases V3.1, Boosting AI Model's Capabilities

DeepSeek announced what appeared to be an update to its older V3 artificial intelligence model on Tuesday, declaring an enhanced version ready for testing. The V3.1 has a longer context window, according to a DeepSeek post to its official WeChat group, meaning it can consider a larger amount of information for any given query. That could allow it to maintain longer conversations with better recall, for example. The Hangzhou-based startup didn't offer much more detail on the update and hasn't posted documentation to major platforms including Hugging Face.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store