logo
#

Latest news with #Saimemory

Intel, SoftBank Form Venture for AI-Focused DRAM Chips in Japan
Intel, SoftBank Form Venture for AI-Focused DRAM Chips in Japan

Yahoo

time5 days ago

  • Business
  • Yahoo

Intel, SoftBank Form Venture for AI-Focused DRAM Chips in Japan

June 2 - SoftBank Group (SFTBY) and Intel (NASDAQ:INTC) are partnering to launch a new memory chip company called Saimemory, aimed at developing next-generation DRAM for artificial intelligence data centers in Japan. The venture will focus on stacked DRAM technology, which is expected to enhance AI computing performance while reducing energy consumption. A prototype is targeted within two years, with commercial rollout anticipated in the latter half of the decade. Warning! GuruFocus has detected 6 Warning Signs with INTC. SoftBank will invest 3 billion yen in the 10 billion yen project. Additional backing may come from Japan's Riken research institute and chip substrate maker Shinko Electric Industries. Intel will contribute engineering support and industry know-how, while patents from the University of Tokyo and other institutions will help accelerate development. Saimemory will handle design and intellectual property, while outsourcing production, in a move that may support Japan's efforts to regain a foothold in the global chip race. The companies say the chips are being designed to meet the rising demand for energy-efficient AI systems across Japanese data centers. This article first appeared on GuruFocus. Error while retrieving data Sign in to access your portfolio Error while retrieving data Error while retrieving data Error while retrieving data Error while retrieving data

Intel and SoftBank collaborate on power-efficient HBM substitute for AI data centers, says report
Intel and SoftBank collaborate on power-efficient HBM substitute for AI data centers, says report

Yahoo

time5 days ago

  • Business
  • Yahoo

Intel and SoftBank collaborate on power-efficient HBM substitute for AI data centers, says report

When you buy through links on our articles, Future and its syndication partners may earn a commission. American chip giant Intel has partnered with Japanese tech and investment powerhouse SoftBank to build a stacked DRAM substitute for HBM. According to Nikkei Asia, the two industry behemoths set up Saimemory to build a prototype based on Intel technology and patents from Japanese academia, including the University of Tokyo. The company is targeting a completed prototype and mass production viability assessment by 2027, with an end goal of commercialization before the end of the decade. Most AI processors use HBM or high-bandwidth memory chips, which are perfect for temporarily storing the massive amount of data that AI GPUs process. However, these ICs are complex to manufacture and are relatively expensive. Aside from that, they get hot pretty quickly and require relatively more power. The partnership aims to solve this by stacking DRAM chips and then figuring out a way to wire them more efficiently. By doing so, the stacked DRAM chip's power consumption is halved versus a similar HBM chip. If successful, SoftBank says that it wants to have priority for the supply of these chips. At the moment, only three companies produce the latest HBM chips: Samsung, SK hynix, and Micron. The insatiable demand for AI chips means that HBM supply can be hard to get by, so Saimemory aims to corner the market with its substitute, at least for Japanese data centers. This will also be the first time that Japan aims to become a major memory chip supplier in over 20 years. Japanese firms used to dominate the market in the 1980s, when they manufactured about 70% of the global supply. However, the rise of South Korean and Taiwanese competitors has pushed many of its memory chip manufacturers out of the market. This won't be the first time that a semiconductor company is experimenting with 3D stacked DRAM. Samsung has already announced plans for 3D and stacked DRAM as early as last year, while another company, NEO Semiconductor, is also working on 3D X-DRAM. However, these are focused on enlarging the capacity of each chip, with memory modules targeted to have 512GB capacity. On the other hand, Saimemory is aiming for reduced power consumption — something that data centers sorely need, especially as AI power consumption is increasing annually. Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store