Latest news with #AIaccelerators


Globe and Mail
18 hours ago
- Business
- Globe and Mail
Prediction: 1 Artificial Intelligence (AI) Stock That Could Join the Trillion-Dollar Club
Key Points AMD stock has to double less than twice to reach $1 trillion. Its growing success with AI accelerators could make it a stronger competitor in that market. 10 stocks we like better than Advanced Micro Devices › Advanced Micro Devices (NASDAQ: AMD) has evolved into a semiconductor powerhouse in recent years. Under the leadership of Lisa Su, it overtook longtime rival Intel in the PC market. Although Nvidia 's success with the artificial intelligence (AI) accelerator market initially took AMD by surprise, AMD's efforts to catch up have made it an increasingly important company in that market. Such innovations have also made AMD a prime candidate to join the 10 companies that now have a market cap above $1 trillion. Here's how it can reach that milestone, and why the path might be easier to achieve than many investors might assume. Where AMD stands now At first glance, AMD might appear far away from that milestone since its $280 billion market cap means it is only 28% of the way toward that goal. However, that is not as far away from $1 trillion as it might appear. At the current market cap, it has to double in value less than two times to reach that point. Moreover, a simple increase in popularity could get AMD to that point. Although its 99 price-to-earnings (P/E) ratio might make it appear pricey, it currently sells at a forward P/E ratio of 44. Thus, if it achieves some of the popularity that has boosted Palantir, a stock that sells at 623 times its earnings, multiple expansion alone could take it there. Reaching $1 trillion through business growth More importantly, AMD is in a strong position to reach a $1 trillion market cap even if such hype does not materialize. The company's data center segment, which designs AI accelerators, generated just over $6.9 billion in revenue in the first half of 2025, around 46% of AMD's total. In comparison, Nvidia's data center segment made up 89% of the company's revenue in its most recent quarter. Admittedly, AMD is significantly behind Nvidia in the AI accelerator market, and while AMD's MI350 chip has generated some interest due to its lower cost, it is hardly a threat to Nvidia's dominance. However, AMD plans to release the MI400 next year. With its integration with AMD's upcoming Helios rack-scale solution, some analysts believe it can become a competitive threat to Nvidia's upcoming Vera Rubin platform. Nvidia's CUDA software, which has previously cemented its dominance, also faces increased competitive threats. Such conditions could mean AMD is on the way to becoming a full-fledged competitor in the AI market. Additionally, Grand View Research forecasts a compound annual growth rate (CAGR) of 29% through 2030, taking the market's size to an estimated $323 billion. If that prediction comes to pass, AMD will almost certainly benefit from that industry growth. Even if data center revenue becomes AMD's dominant revenue source, investors should not forget about the client, embedded, and gaming segments. Fortune Business Insights forecasts a CAGR of 15% for the semiconductor industry through 2032. That seems to affirm Grand View's findings, and the market rising above $2 trillion presents AMD with a massive tailwind. Finally, as conditions stand now, Nvidia has reached a market cap of just under $4.5 trillion, making AMD approximately 6% of its size. Hence, even if AMD grew to slightly less than one-fourth of Nvidia's size, its market cap would presumably reach $1 trillion or higher. AMD at $1 trillion (and beyond) Ultimately, AMD is on track to benefit from numerous catalysts that will likely take its market cap to $1 trillion and beyond. The company is less than two doubles away from reaching $1 trillion, meaning hype alone could take it to that milestone. Still, the growth of the semiconductor industry in general puts it on track to spark massive growth. Additionally, even though all four of AMD's segments will probably contribute to the company's growth, the path to $1 trillion will most likely hinge on the AI accelerator market, particularly with the upcoming release of the MI400. Even if it falls somewhat short of expectations, investors should remember that AMD can reach $1 trillion even if it grows to less than one-fourth of Nvidia's size. Such conditions make reaching a $1 trillion market cap easier than most investors are likely assuming. Should you invest $1,000 in Advanced Micro Devices right now? Before you buy stock in Advanced Micro Devices, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Advanced Micro Devices wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $660,783!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,122,682!* Now, it's worth noting Stock Advisor's total average return is 1,069% — a market-crushing outperformance compared to 184% for the S&P 500. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of August 13, 2025 Will Healy has positions in Advanced Micro Devices and Intel. The Motley Fool has positions in and recommends Advanced Micro Devices, Intel, Nvidia, and Palantir Technologies. The Motley Fool recommends the following options: short August 2025 $24 calls on Intel. The Motley Fool has a disclosure policy.
Yahoo
a day ago
- Business
- Yahoo
Nvidia and AMD reportedly sharing 15% of their China GPU revenue in exchange for export licenses — 'unprecedented' export revenue sharing deal may have been struck
When you buy through links on our articles, Future and its syndication partners may earn a commission. Both Nvidia and AMD have reportedly agreed to pay the U.S. government 15% of China-sourced revenues to unlock export licenses for previously restricted chips. Under the reported terms of the new deal, Nvidia will be able to resume H20 chip sales, and AMD will be able to sell its MI308 accelerators into China. The deal, not officially announced, has been reported independently by a multitude of outlets, including the BBC, FT, and Reuters. We also followed up with Nvidia for more details. The restrictions on exports of such potent AI accelerators were originally put in place due to national security concerns, but the 15% levy marks a significant shift in that strategy. The revenue-share deal would represent a landmark foray into uncharted territory. Several news agencies, such as the BBC and FT, point out that a deal where commercial entities pay a revenue share in exchange for government export license approval is unprecedented. Furthermore, the FT asserts that the U.S. Commerce Department has already started issuing H20 licenses last Friday. That would be just two days after Jensen Huang last met in person with President Trump. It is understood that AMD's licenses were also being inked ahead of the weekend. We don't have any AMD statements about the new 15% revenue-sharing deal at the time of writing. Nvidia statement to Tom's Hardware Nvidia has been quite quick to trot out an official statement. This morning, an Nvidia spokesperson told Tom's Hardware, 'We follow rules the U.S. government sets for our participation in worldwide markets.' Providing some context to the newly announced export license approval process, they added, 'While we haven't shipped H20 to China for months, we hope export control rules will let America compete in China and worldwide.' In addition, the statement repeated arguments in favor of relaxed cutting-edge technology export controls we have seen previously, 'America cannot repeat 5G and lose telecommunication leadership. America's AI tech stack can be the world's standard if we race,' ended the short statement. For Nvidia and AMD, perhaps this policy shift has come at a pivotal time. There have been signs of China-based competitors raising their competitiveness on several fronts. Only yesterday, we reported on Chinese state media molding public opinion by characterizing Nvidia H20 GPUs as 'neither environmentally friendly, nor advanced, nor safe.' The safety concerns seem to be a tit-for-tat response to Western powers, who cite similar concerns regarding Chinese-sourced semiconductors. and electronics. Nvidia has strenuously denied the existence of any kill switches, back doors, or spyware in its GPUs. How much extra money will go to the U.S. treasury? Reuters provided some interesting context to the new 15% deal between Nvidia, AMD, and the U.S. government. According to the latest financials, Nvidia raked in $17 billion in revenue during its latest financial year from China. AMD's China business scored $6.2 billion in revenue in 2024. With the Green and Red team's expensive AI chips now available for sale in China, we could expect both yearly revenue figures to increase impressively. However, 15% of the total combined $23.2 billion (latest combined figures) is less than $3.5 billion for the U.S. Treasury. Also, we must remember that the 15% deal only covers the advanced AI chips that require these export licenses. In other words, the 15% deal is 'small potatoes' to a country like the U.S., when judged purely on financial terms. Follow Tom's Hardware on Google News to get our up-to-date news, analysis, and reviews in your feeds. Make sure to click the Follow button.


Forbes
24-07-2025
- Forbes
Cracking The Code: Navigating The Edge AI Development Life Cycle
Rajesh Subramaniam is Founder and CEO of embedUR systems. How many intelligent devices are running in your home right now? I bet it's more than you think. The current average is 25 devices per household, and the number is only going up every year. What's more, many of these devices, from fridges to fans, now come equipped with AI accelerators tucked into their chipsets. Whether or not you're aware of it, your thermostat may be learning your habits, and your washing machine may be whispering to the cloud. This quiet evolution marks a new frontier in technology: Edge AI. It's the convergence of embedded systems and AI, designed to run efficiently right where the data is generated: on the edge. But getting from an idea to a working AI-enabled product is anything but straightforward. The development process is fragmented, the talent pool is bifurcated and the hodgepodge of tools were all designed for AI development in the cloud, not the edge. I've spent the last two years focused on one central question: How do we make edge AI easier? Edge AI Development Pain Points Let's start with the development workflow itself. Building an AI solution for an edge device is a series of deeply interdependent challenges. You start with model discovery: finding a neural network architecture that might solve the problem you're working on. Then comes sourcing and annotating relevant data, fine-tuning the model, validating its accuracy, testing it on real devices, optimizing it for specific chipsets and finally deploying it into production. That's a lot of moving pieces. That's where engineers get stuck. Using the output from one step, as the input to the next, hoping they are compatible, and discovering they mostly are not. A lot of jerry-rigging is needed to string dev pipelines together, because until now there has not been a unified dev environment for Edge AI. The challenge is that most developers are forced to stitch this pipeline together from scattered tools. You might use one platform to find a model, a separate one to label data and something entirely different to benchmark your results. There are constant handoffs, and each transition brings the risk of versioning problems, performance degradation or flat-out failure when trying to get a model to run on resource-constrained hardware. On top of that, most embedded engineers aren't AI experts, and most AI experts don't come from embedded systems. Bridging this language and tooling divide is one of the core problems we're trying to solve. A New Mindset And A New Toolchain Traditionally, embedded software followed a familiar pattern: Write the code, compile it, test it and ship it. Now, though, you have to fit an AI model into that life cycle. But AI doesn't behave like conventional software. You need to train AI models with a large amount of high-quality data. You also need to make sure they're accurate, secure, upgradeable and able to run efficiently on limited hardware—and they still need to integrate cleanly with the rest of the software stack. What's really needed is a toolset that allows embedded developers to stay in their comfort zone while unlocking the power of AI. Think of it like a sandbox: You identify the type of application you're building and get model recommendations from a curated library. Then the system walks you through fine-tuning, validating and benchmarking the model. It should also help with things like security and upgrade paths. This is where I see us heading: tools that abstract the complexity of AI while integrating seamlessly with existing embedded workflows. That means packaging up best-in-class models, simplifying the training process and making on-device validation dead simple. Standardization And The Path Forward Our goal is to bring some structure to the edge AI development lifecycle. Right now, there are too many tools and frameworks and no common standards for building, testing or deploying AI models in an embedded context. By pushing for standardization, we're trying to make it easier for traditional developers to adopt AI. Once the life cycle is defined and toolchains are aligned, more engineers will feel confident jumping in. Consistency will help build trust and reduce friction in the process. It's hard to overstate the implications of this shift to embedded edge AI. Think about the early days of the internet or the rise of smartphones—we're at that kind of inflection point. The number of embedded clients per household is only going to continue to soar, from smart doorbell cameras that recognize family and friends to voice assistants that control everything from lighting to entertainment with natural commands. That means it's essential to solve the issue of integration. The sheer scale and reach of edge AI applications are staggering, maybe even a little scary, but mostly it's exciting. Because what we're really talking about is democratization. AI was once limited to massive data centers and elite development teams. Now it's finding its way into everyday devices at a price point that's accessible to everyone. Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?


Globe and Mail
12-07-2025
- Business
- Globe and Mail
10 Reasons to Buy and Hold This Tech Stock Forever
Key Points Taiwan Semiconductor Manufacturing (TSMC) is by far the largest provider of ultra-advanced semiconductors. Management estimates revenue from artificial intelligence (AI) accelerators will grow in the mid-40% range until 2029. The company has steadily increased its capital investments to account for growing demand. 10 stocks we like better than Taiwan Semiconductor Manufacturing › Semiconductor (chip) manufacturing giant Taiwan Semiconductor Manufacturing (NYSE: TSM) recently joined the elite trillion-dollar club, becoming one of only 10 companies with a market cap of over $1 trillion (as of July 8). The company, also known as TSMC, has experienced a lot of growth in recent years, and its momentum is still going strong. In fact, it's one of my favorite stocks right now, and I plan to hold it for the long haul. Here are 10 reasons why. 1. TSMC is the undisputed market leader When it comes to semiconductor manufacturing, there's TSMC, and there's everyone else. TSMC has around a 70% market share of the semiconductor foundry market, far exceeding its next closest competitors. There's no clear path for any competitor to get close to TSMC's market share anytime in the foreseeable future. 2. The tech world relies heavily on TSMC TSMC doesn't sell products directly to consumers, but its chips are found in many of the electronics they use daily. TSMC's customers include Apple (smartphones, tablets, etc.), Nvidia (GPUs), Tesla (self-driving technology), AMD (CPUs), and dozens of other tech heavyweights. 3. TSMC has shown strong financial performance In the first quarter (Q1), TSMC's revenue was $25.5 billion, up 35% year over year (YOY). Its net income increased 60% YOY (in local currency), continuing its impressive financial performance over the past five years. Data by YCharts. TSMC's customers typically sign long-term contracts, helping to keep its revenue predictable as well. 4. Artificial intelligence (AI) chip demand is skyrocketing TSMC makes most of the high-powered chips essential to the AI ecosystem. Smartphones used to be the largest segment for TSMC's business, but the new AI demand has shifted the landscape. Managment estimates revenue growth from AI accelerators will have a compound annual growth rate (CAGR) in the mid-40% range until 2029. 5. The semiconductor industry has a high barrier to entry Developing a semiconductor manufacturing plant is far from easy, which is why some of the world's richest, most technologically advanced companies have yet to build their own and continue to rely on TSMC. It takes a lot of invested capital, complex technology, and years of process improvements to get to a point where it works efficiently. This helps TSMC keep its competitors at a distance. 6. TSMC is expanding its business beyond Taiwan One concern with TSMC's business has been the geopolitical tension between Taiwan and China. In light of this risk, the company has begun expanding its operations outside Taiwan. TSMC currently has, or will have soon, manufacturing plants in Taiwan, the U.S., Germany, and Japan. 7. TSMC has a dividend that complements its stock price growth I wouldn't consider TSMC a dividend stock, but it does offer a dividend that complements its recent share price growth. Its dividend yield is around 1.17% (as of July 8), which is lower than the S&P 500 average. However, its average dividend yield over the past three years is higher than the S&P 500's. Data by YCharts. The modest dividend can still contribute to your total returns over the long haul. 8. Companies rely almost exclusively on TSMC for advanced chips Semiconductors are categorized by the manufacturing process node, measured in nanometers (nm) -- such as 7nm, 5nm, 3nm, and the upcoming 2nm. The smaller the node, the more powerful and advanced the semiconductor is. TSMC effectively has a monopoly on manufacturing and selling the world's most advanced semiconductors. Other companies cannot match the efficiency of TSMC and the scale at which it can build them. 9. TSMC is committed to investing in continued growth TSMC has consistently made investments to grow its business, but it has stepped this spending up a notch with the rising demand for AI chips. In 2024, TSMC's capital expenditures totaled just over $30 billion. This year, it expects this number to increase to between $38 billion and $42 billion. Data by YCharts. TSMC noted that the higher capital spending is directly correlated to its growth opportunities, which should be music to investors' ears. 10. TSMC has stood the test of time When you're investing in a company for the long haul, you want one that has shown it can stand the test of time. Since 1987, TSMC has navigated various economic cycles, the introduction of new technologies, and geopolitical tensions. During each step, it has adjusted and positioned itself for long-term growth, and there's little reason to believe it won't continue to do so. Should you invest $1,000 in Taiwan Semiconductor Manufacturing right now? Before you buy stock in Taiwan Semiconductor Manufacturing, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Taiwan Semiconductor Manufacturing wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $671,477!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,010,880!* Now, it's worth noting Stock Advisor 's total average return is1,047% — a market-crushing outperformance compared to180%for the S&P 500. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of July 7, 2025