
AWS just crushed Vertiv? Stock tanks over 6% after Amazon's bold cooling move
Stock Target Still Optimistic For Now
Amazon's Game-Changing Cooling System
Live Events
Analyst Flags Growth Concerns
A Broader AI Push by AWS
FAQs
(You can now subscribe to our
(You can now subscribe to our Economic Times WhatsApp channel
Vertiv Holdings stocks dropped more than 6% on Thursday, reeling from a surprise announcement by Amazon Web Services that could directly threaten one of Vertiv's key revenue streams, as per a report.The drop followed AWS's unveiling of its own custom cooling hardware designed to support Nvidia's next-gen AI chips—technology that overlaps significantly with Vertiv's offerings, as it provides liquid cooling solutions for data centers, according to an Investing.com report.GuruFocus reported that according to the one-year price targets provided by 20 analysts, the average target price for Vertiv stock is $120.56, with a high estimate of $150.00 and a low estimate of $82.00.The cause of the shakeup is AWS's new In-Row Heat Exchanger (IRHX), a homegrown cooling system built to handle the extreme heat output of Nvidia's Blackwell GPUs, as per the report. These cutting-edge chips are driving a new wave of AI computing, but their power demands have created a significant thermal challenge for data center operators, according to Investing.com.Until now, companies like Vertiv have been key players in addressing those needs through advanced liquid cooling systems , as per the report. But AWS, one of the biggest cloud providers and potentially one of Vertiv's top customers, just announced it's taking matters into its own hands, according to the report.Developed in collaboration with Nvidia, the IRHX system went from design to production in just 11 months, according to the report. It uses a hybrid approach, combining liquid cooling through cold plates directly on GPU chips with air-based components that help remove heat through fan-coil arrays, and is expected to deliver high-performance cooling without the space or water usage of traditional methods, as reported by Investing.com.AWS vice president Dave Brown noted that standard cooling techniques "would take up too much data center floor space or increase water usage substantially," as quoted in the report.Bloomberg Intelligence analyst Mustafa Okur highlighted the potential impact on Vertiv, saying, "Amazon Web Services rolling out its own server liquid-cooling system could weigh on Vertiv's future growth prospects. Around 10% of overall sales come from liquid cooling, we calculate, and AWS may be one of the largest customers," as quoted by the Investing.com report.The timing of the announcement coincides with AWS rolling out new AI computing instances featuring Nvidia's latest chips, all supported by its custom-built Nitro infrastructure for performance monitoring and networking, according to the Investing.com report.Because AWS revealed it built its own cooling tech for AI servers—potentially cutting into Vertiv's business.It's called the In-Row Heat Exchanger (IRHX), and it cools high-performance AI chips like Nvidia's Blackwell GPUs, as per the Investing.com report.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Hans India
an hour ago
- Hans India
How to Find the Best GPU for AI?
New Delhi [India], July 16: As artificial intelligence continues to reshape industries, the hunger for high-performance computing resources just keeps growing. And when it comes to powering AI innovation, one of the unsung heroes is the GPU VPS. From training those massive neural networks to running real-time inference that blows your mind, the GPU you choose literally shapes your entire AI pipeline. But let's be real, with so many models, specs, and VPS providers out there, figuring out the "best" GPU for AI can feel a bit tough. So, your first big step? getting a handle on the technical metrics and architectural advantages of what's on offer. GPU Architecture When you're sifting through GPUs for those demanding AI workloads, there are three critical elements you absolutely have to zero in on: tensor cores, CUDA cores, and memory bandwidth. These guys are the real muscle. Tensor cores, first popping up with NVIDIA's Volta architecture and continuously refined through the Ampere and Hopper generations, are specialized wizards at mixed-precision calculations (think FP16, BF16, INT8). They can dramatically slash your training times, which is a huge win. Then you've got CUDA cores, the general-purpose workhorses that determine how versatile your GPU will be across different frameworks. Bandwidth is often overlooked, but it can quickly become a bottleneck when you're training large models, especially with those hungry transformer architectures. For instance, the NVIDIA A100 boasts a whopping 2 TB/s of memory bandwidth. Here's a quick rundown of some leading GPUs: GPU Model VRAM CUDA Cores Tensor Cores Memory Bandwidth Ideal Use Case NVIDIA A100 40–80 GB 6912 432 1555 GB/s LLM training, multi-GPU setups RTX 4090 24 GB 16384 512 1008 GB/s Deep learning, generative AI RTX 3080 10–12 GB 8704 272 760 GB/s Model prototyping, DL training Tesla T4 16 GB 2560 320 320 GB/s Inference, low-power tasks RTX 3060 12 GB 3584 112 360 GB/s Entry-level experimentation Performance Benchmarks and Profiling Your AI Workload Before committing to a GPU VPS, it's crucial to test models with your specific AI workload. Real-world performance varies wildly based on model complexity and optimization. For example, CNNs for image classification behave differently than transformer-based architectures for natural language processing—it's like comparing apples and oranges! Forget raw core counts; FLOPS, memory latency, and inference throughput tell the real story. An RTX 4090 might have more CUDA cores than an A100, but its lower FP64 performance makes it less ideal for scientific AI, though it's a beast for generative tasks like GANs. See the difference? Profiling your workload with tools like NVIDIA Nsight or PyTorch's isn't just an option; it's a must-do. It'll pinpoint GPU utilization, highlight bottlenecks, and show how your model scales. Deployment Models Picking the best GPU for AI isn't just about raw power, but also how you deploy it. A GPU VPS offers sweet advantages: remote accessibility, elastic scaling, and less infrastructure overhead. But be smart—evaluate your provider's latency and virtualization overhead. Some GPUs shine in bare-metal configurations, while others excel in virtual environments using NVIDIA GRID and vGPU. For latency-sensitive apps, even slight virtualization overhead can impact performance. Look for PCIe Gen4 support and low I/O contention. Cost-wise, pricing scales with VRAM and GPU generation. A smart approach is to start with mid-range GPUs like the 3080 for inference, then step up to A100s or H100s for larger model training. It's all about playing it smart! Fresh GPU Insights A fascinating Cloudzy blog deep-dive recently showed how developers fine-tune AI by matching project scale with GPU architecture. It highlighted that memory bandwidth and tensor core utilization are often under-optimized due to poor GPU choices. For instance, an AI team saw their language translation's inference latency slashed by 35% by upgrading from a 3060 to a 3080 Ti, with minimal cost increase. This confirms that understanding workload demands beats just grabbing the most expensive GPU. Plus, Cloudzy's infrastructure offers pre-configured environments for TensorFlow, PyTorch, and JAX, meaning faster experimentation and iteration while keeping full control. Pretty neat, right? Wrapping Up To truly nail down the best GPU for your AI journey, look past brand names. Dive into architecture, workload requirements, and deployment contexts. Tensor core efficiency, memory bandwidth, and a scalable VPS infrastructure are your secret weapons for accelerating AI innovation without unnecessary costs. By dissecting your workload, benchmarking performance, and picking a GPU VPS that aligns with your strategy, you'll be in the best position to train, deploy, and optimize your AI models in today's competitive landscape. It's a bit of work, but trust me, it pays off big time!


Economic Times
2 hours ago
- Economic Times
Nvidia stock surges 4.47% in pre-market as U.S. clears H20 AI chip sales to China—AMD up 3.18% too as AI trade door reopens
AP Nvidia stock spikes 4.47% pre-market after U.S. clears H20 AI chip exports to China. AMD rises 3.18% too. Nvidia also launches RTX Pro chip, aiming for industrial use. Could this revive billions in revenue and push Nvidia toward $200? Here's what this means for the AI chip market. Nvidia is back in action with its China sales — and the market is cheering. The chip giant has received a green light from the U.S. government to resume exports of its advanced H20 AI chips to China, according to multiple sources including Reuters , and the Financial Times . The announcement triggered a strong market response, with Nvidia's stock price jumping $7.33 (4.47%) in pre-market trading to $171.40. AMD also rode the wave, gaining $4.65 (3.18%) to hit $150.89 pre-market. The news, which signals a potential easing in U.S.–China chip tensions, is drawing serious attention from investors, analysts, and AI industry leaders. TSMC and SMCI also ticked up; TSMC ahead of Q2 earnings, and SMCI thanks to its Nvidia server partnerships. Nvidia's comeback in China might be just the catalyst it needs to push its stock toward the $200 mark—and here's why the market is reacting so strongly: Pre-market jump: Nvidia stock surged 4.47% to $171.40, gaining $7.33 after the U.S. government confirmed it will grant export licenses for the H20 AI chip to Chinese customers. Nvidia stock surged 4.47% to $171.40, gaining $7.33 after the U.S. government confirmed it will grant export licenses for the H20 AI chip to Chinese customers. Huge revenue rebound: Nvidia had taken a $4.5 billion inventory charge in Q2 after being blocked from selling the H20 in China. Resuming shipments could restore a big portion of those lost sales. Nvidia had taken a $4.5 billion inventory charge in Q2 after being blocked from selling the H20 in China. Resuming shipments could restore a big portion of those lost sales. China's tech giants buying fast: Firms like ByteDance, Tencent, and Alibaba are already lining up with orders on Nvidia's whitelist. Firms like ByteDance, Tencent, and Alibaba are already lining up with orders on Nvidia's whitelist. New RTX Pro GPU release: Alongside the H20 comeback, Nvidia launched a new RTX Pro chip, built on the Blackwell architecture, specifically to meet U.S. export rules for industrial use. Alongside the H20 comeback, Nvidia launched a new RTX Pro chip, built on the Blackwell architecture, specifically to meet U.S. export rules for industrial use. Broader chip market rally: AMD stock rose 3.18% to $150.89 in pre-market trading, while TSMC and Super Micro Computer also saw gains as investor confidence returned to the AI chip trade. in pre-market trading, while TSMC and Super Micro Computer also saw gains as investor confidence returned to the AI chip trade. Big picture outlook: With geopolitical risk still looming but signs of a thaw in U.S.–China chip relations, Nvidia's reopening of its China pipeline could drive stronger Q3 sales and give bulls a reason to eye $200 as the next target. This export resumption could mark a major financial turnaround for Nvidia. Earlier this year, the company had to write off $4.5 billion in inventory after the U.S. placed tight restrictions on the sale of high-end AI chips to China. The H20, originally designed to comply with previous export rules, had been held back after additional guidance tightened the rules in late 2024. Now, the Trump administration has reportedly assured Nvidia that it will grant the required licenses to export the H20 chip. Nvidia is currently filing the paperwork, and shipments could begin in the coming weeks. If shipments resume smoothly, Nvidia could recover a significant portion of the $4.5 billion in lost sales, much of which was tied to demand from Chinese tech firms like ByteDance, Tencent, and Alibaba, who are already lining up orders, according to Reuters . The reaction in China has been immediate. Major tech companies are scrambling to get on Nvidia's so-called 'whitelist'—a vetted list of approved buyers eligible for H20 shipments. According to The Economic Times and ABC News , Nvidia's CEO Jensen Huang is currently in Beijing, reportedly meeting with tech influencers and business leaders to fast-track the process. Chinese firms see the H20 chip as critical for powering AI workloads, including natural language processing, image recognition, and large-scale data modeling. With China's domestic chip alternatives still struggling to catch up, Nvidia remains the top-tier provider, even under restricted sales conditions. Meanwhile, Nvidia isn't just betting on the H20. It's also introducing a new "RTX Pro" GPU, designed specifically for export-compliant AI tasks in China. Built on its Blackwell architecture, the RTX Pro will target industrial use cases like smart factories and automated logistics, as confirmed by Tech Xplore and El País . The broader AI chip ecosystem is already reacting. Nvidia's export approval appears to be a rising tide that's lifting multiple boats. AMD, a key competitor in the AI GPU space, saw its shares rise over 3% in pre-market trading. Other beneficiaries include: Broadcom Taiwan Semiconductor Manufacturing Co. (TSMC) Super Micro Computer (SMCI) TSMC, in particular, is gaining attention ahead of its Q2 earnings report, while SMCI is benefiting from its ongoing Nvidia server partnerships. According to MarketWatch , the export license news is also boosting investor sentiment across the tech and semiconductor sectors, especially as it points to renewed demand from the world's second-largest economy. Short-Term Projection (Q3 2025): Nvidia's current pre-market price of $171.40—up 4.47%—reflects strong investor optimism following the H20 export greenlight. If shipment licenses are approved within July as expected, analysts foresee Nvidia retesting its previous record highs around $174–$177, possibly even pushing toward the $180 mark before Q3 ends. The chipmaker could also benefit from an influx of backlogged orders from Chinese giants like ByteDance, Alibaba, and Tencent. If early deliveries begin by August, Nvidia may exceed revenue estimates in its next earnings report, pushing the stock further. Short-Term Risk Factors: Any delay in license approvals U.S.–China tensions re-escalating Shipment volume caps or added compliance checks Long-Term Projection (2025–2026): Over the next 6 to 12 months, Nvidia's stock has the potential to cross $200, especially if: H20 and RTX Pro sales scale in China Global AI demand continues rising New GPU lines under the Blackwell architecture drive growth Several Wall Street firms, including those cited by Reuters and Financial Times , already project Nvidia's FY2025 and FY2026 revenues to grow 20–30% year-over-year, assuming China remains a viable market. The company's $4 trillion market cap could stretch even higher if export channels remain open. In the broader picture, Nvidia is positioned as the backbone of global AI infrastructure, and with supply chains stabilizing and geopolitical doors slightly ajar, its long-term trajectory looks increasingly bullish. While today's headlines are bullish, there are several uncertainties investors will want to monitor: Watchpoint Key Details License Timelines How fast the U.S. Commerce Department processes Nvidia's applications. Shipment Caps Whether the U.S. imposes any volume limits or tighter oversight on exports. Demand Surge With pent-up demand in China, Q3 orders could surge, affecting pricing and supply chain. Geopolitical Risks Any new tensions between the U.S. and China could put future shipments at risk. This development might also play a role in 2025's broader chip investment strategies, especially with growing global competition from firms in South Korea, Japan, and the EU. The decision to allow H20 shipments isn't just a win for Nvidia — it could reshape the entire AI chip supply chain in 2025 and beyond. For now, Nvidia appears poised to regain its footing in China, a crucial market it can't afford to lose. As Business Insider and the Wall Street Journal noted, this moment may signal a strategic thaw in U.S.–China AI cooperation, even if only temporary. At the same time, it shows how critical Nvidia's hardware has become to the global AI race. From a revenue standpoint, recovering even half of the blocked $4.5 billion would significantly boost Nvidia's FY2025 performance. Investors will also be watching for how RTX Pro sales unfold in China's industrial automation sector — another emerging growth area. If all goes as planned, Nvidia's next earnings report could show a marked improvement in both top-line revenue and investor confidence, restoring momentum that had been dampened by geopolitical roadblocks. Q1: Why is Nvidia stock up in pre-market trading? Nvidia shares rose after the U.S. cleared H20 AI chip sales to China. Q2: How did AMD stock react to Nvidia's China update? AMD stock gained 3.18% pre-market due to improved AI chip trade outlook.


Mint
3 hours ago
- Mint
TSMC quarterly profit seen hitting record but Trump tariffs, forex a concern
Analysts expect a 52% surge in second-quarter profit TSMC benefiting from surge towards AI Earnings call at 0600 GMT TAIPEI, - TSMC, the world's main producer of advanced AI chips, is expected to post a 52% jump in second-quarter profit to record levels on Thursday, though U.S. tariffs and a strong Taiwan dollar could weigh on its outlook. Taiwan Semiconductor Manufacturing Co, the world's largest contract chipmaker and a key supplier to Nvidia and Apple, is forecast to report net profit of T$377.4 billion for the three months through June 30, according to an LSEG SmartEstimate compiled from 21 analysts. SmartEstimates place greater weight on forecasts from analysts who are more consistently accurate. The company will report the headline profit figure at 0530 GMT which will be followed by an earnings call from 0600 GMT that will include third-quarter guidance. TSMC has already flagged a rise in second-quarter revenue of 38.6%. Any profit result above T$374.68 billion would mark the company's highest-ever quarterly net income and its sixth consecutive quarter of profit growth. It remains unclear just how much U.S. President Donald Trump's tariffs will affect TSMC. Taiwan was threatened with a 32% reciprocal tariff rate in April but has yet to be notified of an updated figure that some countries have received. Trump also said this month that tariffs on semiconductors are likely to come soon. The company said in June that U.S. tariffs were having some indirect impact, noting they can lead to slightly higher prices, which may in turn weigh on demand. In March, TSMC announced a $100 billion investment in the U.S. alongside Trump at the White House, on top of $65 billion pledged for three Arizona plants - two of which have been built. Another key issue is the Taiwan dollar's 12% appreciation against the greenback so far this year. TSMC has said a 1% appreciation in the Taiwan dollar typically reduces its gross margin by 0.4 percentage points. In June, the company said that strengthening in the Taiwan dollar had shaved more than 3 percentage points off its gross margin. Shares in TSMC surged some 80% last year but have climbed just 5% for the year to date on worries about tariffs and unfavourable currency exchange rates. This article was generated from an automated news agency feed without modifications to text.