logo
Intel's next batch of CPUs might still be called Core Ultra 200S, possibly because the Arrow Lake refresh won't be much of a boost

Intel's next batch of CPUs might still be called Core Ultra 200S, possibly because the Arrow Lake refresh won't be much of a boost

Yahoo21-05-2025

When you buy through links on our articles, Future and its syndication partners may earn a commission.
Over the years of producing CPUs, Intel has a long history of refreshing its current architecture, rather than wholesale replacing it. The updates typically offer higher clock speeds and occasionally a new feature or two, but they've always involved a new name for the processors. However, for the Arrow Lake refresh, it's being claimed that the tweaked chips will still be called Core Ultra 200S.
Admittedly, it's just the one person making the claim, X-user Jaykin (via Videocardz), but they have a good track record when it comes to tech leaks and rumours. Jaykin also suggests that the Arrow Lake refresh will involve an updated NPU (neural processing unit), courtesy of a larger SoC tile.
Both desktop and mobile variants of Arrow Lake (ARL-S and ARL-HX, respectively) will still have the same package size and, in the case of the former, will still use the LGA1851 socket. While that doesn't automatically mean your current Z890 or B850 motherboard will support the refreshed chips, there's a good chance it will.
In previous years, when Intel launched a refreshed CPU line-up, it has always renamed the chips. So why would Team Blue stick with Core Ultra 200S? I suspect that it's because there will be no changes that affect their fundamental performance, such as higher clock speeds, more cores, or a faster memory system.
If the only thing that's being improved is the NPU (possibly to ensure that Intel's chips now meet Microsoft's requirements for its Copilot AI PC ecosystem), then the new processors would perform no better than the current ones do in games, content creation, and general use.
Then again, the recently launched Core 200S Boost feature showed there is some scope for improving the performance of Arrow Lake without having to change various internal components. In other words, Intel could tweak all of the internal clocks and timings, plus add a larger NPU, and have a big enough difference in performance to warrant calling them Core Ultra 300S.
After all, it was happy to refresh its Raptor Lake-powered Core 13th Gen series, call it Core 14th Gen, and barely change anything other than the boost clocks and power consumption.
I suspect that Intel wants to keep Core Ultra 300S for its next generation of CPU architectures, namely Panther Lake for mobile platforms and Nova Lake for desktops, both targeted for release in 2026.
Your next upgrade
Best CPU for gaming: The top chips from Intel and AMD.Best gaming motherboard: The right boards.Best graphics card: Your perfect pixel-pusher awaits.Best SSD for gaming: Get into the game ahead of the rest.
These are expected to be significantly better in performance than the current Lunar Lake and Arrow Lake processors, and given the lacklustre reception the latter received, it makes sense to keep the new name for something that's (hopefully) a lot better.
One concern that I have is whether the Arrow Lake refresh is going to be the last processor that supports Intel's LGA1851 socket. If it is and the refreshed chips aren't any better in gaming, for example, then it would be really disappointing to see.
AMD was still releasing CPUs for its AM4 socket last year, a good eight years after it first appeared, and while Intel is well-known for changing its sockets every couple of processor generations, you'd think it would take a leaf from Team Red's book of design and make something that lasts. Of course, changing the socket forces OEMs and system builders to push out entirely new platforms, which in turn helps Intel shift a whole heap of processors.
Hopefully, the Arrow Lake refresh does offer more than just a larger NPU because, as things currently stand, I wouldn't recommend that any PC gamer buy a Core Ultra 200S chip. It's not bad, but regardless of what your budget is or what your use scenario for the processor is, there are far better CPUs out there to choose from.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Data centers are at the heart of the AI revolution and here's how they are changing
Data centers are at the heart of the AI revolution and here's how they are changing

Yahoo

time8 hours ago

  • Yahoo

Data centers are at the heart of the AI revolution and here's how they are changing

When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

Data centers are at the heart of the AI revolution and here's how they are changing
Data centers are at the heart of the AI revolution and here's how they are changing

Yahoo

time8 hours ago

  • Yahoo

Data centers are at the heart of the AI revolution and here's how they are changing

When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

This HP 17″ Laptop (64GB RAM, 2TB SSD) Is Nearly Free, Amazon Slashes 75% Off and Adds $400 in Free Bonuses
This HP 17″ Laptop (64GB RAM, 2TB SSD) Is Nearly Free, Amazon Slashes 75% Off and Adds $400 in Free Bonuses

Gizmodo

time11 hours ago

  • Gizmodo

This HP 17″ Laptop (64GB RAM, 2TB SSD) Is Nearly Free, Amazon Slashes 75% Off and Adds $400 in Free Bonuses

Amazon lets you save $3,500 on a super powerful HP laptop and it's hard to believe. When it comes to finding the best value for your money on a premium laptop, deals like the one currently available on Amazon for the HP touchscreen laptop (17-inches, 64GB RAM, 2TB SSD and Intel Core i7) are rare and truly game-changing. This is maybe the deal of the century for anyone in the market for a powerhouse portable as it let you save more than $3,500 on one of the most capable laptops available today. Not only does this laptop boast a massive 75% discount – dropping its price from a typical $4,699 to just $1,159 – but it also comes packed with valuable bonuses that when factored in, make the effective price even lower than $1,000. With extras like a lifetime Microsoft Office Pro license, Windows 11 Pro, and even a mouse included, this package is designed to supercharge your productivity. See at Amazon Perfect Laptop For All This 2025 HP laptop is designed to power through anything you throw at it in office tasks, multitasking, remote learning, and light creative work. The machine has an Intel i7-1355U processor that has 10 cores and can turbo boost to 5.0GHz. It will run exceedingly well with various applications while giving superior battery life. You can use this laptop for either work or play, with Intel Iris Xe Graphics and useful power management. One of the great features of this laptop is the 64GB of DDR4 RAM: It provides you with high-bandwidth memory (even for the most demanding tasks) so you can open dozens of browser tabs, large spreadsheets, and graphics and creative apps all on one system without slowdowns or hiccups. The storage options are equally impressive with a 2TB PCIe SSD providing ultra-fast boot speeds. The laptop's HD+ 17.3-inch touchscreen display makes colors pop and images sharp and crisp for presentations and media consumption but is also a fun way to watch your favorite shows and movies after a busy week of working hard. The touchscreen provides an efficient way to interact with your content in an engaging and intuitive manner. The display has a resolution of 1600 x 900 and is perfectly suited for both productivity and relaxation. This laptop also includes several ports: HDMI 2.1, two USB 3.2 Type-A ports, one USB 3.2 Type-C port, and a headphone/microphone combo jack. It also has a dedicated numeric keypad to speed up data entry. There's a built-in fingerprint reader so you can feel secure when signing in or handling sensitive information. Wi-Fi 6 ensures robust and fast internet connectivity anywhere you go! The software package that comes with this deal might be most remarkable of all. Your laptop comes pre-loaded with Windows 11 Pro, which adds productivity and security features to make your data more secure and your workflow more efficient. You are also getting a lifetime license for Microsoft Office Professional: Word, Excel, PowerPoint and more are installed on your computer, forever, at no additional cost. And a mouse is included as an added bonus. We suspect that this deal won't last on Amazon for long. See at Amazon

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store