logo
Intel has announced a load of new bugs afflicting its chips and this time it's not just CPUs but also GPUs that are involved, hooray!

Intel has announced a load of new bugs afflicting its chips and this time it's not just CPUs but also GPUs that are involved, hooray!

Yahoo16-05-2025

When you buy through links on our articles, Future and its syndication partners may earn a commission.
Intel is notching up an impressive collection of bugs and flaws of late, what with crashing 13th and 14th Gen CPUs, the latest Arrow Lake chips needing a fair bit of post-launch work to get them running (not all that) optimally, all that good stuff. Now the company has announced some exciting new bugs that afflict not just its CPUs, but also Arc GPUs.
Let's start with some familiar sounding bugs involving Core Ultra processors. "Description: Incorrect initialization of resource in the branch prediction unit for some Intel Core Ultra Processors may allow an authenticated user to potentially enable information disclosure via local access," says the official security advisory.
In other words, the bad guys can gain access to your rig. Intel says it is, "releasing microcode updates to mitigate these potential vulnerabilities," which impact Core Ultra 5, 7, and 9 CPU models across both desktop and laptop.
The second CPU bug afflicting those same chips is described as, "a potential security ['flaw'? Even Intel's bug reports have bugs, it seems...] in the Intel Integrated Connectivity I/O interface (CNVi) for some Intel Core Ultra Processors may allow escalation of privilege."
Again, a microcode fix is in the oven. In both cases, Intel advises that PC owners should contact their system provider for an update to fix the problem, but it's not clear if the fix has already been released by Intel.
Next up, some novel GPU flaws, one of which excitingly has been given a "HIGH" severity by Intel. "Potential security vulnerabilities in some Intel Graphics Driver software may allow escalation of privilege, denial of service, or information disclosure," Intel says.
The bug actually applies to all Intel iGPUs from 7th Gen CPUs onwards and also includes the latest Arc GPUs, like the Arc B580. Happily, the solution is a driver update that's already available, links to which you can find here. The fix specifically for discrete Arc GPUs is here.
Finally, we have this advisory pertaining to Intel's Endurance Gaming Mode software. "A potential security vulnerability for some Endurance Gaming Mode software may allow escalation of privilege," Intel says.
Intel Endurance Gaming Mode is an app for laptops that monitors real-time frame rates. The idea is to keep power consumption in check by setting a target frame rate and then reducing GPU frequency and power to match, resulting in improved battery life.
The fix for this is both an update to Endurance Gaming Mode version 1.5.651.0, available here, and that latest GPU driver we mentioned before, available here.
All of which means if you have Intel laptop, say, with both the relevant CPU and GPU and you happen to use Endurance Gaming Mode, perhaps you'd better get patched up. And fast.
Best CPU for gaming: Top chips from Intel and AMD.Best gaming motherboard: The right boards.Best graphics card: Your perfect pixel-pusher awaits.Best SSD for gaming: Get into the game first.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Data centers are at the heart of the AI revolution and here's how they are changing
Data centers are at the heart of the AI revolution and here's how they are changing

Yahoo

time8 hours ago

  • Yahoo

Data centers are at the heart of the AI revolution and here's how they are changing

When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

Data centers are at the heart of the AI revolution and here's how they are changing
Data centers are at the heart of the AI revolution and here's how they are changing

Yahoo

time8 hours ago

  • Yahoo

Data centers are at the heart of the AI revolution and here's how they are changing

When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

This HP 17″ Laptop (64GB RAM, 2TB SSD) Is Nearly Free, Amazon Slashes 75% Off and Adds $400 in Free Bonuses
This HP 17″ Laptop (64GB RAM, 2TB SSD) Is Nearly Free, Amazon Slashes 75% Off and Adds $400 in Free Bonuses

Gizmodo

time10 hours ago

  • Gizmodo

This HP 17″ Laptop (64GB RAM, 2TB SSD) Is Nearly Free, Amazon Slashes 75% Off and Adds $400 in Free Bonuses

Amazon lets you save $3,500 on a super powerful HP laptop and it's hard to believe. When it comes to finding the best value for your money on a premium laptop, deals like the one currently available on Amazon for the HP touchscreen laptop (17-inches, 64GB RAM, 2TB SSD and Intel Core i7) are rare and truly game-changing. This is maybe the deal of the century for anyone in the market for a powerhouse portable as it let you save more than $3,500 on one of the most capable laptops available today. Not only does this laptop boast a massive 75% discount – dropping its price from a typical $4,699 to just $1,159 – but it also comes packed with valuable bonuses that when factored in, make the effective price even lower than $1,000. With extras like a lifetime Microsoft Office Pro license, Windows 11 Pro, and even a mouse included, this package is designed to supercharge your productivity. See at Amazon Perfect Laptop For All This 2025 HP laptop is designed to power through anything you throw at it in office tasks, multitasking, remote learning, and light creative work. The machine has an Intel i7-1355U processor that has 10 cores and can turbo boost to 5.0GHz. It will run exceedingly well with various applications while giving superior battery life. You can use this laptop for either work or play, with Intel Iris Xe Graphics and useful power management. One of the great features of this laptop is the 64GB of DDR4 RAM: It provides you with high-bandwidth memory (even for the most demanding tasks) so you can open dozens of browser tabs, large spreadsheets, and graphics and creative apps all on one system without slowdowns or hiccups. The storage options are equally impressive with a 2TB PCIe SSD providing ultra-fast boot speeds. The laptop's HD+ 17.3-inch touchscreen display makes colors pop and images sharp and crisp for presentations and media consumption but is also a fun way to watch your favorite shows and movies after a busy week of working hard. The touchscreen provides an efficient way to interact with your content in an engaging and intuitive manner. The display has a resolution of 1600 x 900 and is perfectly suited for both productivity and relaxation. This laptop also includes several ports: HDMI 2.1, two USB 3.2 Type-A ports, one USB 3.2 Type-C port, and a headphone/microphone combo jack. It also has a dedicated numeric keypad to speed up data entry. There's a built-in fingerprint reader so you can feel secure when signing in or handling sensitive information. Wi-Fi 6 ensures robust and fast internet connectivity anywhere you go! The software package that comes with this deal might be most remarkable of all. Your laptop comes pre-loaded with Windows 11 Pro, which adds productivity and security features to make your data more secure and your workflow more efficient. You are also getting a lifetime license for Microsoft Office Professional: Word, Excel, PowerPoint and more are installed on your computer, forever, at no additional cost. And a mouse is included as an added bonus. We suspect that this deal won't last on Amazon for long. See at Amazon

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store