logo
#

Latest news with #NvidiaRTX

The RTX 5090 is the best graphics card I've ever owned — but there's a catch for living room PC gamers
The RTX 5090 is the best graphics card I've ever owned — but there's a catch for living room PC gamers

Tom's Guide

time14-07-2025

  • Tom's Guide

The RTX 5090 is the best graphics card I've ever owned — but there's a catch for living room PC gamers

Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50-series GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. So I have a PC similar to the build our lab tester Matt Murray constructed (he even posted a handy how-to on building a PC) — packing the 5090, AMD Ryzen 7 9800X3D, and 64GB DDR5 RAM on a Gigabyte X870 Aorus motherboard. In terms of the screens I play on, I have two. For the desk, I've got an Samsung Odyssey G9 OLED with a max 240Hz refresh rate, but most of the time, I'll be in living room mode with my LG G3 OLED's max 120Hz refresh rate. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Cyberpunk 2077, Indiana Jones and the Great Circle and Half-Life 2 RTX — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. So I got into the games (turn off Vsync for the best results). For more specific context, these figures were taken from Doom's Forsaken Plain level, Indy's Marshall College section during a particularly challenging path traced scene, driving around downtown Night City in Cyberpunk, and Gorden's mesmerizing new take on Ravenholm. All games tested at 4K (Max settings, DLSS Balanced) Cyberpunk 2077 Doom: The Dark Ages Indiana Jones and the Great Circle Half-Life 2 RTX demo Frame gen off (Average frame rate / latency) 58 FPS / 36-47 ms 95 FPS / 37-48 ms 85 FPS / 33-40 ms 75 FPS / 26-3 ms Frame gen x2 (Average frame rate / latency) 130 FPS / 29-42 ms 160 FPS / 51-58 ms 140 FPS / 35-46 ms 130 FPS / 29-42 ms Frame gen x3 (Average frame rate / latency) 195 FPS / 37-52 ms 225 FPS / 54-78 ms 197 FPS / 43-53 ms 195 FPS / 37-52 ms Frame gen x4 (Average frame rate / latency) 240 FPS / 41-60 ms 270 FPS / 56-92 ms 243 FPS / 44-57 ms 240 FPS / 41-60 ms These are ludicrous frame rates — limited only by either my LG G3 OLED's max 120Hz refresh rate, or even the sky high 240Hz on my Samsung Odyssey G9 OLED in a couple circumstances. There is a catch, though, which goes back to the ways that I play. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as I expected. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. And that situation is compounded on my TV. The high frame rate is glorious on my monitor, but when locked to 120Hz, you don't get the perceived smoother motion of those additional frames — creating a disconnect that makes that latency a bit more noticeable. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates on your TV, my advice would be to aim for the frame gen level that is as close to your maximum refresh rate as possible. For all the games I tested, that would be 2x. At this level, I find latency hovers around the mid 30s but never exceeds 60 ms, which feels as snappy in that kind of living room gaming setup. Crank up the multi frame gen set to either x4 or x3 setting, and there's a depreciation of what you get here, as the latency becomes more visibly prevalent at the restricted refresh rate using one of the best gaming mice. Flip to a 240Hz monitor, however, and the difference is night and day, as the latency remains at a responsive level alongside those AI-injected frames for a buttery smooth experience. And now, we've got to talk about path tracing — it's already blowing minds in Doom: The Dark Ages, and it's prevalent in the likes of Cyberpunk and Doctor Jones' enjoyable romp. It's essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. Given the demands of this tech on your GPU, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates in future implementations. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. So while the technology matures, I hope Nvidia continues to work to reduce latency at these middle-of-the-road frame rates too, so that this AI trickery really hits the spot when maxed out. To be clear to those on the ropes about buying an RTX 5090 — just as we've said in our reviews of the RTX 5060 Ti, 5070 and 5070 Ti, if you own a 40 series-equivalent GPU, you should stick with your current card. You may not get that multi-frame gen goodness, but with DLSS 4 running through its veins, you still get the benefits of Nvidia's latest form of supersampling and its new Transformer model — delivering considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. I don't want to end on a total downer though, so I'll give credit where its due. If you're on a monitor with a blisteringly refresh rate though, I'll admit multi frame generation might be a good suit for your setup. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. But for those who hot switch between the desk and the couch like I do, make sure you tweak those settings reflective of your refresh rate.

Razer Thins Out the Blade 14 and Fattens the Price Tag
Razer Thins Out the Blade 14 and Fattens the Price Tag

Gizmodo

time19-05-2025

  • Gizmodo

Razer Thins Out the Blade 14 and Fattens the Price Tag

The newly announced Razer Blade 14 isn't quite stiletto-thin, but it's becoming far more knife-like over time. Compared to past iterations, the shell is now as tall as 10 pennies stacked on top of each other, which means the new blade might be as thin as your wallet after buying one. On top of being the thinnest Blade 14 Razer has ever made, it's also the most expensive, starting at $2,300 for a version with Nvidia's new GeForce RTX 5060 laptop GPU. Razer's always tried to offer quality for its high price, but with tariffs in effect, the new Blade 14 is pushing what consumers can expect from gaming laptops. The Blade 14's base price is $100 more than you would have paid for the 2024 Blade 14 with an RTX 4060. If you upgrade the new Blade to the version with the RTX 5070, Razer told us you could spend $2,700 for the sake of a laptop that's 'not a big brick,' as the company put it. Razer is always an enticing buy because of its generally strong build quality, but even a frisbee-light frame doesn't take the sting out of today's tariff-inflated product prices. Razer's new ultra-thin design houses an AMD Ryzen AI 9 365 and up to 64GB of 8,000MHz LPDDR5X RAM. It's the first time Razer is pairing a Copilot+ CPU made for lightweight laptops with an Nvidia GPU in a thoroughbred gaming machine, which means it's compatible with Windows 11 AI features like Recall (which you should probably remember to turn off during setup). The laptop sports a bevy of ports, including HDMI and a microSD card slot. As per usual, Razer promotes its hardy aluminum with an anodized black finish that will manage to stave off bumps or blemishes. We don't doubt that all that combined will offer enough juice to showcase the Blade 14's 3K, 120Hz OLED display. We do wonder what kind of CPU performance it might provide compared to a similarly sized laptop like the Asus TUF Gaming A14 and its top-end AMD Ryzen AI 9 HX 370. Asus' older model also costs hundreds of dollars less than Razer's latest. Razer promised you'll get 2 to 3 hours while gaming on the 72WHr battery. That doesn't sound like much, but it's technically better than what you already get on the Razer Blade 16 from this year. We've yet to find a gaming laptop with a battery life that will keep up for extended periods. The Blade 14 powers the RTX 5070 up to the max 115W TGP, which may give you enough juice for most modern games at the max 2,880 x 1,800 resolution. The RTX 5060 laptop GPU is still so new, we don't yet know how it performs compared to Nvidia's mid-range graphics options. No matter which GPU you choose, the machine will still support a six-speaker system through upward-firing speakers. That may offer better sound quality than you may be used to on such small systems, especially with support for THX Spatial Audio. There has been a rash of relatively light gaming laptops from 2024 stretching into this year. Razer seemingly knew it needed to step up its game with the 2025 edition of the Blade 14. At 0.62-inch thickness and weighing in at 3.59 pounds, it's 11% thinner and lighter than the 2024 edition. It takes the same thermal hood design from this year's rendition of the Blade 16. That laptop also went on a diet for the sake of customers who want a less hefty device to fit a little bit better in their backpacks. We found it also tended to get rather hot when playing intensive games, so we hope that's less of a problem with a smaller battery and less demanding GPU. The one thing that hasn't been brought over from the Blade 16 is the improved keyboard. It's a Razer device, so of course it's packed to the gills with gamer lights, including per-key RGB. Those keys still only have 1 mm of key travel compared to the deeper, more impactful 1.5mm on the redesigned 16-incher. There are no color options save for black and white, as much as we might beg Razer to bring back the 'coral' pink color from the 2019 Razer Blade Stealth. There's nothing wrong with a thin system, but perhaps a pink blade would help take away the sting of price hikes.

UPS Explores Humanoid Robots As Figure AI Secures $675M From Jeff Bezos, OpenAI, And Intel, Targeting $9B Revenue By 2029
UPS Explores Humanoid Robots As Figure AI Secures $675M From Jeff Bezos, OpenAI, And Intel, Targeting $9B Revenue By 2029

Yahoo

time04-05-2025

  • Business
  • Yahoo

UPS Explores Humanoid Robots As Figure AI Secures $675M From Jeff Bezos, OpenAI, And Intel, Targeting $9B Revenue By 2029

, the robotics company aiming to build the first commercially viable humanoid worker, recently announced it secured a staggering $675 million in funding from some of the biggest names in tech and venture capital, including Jeff Bezos, Microsoft (NASDAQ:MSFT), Nvidia (NASDAQ:NVDA), OpenAI, and Intel (NASDAQ:INTC). Now valued at $2.6 billion, the San Jose, California-based startup is in talks with United Parcel Service (NYSE:UPS) to integrate its humanoid robots into the global shipping giant's logistics infrastructure, according to Bloomberg. Don't Miss: 'Scrolling To UBI' — Deloitte's #1 fastest-growing software company allows users to earn money on their phones. Maker of the $60,000 foldable home has 3 factory buildings, 600+ houses built, and big plans to solve housing — Figure AI was founded in 2022 by serial entrepreneur Brett Adcock, who previously launched Vettery and Archer Aviation (NYSE:ACHR). Adcock's bio says that his goal is to create a fully autonomous humanoid robot that can safely, efficiently, and intelligently perform a wide range of physical tasks that are currently repetitive, physically draining, or simply undesirable for human workers. According to Bloomberg, the company's latest model, Figure 02, features integrated cabling, a torso battery, and real-time computing power using Nvidia RTX GPU-based modules. It's outfitted with 6 RGB cameras, directional microphones, speakers, and a custom-built conversational AI developed in collaboration with OpenAI. Figure 02 also boasts 16 degrees of freedom in its redesigned hands, which allow it to lift and maneuver up to 55 pounds. These capabilities are being actively tested in real-world settings, including on the factory floor at BMW. Trending: Donald Trump just announced a $500 billion AI infrastructure deal — . While both UPS and rival FedEx (NYSE:FDX) have incorporated autonomous systems like sorting and ground robots into their operations, neither has implemented humanoid robots until now. Bloomberg reported last year that UPS and Figure AI began discussions in 2023 and resumed them in early 2025 as Figure's latest model reached new performance milestones. If an agreement is finalized, UPS will become one of the first global logistics companies to pilot humanoid labor at an industrial scale. This move comes as UPS is undergoing a significant restructuring in response to declining package volumes, particularly from major clients like Amazon (NASDAQ:AMZN), and economic pressures stemming from international tariffs. According to Reuters, UPS plans to cut 20,000 jobs and close 73 facilities by June, aiming to save $3.5 billion through workforce reductions, automation, and asset sales. These measures are part of UPS's strategy to adapt to shifting market dynamics and enhance operational believes humanoid robots are the next frontier in workforce transformation. He said in a statement that Figure's vision is 'to bring humanoid robots into commercial operations as soon as possible. AI and robotics are the future, and I am grateful to have the support of investors and partners who believe in being at the forefront." Figure's mission has always been about filling the gap where human labor is in short supply or at risk. The startup says it is building toward a future where robots can work side-by-side with humans, or even independently, across warehouses, manufacturing floors, and homes. Figure AI is not alone in the humanoid race since competitors like Optimus, created by Tesla (NASDAQ:TSLA), and Agility Robotics' Digit are also vying to dominate this emerging sector. However, with backing from the world's top AI labs, chipmakers, and billionaires, Figure AI is showing that humanoid labor is no longer a sci-fi dream but a near-future solution to real-world industrial problems. Read Next:Deloitte's fastest-growing software company partners with Amazon, Walmart & Target – Image: Shutterstock UNLOCKED: 5 NEW TRADES EVERY WEEK. Click now to get top trade ideas daily, plus unlimited access to cutting-edge tools and strategies to gain an edge in the markets. Get the latest stock analysis from Benzinga? APPLE (AAPL): Free Stock Analysis Report TESLA (TSLA): Free Stock Analysis Report This article UPS Explores Humanoid Robots As Figure AI Secures $675M From Jeff Bezos, OpenAI, And Intel, Targeting $9B Revenue By 2029 originally appeared on © 2025 Benzinga does not provide investment advice. All rights reserved.

Chipmaker TSMC's new A14 process will apparently offer a '15% speed improvement' but our GPUs won't be made on it for a while
Chipmaker TSMC's new A14 process will apparently offer a '15% speed improvement' but our GPUs won't be made on it for a while

Yahoo

time26-04-2025

  • Business
  • Yahoo

Chipmaker TSMC's new A14 process will apparently offer a '15% speed improvement' but our GPUs won't be made on it for a while

When you buy through links on our articles, Future and its syndication partners may earn a commission. TSMC, the world's biggest chipmaker, has just announced another process node which will almost certainly, amongst an undoubted slew of AI chips, be used to make some of our gaming GPUs and CPUs in the future. This next-gen process is 'A14', meaning 14 angstroms or 1.4 nanometres or really, really small. This was announced yesterday at TSMC's North America Technology Symposium, and the company says the process "is designed to drive AI transformation forward by delivering faster computing and greater power efficiency." TSMC was already the world's biggest chipmaker even before all this AI business started to really kick off, but ever since then, it's a company name, alongside Nvidia, that's on a ton of people's lips. Naturally, then, talk of its upcoming process nodes will be of interest to many people, but we PC gamers can throw our hats into that pool of interested onlookers, too. That's because TSMC makes lots of the chips that end up in some of the best gaming CPUs and best graphics cards, whether from AMD, Nvidia, or even Intel. Currently, for instance, Nvidia RTX 50-series GPUs are made predominantly on TSMC's 4 nm node, and the same goes for AMD's Ryzen 9000-series processors. Intel's Arrow Lake chips look to now be made exclusively by TSMC, too, ever since Intel killed its 20A process last year. The newly announced A14 node is planned for 2028. Compared with its upcoming N2 process (set for later this year), TSMC says "A14 will offer up to 15% speed improvement at the same power, or up to 30% power reduction at the same speed, along with more than 20% increase in logic density." Although it's a few years away, I can't help but get a little excited about new processes. That's primarily because we've seen with the RTX 50-series GPUS just how unexciting a new GPU generation can be if it doesn't come off the back of a new process node (the RTX 50 series is on the same process as the RTX 40 series). That being said, Nvidia doesn't usually use bleeding-edge nodes for its GPUs, and we'll be more likely to see AMD and Apple chips made using A14 to begin with. A14 won't have backside power delivery until 2029, either, according to our colleagues at Tom's Hardware. Backside power delivery essentially moves power interconnects to the underside of the chip, reducing inference and the distance that power has to travel, thereby increasing efficiency and performance. Your next upgrade Best CPU for gaming: The top chips from Intel and gaming motherboard: The right graphics card: Your perfect pixel-pusher SSD for gaming: Get into the game ahead of the rest. We expect to see backside power delivery (AKA 'Super Power Rail') from TSMC first with A16 in 2026. Intel's ahead in this game, however, as its 18A process already has backside power delivery and is ready to go as of two months ago. As for whether this A14 production will also make an appearance in the US, as well as from TSMC's Taiwan fabs, it seems like it might. I'm basing this on the company's recent earnings call, in which the company claimed that six fabs are planned in Arizona: "In that six fab, the 2-nanometer will be a major node, and that's what I say, 30% will be there. As time goes by, after the 2-nanometer will be 1.4 and 1.0, that has not been discussed yet." This was in response to a question about what percentage of future leading nodes will come from the US vs from Taiwan, and to my ears it seems like TSMC is saying 1.4 and 1.0 will come from the US, but the percentage hasn't been discussed yet. Whatever the case, here's to some healthy progress in process nodes across the board, whether from TSMC, Intel, or anyone else. Architectural and AI changes aside, raw performance increases are a direct result of transistor density, and we can all get behind that.

Samsung has a crack at ye olde glasses-free 3D monitor thing but its new cheaper 49-inch ultrawide OLED is far more interesting
Samsung has a crack at ye olde glasses-free 3D monitor thing but its new cheaper 49-inch ultrawide OLED is far more interesting

Yahoo

time02-04-2025

  • Yahoo

Samsung has a crack at ye olde glasses-free 3D monitor thing but its new cheaper 49-inch ultrawide OLED is far more interesting

When you buy through links on our articles, Future and its syndication partners may earn a commission. Samsung has announced a slew of new gaming monitors and on paper, the big news is a glasses-free 3D model. However, it's a new lower-cost version of Samsung's 49-inch ultrawide OLED that could be most interesting in the real world. The Odyssey 3D G90XF is a 27-inch 4K model with, "advanced eye-tracking technology and a proprietary lenticular lens deliver a natural-looking high-definition 3D image." The combination of eye-tracking and lenticular lenses to enable glasses-free 3D is not entirely new. Lenovo announced something essentially identical back in 2023. Indeed, our own Jacob experienced a Samsung prototype over a year ago and came away impressed. And yet, 3D displays never seem to take off. Admittedly, most attempts in the past, including Nvidia's 3D Vision, involved glasses which typically prove quite the impediment to adoption. So, could this glasses-free version take off? Jacob only had five minutes with the concept display, but said, "It delivers a genuinely decent 3D image." One catch is that it requires an Nvidia RTX GPU with Samsung recommending RTX 3080 at minimum. But then this is not a cheap monitor. For the record, this is an LCD monitor, not OLED, and thus has claimed 1 ms response. The refresh rate is 165 Hz. Samsung hasn't released an official price. But it is listed on Samsung's South Korean pre-order page at a price that converts to around $1,575. Ouch. With that in mind, it could be the new Odyssey G9 G91F that's more interesting. In most regards it's familiar and similar to the Samsung Odyssey OLED G9 G93SC we reviewed way back in 2023. So, it's a huge 49-inch 5,120 by 1,440 OLED. Except instead of 240 Hz refresh, it's 144 Hz and designed to allow, "more gamers to experience curved ultrawide gaming." In other words, it's cheaper. Again, Samsung hasn't listed a price. But it's on that South Korean pre-order website for a price that converts to $850. That's about half the original launch price of the G93SC. What's more, Samsung monitors tend to slip under their MSRPs pretty quickly. So the G91F could dip below $800 in fairly short order. Fingers crossed. Samsung also announced refreshed versions of its 4K OLEDs in 27-inch and 32-inch variants, rather oddly claiming "industry first" implementation of 240 Hz 4K technology, something that's been widely available for years. Anyway, the 32-inch version seems to be priced at around $1,100, with the 27-inch option not yet listed. We'll keep an eye out for US and UK availability of all of those new monitors. Best gaming monitor: Pixel-perfect high refresh rate monitor: Screaming 4K monitor for gaming: High-res 4K TV for gaming: Big-screen 4K PC gaming.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store