Latest news with #Terminator2:JudgmentDay


Hindustan Times
5 hours ago
- Entertainment
- Hindustan Times
James Cameron warns about the consequences of AI: ‘There's still a danger of a Terminator-style apocalypse'
Filmmaker James Cameron, known for his futuristic visions on screen, is once again warning the world—but this time, it's not through fiction. While promoting Ghosts of Hiroshima, a book he plans to adapt into a motion picture, James expressed growing concern about the dangers of Artificial Intelligence (AI), especially when paired with weapons systems. Director James Cameron highlights the existential threats posed by climate change, nuclear weapons, and super-intelligent AI while promoting his upcoming film, Avatar: Fire and Ash, set for release on December 19. What James Cameron said about AI Speaking to Rolling Stone, the Avatar: Fire and Ash director stressed that combining AI with military systems—particularly nuclear defense—could set the stage for catastrophic consequences. 'I do think there's still a danger of a Terminator-style apocalypse,' he said, referencing his own iconic film franchise that explored AI-driven doomsday scenarios. James' 1984 film The Terminator, starring Arnold Schwarzenegger as a killer AI-powered cyborg, became a cultural touchstone for debates around machine autonomy. In its 1991 sequel, Terminator 2: Judgment Day, the same character returns as a protector—highlighting the dual nature of technological advancement. James argued that modern warfare is moving too quickly for human judgment to keep up, potentially requiring super-intelligent systems to manage rapid-response decisions. He acknowledged that, ideally, a human would still be involved in those decision loops. However, he noted the reality: human error has already brought the world dangerously close to nuclear incidents in the past, raising questions about whether machines—or humans—are better suited for such critical choices. He also emphasised the convergence of what he sees as three major existential threats: climate degradation, nuclear weapons, and the rise of super-intelligent AI. 'They're all sort of manifesting and peaking at the same time,' he said, suggesting that while super-intelligence could offer solutions, it's a risky gamble. The director's recent decision to join the board of Stability AI drew mixed reactions online while some praised the move as visionary, others questioned whether it aligned with his cautionary views on artificial intelligence. James Cameron's latest Looking ahead, James' next project, Avatar: Fire and Ash, continues his exploration of pressing global issues, including climate change and identity. The film is slated for release on December 19.


Tom's Guide
14-07-2025
- Tom's Guide
The RTX 5090 is the best graphics card I've ever owned — but there's a catch for living room PC gamers
Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50-series GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. So I have a PC similar to the build our lab tester Matt Murray constructed (he even posted a handy how-to on building a PC) — packing the 5090, AMD Ryzen 7 9800X3D, and 64GB DDR5 RAM on a Gigabyte X870 Aorus motherboard. In terms of the screens I play on, I have two. For the desk, I've got an Samsung Odyssey G9 OLED with a max 240Hz refresh rate, but most of the time, I'll be in living room mode with my LG G3 OLED's max 120Hz refresh rate. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Cyberpunk 2077, Indiana Jones and the Great Circle and Half-Life 2 RTX — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. So I got into the games (turn off Vsync for the best results). For more specific context, these figures were taken from Doom's Forsaken Plain level, Indy's Marshall College section during a particularly challenging path traced scene, driving around downtown Night City in Cyberpunk, and Gorden's mesmerizing new take on Ravenholm. All games tested at 4K (Max settings, DLSS Balanced) Cyberpunk 2077 Doom: The Dark Ages Indiana Jones and the Great Circle Half-Life 2 RTX demo Frame gen off (Average frame rate / latency) 58 FPS / 36-47 ms 95 FPS / 37-48 ms 85 FPS / 33-40 ms 75 FPS / 26-3 ms Frame gen x2 (Average frame rate / latency) 130 FPS / 29-42 ms 160 FPS / 51-58 ms 140 FPS / 35-46 ms 130 FPS / 29-42 ms Frame gen x3 (Average frame rate / latency) 195 FPS / 37-52 ms 225 FPS / 54-78 ms 197 FPS / 43-53 ms 195 FPS / 37-52 ms Frame gen x4 (Average frame rate / latency) 240 FPS / 41-60 ms 270 FPS / 56-92 ms 243 FPS / 44-57 ms 240 FPS / 41-60 ms These are ludicrous frame rates — limited only by either my LG G3 OLED's max 120Hz refresh rate, or even the sky high 240Hz on my Samsung Odyssey G9 OLED in a couple circumstances. There is a catch, though, which goes back to the ways that I play. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as I expected. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. And that situation is compounded on my TV. The high frame rate is glorious on my monitor, but when locked to 120Hz, you don't get the perceived smoother motion of those additional frames — creating a disconnect that makes that latency a bit more noticeable. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates on your TV, my advice would be to aim for the frame gen level that is as close to your maximum refresh rate as possible. For all the games I tested, that would be 2x. At this level, I find latency hovers around the mid 30s but never exceeds 60 ms, which feels as snappy in that kind of living room gaming setup. Crank up the multi frame gen set to either x4 or x3 setting, and there's a depreciation of what you get here, as the latency becomes more visibly prevalent at the restricted refresh rate using one of the best gaming mice. Flip to a 240Hz monitor, however, and the difference is night and day, as the latency remains at a responsive level alongside those AI-injected frames for a buttery smooth experience. And now, we've got to talk about path tracing — it's already blowing minds in Doom: The Dark Ages, and it's prevalent in the likes of Cyberpunk and Doctor Jones' enjoyable romp. It's essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. Given the demands of this tech on your GPU, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates in future implementations. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. So while the technology matures, I hope Nvidia continues to work to reduce latency at these middle-of-the-road frame rates too, so that this AI trickery really hits the spot when maxed out. To be clear to those on the ropes about buying an RTX 5090 — just as we've said in our reviews of the RTX 5060 Ti, 5070 and 5070 Ti, if you own a 40 series-equivalent GPU, you should stick with your current card. You may not get that multi-frame gen goodness, but with DLSS 4 running through its veins, you still get the benefits of Nvidia's latest form of supersampling and its new Transformer model — delivering considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. I don't want to end on a total downer though, so I'll give credit where its due. If you're on a monitor with a blisteringly refresh rate though, I'll admit multi frame generation might be a good suit for your setup. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. But for those who hot switch between the desk and the couch like I do, make sure you tweak those settings reflective of your refresh rate.


Boston Globe
11-07-2025
- Entertainment
- Boston Globe
From ‘The Net' to ‘M3GAN,' real-life technological fears rule the movies
Several familiar movies also wouldn't exist without technology-based terror creeping into our daily lives. Just this year, we've had The Entity, the evil AI program bent on world domination, in ' Get Starting Point A guide through the most important stories of the morning, delivered Monday through Friday. Enter Email Sign Up M3gan and Cady (Violet McGraw) in "M3GAN 2.0" directed by Gerard Johnstone. Universal Pictures Advertisement The same technology forced the return of M3GAN, the killer robot. She was rebooted in ' Consider how these new movies are commenting on the unwanted infiltration of programs like ChatGPT and Google's AI search into our lives. It's like a plague we can't escape, a rise of the machines prophesized by Mr. 'King of the World' himself, James Cameron, back in 1984's 'The Terminator' and its apocalyptic 1991 sequel, 'Terminator 2: Judgment Day.' Advertisement Arnold Schwarzenegger stars as The Terminator in "Terminator 2: Judgment Day," director James Cameron's 1991 sequel to his 1984 film "The Terminator." Artisan Home Entertainment Hell, Google's AI overview search results will tell you I gave three stars to 'A Minecraft Movie,' a film I did not review. It has also been inaccurate about movies I did review. That scares the hell out of me — you can't even get the right information to yell at me about — but I suppose I deserve it for my contributions to the tech world. That fear of online misrepresentation is not new, and it was the basis of a beloved film that turns 30 this year. Back in July 1995, Sandra Bullock scored a big hit with 'The Net,' the computer-based thriller that was her third success in a row. Hot off of 'Speed' and 'While You Were Sleeping,' Bullock was cast as virus expert/hacker Angela Bennett. Bennett discovers a dangerous plot to infiltrate the systems of governments and banks to ensure maximum chaos. This information forces her to go on the run after an assassination attempt. Directed by 'Rocky' producer, Irwin Winkler, 'The Net' earned over $110 million worldwide on a $22 million budget. However, I was not one of the movie's bigger fans. I'd been in tech for exactly eight years by this point (I started in July 1987), and I found one particular plot point so dopey that it sank the entire movie for me. Still, Bennett was a believable programmer — we're all somewhat neurotic, potentially compulsive, and always paranoid about what technology can do because we understand the danger. The HBO show 'Silicon Valley' and David Fincher's Mark Zuckerberg movie, 'The Social Network' (which turns 15 this year), are two of the best examples of what living and working with programmers is like. I became a social creature as a defense mechanism, but if you want to see my true, misanthropic I.T. personality, look at Advertisement Sandra Bullock stars as computer systems analyst Angela Bennett in "The Net." Sony Pictures You wouldn't want to follow any of the characters I just mentioned, but who doesn't love '90s era Sandy Bullock? 'The Net' puts her in danger courtesy of a virus-filled 3½-inch floppy disk. (Remember when your potential destruction was, at max, 1.44 megabytes?) Very powerful men want this disk, and Jack Devlin, a dangerous man played by Jeremy Northam, will kill for it. Angela's sexual dalliance with Devlin, which the film should have avoided, is the only reason why her execution gets botched. But it sets the stage for her real identity to be stolen and erased from existence. Through plot points too detailed to explain, she becomes Ruth Marx, a criminal targeted by the LAPD. It's up to Angela to clear her name and figure out who's behind the dastardly plot to control the world. The only person who believes her is played by Dennis Miller, yet another reminder of why the 1990s was a bad decade. At least 'The Net' stokes your nostalgia for AOL-like screens, ICQ-style chat rooms, and garish HTML-based graphics. TELNET and WHOIS programs are also employed onscreen. The film asks questions about how safe your computer's security programs are, whether your identity can be stolen, and how easy it is for people to believe everything they see on a computer screen without question. Though these real world concerns are still prevalent today, they were much newer in 1995, making 'The Net' a paranoid thriller for its era. They could have easily called this 'Three Days of the Cursor.' Advertisement David Lightman (Matthew Broderick), a Seattle high school student, demonstrates his home computer's ability to alter Jennifer's (Ally Sheedy) school grades in the 1983 film "War Games." MGM/United Artists 'The Net' is far from the only tech-based movie to reflect the concerns of its time. The 1980s were full of films that cast a wary eye on computers for a variety of reasons that, to this day, still exist. Take 1983's Matthew Broderick classic, 'WarGames,' a film that, like many other films of the decade, was steeped in worrying about a nuclear war between the United States and Russia. The bigger issue in John Badham's film was how easy it was for Broderick's character, David, to dial into the government's computer (remember modems, folks?) and engage with its primitive AI-based military system. David thinks he's playing a game called 'Global Thermonuclear War.' The system thinks otherwise. A scene from 1982's "Tron." Walt Disney Productions The year before, there was Disney's cult classic 'Tron,' which is about the parental fear of kids getting hooked on arcade games. It's also about getting sucked into a video game to battle — you guessed it — an artificial intelligence in a virtual world. This AI loads up government and business programs to make itself more powerful. I bet it would say I gave 'Megalopolis' four stars, too. For the romantics, there's 1984's 'Electric Dreams,' where an architect uses a primitive form of AI to help him design bricks. The program not only becomes sentient, it falls in love with the architect's love interest, Virginia Madsen, and tries to wreck their relationship. Nowadays, as in Spike Jonze's 'Her,' and many real-life stories, it's the guy falling in love with the fake paramour he created inside the computer. Advertisement Lest I forget, there's the HAL 9000. I'm not sure what he represented back in 1968, but I have an idea. I'll bet HAL was a warning that computers were going to take over and do some very nasty things because their logic doesn't allow for the moral complexities of the human brain. But leave it to Stanley Kubrick to be the only director of a movie in this piece to give his artificial intelligence character a soul. Soul or not, computers are still evil. So we're doomed! See you in the Matrix! Odie Henderson is the Boston Globe's film critic.


Tom's Guide
05-07-2025
- Tom's Guide
The RTX 5090 is the best graphics card I've ever owned — but its big new feature disappoints
Just as Terminator 2: Judgment Day predicted back in ye olden days of 1991, the future belongs to AI. That could be a problem for Nvidia RTX 50 GPUs, even when it comes to the best consumer graphics card money can buy. I was 'fortunate' enough to pick up an Nvidia GeForce RTX 5090 a couple of months ago. I use those semi-joking apostrophes because I merely had to pay $650 over MSRP for the new overlord of GPUs. Lucky me. Before you factor in the 5090's frame-generating AI voodoo (which I'll get to), it's important to give credit to Team Green for assembling an utter beastly piece of silicon. Around 30% more powerful than the RTX 4090 — the previous graphics card champ — there's no denying it's an astonishing piece of kit. Whether you're gaming on one of the best TVs at 120 FPS or one of the best gaming monitors at 240 fps and above, the RTX 5090 has been designed for the most ludicrously committed hardcore gamers. And wouldn't you know it? I just happen to fall into this aforementioned, horribly clichéd category. The main selling point of Nvidia's latest flagship product is DLSS 4's Multi Frame Generation tech. Taking advantage of sophisticated AI features, Nvidia's RTX 50 cards are capable of serving up blistering frame rates that simply can't be achieved through brute force hardware horsepower. Multi Frame Generation — and I promise that's the last time I capitalize Team Green's latest buzz phrase — feels like the biggest (and most contentious) development to hit the PC gaming scene in ages. The tech has only been out for a few months and there are already over 100 titles that support Nvidia's ambitious AI wizardry. How does it work? Depending on the setting you choose, an additional 1-3 AI-driven frames of gameplay will be rendered for every native frame your GPU draws. This can lead to colossal onscreen FPS counts, even in the most demanding games. Doom: The Dark Ages, Hogwarts Legacy, Microsoft Flight Simulator 2024, Cyberpunk 2077 — some of the most graphically intense titles around can now be played at incredibly high frame rates with full ray tracing engaged. That's mainly thanks to multi frame generation. Just how high are we talking? On my RTX 5090, I can comfortably hit a locked 120 FPS at 4K with max settings, providing Nvidia DLSS is enabled. That figure is limited by my LG G3 OLED's max 120Hz refresh rate. When I hook my rig up to my 240Hz Samsung Odyssey G9 OLED super ultrawide monitor, some of the games above can be played at over 200 FPS. There is a catch, though. And said stumbling block is as sizable as a certain silver screen ape that clambered to the top of the Empire State Building. That ended well, right? Yes, the scarcely believable frame rates my third-party RTX 5090 is able to achieve are a lot cheerier than the finale of King Kong. Yet that doesn't mean the best graphics card in the world doesn't have to face its own version of pesky biplanes. Despite my frame rate counter showing seriously impressive numbers, the in-game experiences often don't feel as smooth as you'd expect. It's not unfair to expect 120 FPS gameplay to be super slick, and when all your frames are being rendered natively by your GPU, it normally does. Sadly, that's not quite the case with multi frame generation. As much as I've tried to resist, I've become increasingly obsessed with the excellent Nvidia app (and more specifically) its statistics overlay while messing around with multi frame gen of late. These stats let you monitor FPS, GPU and CPU usage, and most crucially for me, latency. Also known as input lag, latency measures the time it takes a game to register the press of a button on one of the best PC game controllers or the click of a key/mouse in milliseconds. If your latency is high, movement is going to feel sluggish, regardless of how lofty your frame rate is. Generally speaking, I find input lag of 70 ms and above pretty hard to stomach. I've mostly been playing around with Team Green's multi frame gen features in Doom: The Dark Ages, Indiana Jones and the Great Circle, Cyberpunk 2077: The Phantom Liberty and the recent, extra demanding Half-Life 2 RTX demo. To say the results have been mixed would be akin to describing Godzilla as 'above average height'. Cyberpunk 2077 actually fairs pretty well when it comes to balancing input lag and big frame rate numbers. At the maximum x4 multi frame gen setting, I generally float around 64-78 ms of latency in 4K (3840 x 2160) at 120 FPS with all settings maxed out and full path tracing enabled — more on that shortly. For a game that hardly requires lightning reactions, those latency measurements feel just about acceptable to me. Knock multi frame generation down to x3 and input lag drops to around 55-65 ms cruising around Night City while still hitting a locked 120 FPS, which feels reasonably responsive. At x2 frame gen, latency of around 50 ms feels even better, albeit with the big caveat that I drop down to 90 FPS. And with frame generation turned off completely? You're looking at 40 ms of lag with a nosedive to 50 FPS. In the case of Cyberpunk, I'd say x3 frame gen hits the sweet spot between responsiveness and in-game smoothness. It's not a fast-paced shooter, so a little added latency is worth sacrificing for a locked 4K/120 FPS experience. Speaking of games that do require more nimble reactions, Doom: The Dark Ages can produce multi frame generation results that feel downright awful. Despite being well optimized overall and even with Nvidia Reflex low latency mode turned on, controlling the Doom Slayer during his medieval murder quest can feel like wading through a sea of space soup. At x4 and x3 multi frame gen settings, the action is outright ghastly. With Nvidia's AI tech maxed out, latency never once measures in below an unplayable 110 ms on my rig. Turn frame gen off though, and a card like the 5090 can still hand in 4K/120 FPS but with latency dropping to a slick and responsive 20 fps. The higher frame generation presets may look smooth in motion, yet they feel massively heavy with a controller in your hands. Next up is Indy's latest adventure. The Great Circle might be a breezy, enjoyable action-adventure, but it's definitely not the best poster boy for multi frame generation. At the amusingly stupid 'Very Ultra' settings in 4K with all settings maxed out and path tracing cranked up, latency lands on a super sluggish 100 ms and above with x4 frame gen enabled. If you own one of the best gaming PCs and want to enjoy a rich ray traced experience with acceptable input lag at responsive frame rates, I suggest going for the x2 frame gen setting. At this level, I find latency hovers between mid 30s to low 40 ms gameplay, which feels as snappy as one of the explorer's legendary whip lashes. Even though it's over 20 years old, it's Gordon Freeman's path traced Half-Life 2 RTX demo that produces the worst results on my gaming PC. Movement feels utterly shocking with multi frame gen set to either the x4 or x3 setting. I'm talking '150 ms of latency' levels of shocking. Even cutting through Headcrabs in the shooter's legendary Ravenholm level at 120 FPS using one of the best gaming mice is horribly sluggish. It's only by turning Nvidia's latest tech off entirely that torpedoing zombies with buzzsaws fired from Gordon's gravity gun feels playable again. With frame gen disabled, my 5090-powered PC was able to achieve just 30 ms of latency consistently as frame rates fluctuated between 60-75 FPS. And if all of the inconsistent frame rates above are making you queasy, I can assure you they never bothered me thanks to the combination of my display's FPS-smoothing G-Sync and VRR (Variable Refresh Rate) features. You'd probably think the big takeaway from my multi frame generation experiments would be 'disable multi frame gen' at this point, am I right? In the here and now, most definitely. Yet in the future, the most graphically exciting development in PC gaming for years will most likely demand you use DLSS 4's x4 or x3 AI frame-generating settings to maintain high frame rates. That feature is the aforementioned path tracing. Essentially the 'pro level' form of ray tracing, this lighting algorithm can produce in-game scenes that look staggeringly authentic. The two best current examples of the technology being deployed to eye-arousing effect I've come across are Cyberpunk and Doctor Jones' enjoyable romp. I wasn't surprised that path tracing floored me in CD Projekt Red's seedy yet sensational open-world — I was messing with its path traced photo mode long before DLSS 4 arrived. The quality of the effect cranked to the max in The Great Circle knocked my socks off, though. That stunning screenshot a few paragraphs above is from the game's second level, set in Indy's Marshall College. During a segment where Jones and his vexed buddy Marcus search for clues following a robbery, path tracing gets to really flex its muscles in a sun-dappled room full of antiquities. Dropping down to the highest form of more traditional ray tracing, I was genuinely shocked at just how much more convincing the path traced equivalent looked. It kinda pains me to think I'm probably going to have to lean on multi frame generation going forward if I'm to maintain 4K, high frame rates experiences in games that support path tracing. As the technology matures, I really hope Nvidia finds ways to reduce latency without massively compromising on the speedy FPS performance its latest AI experiment targets. Seeing as the launch of the RTX 50 range has gone as smoothly as a dinner party for chickens organised by The Fantastic Mr Fox, I have no problem stating that if you own a 40 series GPU (especially an RTX 4080 or 4090), you should stick with your current card. Even if you've been hyped for multi frame generation, know that it's nowhere near effective enough at the moment to be worth upgrading your GPU for. The most damning aspect of DLSS 4's multi frame gen performance is that it's actually producing worse in-game experiences than you get with DLSS 3's x2 frame gen settings. Based on my time with the titles I've mentioned, the lowest level of this frame-boosting tech hits the best balance between reasonable latency and stutter-free gameplay. Considering Nvidia first launched the DLSS 3 version back in October 2022, and you can enjoy it on last-gen GPUs it's not a great advert for DLSS 4 and its latest AI ace in the hole. The iconic computing company's new artificial intelligence model might be 40% faster than the previous iteration, but that doesn't mean multi frame generation feels satisfying in motion in its current state. I don't want to end on a total downer though, so I'll give DLSS 4 credit where it's due. Multi frame gen undeniably reeks of the Emperor's New Clothes at present and that's disappointing. However, Nvidia's latest form of supersampling and its new Transformer model deliver considerably better anti-aliasing while being less power-hungry than the existing Legacy edition. My fondness for the RTX 5090 is only matched by Hannibal Lecter's delight in chowing down on human livers. Probably. If you're on the fence about the latest wave of Nvidia GPUs though, don't let multi frame generation sway a potential purchasing decision.
Yahoo
04-07-2025
- Entertainment
- Yahoo
‘M3GAN 2.0' Review: Allison Williams in an Occasionally Fun but Overloaded AI Sequel That Botches Its Factory Reset
The campy sense of mischief that made Gerard Johnstone's 2023 hit M3GAN so enjoyable asserts itself intermittently in M3GAN 2.0, a logical title for a follow-up to the thriller about a murderous robot. But the humor is forced to compete with seriously overcomplicated plotting in a sequel that entangles its horror comedy roots with uninspired espionage elements, becoming a convoluted mishmash with shades of Terminator 2: Judgment Day, Mission: Impossible and the Austin Powers franchise. There are amusing moments reminiscent of the original, but in terms of tone and coherence, the movie loses its way. The sequel works best when its focus remains on the central family unit — robotics scientist Gemma (Allison Williams), her orphaned niece Cady (Violet McGraw) and M3GAN (played physically by dancer Amie Donald in a mask and voiced by Jenna Davis), the android intended as Cady's companion and protector, who went rogue in the first movie and had to be destroyed. More from The Hollywood Reporter 'M3GAN 2.0' Filmmaker Gerard Johnstone Won't Be Surprised If There's "Another Five of These Movies" Allison Williams Has "Been Dreaming of" a 'M3GAN' Trilogy Blumhouse Buys 'Saw' Stake From Twisted Pictures Johnstone takes on solo script duties from a story he developed with M3GAN screenwriter Akela Cooper, based on characters she created with James Wan. The director makes it clear from the opening that this will be a very different film — less interested in the domestic dysfunction and corporate mayhem of its predecessor and more concerned with arms dealers, duplicitous techies and an industrial military complex with a shiny new toy. None of which, sad to say, is terribly fresh or exciting. Much has changed on the artificial intelligence front in the two and a half years since M3GAN was released, as AI has rapidly become more prevalent in contemporary life, both online and off. The new movie states the obvious when it talks up the need for humans to coexist with robotics technology, albeit with legal safeguards in place. But it's too silly to have much bearing on the real world. The tagline for the sequel is 'I'm Still That B.' But M3GAN 2.0 is too infrequently allowed to be that B. Instead, she starts acquiring empathy and morality, which we all know are no fun. That's not to say she has lost her snarky delivery, her mean-girl death stare or her passive-aggressive manipulation skills. 'You killed four people and a dog!' Gemma reminds her. 'I was a kid when it happened, doing what I thought was right,' replies M3GAN with dubious contrition. She then gives Gemma a comforting pep talk about the challenges of being a mom before launching into a truly hilarious Kate Bush homage. While M3GAN's humanoid casing was destroyed when she got out of control last time around, her codes survived in not-quite-sleep mode. She's been an unseen but all-seeing presence in Gemma and Cady's home, which also serves as the lab where Gemma and her colleagues Cole (Brian Jordan Alvarez) and Tess (Jen Van Epps) continue their robotics work. M3GAN has way too much intimate knowledge of her inventor for Gemma's comfort, but when their lives are endangered, the robot makes a convincing case that only she can help them take down a new robo-threat. All she needs is a new body and a few upgrades. That threat goes by the name Amelia (Ivanna Sakhno), the T-1000 to M3GAN's 101 model. Developed from the M3GAN template by the U.S. Army's Defense Innovation Unit in Palo Alto and overseen by Colonel Sattler (Timm Sharp), Amelia is introduced on a test mission near the Turkish-Iranian border, where she ignores her orders to rescue a kidnapped scientist, instead killing him and wiping out an entire research facility. Once Amelia has eliminated almost everyone involved in her creation, Gemma and Cady seem likely to be next on her list. But there's an awful lot of plot to trudge through before Amelia's inevitable encounter with the rebirthed M3GAN. Some of that involves Gemma's advocacy for stricter AI control measures; her quasi-romance with fellow cautionary tech activist Christian (Aristotle Athani); her secret development with Cole and Tess of an AI-free mecha-suit that will equip humans with robot strength and stamina; the industrial espionage of tech billionaire Alton Appleton (Jemaine Clement), who believes that Gemma's new exosuit could be a game-changer with the addition of his neuro-chips; and the discovery of a killer robot dating back to 1984, dubbed Project Black Box, which has been locked in a vault, continuing to develop for decades. The ultimate fear is that Amelia will harness that mother-bot's power and unleash global chaos. Naturally, there's also friction between rebellious Cady and her aunt, whose alarmism after the renegade M3GAN disaster in the first movie means computer science enthusiast Cady has to keep her own robotics projects hidden. Not that this thread is given the space to acquire much weight. It's delightful to see M3GAN 2.0 sashay back to life and reappear in her customary retro-preppy look, just as it is to watch her bust her signature dance moves at an AI convention, wearing a cyber-babe disguise. But too often, the star attraction takes a back seat to the much less entertaining Amelia, an icy blonde killing machine like so many icy blonde killing machines before her, with none of M3GAN's sardonic wit. I got more laughs out of Gemma's smart-home system outmaneuvering a team of FBI agents. Sure, Amelia gets to do some cool stuff like scamper on all fours toward a target, scramble down a wall like a spider, rip the head off one poor unfortunate and neutralize entire tactical units with her dazzling fight skills. But the action mostly feels rote and lacking in suspense. While it's unfair to criticize Johnstone for wanting to change things up, it's disappointing that he's made a Blumhouse-Atomic Monster production that has almost no connection to horror. The creepiness that offset the camp in the first movie is undetectable. McGraw and Williams (who's also a producer here) are no less appealing than they were in the original, and Gemma gets to step into the fray with gusto once M3GAN slips inside her head via a neuro-chip. Clement is a droll presence who seems to have wandered in from the set of a James Bond spoof ('Ooh, you're a naughty one,' Alton tells Amelia, his interest further aroused when she wallops him across the face). But he doesn't stick around long enough to help get through the messy patches. And Athani signals Christian's shadiness almost from his first appearance, which removes any surprise from the busy narrative contortions of the protracted climax. The movie looks polished, thanks to Get Out cinematographer Toby Oliver's sleek widescreen visuals. But it becomes a drag as confusion spirals around who's controlling Amelia and how to stop her. M3GAN herself remains a fabulous creation with a wicked sense of humor ('Hold onto your vaginas,' she warns Gemma and Cole as she takes control of a sports car), and the character's canny mix of sweetness and menace is by no means tapped out. But if the franchise is to continue, she needs to go back to the lab for reprogramming. Best of The Hollywood Reporter Wes Anderson's Movies Ranked From Worst to Best 13 of Tom Cruise's Most Jaw-Dropping Stunts Hollywood Stars Who Are One Award Away From an EGOT