Game of Thrones star Rory McCann speaks out on joining Star Wars as Baylan Skoll in Ahsoka: "I think it's the right decision to carry on his storyline"
When you buy through links on our articles, Future and its syndication partners may earn a commission.
Game of Thrones star Rory McCann has spoken out for the first time on taking over as Baylan Skoll in Ahsoka season 2.
McCann is stepping in for Ray Stevenson as the former Jedi, after Stevenson sadly died in 2023. The casting was confirmed at Star Wars Celebration 2025.
"I think it's the right decision to carry on his storyline, not just cut it off," McCann told ComicBook.com. "We've done it before with other things. I hope the fans embrace it and I'll do my best. I'm just starting now, so I'm just trying to be in the right zone for doing it. It's pretty bloody exciting. I mean, I remember being a kid with my dad, going to the first one probably in the late '70s and '80s. Now I'm training with a lightsaber at night, so it's pretty exciting."
"It was a challenge for me to even consider continuing for a while," Ahsoka creator Dave Filoni said at this year's Celebration of Stevenson's passing. "But I have a great support group with Jon [Favreau] and Rosario [Dawson]. And I found a way, and I had Ray in my head. And [I'm] grateful for all those conversations with him about Baylan, so I understood what to do. It just took a while to get there. But I'm very confident now that Ray would be happy with the direction of the character that we've chosen."
Also at Celebration, it was confirmed that Hayden Christensen will return as Anakin Skywalker in season 2, while another major Star Wars character, Admiral Ackbar, will face off against Grand Admiral Thrawn.
A first look of sorts was also played for the crowd, featuring concept art spliced together with a voice over that teased giant evil droids and battles in two galaxies.
Ahsoka season 2 doesn't yet have a release date. While you wait, check out our guide to all the upcoming Star Wars movies and shows for everything else the galaxy far, far away has in store.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
19 minutes ago
- Yahoo
I'm expecting the Wake Up Dead Man trailer during Netflix Tudum event, but here's what I really want to know
When you buy through links on our articles, Future and its syndication partners may earn a commission. Netflix is ready to turn on the hype machine with its Netflix Tudum event streaming live on Netflix at 8 pm ET/5 pm PT on Saturday, May 31. While live performances and star appearances are part of the proceedings, the big draw for Netflix Tudum is that it's where the streamer is expected to show exclusive looks at some of their biggest upcoming TV shows and movies. Right near the top of that list of anticipated titles is Wake Up Dead Man: A Knives Out Mystery, the third Benoit Blanc movie starring Daniel Craig and pegged for a 2025 release. While Netflix has officially included Wake Up Dead Man as part of the Netflix Tudum lineup, it has not said what exactly they're going to be sharing from the movie. If I had to bet, I would say the evening will likely bring our first look at the Wake Up Dead Man trailer. As excited as I am to get a first-look at the third movie in Rian Johnson's Knives Out franchise, that is not the bit of news I am most interested in hearing. Instead, I want to know what is going on with the Wake Up Dead Man release. Not just its release date, but if, when, where and for how long Wake Up Dead Man may be playing in movie theaters? Netflix, famously, is not all that interested in the movie theater business. A majority of its movies never release in theaters, instead simply premiering on the streaming platform right away for subscribers. The lone exception is when Netflix wants to position a movie for potential Oscars. The Academy Awards require that eligible movies have an exclusive release in movie theaters for at least seven days in at least one select markets (Los Angeles, New York and a couple other major cities qualify). For all the Netflix movies that have been nominated in its history, they typically do the bare minimum required to become Oscar eligible, then they're pulled from theaters and made available to stream (either right away or with a slight delay). That was the case for Glass Onion, the previous Knives Out movie. Netflix gave Glass Onion a limited release in 600 movie theaters for one week in late November 2023, pulled it, never reported box office grosses for the movie and then put it on the streaming platform around Christmas time. In terms of awards, Glass Onion earned a single Oscar nomination for Best Adapted Screenplay. I fully expect that to be the minimum of what Netflix will do with Wake Up Dead Man, but I'm holding out hope that the streamer may finally realize they can have their cake and eat it too: a box office smash and then a streaming hit. Multiple studies and analysis from the last five years have shown evidence that movies that first get a theatrical release perform better on streaming services than those that simply premiere on streamers. To examine that claim, let's take a look at Netflix itself. After One of Them Days, one of the best reviewed movies of 2025 and a solid box office performer, premiered on Netflix in late March, it spent three straight weeks in the top 10 of Netflix's most watched movies. Conversely, one of the big Netflix original movies to be released so far this year, Havoc, only spent two weeks in the top 10. In fairness, Vince Vaughn's Nonnas has matched One of Them Days' three weeks in the top 10 and could potentially top it (that will be determined next week). But whether or not that happens, the idea that a movie theater release does appear to give some added cache to a title when it does hit streaming appears to have merit. That's likely why many other streamers and studios — Apple TV Plus, Prime Video, Warner Bros. — have opted to put many of their biggest movies in theaters before debuting them on streaming. Netflix has shown signs that they are starting to budge a little, including announcing earlier this year that Greta Gerwig's Chronicles of Narnia movie will be released in IMAX movie theaters around the world for two weeks in 2026. However, Netflix CEO Ted Sarandos said at the time that there was 'no change at all' to Netflix's theatrical strategy. I hope Sarandos and Netflix have had a change in heart and realize the boom that could be releasing their movies exclusively in movie theaters for an extended run. Especially a star-studded one like Wake Up Dead Man, which in addition to Craig features Jeremy Renner, Josh O'Connor, Cailee Spaeny, Josh Brolin, Mila Kunis, Andrew Scott, Kerry Washington, Thomas Haden Church and Glenn Close as potential suspects. Maybe not a run like Sinners or A Minecraft Movie, but a wide release so everyone who wants to see it on the big screen can easily do so. That's what I'm wishing will be the big take away from the Netflix Tudum event — though I'm also just excited to see the Wake Up Dead Man trailer.
Yahoo
an hour ago
- Yahoo
Data centers are at the heart of the AI revolution and here's how they are changing
When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:
Yahoo
an hour ago
- Yahoo
Data centers are at the heart of the AI revolution and here's how they are changing
When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: