
How Google is working with Hollywood to bring AI to filmmaking
Google, having developed cutting-edge AI tools spanning script development to text-to-video generation, is positioned as a key player in AI-assisted filmmaking. At the center of Google's cinema ambitions is Mira Lane, the company's vice president of tech and society and its point person on Hollywood studio partnerships. I spoke with Lane about Google's role as a creative partner to the film industry, current Hollywood collaborations, and how artists are embracing tools like Google's generative video editing suite Flow for preproduction, previsualization, and prototyping. This interview has been edited for length and clarity.
Can you tell me about the team you're running and your approach to AI in film?
I run a team called the Envisioning Studio. It sits within this group called Technology and Society. The whole ambition around the team is to showcase possibilities. . . . We take the latest technologies, latest models, latest products and we co-create with society because there's an ethos here that if you're going to disrupt society, you need to co-create with them, collaborate with them, and have them have a real say in the shape of the way that technology unfolds. I think too often a lot of technology companies will make something in isolation and then toss it over the fence, and then various parts of society are the recipients of it and they're reacting to it. I think we saw that with language models that came out three years ago or so where things just kind of went into the industry and into society and people struggled with engaging with them in a meaningful way.
My team is very multidisciplinary. There are philosophers on the team, researchers, developers, product thinkers, designers, and strategists. What we've been doing with the creative industry, mostly film this year—last year we worked on music as well—is we've been doing fairly large collaborations. We bring filmmakers in, we show them what's possible, we make things with them, we embed with them sometimes, we hear their feedback. Then they get to shape things like Flow and Veo that have been launched. I think that we're learning a tremendous amount in that space because anything in the creative and art space right now has a lot of tension, and we want to be active collaborators there.
Have you been able to engage directly with the writers' and actors' unions?
We kind of work through the filmmakers on some of those. Darren Aronofsky, when we brought him in, actually engaged with the writers' unions and the actors' unions to talk about how he was going to approach filmmaking with Google—the number of staff and actors and the way they were going to have those folks embedded in the teams, the types of projects that the AI tools would be focused on. We do that through the filmmakers, and we think it's important to do it actually in partnership with the filmmakers because it's in context of what we're doing versus in some abstract way. That's a very important relationship to nurture.
Tell me about one of the films you've helped create.
Four weeks ago at Tribeca we launched a short film called Ancestra, created in partnership with Darren's production company, Primordial Soup. It's a hybrid type of model where there were live-action shots and AI shots. It's a story about a mother and a baby who's about to be born and the baby has a hole in its heart. It's a short about the universe coming together to help birth that baby and to make sure that it survives. It was based on a true story of the director being born with a hole in her heart.
There are some scenes that are just really hard to shoot, and babies—you can't have infants younger than 6 months on set. So how do you show an accurate depiction of a baby? We took photos from when she was born and constructed an AI version of that baby, and then generated it being held within the arms of a live actress as well. When you watch that film, you'll see these things where it's an AI-generated baby. You can't tell that it's AI-generated, but the scene is actually composed of half of it being live action, the other half being AI-generated.
We had 150 people, maybe close to 200 working on that short film—the same number of people you would typically have working on a [feature-length] film. We saw some shifts in roles and new types of roles being created. There may even be an AI unit that's part of these films. There's usually a CGI unit, and we think there's probably going to be an AI unit that's created as well.
It sounds like you're trying to play a responsible role in how this impacts creators. What are the fruits of that approach?
We want to listen and learn. It's very rare for a technology company to develop the right thing from the very beginning. We want to co-create these tools. because if they're co-created they're useful and they're additive and they're an extension and augmentation, especially in the creative space. We don't want people to have to contort around the technology. We want the technology to be situated relative to what they need and what people are trying to do.
There's a huge aspect of advancing the science, advancing the latest and greatest model development, advancing tooling. We learn a lot from engaging with . . . filmmakers. For example, we launched Flow [a generative video editing suite] and as we were launching it and developing it, a lot of the feedback from our filmmakers was, 'Hey, this tool is really helpful, but we work in teams.' So how can you extend this to be a team-based tool instead of a tool that's for a single individual? We get a lot of really great feedback in terms of just core research and development, and then it becomes something that's actually useful.
That's what we want to do. We want something that is helpful and useful and additive. We're having the conversations around roles and jobs at the same time.
How is this technology empowering filmmakers to tell stories they couldn't before?
In the film industry, they're struggling right now to get really innovative films out because a lot of the production studios want things that are guaranteed hits, and so you're starting to see certain patterns of movies coming out. But filmmakers want to tell richer stories. With the one that we launched at Tribeca, the director was like, 'I would never have been able to tell this story. No one would have funded it and it would have been incredibly hard to do. But now with these tools I can get that story out there.' We're seeing a lot of that—people generating and developing things that they would not have been funded for in the past, but now that gets great storytelling out the door as well. It's incredibly empowering.
These tools are incredibly powerful because they reduce the costs of some of the things that are really hard to do. Certain scenes are very expensive. You want to do a car chase, for example—that's a really expensive scene. We've seen some people take these tools and create pitches that they can then take to a studio and say, 'Hey, would you fund this? Here's my concept.' They're really good at the previsualization stage, and they can kind of get you in the door. Whereas in the past, maybe you brought storyboards in or it was more expensive to create that pitch, now you can do that pretty quickly.
Are we at the point where you can write a prompt and generate an entire film?
I don't think the technology is there where you can write a prompt and generate an entire film and have it land in the right way. There is so much involved in filmmaking that is beyond writing a prompt. There's character development and the right cinematography. . . . There's a lot of nuance in filmmaking. We're pretty far from that. If somebody's selling that I think I would be really skeptical.
What I would say is you can generate segments of that film that are really helpful and [AI] is great for certain things. For short films it's really good. For feature films, there's still a lot of work in the process. I don't think we're in the stage where you're going to automate out the artist in any way. Nobody wants that necessarily. Filmmaking and storytelling is actually pretty complex. You need good taste as well; there's an art to storytelling that you can't really automate.
Is there a disconnect between what Silicon Valley thinks is possible and what Hollywood actually wants?
I think everybody thinks the technology is further along than it is. There's a perception that the technology is much more capable. I think that's where some of the fear is actually, because they're imagining what this can do because of the stories that have been told about these technologies. We just put it in the hands of people and they see the contours of it and the edges and what it's good and bad at, and then they're a little less worried. They're like, 'Oh, I understand this now.'
That said, I look at where the technology was two years ago for film and where it is now. The improvements have been remarkable. Two years ago every [generated] film had six fingers and everything was morphed and really not there—there was no photorealism. You couldn't do live-action shots. And in two years we've made incredible progress. I think in another two years, we're going to have another big step change. We have to recognize we're not as advanced as we think we are, but also that the technology is moving really fast. These partnerships are important because if we're going to have this sort of accelerated technology development, we need these parts of our society that are affected to be deeply involved and actively shaping it so that the thing we have in two years is what is actually useful and valuable in that industry.
What kinds of scenes or elements are becoming easier to create with AI?
Anything that is complex that you tend to see a lot of, those types of things start to get easier because we have a lot of training data around that. If you've seen lots of movies with car chases in them. There are scenes of the universe—we've got amazing photography from the Hubble telescope. We've got great microscopic photography. All of those types of things that are complicated and hard to do in real life, those you can generate a lot easier because we have lots of examples of those and it's been done in the past.
The ones that are hard are ones where you want really strong eye contact between characters, and where the characters are showing a more complex range of emotions.
How would you describe where we're at with the uptake of these tools in the industry?
I think that we're in a state where there's a lot of experimentation. It's kind of that stage where there's something new that's been developed and what you tend to do when there's something new is you tend to try to re-create the past—what you used to do with [older] tools. We're in that stage where I think people are trying to use these new tools to re-create the same kinds of stories that they used to tell, but the real gem is when you jump past that and you do new types of things and new types of stories.
I'll give you one example. Brian Eno did a set of generative films; every time you went to the theater you saw a different version of that film. It was generated, it was different, it was unique. It still had the same backbone but it was a different story every time you saw it. That's a new type of storytelling. I think we're going to see more types of things like that. But first we have to get through this phase of experimentation and understanding the tools, and then we'll get to all the new things we can do with it.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
4 minutes ago
- Yahoo
🔴 MLS LIVE: Three games underway as Miami travel to Orlando later
A reminder of tonight's fixtures:2025-08-10T22:07:22Z Who do you have coming out on top tonight? Let us know in the comments below. 2025-08-10T21:47:14Z Saturday night's action saw LAFC welcome new signing Son Heung-Min to the pitch. Check out every moment of his debut. 2025-08-10T21:45:19Z A reminder of tonight's fixtures:2025-08-10T21:41:53Z We're back with more Major League Soccer action this Sunday with a nice five matches on the schedule. Follow all the action here with us.
Yahoo
4 minutes ago
- Yahoo
Major bank's Australian-first move to crack down on costly $9.5m scourge: 'Screenshot the text'
Commonwealth Bank (CBA) has launched an Australian-first tool to help people check whether they're about to be scammed. It can be difficult sometimes to work out whether a text message that's landed in your phone is real or from a criminal trying to steal your information or money. But CBA's new AI-powered Scam Checker aims to crack down on this issue. Users will now be able to send a screenshot of the message to the Truyu app, which is owned by CBA, to check what they should do. "When you upload a suspicious text to Scam Checker, you're not just protecting yourself. You're also helping keep others safe by sharing valuable information that can be used to help protect them too," Melanie Hayden, Truyu's managing director, said. RELATED Duplicitous new scam targeting 'vulnerable' Aussies costs pensioner $45,000 Text message 'proves' common dinner bill foul play as woman left '$500 out-of-pocket Woolworths shopper saves $60 after discovering game-changing new trick How does CBA's Scam Checker work? Scammers have been able to impersonate banks big and small across Australia, as well as other trusted organisations like Centrelink, the ATO, telcos, internet service providers, and myGov. Some text messages can arrive in the same thread as previous legitimate conversations from that person or organisation, which can make it hard to know what to trust. But Scam Checker uses a "powerful combination" of generative AI and CBA's scams intelligence to dig into the nitty-gritty of any message you give it. While scammers might try their best to look and sound exactly like the group or person they're imitating, they're not always perfect. The tool will be able to scan the message and any links included within seconds to determine whether you should reply to avoid. In the first half of 2024, nearly 58,000 scam text messages were reported, but calls led to the highest reported losses. There have been more than 11,700 of these dodgy messages reported in 2025 so far, with $9.5 million in reported is Truyu? The Truyu app was launched last year between CBA and its digital business arm x15ventures as a way to prevent customers from being scammed. They can check the app to see whether their personal or banking information has been exposed in a data breach. Users will get alerted if their name, date of birth, passport or driver's licence details are being used by thousands of retailers and vendors across the country. If there's a company or business that doesn't ring a bell, customers can find out how the details are being used and shut it down if it's illegitimate. Scam Checker is another weapon in Truyu's arsenal, which has already saved thousands of Australians from being hacked. You can get three months of free access when you sign up, and then after that it costs $4.99 per month. CBA users will be asked to verify certain card purchases One-time passwords (OTPs) have been used by many banks across Australia to help verify a payment or money transfer. However, CBA customers will be asked to log in to the bank's app to approve certain card payments instead of receiving those OTPs. "We are able to give clearer guidance and warnings in the app than in a text message," James Roberts, CBA's general manager of Group Fraud, said. It's aimed at relying less on text messages for important communication between the bank and its customers, as these messages can be hijacked by scammers. 'Earlier this year CommBank introduced in-app authentication to help stop unauthorised access to a customer's online banking, even if a would-be intruder has obtained the customer's password," Roberts added. "We're now looking at progressively moving other sensitive notifications and actions into the app – such as transaction alerts and security prompts – to enhance customer protections."Error in retrieving data Sign in to access your portfolio Error in retrieving data
Yahoo
4 minutes ago
- Yahoo
Why Reddit Stock Skyrocketed This Week
Key Points Reddit reported its second-quarter results on July 31, and the stock has been on a tear since then. Reddit's sales and earnings are rising rapidly. Data licensing for artificial intelligence models has turned into a powerful growth driver for Reddit. 10 stocks we like better than Reddit › Reddit (NYSE: RDDT) stock continued to rocket higher in this week's trading thanks to strong quarterly results. The social media player's price rose 14.2% over the last week of trading. Reddit published its second-quarter report on July 31, and the results spurred a surge in bullish momentum that extended into this week's trading. The company's share price is now up roughly 308% over the last year of trading. Reddit stock roars higher on big Q2 beats Reddit's second-quarter report arrived with results that caused investors to adopt a far more bullish stance on the company's outlook. In Q2, Reddit reported a profit of $0.45 per share on sales of $500 million. The performance came in far better than the average analyst estimate, which had targeted earnings per share of $0.19 and revenue of $426 million. The company's sales increased 78% year over year in the period, and the strong performance beats caused a wide range of Wall Street analysts to significantly increase their one-year price targets on the stock. With excitement surrounding Reddit's future growing, strong post-earnings valuation gains continued over the last week of trading. What's next for Reddit? For the current quarter, Reddit expects its sales to come in between $535 million and $545 million. Hitting the midpoint of that guidance range would mean posting year-over-year sales growth of roughly 55% in the third quarter. Reddit is seeing strong sales and earnings momentum in conjunction with data generated from its platform being a go-to resource for the training of artificial intelligence (AI) models. While the platform's user base has historically monetized at relatively low levels compared to other social sites, data licensing for AI models seems to have changed the game. Should you invest $1,000 in Reddit right now? Before you buy stock in Reddit, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and Reddit wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $653,427!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,119,863!* Now, it's worth noting Stock Advisor's total average return is 1,060% — a market-crushing outperformance compared to 182% for the S&P 500. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of August 4, 2025 Keith Noonan has no position in any of the stocks mentioned. The Motley Fool has no position in any of the stocks mentioned. The Motley Fool has a disclosure policy. Why Reddit Stock Skyrocketed This Week was originally published by The Motley Fool Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data