Read my lips: AI-dubbed films are debuting in cinemas
Dubbing critics have long grumbled about the pitfalls of mismatched audio and awkward lip-syncing, but new technology is quietly changing the face (and mouths) of international cinema.
Last week, Swedish sci-fi film Watch The Skies opened in US theatres – marketed as the world's first full-length theatrical feature to use AI for an immersive dubbing – a process that makes the characters look as though they are speaking English.
XYZ Films partnered with AI start-up Flawless, which uses TrueSync, a visual tool which alters the character's mouth movements and speech to appear perfectly synced for an English-speaking audience.
'For the movie industry, this is a game changer,' producer Albin Pettersson declared in a behind-the-scenes trailer for the film.
'The Swedish language is a barrier when you want to reach out around the world.'
It's important to note the AI tool has not replaced the actors – the original cast of Watch The Skies, having shot the film in Swedish, then recorded their English lines in a studio. This kept them compliant with SAG-AFTRA guidelines.
'I think a lot of filmmakers and a lot of actors will be afraid of this new technology at first,' added writer and director Victor Danell. 'But we have creative control and to act out the film in English was a real exciting experience.'
Watch The Skies is the start of a long list of AI-dubbed international film collaborations between XYZ Films and Flawless set to be released in the US. They include French film The Book of Solutions, Korean flick Smugglers, Persian-language film Tatami, and German film The Light.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

ABC News
37 minutes ago
- ABC News
Artificial intelligence is revolutionising classroom learning but will it help or hinder students?
Adrian Camm's "digital twin" is uncanny, down to the voice and twitching eyebrows. The website chatbot means the Melbourne principal can be everywhere all at once, answering parents' questions and signing up new enrolments to his school, Westbourne Grammar. Mr Camm recorded just 15 minutes of speech for the AI, which cloned his voice and perfected his Australian accent. It can now converse in 100 different languages he freely admits he can't speak, from Korean to Ukrainian. "We're open and transparent about our uses of AI." When ChatGPT launched in late 2022, independent school Westbourne Grammar embraced it, just as some state education departments around Australia began moving to ban its application in public schools. Programs like ChatGPT can write essays with little instruction from users, which has educators around the world concerned about the risk of cheating. "I don't think banning is ever the solution," Mr Camm said. Westbourne Grammar has a long list of AI programs it uses, including ChatGPT, Google Gemini and Canva AI. With a few commands, students from Year 5 up can produce video games, interact with AI avatars, and generate AI art. The school told 7.30 it vets such programs to ensure they are appropriate under its own guidelines. Year 7 student, Ishana, has been asked to use AI to create something that doesn't exist in real life. She types in "create a shark wearing a pink tutu riding a surfboard". It takes mere seconds before numerous different images appear on her screen, and she is able to print off a copy of the one she likes. But why not draw or paint it herself? "Not everyone has the ability or skills to do something like this," she told 7.30. "But having access to a laptop can allow you to be able to do all of this regardless of your age or how much you know." AI is raising profound questions about how it is affecting the way we think. A recent MIT study compared brain activity between essay writers who used AI, search engines or just their brains, finding that "brain-only participants exhibited the strongest, most distributed networks" while those who used AI "displayed the weakest connectivity". So, will AI foster a generation of lazy thinkers? The question has Jake Renzella, a computer science lecturer at UNSW, worried. "The real concern here is, are we outsourcing the learning when we ask students to do work with these tools?" Mr Renzella told 7.30. "The way the brain actually works when we are using these tools is changing. We can bundle this all up in a term called over-reliance. Are students relying too much on these technologies?" For more than 40 years, computers have been classroom disruptors, but now the AI revolution is well advanced, with some state public schools trialling their own approved versions of AI software. In New South Wales, 50 public schools have been using a program called EduChat since early 2024. EduChat is based on models from the creator of ChatGPT, OpenAI, but draws on the NSW syllabus, and most importantly for schools, does not provide answers or write essays for students like its commercial counterpart. The program pushes back enthusiastically if it is asked to do the work for them: "I can't write an essay for you, but I can help you get started on your own!" Year 11 English students at Plumpton High School in Western Sydney have been learning how to prompt the AI with their own questions about Shakespeare's Othello. They write their essays by hand and then ask the AI how they can make their writing better. According to their teacher, Katherine Gonzaga, they've gone from writing 800 words in 40 minutes to up to 1,500 words in the same amount of time. "[Their vocabulary] has become more sophisticated and more critical and more evaluative," Ms Gonzaga told 7.30. Plumpton High students told 7.30 the AI helped their learning. "It's in no way taking over our thinking, but rather pushing us to improve literacy skills," one student, Annacemone said. Another student, Roma, insisted students were still using their brains. "I'm just scared that if AI does take over and we're not able to adapt that we could lose our critical thinking; that's where EduChat helps, it builds our critical thinking." All Australian schools are expected to follow national guidelines, known as the Australian Framework for Generative Artificial Intelligence in Schools, which allow them to use AI but in an ethical and appropriate way. "We call it safe AI because if you're using EduChat the data stays within the department's system," deputy secretary of the NSW Education Department, Martin Graham, told 7.30. But like any AI, he conceded, it 'hallucinates', a technical term for speaking gibberish. Teachers and students in the trial have been told to expect errors in the AI and to interrogate them. "We've had almost a billion words through the product and you can absolutely say that some of them will not have been completely accurate in the same way that any AI is not completely accurate," Mr Graham said. "We do everything we can to minimise that." AI programs like EduChat can provide instant feedback to individual students in a large class while a human teacher cannot. But if AI is smarter, faster, and more productive, then where does that leave the profession? Teachers aren't going anywhere, according to assistant principal at Chatswood Public School, Isobel McLoughlin, who's been teaching her Year 5's how to use EduChat. "We can ask AI to do all sorts of things, but it will never know if they've missed breakfast, it will never understand if they really struggled with the curriculum, or if they just need a bit of extra time to catch up," she told 7.30. Mr Graham said the department had no plans to replace teachers with AI. Watch 7.30, Mondays to Thursdays 7:30pm on ABC iview and ABC TV Do you know more about this story? Get in touch with 7.30 here.

ABC News
4 hours ago
- ABC News
Australian authors challenge Productivity Commission's proposed copyright law exemption for AI
Australian authors are furious over a recent Productivity Commission (PC) interim report that says AI could deliver a $116 billion boost to Australia's economy over the next 10 years. The centrepiece of the report was a proposal to implement a text and data mining (TDM) exception to the Copyright Act, which would permit tech companies to use copyrighted work to train AI. In July, former Atlassian CEO Scott Farquhar made a similar suggestion in his address to the National Press Club, arguing that a TDM exception could "unlock billions of dollars of investment in Australia". However, the PC report, Harnessing data and digital technology, has attracted strong criticism from the writing industry. Lucy Hayward, CEO of the Australian Society of Authors (ASA), said the proposal gave "a free pass" to multinational tech companies, such as Google, Meta and OpenAI, to continue using unauthorised copyrighted material to train their AI models. "Why should we create a situation where billion-dollar tech companies can profit off authors' work, but not the creators who made the work? It's an entirely absurd proposition," Hayward told ABC Arts. While the government has yet to deliver a formal response to the interim report, Arts Minister Tony Burke stated that the unauthorised use of copyrighted material for commercial purposes constituted theft. "We have copyright laws," the minister said in a speech at the 2025 BookUp conference In Sydney. Tech companies have already used unauthorised copyrighted material to train AI platforms. In March, The Atlantic published a tool that made it possible to search the LibGen database, an online trove of pirated books and academic papers that Meta used to train its generative AI language model. It followed similar revelations in 2023 that a database of pirated material known as Books3 had been used to train Meta's AI model Llama, Bloomberg's BloombergGPT and EleutherAI's GPT-J. The work of countless Australian authors appeared in the pirated databases, including Charlotte Wood, Tim Winton, Helen Garner and Richard Flanagan. A TDM exception would allow this type of use of copyrighted material without compensating the author or seeking their consent. Commissioner Stephen King, one of the report's authors, told ABC RN Breakfast that "not everyone will be a winner". "There will be people who will lose their jobs because of this technology and those people need to be looked after." Danielle Clode, the author of non-fiction titles including Koala (2022) and Killers in Eden (2011), fears writers are among those set to lose the most under the proposal. She said the report demonstrated the Productivity Commission's lack of understanding about how the arts sector operated. "The economic framework they're working with is completely unsuitable for creative industries," she said. Copyright fees are a valuable source of income for authors, who are among the lowest-paid arts workers in Australia, earning just $18,200 on average each year. "In Australia, we have a very fair and well-regulated copyright system which gives clarity to everybody who uses it," said Clode, who is also a board member of the ASA and the Copyright Agency. Wenona Byrne, the inaugural director of Writing Australia, said any watering down of copyright laws was a concern for the sector. "We think the copyright law is fit for purpose. It has sustained the industry since 1968 and has accommodated a lot of technological change in that time," she said. "We already know that writers earn very little from their creative work … We believe it's fair for writers who have spent the time creating these works, and in some cases, that's decades, to earn an income from their use." Byrne believes there will always be a readership for Australian stories, but acknowledges writing is not a sustainable career for many authors who hold down multiple jobs to pay the bills. A TDM exception would reduce authors' ability to earn income from their work and expose them to further economic precarity, she warned. "It would also disincentivise them to create the work in the first place," she said. "We need a rich culture; we need our contemporary Australian society to be reflected in a variety of works for the page and the stage. Anything that comes to disincentivise that creation is a problem for society as a whole." Large language models (LLMs) such as ChatGPT pose another, more existential, threat to authors. "If it's the case that you're able to train these large language models on Australian content, does it mean that it's possible to produce content drawing on all of that material that then replaces work by Australian authors and then they're no longer able to produce work that earns them money?" asked Alice Grundy, a visiting fellow at ANU and the managing editor at Australia Institute Press. Grundy believes the forecast $116 billion over 10 years is not worth the harm the proposal would cause the arts industry. "That's not that much money across the whole economy," she said. She also queries the benefit of a policy that delivers benefits offshore to multinational companies over Australian artists and authors. "At what point do you say productivity is not worth as much to us as maintaining our culture, as continuing to foster our writers, our artists, our other creators?" Grundy said. "At what point do … we say that work is less important to us than some extra dollars in the economy?" Kate Kruimink, whose novel Heartsease won the 2025 Tasmanian Literary Awards Premier's Prize for Fiction, found two of her three books in the LibGen database. She said the unauthorised use of her work had not hurt her economically — yet. She suspects that will change. "The end result … is going to be that there will be a glut of AI-generated creative works on the market. "There are always going to be people who care about the human connection and who don't want the AI-created work, especially if it's being created and trained unethically. But at the same time, I think it's going to be so much harder to survive as a creative worker." Kruimink also questions the Productivity Commission's understanding of the concept of productivity. "It's not about the productivity of creative workers," she said. Geelong writer Rhett Davis, who recently published his second novel, Arborescence, said he would be reluctant to allow his work to be used to train AI when the outcome was a tool that could simulate his writing. "It seems like a strange kind of deal for me, regardless of how much they pay." He believes compensation for authors should be a basic requirement for the use of their work. "It shouldn't just be taken for free," he said. "If you're going to use something, there needs to be an agreement to pay for it. It's a pretty basic copyright principle. That is how we continue to make a living as artists." He is not alone in his view. According to a 2025 study by Macquarie University, 79 per cent of authors would refuse permission for their work to be used to train AI. Kruimink is another author who, given the choice, would "opt out" from allowing her work to train generative AI, a technology she considers "unethical". She believes generative AI undermines the meaning of creative work. "What is creative work for? It's a deeply human endeavour, and to me it's based on the principle of human exchange. The meaning of the work — my writing, for example — is not only in its consumption, it's also in its creation. If you try to cut that exchange in half, I think you remove the soul of what it is." ASA CEO Lucy Hayward pushed back against claims that copyright was a barrier to investment and innovation. "That's absolutely not the case. We know that tech and AI are booming in Australia," she said. "We're in the top five global destinations for data centres; we're a world leader in quantum computing, Amazon has just invested $20 billion in data centre infrastructure in Australia, so it's not the case that Australian copyright law, which is robust and protects creators, is hindering any kind of innovation and investment." Hayward believes the PC report overlooks the economic opportunities licensing arrangements could offer. "Instead of considering ways to legitimise this theft, why aren't we exploring ways to protect [authors'] rights and ensure that generative AI brings an opportunity to Australian creators and is not simply extracting the value of their content?" Local creatives — who contribute to the $60 billion arts industry — should not be cut out of the picture, she argues. "How can we ensure that Australian authors and illustrators who are producing Australian content for Australian audiences can enjoy the financial benefits of the AI boom when their work is vital to the development of this technology? "Do we really want to give away their intellectual property to multinational tech giants for free so they can continue to enrich themselves and continue extracting value from the Australian economy? "Or do we want to find a sensible kind of middle ground where we have generative AI tools — they're not going away; there's going to be adoption of AI tools in the workplace — but authors can be reasonably compensated for their vital contribution to the development of the tools?" Kimberlee Weatherall, a law professor at the University of Sydney and co-director of the Centre for AI, Trust and Governance, told RN Breakfast that AI developers in Australia were constrained by an uncertain regulatory environment. "If you want to do it legitimately and responsibly, there are very real challenges trying to identify who to license from," she said. "It may be simply impossible to identify all the copyright owners and get licences from them all; there's no central system for doing so, and if you want to develop AI here in Australia, even if you want to do it purely for research purposes, it's not entirely clear you could do that legitimately under copyright law." It was a situation in which "everyone loses", she said. "In an ideal world, we would try to do something very different; we would try to find a way to compromise between these different interests so that you could have local AI development responsibly with some kind of way to recognise the interests of creators." The ASA is calling for the government to reject the PC's proposal and instead implement a licensing system to compensate copyright holders for the use of their work. The organisation also wants to see the introduction of new legislation to regulate the use of AI. Writing Australia's Wenona Byrne said authors should be included in any consultation about AI regulation. "We'd like to see the tech companies working with Australian creatives — that's fundamental," she said. "We know that the work of Australian writers is of very high value and using it without their consent or remuneration is akin to theft of their copyright material. "We want to look at different ways that generative AI can compensate original creators for their work, whether that's through licensing models or royalty schemes that would see the fair and equitable treatment of the creators." The issue of AI — including the proposed TDM exemption — will be on the agenda at the first Writing Australia council meeting, scheduled for late August. "The writing industry is a $2 billion industry in Australia; it's one that we are rightly proud of, and anything that would diminish the potential for that industry to thrive is something [we] would be very concerned about."

Herald Sun
5 hours ago
- Herald Sun
Imagion teams up with Wayne State University for AI push into cancer detection
Imagion to collaborate with MRI experts at Wayne State University to strengthen push into AI-enabled cancer diagnostics Collaboration will establish optimised imaging protocols for Imagion's MagSense molecular-imaging-agent tech Bolsters relationship Imagion has with Siemens Healthineers, the leading manufacturer of MRI equipment Special Report : Imagion Biosystems has strengthened its push into AI-enabled cancer diagnostics through a new collaboration with leading MRI experts at Wayne State University School of Medicine in the US. Imagion Biosystem (ASX:IBX) has signed a collaborative service agreement with Dr Mark Haacke and Dr Sagar Bush to establish optimised imaging protocols for its proprietary MagSense molecular imaging agent technology. The collaboration further bolsters the existing relationship Imagion has with Siemens Healthineers, the world's leading manufacturer of MRI equipment. Both researchers have had a long-standing collaboration with Siemens and are equipped with its top-of-the-line scanners. The research will focus on quantitative MRI sequences compatible with Siemens and other commercially available MRI scanners. By combining advanced quantitative imaging with the MagSense agents, Imagion aims to enable AI-based interpretation and deliver more accurate and precise data cancer detection and patient care. With Siemens' backing, Imagion plans to incorporate the optimised MRI sequences developed by the leading researcher into its upcoming phase II clinical trial for its HER2 breast cancer imaging agent. Other key objectives of the collaborative service agreement also include: Determining the lowest dose of the MagSense imaging agent to achieve detection Establish MRI sequences and protocols optimised for MagSense Transferring the optimised protocols to clinical sites for use in the planned MagSense HER2 Phase 2 study Using quantitative imaging techniques that could net AI compatible image data to improve diagnostic accuracy 'Exciting development for the medical imaging field' Haacke has been a pioneer in the field of quantitative MRI for decades. Collaboration with his team is set to lay the foundation for future automated analysis of MagSense images using AI. By implementing quantitative MR imaging techniques, Imagion said the specific signature of MagSense imaging agents will be uniquely detectable in affected tissue. By applying advanced post-processing techniques and AI interpretation to these images, MagSense has potential to enable not only automatic detection and differentiation of normal versus cancerous tissue, but also improved staging, tracking and treatment monitoring. 'Imagion's MagSense imaging agents are a very exciting development for the medical imaging field as it finally brings molecular specificity to MRI,' said Haacke. 'I have spent decades developing quantitative, high-resolution imaging to identify new biomarkers and explore disease etiology. 'Working with Imagion is a natural extension of that work, adding increased specificity to the already high resolution and sensitivity of MRI and powering the future of AI-based diagnostics.' Listen : Tim Boreham interviews IBX Solving a key barrier holding back AI diagnostics Imagion said conventional MRI, while producing excellent images of soft tissue, is qualitative by nature. This means it relies on subjective interpretation by radiologists to make a diagnostic determination based on differences in contrast of the various tissues. As a result, imaging findings still require confirmatory biopsies to achieve diagnostic certainty, creating challenges in analysing small lesions or early-stage disease. The lack of specificity is considered a fundamental hurdle for radiologic AI models, limited in their diagnostic capability due to subjectivity of the training data and accuracy of the readers' interpretation. Quantitative imaging, on the other hand, provides specific information about tissue characteristics on a pixel-by-pixel basis. These imaging techniques can measure precise amounts of elements such as iron, water, calcium, or fat in each region of interest. Imagion said that combining quantitative MRI sequences with its MagSense imaging agent could overcome one of the biggest obstacles to AI in medical imaging. MagSense particles attach to cancer cells, creating a unique, measurable signal that can be distinguished from healthy tissue automatically. This could lead to earlier and more accurate cancer diagnoses, reduce differences in how individual radiologists interpret scans, and make advanced imaging available beyond specialised medical centres The company said including these quantitative sequences in its phase II trial for HER2 breast cancer will accelerate development of AI diagnostics by providing early data to train and refine AI models. Exciting time for Imagion Imagion chief business officer Ward Detwiler said the company was very excited to have the collaboration and support of Haacke's research team. 'Dr Haacke literally wrote the book on MRI, which anyone in the MRI space will recognise as required reading,' he said. 'Combining their knowledge and expertise in quantitative MRI with the specificity of our targeted MagSense imaging agents, we believe we can significantly improve the diagnostic utility of the images by introducing quantitative data to enable precise, AI-based detection.' This article was developed in collaboration with Imagion Biosystems, a Stockhead advertiser at the time of publishing. This article does not constitute financial product advice. You should consider obtaining independent advice before making any financial decisions. Originally published as Imagion teams up with Wayne State University for AI push into cancer detection