logo
‘You can make really good stuff – fast': new AI tools a gamechanger for film-makers

‘You can make really good stuff – fast': new AI tools a gamechanger for film-makers

The Guardian3 days ago
A US stealth bomber flies across a darkening sky towards Iran. Meanwhile, in Tehran a solitary woman feeds stray cats amid rubble from recent Israeli airstrikes.
To the uninitiated viewer, this could be a cinematic retelling of a geopolitical crisis that unfolded barely weeks ago – hastily shot on location, somewhere in the Middle East.
However, despite its polished production look, it wasn't shot anywhere, there is no location, and the woman feeding stray cats is no actor – she doesn't exist.
The engrossing footage is the 'rough cut' of a 12-minute short film about last month's US attack on Iranian nuclear sites, made by the directors Samir Mallal and Bouha Kazmi. It is also made entirely by artificial intelligence.
The clip is based on a detail the film-makers read in news coverage of the US bombings – a woman who walked the empty streets of Tehran feeding stray cats. Armed with the information, they have been able to make a sequence that looks as if it could have been created by a Hollywood director.
The impressive speed and, for some, worrying ease with which films of this kind can be made has not been lost on broadcasting experts.
Last week Richard Osman, the TV producer and bestselling author, said that an era of entertainment industry history had ended and a new one had begun – all because Google has released a new AI video making tool used by Mallal and others.
'So I saw this thing and I thought, 'well, OK that's the end of one part of entertainment history and the beginning of another',' he said on The Rest is Entertainment podcast.
Osman added: 'TikTok, ads, trailers – anything like that – I will say will be majority AI-assisted by 2027.'
For Mallal, an award-winning London-based documentary maker who has made adverts for Samsung and Coca-Cola, AI has provided him with a new format – 'cinematic news'.
The Tehran film, called Midnight Drop, is a follow-up to Spiders in the Sky, a recreation of a Ukrainian drone attack on Russian bombers in June.
Within two weeks, Mallal, who directed Spiders in the Sky on his own, was able to make a film about the Ukraine attack that would have cost millions – and would have taken at least two years including development – to make pre-AI.
'Using AI, it should be possible to make things that we've never seen before,' he said. 'We've never seen a cinematic news piece before turned around in two weeks. We've never seen a thriller based on the news made in two weeks.'
Spiders in the Sky was largely made with Veo3, an AI video generation model developed by Google, and other AI tools. The voiceover, script and music were not created by AI, although ChatGPT helped Mallal edit a lengthy interview with a drone operator that formed the film's narrative spine.
Google's film-making tool, Flow, is powered by Veo3. It also creates speech, sound effects and background noise. Since its release in May, the impact of the tool on YouTube – also owned by Google – and social media in general has been marked. As Marina Hyde, Osman's podcast partner, said last week: 'The proliferation is extraordinary.'
Quite a lot of it is 'slop' – the term for AI-generated nonsense – although the Olympic diving dogs have a compelling quality.
Mallal and Kazmi aim to complete the film, which will intercut the Iranian's story with the stealth bomber mission and will be six times the length of Spider's two minutes, in August. It is being made by a mix of models including Veo3, OpenAI's Sora and Midjourney.
'I'm trying to prove a point,' says Mallal. 'Which is that you can make really good stuff at a high level – but fast, at the speed of culture. Hollywood, especially, moves incredibly slowly.'
Sign up to TechScape
A weekly dive in to how technology is shaping our lives
after newsletter promotion
He adds: 'The creative process is all about making bad stuff to get to the good stuff. We have the best bad ideas faster. But the process is accelerated with AI.'
Mallal and Kazmi also recently made Atlas, Interrupted, a short film about the 3I/Atlas comet, another recent news event, that has appeared on the BBC.
David Jones, the chief executive of Brandtech Group, an advertising startup using generative AI – the term for tools such as chatbots and video generators – to create marketing campaigns, says the advertising world is about to undergo a revolution due to models such as Veo3.
'Today, less than 1% of all brand content is created using gen AI. It will be 100% that is fully or partly created using gen AI,' he says.
Netflix also revealed last week that it used AI in one of its TV shows for the first time.
However, in the background of this latest surge in AI-spurred creativity lies the issue of copyright. In the UK, the creative industries are furious about government proposals to let models be trained on copyright-protected work without seeking the owner's permission – unless the owner opts out of the process.
Mallal says he wants to see a 'broadly accessible and easy-to-use programme where artists are compensated for their work'.
Beeban Kidron, a cross-bench peer and leading campaigner against the government proposals, says AI film-making tools are 'fantastic' but 'at what point are they going to realise that these tools are literally built on the work of creators?' She adds: 'Creators need equity in the new system or we lose something precious.'
YouTube says its terms and conditions allow Google to use creators' work for making AI models – and denies that all of YouTube's inventory has been used to train its models.
Mallal calls his use of AI to make films 'prompt craft', a phrase that uses the term for giving instructions to AI systems. When making the Ukraine film, he says he was amazed at how quickly a camera angle or lighting tone could be adjusted with a few taps on a keyboard.
'I'm deep into AI. I've learned how to prompt engineer. I've learned how to translate my skills as a director into prompting. But I've never produced anything creative from that. Then Veo3 comes out, and I said, 'OK, finally, we're here.''
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Microsoft servers hacked by Chinese state-backed groups, firm says
Microsoft servers hacked by Chinese state-backed groups, firm says

BBC News

timean hour ago

  • BBC News

Microsoft servers hacked by Chinese state-backed groups, firm says

Chinese "threat actors" have hacked Microsoft's SharePoint document software servers and targeted the data of the businesses using it, the firm has state-backed Linen Typhoon and Violet Typhoon as well as China-based Storm-2603 were said to have "exploited vulnerabilities" in on-premises SharePoint servers, the kind used by firms, but not in its cloud-based US tech giant has released security updates in response and has advised all on-premises SharePoint server customers to install them."Investigations into other actors also using these exploits are still ongoing," Microsoft said in a statement. The firm said it had "high confidence" the hackers would continue to target systems which have not installed its security added that it would update its website blog with more information as its investigation said it had observed attacks in which hackers had sent a request to a SharePoint server "enabling the theft of the key material by threat actors".Charles Carmakal, chief technology officer at Mandiant Consulting firm, a division of Google Cloud, told the BBC it was "aware of several victims in several different sectors across a number of global geographies".Carmakal said it appeared that governments and businesses that use SharePoint on their sites were the primary target.A number of adversaries who stole material encoded by cryptography were then able to regain ongoing access to the victims' SharePoint data, he said."This was exploited in a very broad way, very opportunistically before a patch was made available. That's why this is significant," Carmakal said the "China-nexus actor" was deploying techniques similar to previous campaigns associated with said Linen Typhoon had "focused on stealing intellectual property, primarily targeting organizations related to government, defence, strategic planning, and human rights" for 13 added that Violet Typhoon had been "dedicated to espionage", primarily targeting former government and military staff, non-governmental organizations, think tanks, higher education, the media, the financial sector and the health sector in the US, Europe, and East Storm-2603 was "assessed with medium confidence to be a China-based threat actor".

Babydoll Archi: Indian woman's identity stolen for erotic AI content in deepfake deception
Babydoll Archi: Indian woman's identity stolen for erotic AI content in deepfake deception

BBC News

timean hour ago

  • BBC News

Babydoll Archi: Indian woman's identity stolen for erotic AI content in deepfake deception

It took just a few days for Indian Instagram sensation Babydoll Archi to double her following to 1.4 million, thanks to a couple of viral social media was a video that showed her in a red sari, dancing seductively to Dame Un Grr – a Romanian song. And a photo posted on the platform showed her posing with American adult film star Kendra everyone wanted to know about her - and the name Babydoll Archi trended in Google search and spawned countless memes and fan pages. But there was one issue about to emerge – there was no real woman behind the online Instagram account was fake, although the face it used had uncanny likeness to a real woman - a homemaker in Dibrugarh city in Assam, whom we'll call truth unravelled after her brother lodged a police complaint. Pratim Bora, Sanchi's ex-boyfriend, was police officer Sizal Agarwal who's heading the investigation told the BBC that Sanchi and Bora had a falling out and the AI likeness he created was to exact "pure revenge" on - a mechanical engineer and a self-taught artificial intelligence (AI) enthusiast - used private photos of Sanchi to create a fake profile, Ms Agarwal is in custody and has not made any statements yet. The BBC has reached out to his family and will update the article when they Archi was created in 2020 and the first uploads were made in May 2021. The initial photos were her real pictures that had been morphed, Ms Agarwal said."As time passed, Bora used tools such as ChatGPT and Dzine to create an AI version. He then populated the handle with deepfake photos and videos."The account started picking up likes from last year but it started gaining traction from April this year, she short two-paragraph complaint to the police submitted by Sanchi's family on the night of 11 July came with printouts of some photos and videos as Agarwal says it did not name anyone because they had no idea who could be behind it. Babydoll Archi was not an unfamiliar name for the police. Ms Agarwal says they had also seen media reports and comments speculating that she was AI generated, but there had been no suggestion that it was based on a real they received the complaint, police wrote to Instagram asking for the details of the account's creator."Once we received information from Instagram, we asked Sanchi if she knew any Pratim Bora. Once she confirmed, we traced his address in the neighbouring district of Tinsukia. We arrested him on the evening of 12 July."Ms Agarwal says the police have "seized his laptop, mobile phones and hard drives and his bank documents since he had monetised the account"."The account had 3,000 subscriptions on linktree and we believe he had earned 1m rupees from it. We believe he made 300,000 rupees in just five days before his arrest," she added. Ms Agarwal says Sanchi is "extremely distraught - but now she and her family are receiving counselling and they are doing better".There really is no way to prevent something like this from happening, "but had we acted earlier, we could have prevented it from gaining so much traction", Ms Agarwal said."But Sanchi had no idea because she has no social media presence. Her family too had been blocked out from this account. They became aware only once it went viral," she has not responded to the BBC queries on the case, but it generally does not allow posting of nudity or sexual content. And last month, CBS reported that it removed a number of ads promoting AI tools used to create sexually explicit deepfakes using images of real people. The Instagram account of Babydoll Archi, which had 282 posts, is no longer available to public - although social media is replete with her photos and videos and one Instagram account seems to have all of them. The BBC has asked Meta what they are planning to do about Bal, AI expert and lawyer, says what happened to Sanchi "is horrible but almost impossible to prevent".She can go to the court and seek the right to be forgotten, and a court can order the press reports that named her to be taken down but it's hard to scrub all the trace from the happened to Sanchi, she says, is what's always been happening to women, where their photos and videos are circulated as revenge."It's now a lot easier to do because of AI, but such incidents are still not as common as we expect - or they could be under-reported because of stigma or people being targeted may not even know about it as in the present case," Ms Bal says. And people watching it had no incentive to report it to the social media platform or cybercrime portal, she their complaint against Mr Bora, police have invoked sections of law that deal with sexual harassment, distribution of obscene material, defamation, forgery to harm reputation, cheating by personification and cybercrime. If found guilty, Mr Bora could get up to 10 years in case which has also led to outrage on social media in recent days has seen some seeking tougher laws to deal with such Bal believes there are enough laws to take care of such cases, but whether there's scope for new laws to deal with generative AI companies has to be looked at."But we also have to remember that deepfakes are not necessarily bad and laws have to be carefully crafted because they can be weaponised to chill free speech."Follow BBC News India on Instagram, YouTube, X and Facebook

‘I went vegan on the second day of filming': James Cromwell on making Babe, the talking pig classic
‘I went vegan on the second day of filming': James Cromwell on making Babe, the talking pig classic

The Guardian

time2 hours ago

  • The Guardian

‘I went vegan on the second day of filming': James Cromwell on making Babe, the talking pig classic

Chris Noonan, the director, had been in a battle with producer George Miller, who wanted an all-Australian cast for Babe. Thankfully, a wonderful casting director believed I was right for farmer Hoggett and pushed for me to get a meeting. George had found the book that the film is based on – The Sheep-Pig by British author Dick King-Smith – while on a trip to Europe with his daughter. I thought farmer Hoggett was from Yorkshire, but the studio said: 'No. Movies with accents don't make money.' Of course, Schindler's List won the best picture Oscar that year and it was filled with accents. They wanted me to keep my American accent so I thought I'd blow smoke up their ass and spent a whole day using this Texas shit-kicker accent. In the end I had to re-record all of those lines using the British accent I ultimately went with. During my makeup test, George was standing nearby. As he walked past, he said: 'Lose the sideburns.' I don't know what got into me. I just said: 'No. I like them.' George went, 'Who the fuck is he?' and walked off. I was very pleased with myself. We had an animatronic sheep in the middle of real sheep – which doesn't stick out. The crew used to bet on which one of the flock was fake. At the end of a take, you'd see the real sheep continue to look around and the animatronic one power down. You'd then hear a crew member say, 'I got it!' On the second day of filming, I broke for lunch before everybody else. All the animals I'd worked with that morning were on the table, cut up, fricasseed, roasted and seared. That was when I decided to become a vegan. The final scene, where the sheep follow Babe, was a miracle. The woman who worked with the sheep spent five months trying to get them to walk three abreast in rows and follow the pig around the circuit. She was working with them right until we shot. I said, 'Away to me pig' and those sheep moved through the circuit without a pause. When the gate closed behind them, the crowd – 200 extras we'd gathered from the local town – went berserk. I asked Chris how he wanted me to deliver my final line and he said: 'Right down the lens.' I didn't expect what happened: reflected back at me in the camera lens I saw not me, but my father. On that thought I laid the line: 'That'll do pig, that'll do.' At the time I hadn't forgiven my father, who was a director and very critical of my work, which stung. I didn't know I had to forgive him. But at that moment, I looked at myself and saw I am my father's son and I love him. Without a doubt, it brought closure. The only negative thing I ever heard about Babe was from a woman who said it ruined her relationship with her daughter. They used to enjoy Big Macs together and now her daughter wouldn't eat animals. I thought: 'If that's what you based your relationship on, it sucks anyway!' What set Babe apart was that it featured realistic animals and not fantasy characters. The goal was to intercut puppets with real animals. To have a convincing animatronic Babe, we had to fit a prosthetic band around the puppet's neck every day and punch in the hairs one at a time with a needle. We'd start early in the morning. We couldn't afford more than one prosthetic head, so to go from a standing pig to a sitting pig, we had to take off the head, put it on to the new puppet body, and then punch in all the hairs again to make it a seamless blend. If anything went wrong, we had to start the entire process all over again. It was terrifying. Up until that point, we had always used foam latex for puppets. It's wonderfully elastic but has no skin-like qualities. We had a chemist working with us who took on the challenge of making skin-like silicone for Babe. He added lots of oil and extra hardener. It set solid but remained flexible. When we'd lay it on paper, it would leach oil. It was a kind of mad chemistry. People who supply silicone skins to the industry use it to this day. Babe's eyes were plastic spheres with a plunger inside that moved back and forth. They had a round silicone ball in front of them. We painted an iris on the ball and filled the spheres with clear silicone gel. By pushing the plunger, we could make the pupils bigger or smaller to create her big brown eyes. Ferdinand the duck was a combination of fur and feathers. We had tried using only feathers but we couldn't lay them individually and make them move. When I watch the scenes with Ferdy and Babe in the shed, I struggle to know what's animatronic and what is real. We shot in Australia and the heat was phenomenal. Silicone is a great insulator so it was like an oven inside animatronic Babe. After we rehearsed, we'd cover Babe in a foil blanket and keep our fingers crossed that nothing overheated. Closeups of animatronic dogs were a degree beyond what we were capable of. The work everybody did was outstanding but there's just so much going on under that fur. We failed – they're impossible. In the end, all dog closeups were digitally enhanced. James had moments of holding Babe in his lap, and there was never a feeling of ridicule. He engaged with the puppets. It was remarkably rewarding to see him reach that level of interaction with something so dependent on your contribution. We did as much as was humanly possible. I'm very proud to have been involved with it.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store