'Weapons' tops North American box office for 2nd week
The Warner Bros. movie starring Julia Garner ("Ozark") and Josh Brolin ("Avengers: Infinity War") tells the story of the mysterious disappearance of a group of children from the same school class.
Analyst David A. Gross of Franchise Entertainment Research called it a "strong" week-two performance, especially in a quiet summer weekend at the movies in the United States and Canada.
Holding in second place was Disney's "Freakier Friday" starring Lindsay Lohan and Jamie Lee Curtis, the much-anticipated sequel to the 2003 body-swapping family film, at $14.5 million, Exhibitor Relations said.
Debuting in third place was Universal action sequel "Nobody 2," starring Bob Odenkirk of "Better Call Saul" fame, at $9.3 million.
"Critics like this story about a workaholic assassin trying to take a vacation with his family while getting caught up in trouble. Reviews and audience scores are both very good," Gross said.
"The Fantastic Four: First Steps," Disney's reboot of the Marvel Comics franchise, dropped to fourth place at $8.8 million.
Pedro Pascal, Vanessa Kirby, Joseph Quinn and Emmy winner Ebon Moss-Bachrach star as the titular team of superheroes, who must save a retro-futuristic world from the evil Galactus.
Universal's family-friendly animation sequel "The Bad Guys 2," about a squad of goofy animal criminals actually doing good in their rebranded lives, dropped to fifth, earning $7.5 million.
Rounding out the top 10 were:
"Superman" ($5.3 million)
"The Naked Gun" ($4.8 million)
"Jurassic World: Rebirth" ($2.9 million)
"F1: The Movie" ($2.7 million)
"Coolie" ($2.4 million)
bur-sst/aha
Solve the daily Crossword
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Washington Post
10 minutes ago
- Washington Post
Making cash off ‘AI slop': The surreal video business taking over the web
Luis Talavera, a 31-year-old loan officer in eastern Idaho, first went viral in June with an AI-generated video on TikTok in which a fake but lifelike old man talked about soiling himself. Within two weeks, he had used AI to pump out 91 more, mostly showing fake street interviews and jokes about fat people to an audience that has surged past 180,000 followers, some of whom comment to ask if the scenes are real.
Yahoo
38 minutes ago
- Yahoo
'Of course': Bob Odenkirk confirms he'd be on board for more Better Call Saul
Bob Odenkirk would "of course" be on board for a new series of Better Call Saul. The 62-year-old actor first played lawyer Jimmy McGill and his corrupt alter ego Saul Goodman in Breaking Bad and then reprised the role in his own spin-off series, which ran from 2015 to 2022, and he admitted if showrunners Vince Gilligan and Peter Gould had a new idea for his character, who was jailed at the end of Better Call Saul, he would be on board, though he thinks it is unlikely. He told 'Vince Gilligan and Peter Gould created that show. There are some of the best writers who ever worked in TV. 'So if they were to think of something in that world, of course I would do it. 'But I don't think so. I think they've all moved on to some more amazing projects that you'll soon see.' Of a potential plotline, he noted: 'He's not getting out [of prison]. If there's another Saul show, it takes place inside prison" The Nobody actor will always be grateful for the opportunity to have played Saul. He said: 'That part turned my life around, and I've given more to that part than anything I've done." Creator Vince has been working on new Apple TV+ sci-fi drama Pluribus, which stars Bob's Better Call Saul co-star Rhea Seehorn, and the actor teased the programme - which will be released in November - is "going to be a great one". Bob said: 'Look forward to the best written show on TV for years to come." The veteran actor previously admitted he is "fine with moving on" from playing both Saul and his assassin character Hutch from the Nobody movies because they are both tough roles to play. He told The Hollywood Reporter: 'They're guys who, for different reasons, have pretty big chips on their shoulders, and that's hard to play after a while. You can't just carry that guy around all the time.' Bob suffered a heart attack during the filming of the final season of Better Call Saul and he recently recalled how he was lucky that co-stars Ray Campbell and Patrick Fabian were nearby as he likely would have died had the incident happened in his trailer. He told Conan O'Brien Needs a Friend: "It was during Covid shooting, so we were separate from the crew. "And luckily, I didn't go to my trailer. If I'd gone to my trailer, I wouldn't be here, because they don't bother you (in the trailer)." But his co-stars' screams for help were initially mistaken as laughter at first due to the social distancing provisions in place on the set. He said: "It took a few seconds to realise people were screaming." Bob was grateful to the show's medical officer Rosa Estrada as she "immediately" began CPR after learning that no defibrillator would be available for 15 minutes.


WIRED
40 minutes ago
- WIRED
Teachers Are Trying to Make AI Work for Them
Aug 18, 2025 6:00 AM Since the start of the AI boom, teachers have been tasked with figuring out if LLMs are helpful tools or a cheat code. This is how they're bringing AI to their curricula. ILLUSTRATION: VIVIENNE SHAO One day last spring, in a high school classroom in Texas, students were arguing about who to kill off first. It was a thought experiment with a sci-fi premise: A global zombie outbreak has decimated major cities. One hundred frozen embryos meant to reboot humanity are safe in a bomb shelter, but the intended adult caretakers never made it. Instead, 12 random civilians stumbled in. There's only enough food and oxygen for seven. The students had to decide who would die and who would live to raise the future of the human race. It wasn't an easy choice. There was Amina, a 26-year-old actress, and Bubak, her husband. Also, a nurse named Marisa, a farmer named Roy, and others. Bubak, who had a criminal record, was a hard sell. So were the useless-yet-likable extras. For years, English teacher Cody Chamberlain had let students debate the ethics and logistics of saving humanity on their own—until he decided to throw AI into the mix. Chamberlain fed the scenario to ChatGPT. It killed Bubak and saved his wife—not because she was useful in other ways but because she could bear children. 'That's so cold,' the students gasped. It was. But for Chamberlain, it offered something new: a dispassionate, algorithmic judgment his students could think about critically. 'ChatGPT said we needed her, like Handmaid's Tale –style,' he says. 'And the kids were like, 'That's ridiculous.' It was weird for ChatGPT to finally not have an answer key but something the kids could push back on.' Teachers have long used technology to personalize lessons, manage workloads, or liven up slideshows. But something shifted after ChatGPT's public launch in 2022. Suddenly, teachers weren't just being tasked with figuring out how to incorporate iPads or interactive whiteboards into their lessons. They had to decipher how to deal with a technology that was already crash-landing into their students' lives, one that could help them study or help them cheat. A quarter of teachers surveyed by Pew in the fall of 2023 said they thought AI provided more harm than benefits; 32 percent thought the tech was a mix of good and bad. Educators faced a choice: Try to fight off AI, or find a way to work with it. This fall, AI will be more embedded in US classrooms than ever. Teachers are deploying large language models to write quizzes, adapt texts to reading levels, generate feedback, and design differentiated instruction. Some districts have issued guidance. Others have thrown up their hands. In the absence of clear policy, teachers are setting the boundaries themselves—one prompt at a time. 'It's just too easy and too alluring,' says Jeff Johnson, an English teacher in California who instructs other teachers on AI incorporation in his district. 'This is going to change everything. But we have to decide what that actually means.' Teaching has long relied on unpaid labor—nights spent googling, planning, adjusting for special education or multilingual learners. For Johnson, AI can provide the kind of assistance that can curb burnout. He uses Brisk to generate short quizzes, Magic School to streamline lesson planning, and Diffit to create worksheets tailored to different skill levels. He doesn't use AI to grade papers or answer student questions. He uses them to prep faster. 'That alone saves me days and weeks,' Johnson says. 'Time that can be better spent interacting with students.' Jennifer Goodnow, who teaches English as a second language in New York, feels similarly. She now plugs complex readings, like essays or book excerpts, into ChatGPT and asks it to create separate versions for advanced and beginner students, with corresponding depth-of-knowledge questions. Amanda Bickerstaff, a former teacher and CEO of AI for Education, an organization that offers training and resources to help educators integrate AI into their classrooms, puts it bluntly: 'Teachers are incorporating AI because they've always needed better planning tools. Now they finally have them.' The same goes for students with individualized education plans, commonly called IEPs—especially those with reading or processing disabilities. If a student struggles with comprehending text, for instance, a teacher might use generative AI to simplify sentence structures, highlight key vocabulary, or break down dense passages into more digestible chunks. Some tools can even reformat materials to include visuals or audio, helping students access the same content in a different way. Chamberlain, Johnson, and Goodnow all teach language arts, subjects where AI can offer benefits—and setbacks—in the classroom. Math teachers, though, tend to be more skeptical. 'Large language models are really bad at computation,' Bickerstaff says. Her team explicitly advises against using tools like ChatGPT to teach math. Instead, some teachers use AI for adjacent tasks—generating slides, reinforcing math vocabulary, or walking students through steps without solving problems outright. But there's something else teachers can use AI for: staying ahead of AI. Nearly three years after ChatGPT became available to the public, teachers can no longer ignore that their kids use it. Johnson recalls one student who was asked to analyze the song 'America' from West Side Story only to turn in a thesis on Simon & Garfunkel's song of same name. 'I was like, 'Dude, did you even read the response?'' he says. Rather than ban the tools, many teachers are designing around them. Johnson has students draft essays step-by-step in a Google Doc with version history enabled, which allows him to track students' writing progress as it appears on the page. Chamberlain requires students to submit their planning documents alongside final work. Goodnow is toying with the idea of having students plug AI-generated essays into assignments and then critique the results. 'Three years ago, I would've thrown the book at them,' Chamberlain says. 'Now it's more like, 'Show me your process. Where were you an agent in this?'' Even so, detecting AI use remains a game of vibes. Plagiarism checkers are notoriously unreliable. Districts have been reluctant to draw hard lines, in part because the tools are moving faster than the rules. But if there's one thing almost everyone agrees on, it's this: Students need AI literacy, and they're not getting it. 'We need to create courses for high school students on AI use, and I don't know that anybody knows the answer to this,' Goodnow says. 'Some sort of ongoing dialog between students and teachers on how to ethically, question mark, use these tools.' Organizations like AI for Education aim to provide that literacy. Founded in 2023, it works with school districts across the US to create AI guidance and training. But even in the most proactive schools, the focus is still on tool use—not critical understanding. Students know how to generate answers. They don't know how to tell whether those answers are inaccurate, biased, or made up. Johnson has begun building lessons around AI hallucinations—like asking ChatGPT how many R's are in the word 'strawberry.' (Spoiler: It often gets it wrong.) 'They need to see that you can't always trust it,' he says. As the tools improve, they're also reaching younger students, raising new concerns about how kids interact with LLMs. Bickerstaff warns that younger children, still learning to distinguish fact from fiction, may be especially vulnerable to over-trusting generative tools. That trust, she says, could have real consequences for their development and sense of reality. Already, some students are using AI not just to complete tasks but to think through them—blurring the line between tool and tutor. Across the board, educators say this fall feels like a turning point. Districts are rolling out new products, students are getting savvier, and teachers are racing to set the norms before the tech sets them itself. 'If we know we're preparing students for the future workforce—and we're hearing from leaders across many different companies that AI is going to be super important—then we need to start now,' Bickerstaff says. That's what teachers like Johnson and Goodnow are doing, one prompt, one student, one weird apocalypse scenario at a time.