
YouTube launches new Shopping product stickers for Shorts
YouTube is updating the Shopping experience for Shorts, the company announced on Wednesday. The platform is launching new Shopping product stickers to make it easy for viewers to know that the products being used in the Short are shoppable.
The new stickers could be a way for creators to increase their earnings. During its testing last month, YouTube found that Shorts with Shopping product stickers saw over 40% more clicks on products than Shorts with the Shopping button, it said.
Previously, tagged products were displayed in a banner in the bottom-left corner of Shorts with a Shopping button. Clicking the banner opened a list of products available for viewers to browse and shop.
Image Credits:YouTube
Now, creators can tag products in their Short to automatically generate a sticker based on the first product they tagged. They can then adjust the size and placement of the sticker.
When multiple products are tagged in a Short, viewers can tap on the downward arrow on the sticker to see the full list of products. Once they click on a product they're interested in, they'll be redirected to the retailer's website for that product.
Shopping product stickers are being rolled out over the next week and will be available globally, expect in South Korea, but YouTube plans to launch the feature there soon.
YouTube CEO Neal Mohan revealed at Cannes Lions 2025 that Shorts are now averaging more than 200 billion daily views. Mohan also shared that Google's Veo 3 video generator, which can generate both video and audio, will be coming to Shorts later this summer.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Business Insider
6 minutes ago
- Business Insider
Meta's Scale AI deal has clients like Google halting projects, contractors scrambling, and one investor bailing out
Google, xAI, and other tech companies aren't wasting any time distancing themselves from Scale AI. Just hours after Meta announced a massive $14.3 billion investment in the data labeling startup, Google halted multiple projects with the company, according to internal correspondence obtained by Business Insider. Google had already been winding down its work with Scale AI, two contractors working on its projects told BI. OpenAI and Elon Musk's xAI have paused some Scale AI projects, too, Scale AI contractors working on them told BI. A smaller Scale AI investor told BI they are selling their remaining stake in the startup because they don't think Meta's investment can make up for the loss of Big Tech clients. These developments are the latest evidence that Scale AI could face challenges to its business, sparked by Meta's blockbuster deal to buy 49% of the startup and hire its founder Alexandr Wang. Scale AI helps Big Tech clients train and improve their latest AI models. An investment from a major AI competitor like Meta throws that relationship into doubt. BI spoke to 10 Scale AI contractors, who said several projects from Meta's AI competitors have been put on ice. They asked to remain anonymous due to the sensitivity of the matter. "As a whole, the nature of this business naturally fluctuates and dips do not imply a change in the relationship," a Scale AI spokesperson said. In a blog post, Scale AI's interim CEO Jason Droege said Scale remains "unequivocally" independent and has "multiple promising lines of business." In another note, Scale said it won't provide Meta with confidential information about its customers. Google declined to comment. An OpenAI spokesperson confirmed the company was winding down its work with Scale AI. xAI and Meta didn't respond to requests for comment from BI. Reuters reported last week that Google is planning to cut ties with Scale AI amid concerns about Meta accessing information about the search giant's latest AI developments. Google work 'may have dried up' Less than 24 hours after Scale AI announced its investment deal with Meta on June 12, contractors working on multiple Google projects — with codenames like "Genesis" and "Beetle Crown" — received emails seen by BI notifying them that their work was being "paused." One US-based contractor told BI they'd spent the last few months working on a Google project designed to help Gemini answer tough biology prompts. The contractor was earning $50 an hour until their work was suddenly stopped the same day the Meta deal was announced, despite having been assured weeks before that the project would be ongoing. "The Beetle Crown group chat just suddenly disappeared, and people were wondering what happened. It wasn't communicated well, and that only adds to the sense of like, 'I have no project now. I'm totally lost," the contractor told BI. Google had previously "paused" and not resumed a separate large-scale AI project with Scale in the spring of this year, leaving workers scrambling as Google work became increasingly hard to find, two contractors told BI. Scale AI told BI the spring pause was "not a notable event from our perspective" and that this kind of closure happens regularly. Reuters previously reported that Google was Scale AI's largest customer last year, spending $150 million on its services alone last year — almost 20% of Scale AI's revenue. Contractors who spoke with BI said work with Scale AI is now hard to come by, with dashboards showing far fewer available projects — or in some cases empty — compared to before the Meta deal. "Work has been extremely scarce for most of us, and now it may have dried up almost completely," another Scale AI contractor who had been working on Google-related projects told BI. A different contractor, whose dashboard showed several paused or inactive projects on Wednesday, said he lost access to all those projects overnight. An internal dashboard of Scale's generative AI projects and clients from April, reviewed by BI, showed Scale AI was running at least 38 active projects for Google at the time — more than a third of Scale AI's 107 generative AI projects on that list. The dashboard also listed active projects for other major clients, including Apple, xAI, Meta, Microsoft, and Amazon. Amazon, Microsoft, and Apple didn't respond to requests for comment from BI. xAI and OpenAI also paused some Scale AI projects As BI previously reported, Scale AI's work for xAI includes a major project called Xylophone. This project involves training xAI's chatbot to improve its conversations on a wide range of topics, from the zombie apocalypse to life on Mars. Three contractors working on several Xylophone projects told BI their dashboard showed several of them on pause since the Meta investment. One said they had spent most of their time working on Xylophone. "Long story short, I am now in a state where I have been told that there are no projects for my specialty or location," they said. "The fact that there's nothing else to work on right now just sucks," another contractor said. Another contractor working on an OpenAI project said their work was also paused last week. The team was told they would no longer be working on the project, but were not given a specific reason. "You can put two and two together," the contractor told BI. Confirming OpenAI's work with Scale AI was winding down, OpenAI's spokesperson said Scale AI accounted for a small fraction of its overall data work and that it had increased its need for expertise that went beyond what Scale AI could offer. One investor is getting out of Scale AI Scale AI's announcement promised that its business with Meta would "substantially expand" following the investment. It's unclear whether that expanded relationship can offset the revenue hit from losing major clients like Google. One CEO at a rival firm told BI they've seen a spike in inbound requests from former Scale clients since the Meta deal closed. The investor selling off their remaining stake in Scale AI said they didn't understand how Scale AI could be valued at $29 billion by Meta now that top customers, especially Google, are leaving.


The Verge
25 minutes ago
- The Verge
Ancestra actually says a lot about the current state of AI-generated videos
After watching writer / director Eliza McNitt's new short film Ancestra, I can see why a number of Hollywood studios are interested in generative AI. A number of the shots were made and refined solely with prompts, in collaboration with Google's DeepMind team. It's obvious what Darren Aronofsky's AI-focused Primordial Soup production house and Google stand to gain from the normalization of this kind of creative workflow. But when you sit down to listen to McNitt and Aronofsky talk about how the short came together, it is hard not to think about generative AI's potential to usher in a new era of ' content ' that feels like it was cooked up in a lab — and put scores of filmmakers out of work in the process. Inspired by the story of McNitt's own complicated birth, Ancestra zooms in on the life of an expectant mother (Audrey Corsa) as she prays for her soon-to-be-born baby's heart defect to miraculously heal. Though the short features a number of real actors performing on practical sets, Google's Gemini, Imagen, and Veo models were used to develop Ancestra 's shots of what's racing through the mother's mind and the tiny, dangerous hole inside of the baby's heart. Inside the mother's womb, we're shown Blonde -esque close-ups of the baby, whose heartbeat gradually becomes part of the film's soundtrack. And the woman's ruminations on what it means to be a mother are visualized as a series of very short clips of other women with children, volcanic explosions, and stars being born after the Big Bang — all of which have a very stock-footage-by-way-of-gen-AI feel to them. It's all very sentimental, but the message being conveyed about the power of a mother's love is cliched, particularly when it's juxtaposed with what is essentially a montage of computer-generated nature footage. Visually Ancestra feels like a project that is trying to prove how all of the AI slop videos flooding the internet are actually something to be excited about. The film is so lacking in fascinating narrative substance, though, that it feels like a rather weak argument in favor of Hollywood's rush to get to the slop trough while it's hot. As McNitt smash cuts to quick shots of different kinds of animals nurturing their young and close-ups of holes being filled in by microscopic organisms, you can tell that those visuals account for a large chunk of the film's AI underpinnings. They each feel like another example of text-to-video models' ability to churn out uncanny-looking, decontextualized footage that would be difficult to incorporate into fully produced film. But in the behind-the-scenes making-of video that Google shared in its announcement last week, McNitt speaks at length about how, when faced with the difficult prospect of having to cast a real baby, it made much more sense to her to create a fake one with Google's models. 'There's just nothing like a human performance and the kind of emotion that an actor can evoke,' McNitt explains. 'But when I wrote that there would be a newborn baby, I did not know the solution of how we would [shoot] that because you can't get a baby to act.' Filmmaking with infants poses all kinds of production challenges that simply aren't an issue with CGI babies and doll props. But going the gen AI route also presented McNitt with the opportunity to make her film even more personal by using old photos of herself as a newborn to serve as the basis for the fake baby's face. With a bit of fine-tuning, Ancestra 's production team was able to combine shots of Corsa and the fake baby to create scenes in which they almost, but not quite, appear to be interacting as if both were real actors. If you look closely in wider shots, you can see that the mother's hand seems to be hovering just above her child because the baby isn't really there. But the scene moves by so quickly that it doesn't immediately stand out, and it's far less ' AI-looking ' than the film's more fantastical shots meant to represent the hole in the baby's heart being healed by the mother's will. Though McNitt notes how 'hundreds of people' were involved in the process of creating Ancestra, one of the behind-the-scenes video's biggest takeaways is how relatively small the project's production team was compared to what you might see on a more traditional short film telling the same story. Hiring more artists to conceptualize and then craft Ancestra 's visuals would have undoubtedly made the film more expensive and time-consuming to finish. Especially for indie filmmakers and up-and-coming creatives who don't have unlimited resources at their disposal, those are the sorts of challenges that can be exceedingly difficult to overcome. But Ancestra also feels like a case study in how generative AI stands to eliminate jobs that once would have gone to people. The argument is often that AI is a tool, and that jobs will shift rather than be replaced. Yet it's hard to imagine studio executives genuinely believing in a future where today's VFX specialists, concept artists, and storyboarders have transitioned into jobs as prompt writers who are compensated well enough to sustain their livelihoods. This was a huge part of what drove Hollywood's film / TV actors and writers to strike in 2023. It's also why video game performers have been on strike for the better part of the past year, and it feels irresponsible to dismiss these concerns as people simply being afraid of innovation or resistant to change. In the making-of video, Aronofsky points out that cutting-edge technology has always played an integral role in the filmmaking business. You would be hard-pressed today to find a modern film or series that wasn't produced with the use of powerful digital tools that didn't exist a few decades ago. There are things about Ancestra 's use of generative AI that definitely make it seem like a demonstration of how Google's models could, theoretically and with enough high-quality training data, become sophisticated enough to create footage that people would actually want to watch in a theater. But the way Aronofsky goes stony-faced and responds 'not good' when one of Google's DeepMind researchers explains that Veo can only generate eight-second-long clips says a lot about where generative AI is right now and Ancestra as a creative endeavor. It feels like McNitt is telling on herself a bit when she talks about how the generative models' output influenced the way she wrote Ancestra. She says 'both things really informed each other,' but that sounds like a very positive way of spinning the fact that Veo's technical limitations required her to write dialogue that could be matched to a series of clips vaguely tied to the concepts of motherhood and childbirth. This all makes it seem like McNitt's core authorial intent, at times, had to be deprioritized in favor of working with whatever the AI models spat out. Had it been the other way around, Ancestra might have wound up telling a much more interesting there's very little about Ancestra 's narrative or, to be honest, its visuals that is so groundbreaking that it feels like an example of why Hollywood should be rushing to embrace this technology whole cloth. Films produced with more generative AI might be cheaper and faster to make, but the technology as it exists now doesn't really seem capable of producing art that would put butts in movie theaters or push people to sign up for another streaming service. And it's important to bear in mind that, at the end of the day, Ancestra is really just an ad meant to drum up hype for Google, which is something none of us should be rushing to do.


Forbes
27 minutes ago
- Forbes
A Promising Update On ‘Mindhunter' Season 3 On Netflix, Of All Things
Mindhunter Six years on, I did not imagine I would be talking about the potential return of Mindhunter on Netflix, an idea that fans have long given up hope on. But here we are, and Mindhunter star Holt McCallany just gave an interview where he talks about how the show might return and the status of director David Fincher's involvement. The interview is with CBR promoting McCallany's other new show, The Waterfront, and he addressed the season 3 issue: "So look, you know, I had a meeting with David Fincher in his office a few months ago, and he said to me that there is a chance that it may come back as three two-hour movies, but I think it's just a chance. I know there are writers that are working, but you know, David has to be happy with scripts." "And I felt very fortunate and privileged to have gotten to do that show at all. I would love it if it were to return. I think, like I said, you know, he gave me a little bit of hope when I had that meeting with him, but the sun, the moon, and the stars would all have to align. The good news is that we're at Netflix with The Waterfront, and those movies would also be for Netflix. So I think that in terms of dates and logistics, it could all be worked out, but it has to do, you know, with David really having the time and the inclination and being happy, you know, with the material. And, you know, that's a big question mark." Mindhunter It has been beyond frustrating that Mindhunter, arguably one of the best shows Netflix has ever made, disappeared after its second season, which in no way needed to be its end. There have been many glimpses as to what happened over the years, combining reports that Fincher wasn't interested with reports he would have done it, but Netflix said it cost too much. Given that the streamer now spends billions on content a year, that no longer seems like a valid excuse. The idea that this would be three, two-hour movies instead of a show is well, I mean, that's pretty much a TV show now, just six hour-long episodes paired together. Yes, back when Mindhunter aired in 2019, it was 10 and 9 episodes, respectively, but that's not the norm for crime series these days, and I think fans would be willing to make the compromise so long as the show came back in some capacity. The main key here is Fincher (co-star Jonathan Groff is busy, but it could be possible to wrangle him). Fincher has been busy non-stop since Mindhunter six years ago, and has worked on other projects for Netflix like his movies Mank and The Killer. Now, he's doing an extremely high-profile Once Upon a Time in America sequel, taking over for director Quentin Tarantino. This still seems like something of a moonshot, but it's more information than we've had previously. Hopefully, we learn more soon. Follow me on Twitter, YouTube, Bluesky and Instagram. Pick up my sci-fi novels the Herokiller series and The Earthborn Trilogy.