Latest news with #Lenses


Time of India
2 hours ago
- Business
- Time of India
Snap to launch lightweight, immersive spectacles in 2026 as smartglass race heats up
Snap Inc. , the parent company of the ephemeral messaging app Snapchat, announced on Tuesday that it will launch immersive smart glasses called Specs in 2026. This is in line with other Big Tech companies targeting the lucrative smart glasses market, each looking for the next breakthrough in tech hardware. Snap's Specs are wearable computers integrated into a lightweight pair of glasses featuring see-through lenses that augment the physical world. In 2024, the tech company released the fifth generation of its Spectacles specifically for developers, paving the way for the public launch of Specs in 2026. 'We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world, and we will publicly launch our new Specs next year,' said CEO and cofounder Evan Spiegel . The company said its AR Lenses in the Snapchat camera are used eight billion times per day, and over 400,000 developers have built more than four million Lenses with Snap's AR tools. The company said it is working with Niantic to integrate its Visual Positioning System (VPS) into its Lens Studio development tools and smart glasses. Additionally, it will integrate WebXR support directly into the browser, allowing developers to build, test, and access augmented reality (AR) and virtual reality (VR) experiences. New for Snap OS Discover the stories of your interest Blockchain 5 Stories Cyber-safety 7 Stories Fintech 9 Stories E-comm 9 Stories ML 8 Stories Edtech 6 Stories Snap also announced major updates to Snap OS, including integration with OpenAI and Gemini on Google Cloud. The company now enables developers to build multimodal AI-powered Lenses and publish them for the Spectacles community. It also announced the Automated Speech Recognition API, which allows real-time transcription for over 40 languages, including non-native accents. Big Tech eyes new market Big Tech companies are racing to one-up each other in the smart glasses market, as they see it as the next space for human-computer interaction and a new way to interact with technology. Here are some of the latest developments in the space: - Meta currently leads in smart glass sales with its Ray-Ban Meta smart glasses , with over one million units sold in 2024. These glasses offer features like photo or video capture, music, calls, and voice commands powered by Meta AI. - Google marked a return to this market after its initial failure to crack it with "Google Glass." - At Google I/O 2025, the tech giant unveiled Project Aura, a new pair of immersive smart glasses powered by Android XR, also highlighting its lightweight nature. It is an "optical see-through XR" device, overlaying virtual content onto the real world through transparent lenses. Google's AI capabilities will also be integrated into these with Gemini. - Apple is reportedly planning a 2026 launch for its smart glasses as well, as it faces pressure in the AI-enhanced gadgets segment. Apple is developing dedicated chips for these glasses and aims to start mass production next year.
Yahoo
6 days ago
- Yahoo
Snap launches Lens Studio iOS and web apps for creating AR Lenses with AI and simple tools
Snap has launched a stand-alone Lens Studio iOS app and web tool, the company announced on Wednesday. The new tools are designed to make it easier for anyone to create AR Lenses through text prompts and simple editing tools. With the Lens Studio app, users will be able to do things like generate their own AI effects, add their Bitmoji, and browse trending templates to create customized Lenses. Up until now, Lens Studio has only been accessible via a desktop application for professional developers. While the desktop application will remain the primary tool for professionals, Snap says that the new iOS app and web tool are designed to allow people at all skill levels to create Lenses. "These are experimental new tools that make it easier than ever to create, publish, and play with Snapchat Lenses made by you," Snap wrote in a blog post. "Now, you can generate your own AI effects, add your dancing Bitmoji to the fun, and express yourself with Lenses that reflect your mood or an inside joke — whether you're on the go or near your computer." While Snap currently has an ecosystem of over 400,000 professional AR developers, the company is looking to attract more people who are interested in creating Lenses with the launch of these simpler tools. Snap is considered a leader in AR thanks to its early adoption of the technology through its AR filters and Lenses, and it's clear the company is committed to investing in the space, even as others may be retracting from it. Last year, Meta angered creators after it decided to shut down its Spark AR platform, which allowed third parties to build augmented reality (AR) effects. By opening up access to AR creation, Snap is doubling down on its vision for the technology. As Snap is bringing AR creation into the hands of more people, the company is also rolling out advanced tools for professionals. Yesterday, Snap released new Lens Studio tools that AR creators and developers can use to build Bitmoji games. The tools include a turn-based system to enable back-and-forth gameplay, a new customizable Character Controller that supports different gameplay styles, and more. This article originally appeared on TechCrunch at Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
6 days ago
- Yahoo
Snapchat Launches Standalone Lens Studio Mobile App
This story was originally published on Social Media Today. To receive daily news and insights, subscribe to our free daily Social Media Today newsletter. As it continues to work on its advancing AR projects, including AR-enabled Spectacles, Snapchat has launched a new standalone Lens Studio mobile app, which aims to make it easier for people to build their own AR experiences. As you can see in these images, Snap's Lens Studio app aims to provide a simplified effects creation flow, including templates and tools that will enable more people, even those without developer experience, to build custom in-app experiences. As explained by Snap: 'We're excited to introduce the Lens Studio iOS app and web tool. These are experimental new tools that make it easier than ever to create, publish, and play with Snapchat Lenses made by you. Now, you can generate your own AI effects, add your dancing Bitmoji to the fun, and express yourself with Lenses that reflect your mood or an inside joke–whether you're on the go or near your computer.' So it's less about professional AR creation, and more about democratizing the AR creation experience. Indeed, the App Store description outlines exactly that: 'Make your own Snapchat Lenses from your phone and in minutes. No code. No installs. Create AI effects, add your Bitmoji, design natural makeup looks, and more. Create something real and immediately share with your friends.' It's another step towards enabling broader creative expression with advanced digital tools, which has also been accelerated by generative AI apps and functions that can generate code for you based on plain language text prompts. Meta's even using conversational AI prompts to generate VR worlds, and with the complexity gap closing, you can see how this will help to usher in an era of expanded digital creativity, in all new ways. Snap has seen big success with user-generated Lenses, with these outside contributions often leading to new engagement trends in the app. Rather than limiting its creative output to its own internal team, enabling anyone to create their own AR experience has expanded its opportunity, and more than 400,000 professional developers and teams now use Lens Studio for their creative visions. And now, Snap's setting its sights on non-technical folk as well, opening up even more opportunity for original AR experiences. It could be a fun add-on, which helps to drive more engagement, and generates more engagement in the main Snapchat app. You can download the Lens Studio mobile app in the App Store.


TechCrunch
6 days ago
- Business
- TechCrunch
Snap launches Lens Studio iOS and web apps for creating AR Lenses with AI and simple tools
Snap has launched a standalone Lens Studio iOS app and web tool, the company announced on Wednesday. The new tools are designed to make it easier for anyone to create AR Lenses through text prompts and simple editing tools. With the Lens Studio app, users will be able to do things like generate their own AI effects, add their Bitmoji, and browse trending templates to create customized Lenses. Up until now, Lens Studio has only been accessible via a desktop application for professional developers. While the desktop application will remain the primary tool for professionals, Snap says that the new iOS app and web tool are designed to allow people at all skill levels to create Lenses. Image Credits:Snap 'These are experimental new tools that make it easier than ever to create, publish, and play with Snapchat Lenses made by you,' Snap wrote in a blog post. 'Now, you can generate your own AI effects, add your dancing Bitmoji to the fun, and express yourself with Lenses that reflect your mood or an inside joke–whether you're on the go or near your computer.' While Snap currently has an ecosystem of over 400,000 professional AR developers, the company is looking to attract more people who are interested in creating Lenses with the launch of these simpler tools. Snap is considered a leader in AR thanks to its early adoption of the technology through its AR filters and Lenses, and it's clear the company is committed to investing in the space, even as others may be retracting from it. Last year, Meta angered creators after it decided to shut down its Spark AR platform, which allowed third parties to build augmented reality (AR) effects. By opening up access to AR creation, Snap is doubling down on its vision for the technology. As Snap is bringing AR creation into the hands of more people, the company is also rolling out advanced tools for professionals. Yesterday, Snap released new Lens Studio tools that AR creators and developers can use to build Bitmoji games. The tools include a turn-based system to enable back-and-forth gameplay, a new customizable Character Controller that supports different gameplay styles, and more.

Engadget
6 days ago
- Business
- Engadget
Snapchat now has a standalone app for making gen AI augmented reality effects
Snapchat has been experimenting with generative AI-powered augmented reality lenses in its app for the last couple years. Now, the company is allowing users to make their own with a new standalone app for making AR effects. Snap is introducing a new version of its Lens Studio software that allows anyone to create AR lenses through text prompts and other simple editing tools, and publish them directly to Snapchat. Up to now, Lens Studio has only been available as a desktop app meant for developers and AR professionals. And while the new iOS app and web version aren't nearly as powerful, it offers a wide range of face-altering and body-morphing effects thanks to generative AI. "These are experimental new tools that make it easier than ever to create, publish, and play with Snapchat Lenses made by you," the company explains in a blog post. "Now, you can generate your own AI effects, add your dancing Bitmoji to the fun, and express yourself with Lenses that reflect your mood or an inside joke–whether you're on the go or near your computer. " Snap gave me an early look at the Lens Studio iOS app, and I was pleasantly surprised by how much flexibility it offered. There are AI-powered tools for transforming your face, body and background via detailed text prompts (the app also offers suggestions of the kinds of prompts that work well, like "detailed zombie head with big eyes and nose, lots of details.") There's a bit of a learning curve to figuring out what works well for each type of effect, and some of the generative AI prompts can take up to 20 minutes to render. But the app also offers dozens of templates that you can use as a starting point and remix with your own ideas. You can also make simpler face-altering filters that don't rely as heavily AI but take advantage of popular Snapchat effects like face cutouts or Bitmoji animations. (A few examples of my creations are below, both used AI to create a background I overlaid other effects onto.) Snap already has hundreds of thousands of lens creators, some of whom have been making effects for the app for years. But I can easily see this new, simpler version of Lens Studio opening the door for many more. There could also be some upside for creators hoping to take advantage of Snapchat's monetization programs: the company confirmed that users who publish lenses from the new app will be eligible to participate in its Lens Creator Rewards program, which pays creators who make popular AR effects. A more accessible version of Lens Studio could also help Snap compete with Meta for AR talent. (Meta shut down Spark AR, its platform that allowed creators to make AR for Instagram last year.) In addition to Snapchat's in-app effects, the company is now on its second generation of standalone AR glasses . More recently, Snap has focused on big-name developers to make glasses-ready effects, but the company has previously leaned on Lens Creators to come up with interesting use cases for AR glasses. Those types of integrations will likely require much more than what's currently available in the new pared-down version of Lens Studio, but making AR creation more accessible (with the help of AI) raises some interesting possibilities for what might one day be possible for the company. Jim Lanzone, the CEO of Engadget's parent company Yahoo, joined the board of directors at Snap on September 12, 2024. No one outside of Engadget's editorial team has any say in our coverage of the company.