logo
Snap introduces next-gen 'Specs' AR Glasses, Snap OS platform: What's new

Snap introduces next-gen 'Specs' AR Glasses, Snap OS platform: What's new

Snap, the parent company of Snapchat, has announced its next-generation consumer-focused augmented reality (AR) glasses at the Augmented World Expo (AWE) 2025. Named Specs, the new wearable device is scheduled for launch in 2026 and marks Snap's most ambitious attempt yet at integrating AR into everyday life. Alongside the hardware reveal, Snap also introduced major upgrades to its AR operating system, Snap OS.
The new Specs aim to understand users' surroundings, enable shared AR experiences such as multiplayer games, and support tasks like Browse, streaming, and productivity—all through a sleek, self-contained design.
Snap Specs: Details
Unlike previous Snap Spectacles—which were available only to developers—the upcoming Specs will be publicly released and are designed as an 'ultra-powerful wearable computer' with see-through lenses that overlay digital content onto the real world.
Snap described the new Specs as a device built to 'seamlessly integrate digital content into everyday life,' positioning the glasses as part of a broader shift in computing where physical and digital environments converge. The company said it believes 'the time is right for a revolution in computing.'
Snap OS: What's new
Snap is also rolling out key updates to Snap OS, the platform powering its AR glasses. These upgrades are designed to support multimodal AI, spatial awareness, and real-time content generation. Highlights include:
Deep integration with OpenAI and Gemini (Google Cloud): Developers can now create multimodal, AI-powered Lenses and publish them for the Specs user base.
Depth Module API: Allows Snap OS to anchor AR visuals in 3D space using translated 2D data from language models—enhancing spatial intelligence.
Automated Speech Recognition API: Supports real-time transcription in over 40 languages, including non-native accents, with high accuracy.
Snap3D API: Enables on-the-fly generation of 3D objects directly within Lenses.
Tools for developers
Snap is also introducing fleet management tools and features focused on location-based and guided experiences, designed for venues such as museums, parks, and public exhibitions:
Fleet Management App: Allows institutions or developers to monitor and manage multiple Specs units remotely.
Guided Mode: Lets developers pre-configure Specs to launch directly into a multiplayer or single-player Lens for instant interaction.
Guided Navigation: Designed for AR-based tours, this feature provides turn-by-turn guidance through points of interest like landmarks or exhibits.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google revives Snapseed on iPhone with major update and new editing tools
Google revives Snapseed on iPhone with major update and new editing tools

Hindustan Times

time20 hours ago

  • Hindustan Times

Google revives Snapseed on iPhone with major update and new editing tools

Google has rolled out a major update to Snapseed, its photo editing app for iOS devices. The new version 3.0 brings a redesigned interface for both iPhone and iPad users. This update introduces a grid view displaying all edited images, making it easier to browse through past work. Navigation now relies on three distinct tabs: Looks, Faves, and Tools. The Faves tab is new and allows users to save frequently used editing tools for quick access. Snapseed 3.0: Redesigned Interface and New Features Snapseed offers over 25 editing tools and filters, including recently added film-style filters. Google also updated the app's icon to a simpler design. Also read: Neurotech and brain data: New frontier of privacy concerns Snapseed has been part of Google since 2012, but it has seen little development over recent years. The last significant update came in 2021, followed by minor changes in 2023 and 2024. Because the app processes images locally on the device and does not depend on cloud services, Google appeared to have deprioritised its development. The sudden release of version 3.0 signals renewed attention to the app. Also read: Snap to launch smart glasses for users in 2026 in challenge to Meta The updated interface focuses on ease of use. Users begin editing by tapping a circular plus button at the bottom of the screen. The new tab system separates editing functions clearly: Looks provides preset styles, Faves stores user-selected tools, and Tools offers the full range of editing features. The export option moved to the top-right corner for easier access. Editing tools include options to adjust image details, correct tonality and white balance, and apply effects like lens blur and vignette. Retouch features allow selective editing, brushing, healing, cropping, and perspective changes. The Style tab includes film filters along with options such as black and white, HDR, and drama effects. Creative tools cover double exposure, frames, and text additions. Also read: Nintendo sells record 3.5 million Switch 2 consoles in four days In addition to the interface overhaul, Snapseed now features a simplified app icon and a 'More to come, stay tuned' message, which indicates further developments may follow. However, Google has not confirmed whether the 3.0 update will be available on Android.

Neurotech and brain data: New frontier of privacy concerns
Neurotech and brain data: New frontier of privacy concerns

Hindustan Times

timea day ago

  • Hindustan Times

Neurotech and brain data: New frontier of privacy concerns

Consumer neurotechnology is no longer confined to sci-fi or academic labs. Thanks to AI advancements and shrinking chip sizes, devices that read brain activity, like EEG headsets, mood-tracking earbuds, and brain-controlled gaming accessories are entering the mainstream. Since 2011, over 130 startups have jumped into the consumer neurotech space. These tools, often embedded in wearables, promise productivity boosts, mental health insights, and immersive control over AR/VR environments. Tech giants like Apple and Snap are already exploring brain-computer interfaces (BCIs) for future headsets that could respond to mental states in real time. How Neurotech Works—And why It's risky EEG-based devices dominate this landscape, powering nearly 65% of consumer neurotech products. They track brainwave patterns linked to emotions, focus, and engagement levels. That may sound harmless until you realise this data could be mined to predict behaviours, preferences, or even political leanings. Imagine hyper-targeted ads based not on clicks, but on neural spikes. Or worse, cognitive surveillance, where employers or governments monitor attention levels, emotional stress, or signs of dissent. Cyberattacks targeting BCIs could introduce 'mental hacks', altering thought patterns or inducing confusion and distress. As one expert puts it, 'Brain data reveals thoughts before they're consciously expressed.' Regulatory gaps and urgent challenges The legal protections around all this? Alarmingly thin. While medical neurotech is regulated (MRIs or brain implants), consumer-grade EEG headsets fall into a grey zone. In the U.S., the FDA only monitors medical devices. State laws in places like California and Colorado require user consent for neural data use, but there's little enforcement. Internationally, concerns are mounting: China has tested neurotech in workplaces to track employee fatigue, while neuromarketing firms tap EEG feedback to fine-tune advertisements. 'Neural data could be weaponized for psychological warfare or blackmail.' Path forward So what now? We need clear federal laws that define how brain data can be collected, stored, and shared. Users should know exactly what's being tracked and who has access to it. Neural data must be encrypted, just like financial or medical records. Most importantly, the public must be made aware of what 'brain transparency' really means. Because the future of privacy may no longer be in your hands, but in your head. First Published Date: 12 Jun, 21:23 IST

Snap to launch smart glasses for users in 2026 in challenge to Meta
Snap to launch smart glasses for users in 2026 in challenge to Meta

Time of India

time2 days ago

  • Time of India

Snap to launch smart glasses for users in 2026 in challenge to Meta

HighlightsSnap Inc. will launch its first-ever smart glasses for consumers, called Specs, next year, intensifying competition with Meta Platforms, Inc. in the wearable technology market. Snap Inc. has invested over $3 billion in developing augmented reality glasses over the past 11 years, highlighting its commitment to integrating technology into wearable products. Snap Inc. plans to collaborate with Niantic, Inc. to enhance the Lens Studio application, which enables creators to design and publish augmented reality lenses for Snapchat. Snap will launch its first-ever smart glasses for all consumers next year, ratcheting up competition with bigger rival Meta in the wearable technology market . The augmented reality smart glasses, called Specs, will be lightweight, the social media company said on Tuesday. Long known for its messaging app Snapchat and animated filters, Snap has been doubling down on AR, which can overlay digital effects onto photos or videos of real-life surroundings through a camera or lens. Integrating technology into wearable products can open up new lucrative markets and diversify revenue streams for Snap amid an uncertain digital ad market due to changing U.S. trade policies. The company had launched its 5th generation of Spectacles glasses in September, but these were only available to developers. The company has invested more than $3 billion over 11 years developing its augmented reality glasses , Snap co-founder and CEO Evan Spiegel said at the Augmented World Expo 2025 on Tuesday. "Before Snapchat had chat, we were building glasses." The popularity of Meta's Ray-Ban Meta smart glasses developed in partnership with EssilorLuxottica have prompted companies such Google to explore similar investments. Meta continues to add AI features to its glasses to attract more consumers. Snap said it would partner with augmented reality and geospatial technology platform Niantic Spatial to enhance the Lens Studio, which is an application used by creators to design, animate and publish AR lenses for Snapchat camera, and Specs.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store