logo
#

Latest news with #ARglasses

Snap will start selling AR glasses next year
Snap will start selling AR glasses next year

The Verge

time3 days ago

  • Business
  • The Verge

Snap will start selling AR glasses next year

Snap plans to start selling its first pair of augmented reality glasses to the public in 2026. The coming release is part of CEO Evan Spiegel's decade-plus bet on what comes after the smartphone. He teased it Tuesday onstage at the Augmented World Expo, an augmented and virtual reality developer conference in Long Beach, California. 'Ever since we launched the developer Spectacles nine months ago, folks have been asking, 'Hey, when's the public release coming?'' Spiegel tells me ahead of his keynote. Announcing that they're coming next year gives developers ample time to 'think about their timeline for building and polishing the experiences they have,' he says. 'And obviously, that's really important.' While he's resistant to offer more details about the hardware, people who have seen prototypes of next year's glasses tell me they're noticeably thinner and lighter than last year's version, which was only available to developers who applied to rent them. They also boast a wider field of view, allowing virtual graphics to fill more of the lenses. Spiegel won't tell me how much the glasses will cost, though he does let slip that they'll be priced less than Apple's $3,499 Vision Pro. (I expect them to cost much less than that, but still considerably more than the roughly $300 Meta Ray-Ban glasses.) According to Spiegel, next year's glasses will just be called Specs, rather than Spectacles, since 'that's what everybody calls them already.' The market for AR glasses is nascent now but shaping up to become crowded. Meta is planning to announce a pair of glasses with a heads-up display later this year. Google just rebooted its smart glasses program with a Gemini-powered version of Android it's putting in frames from Warby Parker and other eyewear companies. Apple is also still working on AR glasses. Spiegel says Snap has spent $3 billion to date on Spectacles, which is a small amount compared to the efforts of much players like Meta. Still, Spiegel thinks he has an edge: there are 400,000 developers already building AR effects, or lenses, for Snapchat, which is closing in on 1 billion monthly users. He's betting on these lenses — which have become increasingly more complex with multiplayer and AI features — as being what differentiates Specs from other AR glasses. Snap has partnered with OpenAI and Google to let developers use their models to generate lenses. You'll also be able to talk to the models in Specs via MyAI, Snap's chatbot that's already available in Snapchat. Developers can use AI to understand what someone is seeing without footage from the glasses being stored somewhere on a server, which Spiegel is positioning as a win for privacy. Meanwhile, Snap 'spatial intelligence' system uses the models to understand what you're seeing through the cameras and do things like coach you on how to play pool. Snap is also partnering with Niantic Spatial, the recent spinoff company of the developer behind Pokémon Go, to 'build a next generation AI map that will serve as the foundation for AR glasses and AI agents to understand, navigate and interact with the real world,' according to Niantic Spatial spokesperson Jonny Thaw. Meta has seen early success with its Ray-Ban smart glasses, but the market for AR glasses with displays is unproven. Spiegel knows the bar is high. He has to prove he has a device that people will buy, when he admits that AR glasses won't fully replace what phones can do anytime soon. Still, he thinks the market opportunity is bigger than smart glasses without displays. 'We've seen people experiment with the smart glasses category,' he says. 'It has just been hard to to imagine a world where that becomes 10 times better than the phone. Unless you can deliver a product experience that's that's 10 times better than the phone, ultimately, the TAM [total addressable market] is just limited, right?' 'Do people want more from their computers? I think the answer is yes.' He says that the developer version of Spectacles are 'already 10 times better at computing in the real world together with your friends,' and that he thinks AR glasses are 'going to be 10 times better at AI' because it will 'evolve to become spatially aware.'

Snap Says Its ‘Lightweight' AR Glasses Are Probably, Definitely, For Sure, Arriving Next Year
Snap Says Its ‘Lightweight' AR Glasses Are Probably, Definitely, For Sure, Arriving Next Year

Gizmodo

time3 days ago

  • Gizmodo

Snap Says Its ‘Lightweight' AR Glasses Are Probably, Definitely, For Sure, Arriving Next Year

AR glasses are all the rage, and Snap doesn't want to be left out of the party. Between Meta and its Ray-Ban glasses, Xreal and its partnership with Google, and a rumored Samsung entrant that could arrive any day now, the smart glasses field is hot right now. No matter how crowded the field is getting, though, there's still one thing no one has yet to offer: augmented reality glasses with screens and all in a lightweight form factor that's similar to regular glasses. According to Snap, however, that AR unicorn is on its way. At Augmented World Exhibition (AWE) 2025, an annual expo for all things AR/XR, Snap said its next-gen spectacles, which are both 'lightweight' and 'immersive,' will be launching next year. The details are still scant on pretty much everything, but based on its description of next year's Specs, the company behind Snapchat seems to think it's cracked the code. 'We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world, and we can't wait to publicly launch our new Specs next year,' said Snap CEO Evan Spiegel in a statement. 'We couldn't be more excited about the extraordinary progress in artificial intelligence and augmented reality that is enabling new, human-centered computing experiences. We believe Specs are the most advanced personal computer in the world, and we can't wait for you to see for yourself.' There are some other equally vague hints at what we can expect out of next year's Specs, including mentions of AI, to no one's surprise, and applications with gaming, streaming, web browsing, and *sigh* work. That tells us a little bit about what we can expect, but also kind of nothing. One thing is for sure, though, and that's that Snap and Spiegel have been working towards making the definitive, groundbreaking pair of AR glasses for quite a while now. In 2019, Spiegel said smart glasses are still 10 years from mass adoption, and if my math is correct, it's 2025 and not 2029, so these upcoming Specs may just be a precursor still. Snap definitely has some work cut out for itself before it can call its Specs 'lightweight.' Current versions look more like a Halloween costume than something you'd want to wear to the park. Whatever Snap ends up releasing, it's going to have some competitors. Google just recently unveiled a partnership with AR glasses company Xreal that combines the company's Android XR operating system with Xreal's hardware. That's still in development at the moment, but it's clear that Google is actively taking strides towards throwing its hat into the AR glasses world in earnest. Then there's Meta. While the Ray-Ban smart glasses are just a glimmer of what smart glasses can be, Meta reportedly has plans to inch towards a pair of Ray-Bans that feels more futuristic, which would include the use of an actual display inside the lens for navigation, notifications, and more. That's not even counting Meta's Project Orion glasses, which combine elements of VR headsets like real compute power with the smaller form factor of glasses. Neither of those is a consumer product, but a lot can happen in a few years. For now, Snap will take a similar route to what we've seen already on the smart glasses front. It announced at AWE that Snap OS, the operating system that powers its Specs, will integrate both Google's Gemini AI and OpenAI's ChatGPT to offer 'multimodal AI-powered' experiences. That should look similar to what we've seen with Meta's AI integration that lets you use on-device cameras to take in your surroundings and offer feedback. For example, looking at a pair of shoes and saying, 'WHAT ARE THOSE?' and having your glasses (hopefully) tell you what you're looking at. I hope for everyone's sake that whatever the integration is, it works better than Meta AI—an AI experience that I've found to be finicky at best. Either way, by Spiegel's own words, Snap is still a player in the AR glasses game, and next year may be the year we find out how much of a competitor it really is. As someone who's pretty excited about the AR glasses space, I'm hoping that it actually delivers on its vision before it gets swallowed by competition.

The 5 coolest things I tried at Google I/O 2025
The 5 coolest things I tried at Google I/O 2025

Android Authority

time21-05-2025

  • Android Authority

The 5 coolest things I tried at Google I/O 2025

Google I/O is the company's biggest event of the year. Even the launch of the Pixel 10 series later in 2025 won't compare in size and scope to I/O. This week, I was lucky enough to attend I/O on the steps of Google's Mountain View headquarters, and it was a total blast. Unfortunately, people who watch the keynote livestream at home don't get to experience what happens after Sundar Pichai exits the stage. That's when I/O gets incredibly fun, because you can wander around the grounds and actually try out many of the upcoming tech and services Googlers just gushed about during the main event. Here, I want to tell you about the five coolest things I was lucky enough to try! It's not the same as being there, but it will have to do for now. Android XR prototype glasses Lanh Nguyen / Android Authority At Google I/O 2024, we first saw hints of a new set of AR glasses. Based around something called Project Astra, the glasses seemed like the modern evolution of Google Glass, the doomed smart glasses the company tried to launch over 10 years ago. All we saw during the 2024 demo, though, was a pre-recorded first-person-view clip of a prototype of these glasses doing cool things around an office, such as identifying objects, remembering where an object once was, and extrapolating complex ideas from simple drawings. Later that year, we found out that these glasses run on something called Android XR, and we actually got to see the wearable prototype in promotional videos supplied by Google. These glasses were hinted at during I/O 2024, but now I got to see them on stage and even wear them myself. This week, Google not only showed off the glasses in real-world on-stage demos, but even gave I/O attendees the chance to try them on. I was able to demo them, and I gotta say: I'm impressed. Lanh Nguyen / Android Authority First, nearly everything Google showed on stage this year worked for me during my demo. I was able to look at objects and converse with Gemini about them, both seeing its responses in text form on the tiny display and hearing the responses blasted into my ears from the speakers in the glasses' stems. I was also able to see turn-by-turn navigation instructions and smartphone notifications. I could even take photos with a quick tap on the right stem (although I'm sure they are not great, considering how tiny the camera sensor is). Nearly everything Google showed during the keynote worked during my hands-on demo. That's a rarity! Although the glasses support a live translation feature, I didn't get to try that out. This is likely due to translation not working quite as expected during the keynote. But hey, they're prototypes — that's just how it goes, sometimes.​ The one disappointing thing was that the imagery only appeared in my right eye. If I closed my right eye, the imagery vanished, meaning these glasses don't project onto both eyes simultaneously. A Google rep explained to me that this is by design. Android XR can support devices with two screens, one screen, or even no screens. It just so happened that these prototype glasses were single-screen units. In other words, models that actually hit retail might have dual-screen support or might not, so keep that in mind for the future. Unfortunately, the prototype glasses only have one display, so if you close your right eye, all projections vanish. Regardless, the glasses worked well, felt good on my face, and have the pedigree of Google, Samsung, and Qualcomm behind them (Google developing software, Samsung developing hardware, and Qualcomm providing silicon). Honestly, it was so exciting to use them and immediately see that these are not Glass 2.0 but a fully realized wearable product. Hopefully, we'll learn more about when these glasses (or those offered by XREAL, Warby Parker, and Gentle Monster) will actually launch, how much they'll cost, and in what areas they'll be available. I just hope we won't need to wait until Google I/O 2026 for that information. Project Moohan Lanh Nguyen / Android Authority The prototype AR glasses weren't the only Android XR wearables available at Google I/O. Although we've known about Project Moohan for a while now, very few people have actually been able to test out Samsung's premium VR headset. Now, I am in that small group of folks. The first thing I noticed about Project Moohan when I put it on my head was how premium and polished the headset is. This is not some first-run demo with a lot of rough edges to smooth out. If I didn't know better, I would have assumed this headset was retail-ready — they're that good. Project Moohan already feels complete. If it hit store shelves tomorrow, I would buy it. The headset fit well and had a good weight balance, so the front didn't feel heavier than the back. The battery pack not being in the headset itself had a big part in this, but having a cable running down my torso and a battery pack in my pocket was less annoying than I thought it would be. What was most important was that I felt I could wear this headset for hours and still be comfortable, which you cannot say about every headset out there. Lanh Nguyen / Android Authority Once I had Project Moohan on, it was a stunning experience. The displays inside automatically adjusted themselves for my pupillary distance, which was very convenient. And the visual fidelity was incredible: I had a full color view of the real world, low-latency, and none of the blurriness I've experienced with other VR systems. The display fidelity of Project Moohan was some of the best I've ever experienced with similar headsets. It was also exceptionally easy to control the headset using my hands. With Project Moohan, no controllers are needed. You can control everything using palm gestures, finger pinches, and hand swipes. It was super intuitive, and I found myself comfortable with the operating system in mere minutes. Of course, Gemini is the star here. There's a button on the top right of Project Moohan that launches Gemini from any place within Android XR. Once launched, you can give commands, ask questions, or just have a relaxed conversation. Gemini understands what's on your screen, too, so you can chat with it about whatever it is you're doing. A Google rep told me how they use this for gaming: if they come across a difficult part of a game, they ask Gemini for help, and it will pull up tutorials or guides without ever needing to leave the game. Lanh Nguyen / Android Authority Speaking of gaming, Project Moohan supports all manner of controllers. You'll be able to use traditional VR controllers or even something simpler like an Xbox controller. I wasn't able to try this out during my short demo, but it made me very excited about this becoming a true gaming powerhouse. I didn't get to try it, but Google assured me that Project Moohan will support most gaming controllers, making me very excited about this becoming my new way to game. The fatal flaw here, though, is the same one we have with the Android XR glasses: we have no idea when Project Moohan is actually coming out. We don't even know its true commercial name! Google and Samsung say it is coming this year, but I'm skeptical considering how long it's been since we first saw the project announced and how little headway we've made since (the United States' tariff situation doesn't help, either). Still, whenever these do land, I will be first in line to get them. AI Mode in Chrome and Search Live Lanh Nguyen / Android Authority Moving away from hardware, AI Mode was another star of Google I/O 2025. Think of this as Google Search on AI steroids. Instead of typing in one query and getting back a list of results surrounding it, you can give much more complex queries and get a unique page of results based on multiple searches to provide you with an easy-to-understand overview of what you're trying to investigate. AI Mode allows you to do multiple Google searches at once with prompts as long as you need them to be. For example, I used it to hunt for a new smartphone. I typed in a pretty long query about how the phone needed to run Android (obviously), cost less than $800, have a good camera, and be available in the United States. Normally, a simple Google search wouldn't work for this, but with AI Mode, it got right down to it. It returned a page to me filled with good suggestions for phones, including the Pixel 9a, the Galaxy S25, and the OnePlus 13R — all incredibly solid choices. It even included buy links, YouTube reviews, articles, and more. Lanh Nguyen / Android Authority The cool thing about AI Mode is that you don't need to wait to try it for yourself. If you live in the US and have Search Labs enabled, you should already have access to AI Mode at (if you don't, you'll have it soon). Lanh Nguyen / Android Authority One thing you can't try out in AI Mode yet, though, is Search Live. This was announced during the keynote and is coming later this summer. Essentially, Search Live allows you to share with Gemini what's going on in the real world around you through your smartphone camera. If this sounds familiar, that's because Gemini Live on Android already supports this. With Search Live, though, it will work on any mobile device through the Google app, allowing iPhone users to get in on the fun, too. With AI Mode you'll also eventually find Search Live, which allows you to show Gemini the real world using your phone's camera. I tried out an early version of Search Live, and it worked just as well as Gemini Live. It will be great for this to be available to everyone everywhere, as it is a very useful tool. However, Google is in dangerous 'feature creep' territory now, so hopefully it doesn't let things get too confusing for consumers about where they need to go to get this service. Flow Lanh Nguyen / Android Authority Of everything I saw at Google I/O this year, Flow was the one that left me the most conflicted. Obviously, I think it's super cool (otherwise it wouldn't be on this list), but I also think it's kind of frightening. Flow is a new filmmaking tool that allows you to easily generate video clips in succession and then edit those clips on a traditional timeline. For example, this could allow you to create an entire film scene by scene using nothing but text prompts. When you generate a clip, you can tweak it with additional prompts or even extend it to get more out of your original creation. Flow could be a terrific new filmmaking tool, or the death of film as we know it. What's more, Flow will also incorporate Veo 3, Google's latest iteration of its video generation system. Veo 3 allows one to create moving images along with music, sound effects, and even spoken dialogue. This makes Flow a tool that could allow you to create a full film out of thin air. Lanh Nguyen / Android Authority Using Flow during my demo was so easy. I created some clips of a bugdroid having spaghetti with a cat, and it came out hilarious and cute. I was able to edit the clip, add more clips to it, and extend clips with a few mouse clicks and some more text prompts. Flow was easy to use, understand, and it worked well enough, but I couldn't help but wonder why it needs to exist at all. I didn't get to try out Veo 3 during my demo, unfortunately. This wasn't because of a limitation of the system but of time: it takes up to 10 minutes for Veo 3 to create clips. Even if I only made two clips, it would push my demo time beyond what's reasonable for an event the size of Google I/O. When I exited the Flow demo, I couldn't help but think about Google's The Wizard of Oz remake for Sphere in Las Vegas. Obviously, Flow isn't going to have people recreating classic films using AI, but it does have the same problematic air to it. Flow left me feeling elated at how cool it is and dismayed by how unnecessary it is. Aloha robotics Lanh Nguyen / Android Authority Everything I've talked about here so far is something you will actually be able to use one day. This last one, though, will not likely be something you'll have in your home any time soon. At Google I/O, I got to control two robot arms using only voice commands to Gemini. The two arms towered over various objects on a table. By speaking into a microphone, I could tell Gemini to pick up an object, move objects, place objects into other objects, etc., and only use natural language to do it. It's not every day I get to tell robots what to do! The robot arms weren't perfect. If I had them pick up an object and put it into a container, they would do it — but then they wouldn't stop. The arms would continue to pick up more objects and try to dump them into the container. Also, the arms couldn't do multiple tasks in one command. For example, I couldn't have them put an object into a container and then pick up the container and dump out the object. That's two actions and would require two commands. Still, this is the first time in my life that I've been able to control robots using nothing but my voice. That is basically the very definition of 'cool.' Those were the coolest things I tried at I/O this year. Which one was your favorite? Let me know in the comments!

I Waited One Hour to Try Google's Android XR Smart Glasses and Only Had 90 Seconds With Them
I Waited One Hour to Try Google's Android XR Smart Glasses and Only Had 90 Seconds With Them

Gizmodo

time20-05-2025

  • Gizmodo

I Waited One Hour to Try Google's Android XR Smart Glasses and Only Had 90 Seconds With Them

I was promised 5 minutes with Google's AR glasses prototype, but only got 90 seconds to use them. Well, that didn't go well. After enduring two hours of nonstop Gemini AI announcement after announcement at Google I/O, I waited one hour in the press lounge to get a chance to try out either a pair of Android XR smart glasses or Samsung's Project Moohan mixed reality headset. Obviously, I went for the Android XR smart glasses to see how they compare to Meta's $10,000 Orion concept and Google Glass before it. Is Android XR the holy grail of smart glasses we've been waiting over a decade for? Unfortunately, Google only let me try them out for 90 seconds. I was promised five minutes with the Android XR headset prototype and only had three minutes total, half of which a product rep spent explaining to me how the smart glasses worked. Ninety seconds in, I was told to tap on the right side of the glasses to invoke Gemini. The AI's star-shaped icon appeared in the right lens (this pair of Android XR glasses only had one tiny transparent display) slightly below its center point. I was instructed to just talk to Gemini. I turned around and looked at a painting hung on the wall and asked it what I was looking at, who painted it, and asked for the art style. Gemini answered confidently; I have no idea if the answers were correct. I looked over at a bookshelf and asked Gemini to tell me the names of the books—and it did. Then the rep used a phone that was paired to the glasses to load up Google Maps. He told me to look down at my feet, and I saw a small portion of a map; I looked back up, and there Gemini pulled up a turn-by-turn navigation. Then the door in the 10 x 10-foot wooden box I was in slid open, and I was told I was done. The whole thing was incredibly rushed, and honestly, I barely even got to get a sense for how well Gemini worked. The AI constantly spoke over the rep while he was explaining the Android XR demo to me. I'm not sure if it was a false activation or a bug or what. When I asked about the painting and books, I didn't need to keep tapping on the side of the glasses—Gemini kept listening and just switched gears. That part was neat. Compared to Meta's Orion smart glasses—which are also a prototype concept at this stage—the Android XR glasses don't even compare. You can see more and do more with Orion and through its waveguide lenses with silicon carbide. Orion runs multiple app windows like Instagram and Facebook Messenger, and even has 'holographic' games like a Pong knockoff that you can play another person wearing their own pair of the AR glasses. Versus Snapchat's latest AR 'Spectacles' and their super narrow field of view, I'd say the Android XR prototype and its singular display might actually be better. If you're going to have less powerful hardware, lean into its strength(s). As for the smart glasses themselves—they felt like any pair of thick sunglasses, and they felt relatively light. They did slide off my nose a bit, but that's only because I have a flat and wider Asian nose. They didn't appear to slip off the nose of my friend and Engadget arch-nemesis, Karissa Bell. (I'm just kidding; I love Karissa.) There was no way for me to check battery life in 90 seconds. So that's my first impression of the first pair of Android XR smart glasses. It's not a lot, but also not nothing. Part of me is wondering why the hell Google chose to limit demo time like this. My spidey sense tells me that it may not be as far along as it appeared in the I/O keynote demo. What I saw feels like a better version of Google Glass, for sure, with the screen resembling a super-tiny heads-up display that's located in the center of the right lens instead of above your right eye on Glass. But with only 90 seconds, there's no way for me to form a firm opinion. I need to see a lot more, and what I saw was not even a sliver of a sliver. Google, you got my number—call me!

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store