logo
We tried on Google's prototype AI smart glasses

We tried on Google's prototype AI smart glasses

The Verge20-05-2025

Here in sunny Mountain View, California, I am sequestered in a teeny tiny box. Outside, there's a long line of tech journalists, and we are all here for one thing: to try out Project Moohan and Google's Android XR smart glasses prototypes. (The Project Mariner booth is maybe 10 feet away and remarkably empty.)
While nothing was going to steal AI's spotlight at this year's keynote — 95 mentions! — Android XR has been generating a lot of buzz on the ground. But the demos we got to see here were notably shorter, with more guardrails than what I got to see back in December. Probably, because unlike a few months ago, there are cameras everywhere and these are 'risky' demos.
First up is Project Moohan. Not much has changed since I first slipped on the headset. It's still an Android-flavored Apple Vision Pro, albeit much lighter and more comfortable to wear. Like Oculus headsets, there's a dial in the back that lets you adjust the fit. If you press the top button, it brings up Gemini. You can ask Gemini to do things, because that is what AI assistants are here for. Specifically, I ask it to take me to my old college stomping grounds in Tokyo in Google Maps without having to open the Google Maps app. Natural language and context, baby.
But that's a demo I've gotten before. The 'new' thing Google has to show me today is spatialized video. As in, you can now get 3D depth in a regular old video you've filmed without any special equipment. (Never mind that the example video I'm shown is most certainly filmed by someone with an eye for enhancing dramatic perspectives.)
Because of the clamoring crowd outside, I'm then given a quick run-through of Google's prototype Android XR glasses. Emphasis on prototype. They're simple; it's actually hard to spot the camera in the frame and the discreet display in the right lens. When I slip them on, I can see a tiny translucent screen showing the time and weather. If I press the temple, it brings up — you guessed it — Gemini. I'm prompted to ask Gemini to identify one of two paintings in front of me. At first, it fails because I'm too far away. (Remember, these demos are risky.) I ask it to compare the two paintings, and it tells me some obvious conclusions. The one on the right uses brighter colors, and the one on the left is more muted and subdued.
On a nearby shelf, there are a few travel guidebooks. I tell Gemini a lie — that I'm not an outdoorsy type, so which book would be the best for planning a trip to Japan? It picks one. I'm then prompted to take a photo with the glasses. I do, and a little preview pops up on the display. Now that's something the Ray-Ban Meta smart glasses can't do — and arguably, one of the Meta glasses' biggest weaknesses for the content creators that make up a huge chunk of its audience. The addition of the display lets you frame your images. It's less likely that you'll tilt your head for an accidental Dutch angle or have the perfect shot ruined by your ill-fated late-night decision to get curtain bangs.
These are the safest demos Google can do. Though I don't have video or photo evidence, the things I saw behind closed doors in December were a more convincing example of why someone might want this tech. There were prototypes with not one, but two built-in displays, so you could have a more expansive view. I got to try the live AI translation. The whole 'Gemini can identify things in your surroundings and remember things for you' demo felt both personalized, proactive, powerful, and pretty dang creepy. But those demos were on tightly controlled guardrails — and at this point in Google's story of smart glasses redemption, it can't afford a throng of tech journalists all saying, 'Hey, this stuff? It doesn't work.'
Meta is the name that Google hasn't said aloud with Android XR, but you can feel its presence loom here at the Shoreline. You can see it in the way Google announced stylish eyewear brands like Gentle Monster and Warby Parker as partners in the consumer glasses that will launch… sometime, later. This is Google's answer to Meta's partnership with EssilorLuxottica and Ray-Ban. You can also see it in the way Google is positioning AI as the killer app for headsets and smart glasses. Meta, for its part, has been preaching the same for months — and why shouldn't it? It's already sold 2 million units of the Ray-Ban Meta glasses.
The problem is even with video, even with photos this time. It is so freakin' hard to convey why Silicon Valley is so gung-ho on smart glasses. I've said it time and time again. You have to see it to believe it. Renders and video capture don't cut it. Even then, even if in the limited time we have, we could frame the camera just so and give you a glimpse into what I see when I'm wearing these things — it just wouldn't be the same.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Which USB Port Is This? Microsoft Vows To End The Lottery
Which USB Port Is This? Microsoft Vows To End The Lottery

Forbes

time19 minutes ago

  • Forbes

Which USB Port Is This? Microsoft Vows To End The Lottery

Consumers are currently left confused by USB-C ports Is that a USB 3.2 Gen 2×2 port on your PC or a USB 40Gbps? Nobody without a degree in computer science knows, which is why Microsoft is vowing to end the 'which USB port is this?' confusion. The USB-C connector was meant to make computing life simpler, with one reversible connection that could handle charging, displays, peripherals and data transfer. Instead, it's turned into a horror show, with various standards leaving consumers confused as to what the ports on their computer are actually capable of. Microsoft claims that its Windows diagnostics data shows that just over a quarter of users have been shown a Windows error message when plugging in a USB-C device, only to find the port doesn't support the feature they wanted. For example, plugging a monitor into a USB-C port that doesn't support display output. 'Not all USB-C ports are created equal," the company writes in a blog. 'You can't tell which ones deliver the full experience just by looking at them.' Finally, Microsoft plans to do something about this. The company plans to 'establish a minimum bar for USB-C port capabilities on PCs." This will be part of the Windows Hardware Compatibility Program, 'turning optional features into mandatory ones, and ensure a consistent level of performance you can count on,' Microsoft claims. That means when a USB-C port appears on a PC in future, it will guarantee that: There's still potential for some confusion, with different USB-C ports operating at different speeds. Currently, for example, USB-4 is available in both 40Gbps and 80Gbps speed variants via a USB-C connector, although only those seeking the highest performance from external storage would ever really notice the difference in the real world. Perhaps the biggest downside of Microsoft's plan is that it's taken so long to get here and the benefits are unlikely to be felt for many years yet. Obviously, the standard only applies to new PCs, not devices that are already in homes and businesses, or in supply chains around the world. That means it's likely to be several years before you can plug a device into a Windows laptop and be confident that it will meet the above criteria – by which time the PC industry might have moved on to another, different type of connector altogether. (Hopefully not. The industry does appear to have coalesced around USB-C and it's in nobody's interests to create another standard. But it's not out of the question, either.) So, brace yourself for a few more years of the 'which USB port is this?' confusion, until Microsoft's new certification scheme has become the industry standard.

Now Playing could get this feature that it should have had from the start (APK teardown)
Now Playing could get this feature that it should have had from the start (APK teardown)

Android Authority

time23 minutes ago

  • Android Authority

Now Playing could get this feature that it should have had from the start (APK teardown)

Ryan Haines / Android Authority TL;DR An Android Authority teardown has revealed that Google is working on an easy way to manually enable Now Playing functionality. Now Playing runs in the background and uses on-device machine learning to automatically identify music. However, the company is working on a Now Playing Quick Settings tile that would let users manually identify a track. Google has offered Now Playing functionality on Pixel phones for years now, using on-device machine learning to passively identify music playing around you. However, it looks like Google could soon let you manually activate the feature. Authority Insights story on Android Authority. Discover You're reading anstory on Android Authority. Discover Authority Insights for more exclusive reports, app teardowns, leaks, and in-depth tech coverage you won't find anywhere else. An APK teardown helps predict features that may arrive on a service in the future based on work-in-progress code. However, it is possible that such predicted features may not make it to a public release. Google currently offers Now Playing as a passive feature on Pixel phones, automatically displaying the currently playing track on your lock screen and in the Now Playing history page in the settings menu. There's no proper way to manually identify a track, though. One workaround is to use the song search functionality in the Google app, which can be accessed in several ways (e.g. a quick settings tile or after tapping the microphone icon in the app). Unlike Now Playing, though, this solution requires an internet connection. Furthermore, tracks identified by the Google app aren't added to your device's Now Playing history, although they are stored in your Google account. Fortunately, an Android Authority teardown of the Android System Intelligence suite ( has revealed that Google is working on a Now Playing tile in the Quick Settings panel. We partially enabled this feature, and you can view it in the screenshots below. As the screenshots show, tapping the Now Playing tile allows you to manually identify a track while also producing a 'searching for song…' notification. We weren't able to get this feature working properly, but we're guessing the track info will eventually be displayed in a notification too. In any event, this would be a long-overdue addition to Now Playing on Pixel phones. We previously reported that many Pixel owners were having trouble getting Now Playing to work, so manually invoking the feature would be a welcome alternative to automatic detection. Our fingers are crossed that this Now Playing Quick Settings tile is implemented sooner rather than later. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store