Latest news with #smartGlasses


The Verge
6 days ago
- Business
- The Verge
Here's what's inside Meta's experimental new smart glasses
Meta has revealed more information about Aria Gen 2, its experimental smart glasses designed to serve as a test platform for research about augmented reality, AI, and robotics. The glasses pack several improvements into their lightweight frame that could one day translate into consumer products, including an improved eye-tracking system that can track gaze per eye, detect blinks, and estimate the center of pupils. 'These advanced signals enable a deeper understanding of the wearer's visual attention and intentions, unlocking new possibilities for human-computer interaction,' Meta writes. Meta initially announced Aria Gen 2 in February, saying they will 'pave the way for future innovations that will shape the next computing platform.' They build upon Meta's first iteration of the glasses in 2020, which were similarly available for researchers only. Along with an improved eye-tracking system, Aria Gen 2 comes with four computer vision cameras that Meta says enable 3D hand and object tracking. Meta says researchers can use this information to enable highly precise tasks like 'dexterous robot hand manipulation.' The glasses also have a photoplethysmography sensor built into the nosepad, which allows the device to estimate a wearer's heart rate, along with a contact microphone that Meta says provides better audio in loud environments. There's a new ambient light sensor as well, allowing the glasses to differentiate between indoor and outdoor lighting. The Aria Gen 2 glasses include folding arms for the first time, weigh around 75 grams, and come in eight different sizes. Meta plans on opening applications for researchers to work with Aria Gen 2 later this year. The initiative builds on the successful development of Meta's Ray-Ban smart glasses, a form factor it aims to expand with its Orion augmented-reality glasses, a rumored partnership with Oakley, and a high-end pair of 'Hypernova' glasses with a built-in screen.


The Verge
6 days ago
- Business
- The Verge
Here's what's inside Meta's experimental new AR glasses
Meta has revealed more information about Aria Gen 2, its experimental smart glasses designed to serve as a test platform for research about augmented reality, AI, and robotics. The glasses pack several improvements into their lightweight frame that could one day translate into consumer products, including an improved eye-tracking system that can track gaze per eye, detect blinks, and estimate the center of pupils. 'These advanced signals enable a deeper understanding of the wearer's visual attention and intentions, unlocking new possibilities for human-computer interaction,' Meta writes. Meta initially announced Aria Gen 2 in February, saying they will 'pave the way for future innovations that will shape the next computing platform.' They build upon Meta's first iteration of the glasses in 2020, which were similarly available for researchers only. Along with an improved eye-tracking system, Aria Gen 2 comes with four computer vision cameras that Meta says enable 3D hand and object tracking. Meta says researchers can use this information to enable highly precise tasks like 'dexterous robot hand manipulation.' The glasses also have a photoplethysmography sensor built into the nosepad, which allows the device to estimate a wearer's heart rate, along with a contact microphone that Meta says provides better audio in loud environments. There's a new ambient light sensor as well, allowing the glasses to differentiate between indoor and outdoor lighting. The Aria Gen 2 glasses include folding arms for the first time, weigh around 75 grams, and come in eight different sizes. Meta plans on opening applications for researchers to work with Aria Gen 2 later this year. The initiative builds on the successful development of Meta's Ray-Ban smart glasses, a form factor it aims to expand with its Orion augmented-reality glasses, a rumored partnership with Oakley, and a high-end pair of 'Hypernova' glasses with a built-in screen.


Bloomberg
20-05-2025
- Business
- Bloomberg
Xreal and Google Debut Aura AR Glasses to Rival Meta's Orion Plan
Alphabet Inc.'s Google has entered the glasses race by partnering with Xreal Inc. on the first spectacles to run an augmented-reality version of its operating system. At the Google I/O conference on Tuesday, the search giant and Chinese smart glasses maker showed developers what they call Project Aura, the first eyewear that will run Android XR. The new operating system was announced in December with the goal of mimicking the Android experience on a range of devices. That includes enclosed mixed-reality headsets; glasses with augmented reality, or AR; and spectacles with cameras but no ability to view content. That month, Google and Samsung Electronics Co. teased a headset known as Project Moohan.


Gizmodo
20-05-2025
- Gizmodo
Google Taps Xreal for ‘Optical See-Through' Smart Glasses That Could Beat Meta's Orion AR Glasses
We told you Google would be gunning for Meta's Ray-Ban smart glasses with its Android XR 'spatial computing' platform that was announced last year. At its I/O developer conference today, the tech giant said it's teaming up with Xreal for a pair of next-gen augmented reality smart glasses called Project Aura. Details are light right now, and Xreal told Gizmodo it won't be demoing Project Aura at the show, but we do know a few things that should get your gears turning on where things are headed. According to Xreal, Project Aura is an 'optical-see-through (OST) device' that's 'lightweight and tethered, cinematic, and Gemini-AI-powered.' It has a 'large field-of-view experience.' How wide? Xreal hasn't said. The company's current top-of-the-line One Pro smart glasses have a 57-degree FOV. Each new iteration has brought a wider viewing experience—the One have a 50-degree FOV and the Air 2 Pro have a 46-degree FOV. A wider FOV means looking at the built-in virtual displays feels less like peering through a floating window or binoculars where the clearest part is only in the center. The one image we have to go off shows a design not too dissimilar to the One Pro, but with three cameras—two on each side and one in the nose bridge. There's also a tiny red button on the right ear. Xreal says Project Aura is tethered—they'll likely connect to phones, tablets, and laptops just like the company's current smart glasses. We also know that the smart glasses will be powered by a Qualcomm chip of some kind. This is an interesting nugget because Xreal had previously boasted about the X1 chip, the company's first custom-designed chipset, for the One and One Pro. With the X1 chip, Xreal told me last year that it could build more compact and power-efficient smart glasses compared to the competition. Using a Qualcomm chip, Google seems to be recycling the same Android and Wear OS wearables playbook, by letting many different companies design smart glasses running on Android XR while leaving the chipset design to its longtime chip partner. From the sound of things, the 'optical-see-through' nature of Project Aura suggests the smart glasses will offer an experience that is more akin to what you'd expect from a pair of AR frames. App windows might float in front of your vision and stay 'locked' in place even while you move your head or body. Or perhaps the windows just follow you wherever you go. Without more information, it's hard to know what using Project Aura is like. But my gut tells me it's something similar to Meta's Orion concept AR glasses, which blew me away when I demoed them last year. The reality check with Orion is that they're nowhere near ready for consumer release—each prototype pair reportedly costs around $10,000 to make—and Meta is allegedly planning to downgrade many of its capabilities, such as its silicon carbide waveguide lenses, in order to get the price down significantly. We won't have long to wait to learn more about Project Aura. Xreal says it plans to share more next month at Augmented World Expo, as well as later in the year. Who knows, if you're reading this on your phone, maybe in the not-too-distant future you'll be reading Gizmodo articles through smart glasses instead. Though, what's more likely is you'll still need to plug a pair into your phone.


CNET
08-05-2025
- Business
- CNET
Meta Reportedly Eyeing 'Super Sensing' Tech for Smart Glasses
Meta is reportedly developing what it calls a "super sensing" type of facial recognition technology to its smart glasses lineup. A new report from The Information said Meta is developing software for the glasses that has the ability to recognize people by name and better keep track of what users are doing throughout the day. The company originally planned to include similar technology in its first wave of smart glasses but abandoned that effort due to privacy concerns. Now, however, the effort appears to be back on the table. Meta is reportedly working on two new pairs of smart glasses, internally known as Aperol and Bellini, and is also re-evaluating its privacy policies and potential safety risks associated with the technology, the report said. The report said the feature would be opt-in only. It's part of Meta's larger strategy to expand its smart glasses lineup and integrate AI more deeply into the products. It follows news that rivals like Google — now developing its first smart glasses since Google Glass — are boosting their push into the category. It's also said to be integrating this same "sensing" recognition capability into AI-powered earphones, which are said to include embedded cameras and sensors. A representative for Meta did not immediately respond to a request for comment.