Latest news with #OrionAR
&w=3840&q=100)

Business Standard
2 days ago
- Science
- Business Standard
Meta builds gesture-based wristband for hands-free device control: Report
Meta is developing a gesture-controlled wristband that can let users operate computers and digital devices using subtle hand movements. According to a post on the Meta blog, the device uses muscle signals to control actions like moving a cursor, launching apps, and sending messages – all without touching a screen or keyboard. The wristband is designed to function across a wide range of physical abilities. Users can 'write' in the air or on surfaces such as a table or their leg, mimicking pencil-like motion, to compose messages or interact with menus. Surface electromyography (sEMG): How it works The core of the technology is surface electromyography (sEMG), a non-invasive method that detects electrical signals generated by muscle activity. These signals, captured by sensors embedded in the wristband, can be interpreted even before a physical gesture is made. Meta is collaborating with Carnegie Mellon University to test the wristband's usability for individuals with spinal cord injuries. According to Douglas Weber, professor at Carnegie Mellon's Department of Mechanical Engineering and the Neuroscience Institute, even people with complete hand paralysis exhibit some residual muscle activity, which the device can detect and interpret. Integration with AR and accessibility Meta has developed an sEMG wristband prototype that integrates with its Orion AR glasses. This enables users to perform actions like typing, navigating menus, and sending messages hands-free, using only neuromotor signals at the wrist. The wristband supports gesture recognition such as tapping, swiping, and pinching even when a user's hand is resting. It also features handwriting recognition, allowing users to write messages by mimicking pen movements on any surface. Key advantages of Meta's sEMG wristband

Engadget
14-07-2025
- Engadget
TikTok owner ByteDance is reportedly building its own mixed reality goggles
ByteDance, the parent company of TikTok, is reportedly working on mixed reality goggles, The Information reports. The in-development device is designed to layer digital objects over your view of the real world, and is supposed to compete directly with Meta's upcoming mixed reality products. The goggles are being built by ByteDance's virtual reality startup Pico, the creators of the Pico 4 VR headset. Pico's past products have attempted to match Meta's Quest headsets in terms of features, but these new goggles apparently represent a different approach (albeit one still positioned as an alternative to Meta). Rather than a bulky headset, the goggles are supposed to be small and light, about the size of the Bigscreen Beyond VR headset, which weighs 0.28 pounds. Pico is keeping the device lightweight by offloading most of the computing work to a puck that's connected to the goggles over a wire. Meta's prototype Orion AR glasses used a wireless puck for a similar weight-saving purpose when the company demoed them in November 2024. Pico is also reportedly working on building "specialized chips for the device that will process data from its sensors to minimize the lag or latency between what a user sees in AR and their physical movements," The Information writes. Plenty of the details are still up in the air, but the report notes that the ByteDance / Pico goggles should be very similar to Meta's next mixed reality device. Following the release of the Quest 3S, Meta reportedly postponed work on the Quest 4 in favor of developing lightweight mixed reality goggles, according to UploadVR . The company has been publicly pushing AI wearables like the newly introduced Oakley Meta HSTN glasses, and it seems like its next Quest device will be closer to smart glasses than a VR headset with controllers. It's not known when ByteDance's goggles will actually be released or where they'll be sold. Current Pico headsets aren't sold in the US, and given the concern over ByteDance's ownership of TikTok, it seems unlikely the company would be able to sell a mixed reality device without pushback.


Android Authority
01-07-2025
- Android Authority
Leaked render reveals Meta's Hypernova smart glasses and wristband controller
TL;DR A newly leaked render shows off Meta's upcoming 'Hypernova' smart glasses alongside its wrist controller accessory. Hypernova is expected to feature a screen in the right lens, an upgraded camera, and the ability to launch apps for taking photos and accessing maps. Users will also be able to see notifications. The 'Ceres' neural wristband will reportedly power gesture-based controls for the device. Hypernova is expected to cost $1,000–$1,400 and is said to be coming by the end of the year. Meta's smart glasses with Ray-Ban have been a runaway hit, and the company is doubling down with more style variants. Beyond these, the company is also rumored to be working on a higher-end version of the smart glasses with a built-in screen, which is expected to launch by the end of the year. These glasses, codenamed Hypernova, have now leaked, giving us our first look at the next generation of Meta's AI smart glasses. Leaker Arsène Lupin has shared a render of a pair of smart glasses and a wrist accessory, claiming the pair to be the upcoming Meta Hypernova glasses. Granted, this is a rather low-resolution render, but it at least gives us a fair look at the wrist accessory. What is this wrist accessory, you ask? A previous report claims that the Hypernova glasses will come with a 'neural' wristband controller, codenamed Ceres. These controllers were in the works for Meta's Orion AR glasses, but they could also seemingly be used by Hypernova to recognize hand gestures such as rotating the hand to scroll through apps, and pinching fingers and thumb to select items. It's unclear if the pictured wristband controller is the Ceres controller, but it likely is. As for the Hypernova glasses themselves, they don't look all that different from the current generation Meta Ray-Bans, but previous leaks suggest these include a built-in screen. The monocular screen will only be visible in the lower-right quadrant on the right lens, but we can't make out the screen in this low-resolution render. When turned on, the display will show a home screen comprised of circular icons laid out horizontally, similar to the app dock that we see on a lot of devices. Users will be able to use dedicated apps to take pictures, view photos, and even access maps. Other expected functions are said to include notification support for phone apps, including WhatsApp and Messenger. These glasses are said to be controlled using hand gestures using Ceres and capacitive touch on the sides of the frame. Meta is also said to be working to upgrade the camera on board the glasses. Meta's Hypernova smart glasses are expected to be priced between $1,000 and $1,400. The final price (and potentially the final marketing name) will be decided closer to the announcement. Meta is also said to be already working on a second-generation version of the product, codenamed Hypernova 2, which will have two screens but won't come out until at least 2027. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.
&w=3840&q=100)

Business Standard
09-05-2025
- Business
- Business Standard
Meta's next-gen smart glasses could get 'super-sensing' vision: What is it
Meta is reportedly developing a 'super-sensing' vision technology for smart glasses that could enable advanced real-time recognition of people, objects, and environments. According to a report by The Verge, citing The Information, the new Meta smart glasses — currently under development under the codenames 'Aperol' and 'Bellini' — are expected to feature this AI-powered visual ability. Meta's super-sensing vision The new technology would allow Meta's smart glasses to identify people by name through facial recognition and trigger context-aware reminders. For instance, the AI could remind users to grab their keys if it notices they have not, or prompt them to pick up groceries when passing a store. To support these features, the glasses would need to keep their onboard cameras and sensors continuously active — something Meta has reportedly started testing on its current-generation Ray-Ban smart glasses. However, it is said that battery drain has proven to be a significant hurdle, which is why Meta is likely targeting the feature for its next-generation models with improved power efficiency. New Meta smart glasses While specific details on the upcoming models remain limited, previous reports indicate that the next-generation Meta Ray-Ban smart glasses will include a small display integrated into the lower section of the right lens. This would enable a heads-up AR-like experience — allowing users to view notifications, run lightweight apps, and see images directly within their line of sight. Regarding input, these glasses are expected to support touch controls along the temple, and Meta is also developing a wrist-worn controller to offer additional interaction options. This wrist device is said to be similar to the one shown off with Meta's Orion AR prototype glasses last year. In addition to the Ray-Ban partnership, Meta is also reportedly working on a new line of smart glasses in collaboration with Oakley — another eyewear brand under the EssilorLuxottica umbrella, which also owns Ray-Ban. Competition Apple is said to be working on a new pair of smart glasses similar in concept to Meta Ray-Ban glasses. These glasses are expected to use a custom chip based on the Apple Watch processor and could feature multiple built-in cameras. Apple plans to integrate Siri and its upcoming Visual Intelligence AI system to enable real-time scene recognition and contextual assistance. Samsung is also reportedly working on a pair of smart glasses with the project name "Haean." Similar to its upcoming Project Moohan XR headset, the smart glasses are expected to be based on Google's Android XR platform.
Yahoo
14-02-2025
- Business
- Yahoo
Meta is reportedly working on humanoid robots that help with chores
If you look at your Roomba with disgust, thinking about what a far cry it is from the Jetsons' Rosey the Robot, help is on the way. Bloomberg reported on Friday that Meta plans to leverage its advances in AI and augmented reality to build a platform for futuristic humanoid robots that can help with household chores like folding laundry. Meta is reportedly creating a new team within its Reality Labs hardware division, which handles Quest VR headsets and the long-term Orion AR glasses project. Although it will build robot hardware during development, Meta's long-term goal is more like Android, where Google makes the software platform that almost all of the industry (outside of Apple) uses. Meta would make the underlying sensors, AI and software for other companies to put inside their hardware. In other words, it wants to be the Android of androids. At least initially, Meta plans to make household chores the project's central focus. Bloomberg lists folding laundry, carrying glasses of water, putting dinnerware in the dishwasher and other home chores as examples to build excitement around what could be an unsettling product category for many people. (For examples of why those concerns may be warranted, look no further than the Unitree G1 robot that ran full-speed at Engadget's Karissa Bell at CES, momentarily pinning her against the onlooking crowd.) See for yourself — The Yodel is the go-to source for daily news, entertainment and feel-good stories. By signing up, you agree to our Terms and Privacy Policy. Speaking of Unitree, Meta has reportedly held early discussions with the Chinese robotics company, which also makes a quadruped "robot dog" that can run around, climb stairs and sit on its hind legs like a good girl. Meta is also said to have discussed its plans with California-based humanoid robot maker Figure AI, which can count OpenAI, Nvidia, Microsoft, Intel and Jeff Bezos among its investors. Today's humanoid robots aren't advanced enough to pitch in around the house like Rosey, but Meta believes all the resources it's sinking into AI and XR are paving a road to that destination. Although the company thinks it will be a few years before useful humanoid robots are widely available, Meta Chief Technology Officer Andrew Bosworth reportedly views the company's progress in hand tracking, low-bandwidth computing and always-on sensors as advantages. "The core technologies we've already invested in and built across Reality Labs and AI are complementary to developing the advancements needed for robotics," Bosworth reportedly wrote in a memo. "We believe that expanding our portfolio to invest in this field will only accrue value to Meta AI and our mixed and augmented reality programs." Meta isn't alone in raising its eyebrows at the prospect of home robots for (likely rich) consumers. Last year, news broke that Apple was working on robotics. Ditto for Google. Both companies have published research papers on their robotics work. Flying cars may have to wait, but Rosey is looking a lot less like a pipe dream.