Latest news with #MetaRay-Ban


Indian Express
23-05-2025
- Business
- Indian Express
Apple may launch AI powered smart glasses by the end of 2026
After experimenting with Mixed-Reality headsets, Apple now seems interested in making Meta Ray-Ban-like smart glasses. The Cupertino-based tech giant is reportedly 'ramping up work on the glasses' and may produce working prototypes by the end of the year. The news from Bloomberg's Mark Gurman, who, citing people with knowledge of the matter, said that Apple is planning to launch the Meta Ray-Ban competitor 'at the end of next year as part of a push into AI-enhanced gadgets.' Like Meta's much rumoured smart glasses codenamed Hypernova, Apple's glasses are said to pack a camera and have a bunch of microphones and speakers, allowing the gadget to analyse the world around us and process voice inputs using Siri. Moreover, it looks like the smart glasses could allow users to make phone calls, control music playback, offer turn-by-turn directions and translate a conversation in real time. Internally dubbed N50, the report states that Apple's smart glasses are now clubbed under the descriptor N401, which may be a broader project exploring similar products. But with Apple having a long history of cancelling products, the company may possibly end up scrapping the idea. Earlier this month, Gurman had claimed that Apple was internally working on a dedicated chip for smart glasses, with plans to start manufacturing as early as next year. The report also claims that Apple was actively working on making an Apple Watch and Apple Watch Ultra with cameras by 2027, but the project was shut down earlier this week. Also, as strange as it may sound, Gurman says that the tech giant is still working on AirPods with cameras. With the Apple Vision Pro seeing poor sales figures and smartphone upgrades becoming increasingly incremental, Apple appears to be shifting its focus to other product categories, particularly wearables. However, people working on Apple's smart glasses are worried that the company's struggle with AI may make the product less appealing compared to Meta's upcoming Ray-Bans and Samsung's Project Moohan, both of which will benefit from Meta's Llama and Google's Gemini AI models. Recently, Google announced that Gemini Live with vision will be free for all to use, but Apple is still relying on Google Lens and OpenAI's solution for its Visual Intelligence feature. As for Apple smart glasses, the majority of work is reportedly undertaken by the Vision Products Group, the same team that developed the Vision Pro. It looks like the team is also working on a much cheaper and lighter model, in addition to one that offers less lag and low latency by connecting to a Mac.


Tom's Guide
22-05-2025
- Business
- Tom's Guide
Apple's ‘AI push' could mean smart glasses arrive as soon as 2026
According to a new report from Bloomberg's Mark Gurman, Apple is seeking to release a set of smart glasses by end of 2026 as part of a "push into AI-enhanced gadgets." The Apple Glasses, meant to take on the Meta Ray-Ban glasses and any upcoming products built on the Android XR platform that Google showed off this week, have entered a ramped up development to meet the target date. Prototypes should be produced by the end of this year, the Bloomberg report claims. With OpenAI buying former Apple Chief Design Officer Jony Ive's company (which he started with OpenAI's Sam Altman) to build the 'iPhone of AI' it seems the Cupertino giant is feeling the pressure. In April, it was reported that Apple CEO Tim Cook is "obsessed" with launching a pair of Apple Glasses. Like other smart glasses, the Apple version is supposed to feature cameras, microphones and speakers. Coupled with Apple Intelligence and Siri, they could potentially analyze the external world and take on tasks like music playback, live translations, and phone calls. Gurman claims that Apple wants its glasses to use augmented reality (AR) to use displays and other tech to show digital content on the lens, but that feature might not come any time soon. Allegedly, Apple's Vision Products Group, makers of the Vision Pro headset, will develop this product. And while they are working on a new version of Apple's spatial computing headset, apparently, Glasses are getting the bulk of the focus. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. The group is supposed to be helping design a chip meant for smart glasses, which might launch next year. Much of Apple's future plans depend on the company bolstering Apple Intelligence, something the company has struggled with since its take on AI was announced in June of 2024. A number of reports have come out in the last few months that claim that Apple couldn't get its priorities in order especially when it comes to Siri. Recently, Apple has started to open up its walled garden by allowing third-party LLMs to help bolster Apple Intelligence alongside ChatGPT which is already integrated with Siri. For the rumored glasses to succeed in the way Apple wants, the company will have to offer a more robust version of its AI tools, including a smarter version of its personal assistant. That could happen with the iOS 19 update likely to arrive later this year. Not only will iOS 19 offer a redesign for Apple's iPhone software, it's supposed to give Apple Intelligence a boost. There have been rumors that Apple was working on an Apple Watch or Apple Watch Ultra that would feature a camera; however, Gurman claims those plans have been squashed. A rumored AirPods update that would feature built-in cameras is still in the works. Reportedly, those earbuds would launch next year as well. Next year could be big for Apple with new products and overhauled classics. Apple's first foldable phone should also launch late next year.


The Irish Sun
22-05-2025
- The Irish Sun
I was among the first to try Google's ‘all-seeing' smart glasses – three features make it worth DITCHING your mobile
WHEN you've seen as much tech as I have in over a decade of reporting there are few things that leave you truly excited. But Google's latest mind-blowing innovation may well fill that void - and much sooner than you may think. 3 From some angles you can just about make out the hidden screen projection Credit: AP 3 The glasses are a prototype of Google's Android XR operating system coming later this year Credit: AFP 3 I was among the first to try them out Credit: Jamie Harris / The Sun The tech big beast hosts a huge show every year in California to show off a load of major gizmos and projects. But the one that everyone is talking about this year is meaningful tech right up to our eyes. Meaningful is key here - as it's not the first time we've seen wearables from Google. The firm boldly tried it with Read more about Google Even Google co-founder Sergey Brin went on stage and admitted he "made a lot of mistakes" on Glass. Now the tech giant is on the cusp of taking another stab at eyewear in the form of Android XR, working with partners like I attended the Google I/O conference for the second time, where I got the chance to try out a prototype pair for a few minutes. While there is still some work to be done, I got a clear idea of where the device is going and several super handy use cases that will be truly life-changing for anyone. Most read in Tech First thing's first, these glasses look and feel like any ordinary pair of glasses which is essential for this category to succeed. We've seen how well it's worked with The pair I'm trying show visuals over one lens which don't distract from your environment. You can also press a button on the side to take a photo or video. Google's Gemini AI app lets you chat out loud with shockingly humanlike virtual helper Information about items Much like the Meta Ray-Ban glasses, there is a lot of AI that can intelligently provide you with information about objects around you. Of course, I only get a few short minutes to try Google's Android XR prototype in a controlled environment, so a real world test to push the device to its limits is needed. But for now, there's a book and a I ask Gemini what the book is about, and it gives me a brief summary of the series it's from and a flavour of the contents. Next, I ask about a plant and how to take care of it. It responds with the name of the planet and tells me the best way to look after it too. Digital voice-powered assistants up until now have been notoriously finicky, only responding to very clear - and often rigid - commands or questions. Here, I was able to mumble, pause and word my questioning without it confusing Gemini in the slightest. Getting around Mapping could be a real game-changer for everyone on Android XR. When asking for directions, I can see an arrow floating in front of me with the name of the road I should turn down. And when I glance down, I can see a small circular projection on the ground showing the As someone who has spent far too long glancing from phone to path and back, I can see this feature being hugely useful. And more to come There are some features we've heard about but didn't get to test just yet which could also make life remarkably easier. We've seen in demo videos how Android XR is all-seeing, so it takes in and remembers what it sees proactively. For example, when wearing them you may have put your keys down on the sofa. Ask Gemini where you left your keys and it will remember seeing them and tell you where. Similarly, we saw from one Google exec on stage during a demo how she had a coffee, put it down and minutes later asked Gemini to recall which coffee shop it was from - and the tech remembered. So watch this space, as Android XR is due to be released at the end of the year and we've already heard about Samsung and another company called Oh, and there is one huge caveat too - for as much as it means less screen time on your phone, you'll still need your handset close by to run Android XR devices, so you won't be able to ditch your mobile entirely.
Yahoo
13-05-2025
- Automotive
- Yahoo
Detroit driving instructor catches dangerous stunt on camera on I-94
The Brief A dangerous stunt allegedly involving alcohol was caught on camera by a Detroit driving instructor. Driver's ed instructor Korey Batey took the video. Batey uses Meta Ray-Ban glasses and, with students' permission, he creates social media content with humor. DETROIT (FOX 2) - Driving around Metro Detroit can have its moments, with construction, excessive speeders, and even a couple of guys trying to share a cocktail in the middle of the highway. What they're saying If roads could talk, these Detroit highways would have some stories to tell, but this story FOX 2 has learned was caught on camera by a content creator/driving instructor. In the video, viewers can see two vehicles next to each other driving ahead of the instructor. The one on the left, a passenger tries to pour liquor into the other driver's cup in the middle of I-94, on the east side of Detroit. Driver's ed instructor Korey Batey took the video. "I could have been easily looking at the cell phone and looked up, and hit them," he said. "I just say, 'hey Meta, record.'" Dig deeper Batey uses Meta Ray-Ban glasses and, with students' permission, he creates social media content with humor. "I really try to take my time with them, show some grace, but I'm also very stern, because this isn't GTA, this is real life," he said. Having a little fun and hopefully sharing some driving wisdom is what Batey is all about. Meanwhile, it is something the two he caught on camera may need more of. "What I'd like to tell them is that it only takes one moment like that to change your life forever," he said. Batey thinks maybe they were trying to make some kind of content of their own with the stunt; regardless, it's dangerous from top to bottom.
&w=3840&q=100)

Business Standard
09-05-2025
- Business
- Business Standard
Meta's next-gen smart glasses could get 'super-sensing' vision: What is it
Meta is reportedly developing a 'super-sensing' vision technology for smart glasses that could enable advanced real-time recognition of people, objects, and environments. According to a report by The Verge, citing The Information, the new Meta smart glasses — currently under development under the codenames 'Aperol' and 'Bellini' — are expected to feature this AI-powered visual ability. Meta's super-sensing vision The new technology would allow Meta's smart glasses to identify people by name through facial recognition and trigger context-aware reminders. For instance, the AI could remind users to grab their keys if it notices they have not, or prompt them to pick up groceries when passing a store. To support these features, the glasses would need to keep their onboard cameras and sensors continuously active — something Meta has reportedly started testing on its current-generation Ray-Ban smart glasses. However, it is said that battery drain has proven to be a significant hurdle, which is why Meta is likely targeting the feature for its next-generation models with improved power efficiency. New Meta smart glasses While specific details on the upcoming models remain limited, previous reports indicate that the next-generation Meta Ray-Ban smart glasses will include a small display integrated into the lower section of the right lens. This would enable a heads-up AR-like experience — allowing users to view notifications, run lightweight apps, and see images directly within their line of sight. Regarding input, these glasses are expected to support touch controls along the temple, and Meta is also developing a wrist-worn controller to offer additional interaction options. This wrist device is said to be similar to the one shown off with Meta's Orion AR prototype glasses last year. In addition to the Ray-Ban partnership, Meta is also reportedly working on a new line of smart glasses in collaboration with Oakley — another eyewear brand under the EssilorLuxottica umbrella, which also owns Ray-Ban. Competition Apple is said to be working on a new pair of smart glasses similar in concept to Meta Ray-Ban glasses. These glasses are expected to use a custom chip based on the Apple Watch processor and could feature multiple built-in cameras. Apple plans to integrate Siri and its upcoming Visual Intelligence AI system to enable real-time scene recognition and contextual assistance. Samsung is also reportedly working on a pair of smart glasses with the project name "Haean." Similar to its upcoming Project Moohan XR headset, the smart glasses are expected to be based on Google's Android XR platform.