logo
#

Latest news with #MetaView

Ray-Ban Meta Smart Glasses Debut in India with AI Features, Priced from Rs 29,900
Ray-Ban Meta Smart Glasses Debut in India with AI Features, Priced from Rs 29,900

Hans India

time13-05-2025

  • Business
  • Hans India

Ray-Ban Meta Smart Glasses Debut in India with AI Features, Priced from Rs 29,900

Meta and Ray-Ban have officially launched their AI-powered smart glasses inIndia, marking the debut of intelligent eyewear that merges classic design withnext-gen features. Priced from ₹29,900, the Ray-Ban Meta smart glasses are nowup for pre-order at and will be available in select retail outletsstarting May 19. These innovative glasses integrate Meta's AI assistant, which can betriggered using the command 'Hey Meta.' This voice interaction enables users toperform everyday tasks like asking general queries, receiving real-time cookingtips, or identifying objects and landmarks—entirely hands-free. One standout feature is the real-time language translation, which supportsEnglish, Spanish, French, and Italian. Meta claims that even when the glassesare in airplane mode, downloaded language packs allow live audio the user hears the translation through the glasses, the conversationpartner can access the translated text or audio on their smartphone. Design-wise, the smart glasses come in the iconic Wayfarer frame (availablein standard and large sizes) and the newly introduced Skyler style, crafted fora more universal fit. Buyers can customise lenses from options such as clear, sun, polarised, or Transitions. Prescription lenses are also supported. The glasses sync seamlessly with the Meta View app, which allows users toreview photos taken with the glasses and use AI to edit them. For instance, onecan command the app to remove objects from an image or enhance it with newelements using voice instructions. Beyond photography and AI assistance, the smart glasses also support callingand messaging via WhatsApp, Messenger, and the phone's native apps. Featureslike Instagram DMs and voice/video calling are set to roll out soon. Musicstreaming from platforms like Spotify, Apple Music, and Amazon Music is alsosupported, controlled entirely through voice. However, while the glasses offer futuristic capabilities, some featuresremain restricted to users with English as their default language. Concernsaround regional language support, data privacy, and long-term usability mayinfluence adoption in India. With AI tech becoming increasingly wearable, the Ray-Ban Meta smart glassesare a bold step into the future—but their real-world impact will unfold overtime.

Ray-Ban Meta Smart Glasses launched in India with 12MP camera and Meta AI: Price, features and more
Ray-Ban Meta Smart Glasses launched in India with 12MP camera and Meta AI: Price, features and more

Mint

time13-05-2025

  • Mint

Ray-Ban Meta Smart Glasses launched in India with 12MP camera and Meta AI: Price, features and more

Meta has officially introduced its AI-driven smart glasses in India, developed in partnership with EssilorLuxottica under the iconic Ray-Ban brand. Dubbed Ray-Ban Meta Glasses, the wearable device blends classic eyewear aesthetics with modern smart features, allowing users to capture photos and videos, enjoy music, communicate hands-free, and even interact with their surroundings using artificial intelligence. The glasses come equipped with a 12MP ultra-wide camera, open-ear speakers, and a five-microphone array, enabling users to take high-resolution snapshots, record 1080p videos up to 60 seconds, and livestream from a first-person perspective directly to Facebook or Instagram. The companion Meta View app also facilitates sharing content across other platforms. The Ray-Ban Meta Glasses start at ₹ 29,900 for the Skyler and Wayfarer styles in Shiny Black. A Matte Black Wayfarer is available at ₹ 32,100, while the Chalk Grey Skyler and Matte Black Wayfarer variants are priced at ₹ 35,700. Pre-orders are now open, with sales commencing from 19 May via Ray-Ban's official website and leading eyewear retailers across the country. The smart glasses are powered by Qualcomm's Snapdragon AR1 Gen1 chip and offer 32GB of internal storage. According to Meta, users can expect up to four hours of use on a single charge, with an additional 32 hours provided by the charging case. The glasses also carry an IPX4 water-resistance rating, making them suitable for everyday use. At the heart of the Ray-Ban Meta Glasses is Meta AI, the company's proprietary virtual assistant. Activated by the voice prompt 'Hey Meta,' the assistant supports hands-free tasks like music identification, live translation, and contextual queries based on the user's surroundings. Notably, the glasses can identify music tracks playing in the background—ideal for public settings like cafés or stores. The real-time translation feature supports English to Spanish, French, or Italian, with the translated speech played through the open-ear speakers. A transcription option is also available. The AI capabilities go further with the 'Live AI' feature, which monitors the live camera feed to answer location-based queries without requiring a wake command. Users can also send direct messages, place audio and video calls, and receive messages through apps such as Instagram, using simple voice commands. In addition, music playback via Spotify, Apple Music, and Amazon Music is supported, with voice-enabled search options that allow users to request specific tracks, artists, or playlists.

Meta has a new stand-alone AI app. It lets you see what other people are asking. I'm confused.
Meta has a new stand-alone AI app. It lets you see what other people are asking. I'm confused.

Business Insider

time01-05-2025

  • Business Insider

Meta has a new stand-alone AI app. It lets you see what other people are asking. I'm confused.

It's 10 p.m., and I'm in bed on my phone, listening to an audio clip of a woman asking an AI chatbot for medical advice about her sick pet turtle. As someone who loves to lurk in other people's business, I'm in heaven. But how did we get here? Let's back up. This week, Meta launched the Meta AI app. The app has two functions: The first is that it replaces "Meta View," which was the app that went with Meta Ray-Ban glasses. The second (which doesn't require the glasses) is to be a stand-alone Meta AI assistant that you may have already encountered in Instagram and Facebook search. Basically, it's a chatbot app that (I guess?) is meant to compete with ChatGPT. This part of the app is pretty familiar. Meta AI can answer questions, chat with you, and make funny pictures for you, like this one I had made of a dog reading BI: Here's where it gets weird. There's also a public feed of other people's AI chats that you can scroll through. Most of this feed is people making silly images — Darth Vader eating ice cream, that sort of thing. Some of these came from suggested prompts when you first open the app. To be clear, your AI chats are not public by default — you have to choose to share them individually by tapping a share button. But even so, I get the sense that some people don't really understand what they're sharing, or what's going on. Like the woman with the sick pet turtle. Or another person who was asking for advice about what possible legal measures he could take against his former employer after getting laid off. Or a woman asking about the effects of folic acid for a woman in her 60s who has already gone through menopause. Or someone asking for help with their Blue Cross health insurance bill. I found all those examples mixed in with funny cartoon images in my public feed. Perhaps these people knew they were sharing on a public feed and wanted to do so. Perhaps not. This leaves us with an obvious question: What's the point of this, anyway? Even if you put aside the potential accidental oversharing, what's the point of seeing a feed of people's AI prompts at all? Meta's blog post announcing the AI app talked about the social aspect: "And just like all our platforms, we built Meta AI to connect you with the people and things you care about. The Meta AI app includes a Discover feed, a place to share and explore how others are using AI. You can see the best prompts people are sharing, or remix them to make them your own." (I asked Meta for comment.) Is seeing other people's AI chats even interesting at all? Would it be interesting to see the AI chats of people I know? Yes, for snooping reasons. Is it interesting to see them for randos? Eh. I barely want to see real photos of people I don't know unless they're incredibly hot; I am bored pretty quickly by seeing AI slop from a stranger. Is a social AI feed the social feed of the future? Even trying to be as open-minded as possible about this, I am straining to see the appeal. I just don't get it.

Zuckerberg launches standalone Meta AI app
Zuckerberg launches standalone Meta AI app

Express Tribune

time01-05-2025

  • Business
  • Express Tribune

Zuckerberg launches standalone Meta AI app

Social media behemoth Meta unveiled its first standalone AI assistant app on Tuesday, challenging ChatGPT by giving users a direct path to its generative artificial intelligence models. "A billion people are using Meta AI across our apps now, so we made a new standalone Meta AI app for you to check out," the company's CEO and founder Mark Zuckerberg said in a video posted on Instagram. Zuckerberg said the app "is designed to be your personal AI" and could be primarily accessed through voice conversations with the interactions personalised to the individual user. "We're starting off really basic, with just a little bit of context about your interests," the CEO said. "But over time, you're going be able to let Meta AI know a whole lot about you and the people you care about from across our apps, if you want." Embracing the company's social media DNA, the app features a social feed allowing users to see AI-made posts by other users. "We learn from seeing each other do it, so we put this right in the app," Meta chief product officer Chris Cox said Tuesday as he opened the tech titan's LlamaCon developers gathering devoted to its open-source AI model. "You can share your prompts. You can share your art. It's super fun." The new application also replaces Meta View as the companion app for Ray-Ban Meta smart glasses, allowing conversations to flow between glasses, mobile app and desktop interfaces, the company said. "We were very focused on the voice experience; the most natural possible interface," Cox said. "You can hear interruptions and laughter and an actual dialogue - just like a phone call." The executive explained that the feature isn't able to search the web, so asking about topics such as sports teams or the Papal conclave was off the table for now. Users will have the option of letting Meta AI learn about them by looking at their activity on their Instagram or Facebook accounts. "It will also remember things you tell it like your kids' names; your wife's birthday, and other things you want to make sure your assistant doesn't forget," Cox said. Meta touted advantages of Llama at the one day event aimed at getting developers to embrace its AI model that it describes as open-source. Open source means developers are free to customise key parts of the software as suits their needs. "You have the ability to take the best parts of the intelligence from the different models and produce exactly what you need, which I think is going to be very powerful," Zuckerberg told developers tuned into LLamaCon. AFP

Meta releases standalone AI app, competing with ChatGPT
Meta releases standalone AI app, competing with ChatGPT

The Sun

time30-04-2025

  • Business
  • The Sun

Meta releases standalone AI app, competing with ChatGPT

SAN FRANCISCO: Social media behemoth Meta unveiled its first standalone AI assistant app on Tuesday, challenging ChatGPT by giving users a direct path to its generative artificial intelligence models. 'A billion people are using Meta AI across our apps now, so we made a new standalone Meta AI app for you to check out,' the company's CEO and founder Mark Zuckerberg said in a video posted on Instagram. Zuckerberg said the app 'is designed to be your personal AI' and could be primarily accessed through voice conversations with the interactions personalized to the individual user. 'We're starting off really basic, with just a little bit of context about your interests,' the CEO said. 'But over time, you're going be able to let Meta AI know a whole lot about you and the people you care about from across our apps, if you want.' Embracing the company's social media DNA, the app features a social feed allowing users to see AI-made posts by other users. 'We learn from seeing each other do it, so we put this right in the app,' Meta chief product officer Chris Cox said Tuesday as he opened the tech titan's LlamaCon developers gathering devoted to its open-source AI model. 'You can share your prompts. You can share your art. It's super fun.' The new application also replaces Meta View as the companion app for Ray-Ban Meta smart glasses, allowing conversations to flow between glasses, mobile app and desktop interfaces, the company said. 'We were very focused on the voice experience; the most natural possible interface,' Cox said. 'Like a phone call' Meta also added an experimental mode designed to let the AI app engage in human style conversations with users. 'You can hear interruptions and laughter and an actual dialog - just like a phone call,' Cox said. The executive explained that the feature isn't able to search the web, so asking about topics such as sports teams or the Papal conclave was off the table for now. Users will have the option of letting Meta AI learn about them by looking at their activity on their Instagram or Facebook accounts. 'It will also remember things you tell it like your kids' names; your wife's birthday, and other things you want to make sure your assistant doesn't forget,' Cox said. The release comes as OpenAI stands as a leader of straight-to-user AI through its ChatGPT assistant that is regularly updated with new capabilities. Meta touted advantages of Llama at the one day event aimed at getting developers to embrace its AI model that it describes as open-source. Open source means developers are free to customize key parts of the software as suits their needs. OpenAI's closed model keeps its inner workings private. 'Part of the value around open source is that you can mix and match,' Zuckerberg told developers tuned into LLamaCon. 'You have the ability to take the best parts of the intelligence from the different models and produce exactly what you need, which I think is going to be very powerful.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store