
Samsung bets big on AI: Gemini Assistant rolls out to Watch6 and Buds3
Samsung's Galaxy ecosystem is set for a major intelligence boost this winter as the tech giant announces the arrival of its Gemini AI assistant to the Galaxy Watch6 series and subsequent models. Marking its first integration into Samsung's wearable line-up, the Gemini app promises to make everyday tasks smarter and more effortless for users.
The new update, which extends Gemini's AI features beyond smartphones, aims to offer a more unified and intuitive experience across the Galaxy platform. Users will soon be able to harness powerful AI functionalities directly from their wrists and ears, transforming the way they interact with their devices.
With Gemini on the Galaxy Watch, tasks that typically require your smartphone can now be handled with simple voice commands. Whether you are mid-workout or on the move, you will be able to locate important emails or send summaries via text without pausing what you are doing. For instance, users can ask, 'Where's the latest email about yesterday's board meeting?' and promptly instruct Gemini to forward a brief summary to their team.
The hands-free capability is particularly useful in situations where using a phone is inconvenient. Out running errands with your hands full? Simply activate Gemini on your watch and ask for directions to the nearest café—no tapping or scrolling required.
The update also enhances the role of the upcoming Galaxy Buds3 series in Samsung's AI ecosystem. Gemini can be launched via voice commands or touch gestures like pinch-and-hold, enabling users to interact fluidly with their Galaxy smartphone through the Buds.
In a similar move, Apple might surprise fans with a new technology that blends wearable convenience with advanced visual intelligence. According to a recent Bloomberg report that came earlier this week, the tech giant is developing updated versions of the AirPods and Apple Watch that feature built-in cameras, with a potential launch expected around 2027.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


India Today
28 minutes ago
- India Today
Apple Liquid Glass may look like a mess but there is likely a method behind this messiness
A day ago, Apple announced a major redesign of its software. Across devices, which includes iPhone and Mac, the company is going to bring a new design language that it calls Liquid Glass. For a tech company whose products and services are used by millions of people, any software redesign is a major endeavour. For a company like Apple whose devices are used by over two billion people, it is almost a monumental task. Most significantly, it is arguably the single biggest product decision Apple can take given its reputation for design and am sure Apple must have considered all of it. And then must have decided that Liquid Glass is the way to go for Apple's first step? Well, it is not so assured as you will notice if you go to social media and search for iOS 26. Rather, this is putting it mildly. It is a proper mess at the moment. The Liquid Glass interface, as Apple envisions it, looks great in concept. But in practice, as people are pointing out after downloading the developer beta of iOS 26 and macOS 26, it is prone to chaos. Transparency in interface is always tricky, and glass effects, complete with glinting light make an interface way too performative at the cost of usability. This is what early users are discovering. The transparency is leading to instances where text is too jumbled because of different layers and is not very readable. This is particularly apparent with iOS 26 control centre, something that social media users are gleefully sharing as a failure of imagination and aesthetic sense from instances of 'failures' are many. Then there are a few more complications that the Liquid Glass brings to the user interface. The contrast between colours and layers seems off in certain use cases because of transparency. Similarly, different layers or busy user interfaces, where you have lots of buttons, are looking like scrambled mess. The chatter on social media from all those who love Apple, as well as those who don't, seems to be unequivocal — this Liquid Glass is not gonna flow. In some instances, the Liquid Glass design of iOS 26 can be a mess. But I feel most people are missing some nuances in this conversation. They are also focussed more on Liquid Glass 'fails' instead of looking at the whole thing with a bit more attention. It is true that parts of the interface, as evident by the messy control centre in the IOS 26, are problem areas. But I feel these are also niggles that would be fixed in the next few months as Apple prepares the software for public release around September-October. The Liquid Glass is something new, it is something extremely ambitious and tricky. It will take time before Apple can iron out the I feel Apple knows this. As a tech journalist whenever I have felt like criticising a tech product, I have always first taken a step back to think about it with a bit more thoroughness. This is because companies like Apple and Google don't make stupid mistakes. They have way too many smart people, way too many resources, way too many methods and tools to make silly and stupid mistakes. Instead, they make product calls, and when they do it, they do so knowing fully well the problems and potential of the calls that they are is not to say that tech giants don't make mistakes. But they make strategic mistakes. However thorough you are in the lab, the real world can undo all your hard-work with its unpredictability. The users have their own way of using the products, and while Apple and Google might hope they can bend their users to their will and taste, it doesn't always work silly and stupid mistakes are rare. And I don't think Liquid Glass is a mistake, not in the sense people on social media are saying it is. I believe Apple has some good reasons to go with Liquid Display, and they know that it is going to be challenging but they also know that the effort is going to be worth it. It is a call they have of futureThe more interesting question is why has Apple chosen Liquid Glass interface? Now, obviously I am not privy to their product decisions but I feel that the company might have decided on this interface language due to the future it imagines for its products. Because nothing else will make as much sense as are that Apple sees a future where screens are going to be translucent or transparent. It also probably believes that the interface is going to move away from buttons to voice, where the feedback to user input on the screen would be with light and glow effects, somewhat similar to how Siri works on such a world, Liquid Display could be the perfect user interface for devices. I mean imagine an iPhone with a translucent display, or Apple AR glasses, which are set to arrive in a year or two, and then imagine the current software design on them. It would be ugly, and would even be non-functional in certain ways. Now imagine the same devices with the Liquid Glass interface — suddenly you will see how much more sense it makes. If you try to imagine the current iOS and macOS design on the futuristic iPhone and Mac, the whole thing looks very ugly. (AI generated image) advertisementApple is a company that tends to do stuff over the years. It will start something today but with an aim to iterate and use that for the next 10 years. The design interface it started with iOS 7 continued until now. That is a long time, and that also gives the indication that what the company is now starting with the Liquid Display is going to be here for a long time. The fact that the Liquid Glass has been created for future devices is amply highlighted by Apple's Alan Dye, who notes that 'it lays the foundation for new experiences in the future and, ultimately, it makes even the simplest of interactions more fun and magical.'advertisementWhen looked at in this context, the Liquid Glass makes a fair amount of sense. It is true that currently it is seemingly a mess and will require some serious tweaks from Apple to make it work on the current iPhones and Macs. But even there, I believe, the playfulness and the technical wizardry behind the 'liquid glassy' animations and interactions will woo most of the users by the end of this year. That is once Apple has fixed some of the teething I am not a fan of glassy or skeuomorphic designs. I find the leather look or glass look boring and unnecessarily complicated. Instead, I have always been a fan of flat 2D designs, which use layers and colours for all the functionality that they can offer. They are simple and effective. I also prefer clean and straight lines on screen. In that sense, I have always loved what Google has done with its Material Design on Android, or what Windows tried to do — rather heavy-handedly — with Windows I find the Liquid Glass not all that appealing at the moment. But my opinion is also shaped by current devices that are available to us. And similar is the case for almost everyone who is slamming Apple on social media for the new iOS 26 or macOS. We are not (yet) thinking of the future in the way Apple is probably doing because in that future the Liquid Glass will probably be a perfect fit for the iPhone and the Mac of 2030.


India Today
28 minutes ago
- India Today
Apple almost confirms AirPods Pro 3, iOS 26 beta source code spills the beans
If the latest development is to be believed, Apple might be planning the launch of its next-generation AirPods Pro — likely to be called the AirPods Pro 3. There's been quite a bit of confusion around its release, especially with rumours suggesting the device could feature tech like infrared cameras and more. While Bloomberg's Mark Gurman initially expected the new AirPods Pro to arrive in 2025, noted analyst Ming-Chi Kuo claimed that Apple would instead launch the earbuds with infrared camera tech in 2026, hinting that no new Pro model would arrive this year. Gurman, on the other hand, believes the camera tech isn't ready yet and won't launch before said, we're already halfway through the year, and there's been no official sign of an AirPods Pro launch. However, a recent report from MacRumors might offer the first real clue. According to the report, contributor Steve Moser spotted a reference to 'AirPods Pro 3' buried inside the iOS 26 source code. The mention was found in the developer beta of iOS 26, within the headphone UI framework. While this isn't an official confirmation, Apple has used similar references in the past to quietly signal the existence of upcoming to this, MacRumors analyst Aaron also found another interesting reference. In a particular system prompt, Apple reportedly uses the line, 'This task requires Apple AirPods Pro 2 or later.' At the moment, there is no model 'later' than the AirPods Pro 2, which is the only Pro model currently on sale. This has naturally sparked more speculation about whether a third-gen model is on the In terms of design and hardware expectations, the recently released AirPods 4 might offer some clues about what to expect from the AirPods Pro 3. The next-gen earbuds could feature a refreshed design for both the case and the buds — possibly slimmer and sleeker — along with a concealed LED indicator and a new capacitive button for pairing and it comes to features, Apple is said to be testing a faster audio chip that could offer significantly better Active Noise Cancellation than the already impressive AirPods Pro 2. Mark Gurman also reports that Apple is working on health features, including in-ear heart rate tracking for the AirPods Pro 3. While Powerbeats Pro 2 already collects heart data, they can't stream music to gym equipment at the same time. Apple may be aiming to offer both biometric tracking and uninterrupted audio playback with the next Pro earbuds — something that would appeal to fitness-focused Apple is researching ear-canal temperature sensing. If included, this could provide more accurate body temperature readings than the current Apple Watch, which relies on skin temperature. However, it's unclear whether this feature will be ready in time for a 2025 interesting feature being discussed is real-time translation. Rumour has it that AirPods Pro 3 may work with the iPhone's Translate app to enable live conversation translation directly through the earbuds. In theory, this could allow someone to hear instant English translations of another person speaking in another language (and vice versa), without needing to use a phone for the possible release date, while there's still uncertainty around when the AirPods Pro 3 might arrive — and whether it will be this year or next — one thing's for sure: if Apple does decide to launch the earbuds in 2025, the most likely time would be alongside the iPhone 17. That would follow Apple's past pattern, with the first-gen AirPods Pro arriving alongside the iPhone 11 series in 2019 and the second-gen model launching with the iPhone 14 series in 2022. If the iOS 26 code reference is anything to go by, we might just see the new AirPods Pro this September.


Time of India
30 minutes ago
- Time of India
AI Becoming the New Gorden Ramsey?: How AI is Sneaking into Your Kitchen.
A relationship that is beyond just personal assistants: Startups' involvement in the AI Food Revolution: Live Events AI is crammed into places you could never guess it could be: AI's contribution to health and sustainability: AI is silently revolutionizing a domain that nobody saw coming: the kitchen. It made its mark with sneak attacks in e-grocery apps like Swiggy Instamart , Big Basket, Blinkit, and Zepto via the quick little 'suggested for you' and 'you may also like' sections. Today, AI has been 'hard launching' itself as a private chef, creating a niche in food management, cooking, and diet planning by suggesting recipes based on leftovers in the fridge. This development is particularly beneficial for single professionals who live alone in today's fast-paced work-life integration in household appliances is no longer limited to personal assistants like 'Alexa' or 'Google Home.' Industry leaders such as 'Samsung' and 'LG' are etching AI into the kitchen appliances to deliver personalized food experiences. 'Samsung's Bespoke AI fridge,' for instance, not only keeps track of the content stored inside the fridge but also integrates with its fitness applications and dietary preferences to suggest meals tailored to the user's the same way, LG's ThinQ products communicate with one another, optimizing the time required to cook and the energy required, in accordance with the user's routine. AI's integration into kitchen appliances aims to de-complexify meal prep, reduce potential food waste, and promote healthy eating habits when the users aren't able to on their own. Like the urban dweller, balancing work and wellness on a startup segment of the industry is buzzing with innovative AI-driven solutions targeting meal prep and lessened food wastage. Companies such as DishGen use AI to curate recipe suggestions from leftovers readily available in the refrigerator, helping single professionals make the most out of what they have in a cost-effective and time-efficient also exist other AIs, such as 'FridgeAI' and 'LeftoverAI,' that focus on promoting food waste management by analyzing the leftover ingredients and suggesting creative recipes to satiate the user's cravings in a cost-efficient manner. These AI platforms dive into extensive recipe databases and nutrition sciences, making them valuable tools for diet- and budget-conscious users looking for convenience without taking a strike at their is fascinating to realize that this is just one part of AI's integration into our daily lives. Beyond fridges and ovens, startups have begun to explore AI's incorporation into blenders, rice cookers, spice organizers, and even grocery lists that sync conveniently with smart kitchens. This AI-everywhere approach re-writes home cooking as an effortless and customized promoting food waste management, healthy eating habits, and raising awareness about the significance of nutritional knowledge, AI-driven kitchen appliances and startups associated with meal prep aren't just extending convenience. Rather, they contribute to sustainability goals and improve public that notion doesn't exempt it from raising serious concerns about data privacy and user trust. Should AI really be entrusted with sensitive information like a user's diet and lifestyle preferences? As advancements in such technologies accelerate, it's crucial to strike a balance between innovation, transparency, and security.