logo
Google Gemini AI gets major upgrades for foldables, Wear OS and mobile gaming

Google Gemini AI gets major upgrades for foldables, Wear OS and mobile gaming

Hindustan Times10-07-2025
Google has just dropped a bombshell for Android users and it's one that could change the way you use your phone, smartwatch and even your games. Gemini AI now sees through your camera, talks to your apps, and lives on your wrist.(Unsplash)
Unveiled at Samsung's Galaxy Unpacked event, five new Gemini features are coming to Android 16 and Wear OS 6, signalling a significant shift in how AI will operate across devices. From foldables and smartwatches to mobile games and system apps, Gemini is being embedded into everyday interactions, not just in the background.
Foldable phones just got smarter with Gemini Live
Let's start with the Galaxy Z Flip 7. Gemini Live, Google's hands-free and real-time AI companion will now work directly from the cover screen of the upcoming Galaxy Z Flip 7. That means users won't even need to unfold the device to access Gemini for quick tasks, instructions or conversational assistance.
More impressively, Gemini is gaining camera awareness via Flex Mode. Open the phone halfway, activate the camera and Gemini will use visual input to understand what you're doing, offering context-aware help while cooking, assembling furniture or reviewing an outfit.
Circle to search becomes conversational and contextual
Circle to Search, originally designed to let users draw a circle around on-screen content and trigger a search, is getting a major upgrade. Gemini now powers the results through a conversational interface. Instead of just returning links, it will offer summaries, allow follow-up questions and help explore topics without leaving the current screen.
In a surprise move, Google is extending this feature to mobile gaming. Players can now circle in-game elements like enemies, tools or puzzles, and Gemini will offer real-time tips, strategy suggestions, or walkthrough guidance tailored to their progress.
Gemini starts talking to your native apps
Google is also embedding Gemini into Samsung's default Calendar, Notes and Reminders apps. You can now ask Gemini to summarise your schedule, add reminders or extract key points from your notes, without needing to open or switch between apps. More third-party integrations are expected in future updates.
Wear OS gets a smarter assistant
Gemini is replacing Google Assistant on the Galaxy Watch8 series and other upcoming Wear OS 6 devices. It will offer more natural conversations, smarter notifications and better contextual responses, finally addressing one of Wear OS's long-standing weak points.
Final word
This rollout by Google isn't just an AI feature drop but a platform-level shift. With Gemini becoming visually aware, contextually embedded and cross-device fluent, Google is clearly repositioning Android to not only be smarter, but also meaningfully assistive.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Oppo's AI Vision: Smartphones as Empathetic Partners, Not Replacements
Oppo's AI Vision: Smartphones as Empathetic Partners, Not Replacements

Hans India

time10 minutes ago

  • Hans India

Oppo's AI Vision: Smartphones as Empathetic Partners, Not Replacements

Oppo is redefining the future of smartphones by positioning artificial intelligence (AI) as a collaborative tool that amplifies human potential, not one that competes with it. The company is actively investing in AI to transform the smartphone into an intelligent, empathetic assistant that complements daily life through intuitive, useful features. At the Mobile World Congress (MWC) 2025, Oppo introduced its enhanced AI strategy, with an ambitious goal to bring generative AI to 100 million users globally by the end of 2025. Peter Dohyung Lee, Head of Product Strategy at Oppo, emphasized India's importance in this vision. 'India is central to our goal of bringing GenAI to 100 million global users by 2025,' Lee told India Today Tech, highlighting the country's rapid AI adoption and tech-savvy consumer base. Since 2020, Oppo has been building its own large language models (LLMs), becoming the first smartphone brand to deploy a 7-billion-parameter LLM directly on a device. These efforts have led to the rollout of over 100 generative AI features across Oppo smartphones in 2024 alone. AI is deeply woven into Oppo's internal and product ecosystem. From features like HyperTone Image Engine for improved photography to intelligent battery optimization via SuperVOOC charging, AI enables a smarter, more personalized user experience. Internally, the company uses AI to enhance R&D, automate testing, and streamline development. 'Our goal is simple – AI for all,' said Lee. Oppo is democratizing AI by embedding it across all product tiers, not just premium devices. The recently launched Reno 14 series is a testament to this, integrating advanced tools such as AI Eraser 2.0, AI Best Face, and productivity boosters like AI Voice Scribe and AI Translate. Strategic collaborations with global tech leaders such as Google, Microsoft, MediaTek, and Qualcomm are helping Oppo push the boundaries of mobile AI. For instance, Google's Gemini is now integrated into Oppo's ecosystem, enabling users to perform complex tasks using natural language across apps. Microsoft's Azure AI brings improved transcription services and will soon allow PC users to control connected Oppo smartphones using Copilot. Oppo also stresses data privacy, investing heavily in encryption, firewalls, and on-device processing to ensure responsible AI implementation. 'AI means more data, but it also means more responsibility,' Lee remarked, underlining the brand's commitment to user trust. The Indian market, with over 690 million users, plays a critical role in this expansion. Lee observed that Indian consumers expect flagship-grade features even in mid-tier phones, which aligns with Oppo's mission to make GenAI features broadly accessible. 'The future is not about humans vs AI, but it's about humans and AI,' Lee concluded. As smartphones evolve into context-aware, real-time collaborators, Oppo envisions a future where AI enriches creativity, communication, and everyday convenience—always with empathy and user control at the forefront.

Android earthquake alert: Google admits algorithm-driven system limitations
Android earthquake alert: Google admits algorithm-driven system limitations

Business Standard

time10 minutes ago

  • Business Standard

Android earthquake alert: Google admits algorithm-driven system limitations

After two years, Google has reportedly admitted that its earthquake early warning system, dubbed Android Earthquake Alert System, failed to accurately alert people during Turkey's deadly quake of 2023. According to a BBC report, the US technology giant accepted that rather than sending its highest level alert to ten million people within 98 miles of the epicentre, it only sent such alerts to 469 users during the first 7.8 magnitude earthquake. Getting this alert on time would have given people around 35 seconds of warning to take precautionary measures, added the report. Some might be wondering why Google failed to send such alerts to the victims timely manner, whereas some might be wondering how Google will even detect which people will be affected by the earthquake. Let's find the answers to these questions and more: What is Google's Android Earthquake Alert System, and how does it work Google's Android Earthquake Alert System is a mechanism that detects early signs of seismic activity or earthquakes. It relies on motion sensors built into Android smartphones to detect it. When a potential earthquake is identified, the system uses an internet connection to send alerts to nearby Android users who might be affected by strong ground shaking. Additionally, when people search for earthquake-related information on Google, the system provides real-time details about recent tremors in the area along with safety tips. For the uninitiated, most Android phones come with accelerometers – sensors that detect movement and orientation. These can also act like small earthquake detectors. The sensor can pick up the earliest signs of ground shaking as it uses your phone's accelerometer to sense the initial tremors (P-waves) of an earthquake. If several phones in the same area detect similar shaking at the same time, Google's servers analyse the data to estimate whether an earthquake is happening, where it started, and how strong it might be. Alerts are then sent to nearby phones using the internet. Since internet signals move faster than earthquake waves, people can often get a warning a few seconds before strong shaking begins. Types of alert notifications and what they mean Google's Earthquake Alert System issues two types of notifications when a quake of magnitude 4.5 or higher is detected: 'Be Aware' and 'Take Action' alerts. The 'Be Aware' alert is intended for areas expected to feel mild tremors. It follows the phone's standard notification settings, including volume, and offers more information once tapped. On the other hand, the 'Take Action' alert is triggered in regions at risk of moderate to severe shaking. This alert overrides the user's notification settings—it plays a loud alarm and wakes the screen to immediately draw attention. Both alert types direct users to safety guidelines and show a map with the estimated epicentre and magnitude of the quake. Where did it go wrong with the Turkey earthquake According to the BBC, Google researchers have detailed in the Science journal what caused the system failures, pointing to flaws in the detection algorithms as the main issue. During the first earthquake, the system mistakenly estimated the magnitude to be between 4.5 and 4.9 on the moment magnitude scale (MMS), when the actual quake measured 7.8. Later that same day, another powerful earthquake occurred, but the system again misjudged its severity. As a result, only 8,158 people received critical 'Take Action' alerts, while nearly four million received lower-level 'Be Aware' notifications. Following these events, Google updated the algorithm and ran a simulation of the initial quake. This revised system would have issued 10 million 'Take Action' alerts to people in immediate danger, and sent 67 million 'Be Aware' alerts to those farther from the epicentre. BBC quoted Google researchers as saying: "Every earthquake early warning system grapples with the same challenge – tuning algorithms for large magnitude events.' What leads to Google's earthquake detection system's unreliability While Android phones help create a wide earthquake detection network, their sensors aren't as accurate as professional-grade seismometers. Data from smartphones can be affected by movement or noise, which makes it harder to correctly estimate the strength of an earthquake. One of the main challenges is determining the quake's magnitude early enough to send timely alerts. But the first few seconds often don't provide enough data, forcing a tough choice between speed and accuracy. Getting it wrong either way, by underestimating or overestimating, can cause problems. For instance, the system failed to correctly assess the size of the 2023 Turkey earthquake, raising concerns about the reliability of early warnings. Another limitation lies in the system's reach. It depends on having enough Android phones in an area with good internet access. Regions with fewer devices, like remote areas, rural zones, or oceans, might not be covered well, leaving detection gaps. Even when an earthquake is detected and measured accurately, the impact on the ground can differ from place to place. Factors such as soil type or building design can influence how strongly people feel the shaking. This means that even with precise alerts, some users might still receive misleading warnings or none at all. Google itself has clarified that the system is supposed to be supplementary and is not a replacement for national systems.

Samsung and Tesla enter into strategic deal for chip manufacturing
Samsung and Tesla enter into strategic deal for chip manufacturing

United News of India

time37 minutes ago

  • United News of India

Samsung and Tesla enter into strategic deal for chip manufacturing

New Delhi, July 28 (UNI) Samsung Electronics & Co., a prominent South Korean chipmaker, has signed a deal worth USD16.5 billion with Tesla Inc. This major agreement raised a wave of hope and momentum for Samsung's lagging chip foundry business. Reportedly, the agreement will be viable until 2033. Initially, the news came out through the regulatory filing by Samsung, which was discovered, but after some hours, CEO Elon Musk posted on the social media platform X, ' Samsung agreed to allow Tesla to coordinate in driving the manufacturing efficiency, citing it as a critical point.' Musk also stated that Samsung's new Texas fab will be dedicated to the generation of Tesla's next-gen A16 chips. Mush also stressed the great strategic importance of this decision. 'I will walk the line personally to accelerate the pace of progress, and the fab is located near my house,' Musk added. The news brought an optimistic trend in Samsung's shares as the company's share rose to about 3.5 per cent on the Seoul Stock Exchange on Monday. It marks the biggest single-day jump in shares. This deal also comes amid Samsung's chip foundry business, which is facing heavy competition from rivals like Taiwan's TSMC (Taiwan Semiconductor Manufacturing Company) and SK Hynix. Reportedly, Taiwan's TSMC is currently dominating the chip market due to its focus on research and intense efficiency. UNI SAS PRS

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store