
Google Revises Android Earthquake Alerts After Major Miss in Turkey
But when two massive quakes struck southern Turkey and Syria in February 2023, the alerts system didn't send out its highest-level "Take Action" notifications to around 10 million people in the region, Google told the BBC. Instead, Android users received lower-level "Be Aware" notifications or nothing at all. Google didn't immediately respond to a request for further comment.
The 2023 earthquakes were among the deadliest in the region's modern history, killing more than 50,000 people and displacing millions. Over 70% of phones in Turkey use the company's Android operating system. Apple's competing iOS software does not have a comparable built-in earthquake alert, relying on government warnings.
Read more: How to Set Up Emergency Alerts on Your Phone Now
How Google is revising its alert system
In a paper published earlier this month by the journal Science, Google said it found "limitations to the detection algorithms" during the event. According to the company, the system underestimated the severity of the earthquakes and failed to trigger the top-tier warnings that tell people to take immediate cover.
Google says it has since improved the detection algorithm and has resimulated the first Turkey earthquake with improved response results.
How Google's alert system works
Android's Earthquake Alerts System is available in more than 90 countries and uses tiny vibrations picked up by a phone's accelerometer to spot seismic activity faster than traditional monitoring stations alone. When enough phones detect shaking, Google's system estimates the quake's location, magnitude and impact zone, and then pushes alerts directly to people's screens.
The idea is to buy precious seconds before strong shaking starts, hopefully providing enough time for people to drop to the ground, take cover or move to safer locations.
The system has been credited with delivering early warnings during quakes in California, Greece and Japan. But the Turkey miss in 2023 highlighted the challenges of building a global warning system that relies on millions of phones and the high stakes when it gets things wrong. The earthquakes in Turkey were unusually complex, involving multiple fault ruptures and powerful aftershocks. This likely made accurate detection harder, but also underscores why timely alerts are so crucial.
Read more: Tornadoes, Floods, Wildfires, Intruders: 4 Ways Your Phone Can Help in an Emergency
Google says it's continuing to refine its earthquake technology and encourages Android users to keep the feature turned on. Earthquake Alerts is enabled by default on many Android phones, and you can check it under Safety & Emergency settings.
With climate and seismic risks rising, mobile-based early warning systems can be a way to reach people faster than traditional sirens or broadcasts. However, Google warns the alert system is meant to complement -- not replace -- national earthquake warning systems.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Engadget
8 minutes ago
- Engadget
Google is bringing image and PDF uploads to AI Mode
Google is updating AI Mode on desktop this week with the ability to process images, so you can ask it detailed questions about the pictures like you already can on mobile. In the coming weeks, the company is also adding support for PDF uploads on desktop in the US, which could help you digest lengthy course or work materials. You can ask AI Mode to summarize the documents for you and ask follow-up questions that it will then answer by cross-referencing the materials you uploaded with information available on the web. Google says AI Mode's responses will also include links to its references that you can visit in order to dig deeper. AI Mode will support additional file types for upload, including ones straight from your Google Drive, in the coming months as well. In addition to PDF upload support, Google is also rolling out a new Canvas feature that you can access if you're enrolled in the AI Mode Labs experiment in the US. You can use Canvas to consolidate all relevant information about a specific topic or for a specific purpose in a side panel that updates as you ask AI Mode more follow-up questions. If you're traveling, for instance, you can ask AI Mode to make you an itinerary and click the Create Canvas button. You'll be able to keep refining the itinerary with more questions, and you can always leave it alone for a while and come back to it later. AI Mode's Search Live is also getting video input on mobile this week, a feature Google announced at I/O 2025, after voice input arrived in June. To be able to access video input, you'll have to open Lens in the Google app and tap the Live icon before asking questions on what the camera sees. When Google revealed the feature during its annual developers' event, it said you could point the camera at a math problem, for example, and ask Search to help you solve it or to explain a concept you're having trouble understanding. Finally, with Lens in Chrome, you'll be able to ask AI Mode what's on your desktop screen. The company will roll out an "Ask Google about this page" dropdown option in the address bar "soon." When you click on it, AI Mode will create an overview with key information on what's being shown on your screen, whether it's a web page or a PDF.


Digital Trends
8 minutes ago
- Digital Trends
Google's AI Mode is quietly turning search into a productivity tool
Google Search has been one of the primary gateways to information on the internet, but it's about to evolving into something more. With the latest set of features being added to AI Mode, Search will no longer be just a tool for finding links or information, but an assistant that can help you organize, understand, and act on that information. Instead of just answering questions, AI Mode is being transformed into what seems to be a helpful workspace. It will soon support you through complex documents, explain visuals, and even help with multi-step tasks. With the addition of these new tools, Google is slowly changing how search works by helping you do more than just find information. What's new in AI Mode A handful of new capabilities are being introduced that expand what AI Mode can do particularly on desktop browsers, where users often juggle multiple tabs, files, and formats during more complex workflows. PDF and image uploads for context-aware queries AI Mode on desktop will now support uploading PDFs and images which will allow users to ask questions about the content in those files and receive web-informed, AI-generated responses. Recommended Videos Imagine having a research report or technical manual in front of you. Instead of searching for terms manually, you can now upload the file, highlight a section, and ask, 'Can you explain this further?' The AI will analyze the document and return contextually relevant explanations, along with links for deeper reading. Support for additional file types, including those from Google Drive, is expected in future updates expanding this capability to more kinds of content. Canvas for task planning and organization Another interesting addition is Canvas, that allows users to create and refine plans in a persistent, editable side panel. It's a tool designed for tasks that span multiple sessions like project planning, research outlines, trip itineraries, and more. The system will let you iterate in real time, ask AI Mode to draft a plan, make changes through follow-up prompts, and organize the results visually in the side panel. Users will also be able to upload their own files, like meeting notes, to help personalize the output. Canvas essentially helps you stay organized and make progress across sessions, documents, and devices. Search Live: AI conversations with visual input Perhaps the most technically ambitious update is Search Live, which integrates Google's camera-based Lens tool with AI Mode to deliver real-time, conversational help based on what your camera sees. Whether it's a diagram, a schematic, or a physical object, you can point your phone's camera at it and start a conversation. The AI interprets the visual data, offers insights, and even lets you refine questions, creating a kind of live tutoring or troubleshooting session, powered by AI and the web. This feature is based on Google's Project Astra work, and is being rolled out on mobile in the U.S. for users enrolled in the Labs experiment. AI Mode in Chrome: Smarter browsing, fewer tabs For desktop users, AI Mode is getting more closely integrated into Chrome. Soon, you will be able to click 'Ask Google about this page' from the address bar, which will launch Lens and the AI assistant to help you understand whatever is on your screen, whether it is a complex chart, a technical section, or a difficult diagram. You can even ask follow-up questions directly in the side panel, making it easier to explore a topic without switching tabs or starting a new search. This could change how people interact with web content and with Google itself. A more useful, less interruptive AI? There's no shortage of AI tools that come with a promise to boost productivity. But where many require a full switch of platforms or behavior, AI Mode is being embedded into existing habits including Search, Chrome, and Lens. Rather than pitching itself as a digital co-pilot or assistant with a personality, Google is trying to make AI Mode feel more like a context-aware layer for everyday digital tasks. Upload a file, ask a question, build a plan, check in later, all within the browser or the search bar. Google says that it is gradually rolling out these features to AI Mode, with some already available in early access for users who have joined the AI Mode Labs program.


Tom's Guide
8 minutes ago
- Tom's Guide
Google Search just got a major upgrade — you can now ask images, PDFs and more with AI Mode
Google is rolling out a new wave of AI features to Search just in time for the back-to-school season. The updates expand what users can do in AI Mode, Google's experimental interface that blends generative AI with traditional search. The latest features include support for asking questions about PDFs and images, a new planning workspace called Canvas and real-time help with video input through Search Live. While many of these tools are still in preview through Google's AI Labs, they show how the company is layering more interactivity into everyday search experiences. Now users can ask about PDFs, images and soon, their own files. Arguably one of the most useful upgrades is the ability to ask questions about uploaded PDFs, a feature rolling out to desktop over the next few weeks. This builds on existing image-based search in AI Mode, which is already available in the Google app on Android and iOS and is now expanding to desktop browsers as well. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. For example, students can upload lecture slides or scanned handouts, then ask detailed questions about the content. AI Mode cross-references the file with web sources to generate a summary or explanation, complete with links to dig deeper. Support for additional file types, including documents stored in Google Drive, is expected in the months ahead. Google is also launching Canvas, a side panel that helps you organize plans or content from your AI searches. For example, a student could ask AI Mode to help create a study guide for an upcoming test. Now, users will see an option to 'Create Canvas,' which opens a dynamic space where your outline or plan begins to take shape. Canvas is designed to be flexible, meaning, you can refine it over multiple sessions, ask follow-up questions and soon even upload your own files (like class notes or a syllabus) to tailor the results. Of course, this feature is for users that aren't students, too. It can be useful to build travel itineraries, compare shopping lists or brainstorm creative projects. Canvas is rolling out first to users enrolled in Google's AI Mode Labs experiment in the U.S., and will appear in desktop browsers. Another new feature, Search Live, brings a real-time, video-enhanced layer to Search. Using your phone camera and Google Lens, you can point at an object or diagram and start a conversation with AI Mode, asking questions as you move your camera or change the angle. The tool, powered in part by Google's Project Astra, is designed to feel like having an expert explain something as you look at it. Search Live is rolling out now on mobile to AI Labs users in the U.S.; it integrates directly with the existing Lens tool in the Google app. Finally, Google is adding tighter integration between AI Mode and Chrome. Starting soon, when you click the Chrome address bar, you'll see a new 'Ask Google about this page' option. This lets you use AI Mode to get an overview of what you're reading, such as an explanation of a diagram in a math problem, or follow up with deeper questions using the 'Dive deeper' button. The feature builds on Lens in Chrome, and the AI summary appears in the side panel, helping users better understand complex topics without leaving the page. If you're in the U.S. and enrolled in Google's Search Labs, you can activate AI Mode by selecting the new button on the Google homepage (on desktop) or through the Google app (on mobile). Many of these features like Canvas and Search Live, will appear gradually over the coming weeks. Google says more file types, better customization, and smarter follow-up interactions are planned in the months ahead. For students, professionals and casual users alike, there's no doubt that AI Mode is becoming a more capable and more personalized tool for learning and exploration. Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.