What the Tech: Help in an emergency with Alexa
We all remember those commercials from the 1980s where someone says, 'I've fallen and I can't get up.' While those devices still exist, many people now rely on their smartphones to call for help. But what happens when someone can't reach their phone? For older adults, especially those living alone, this is a genuine concern.
That is where Alexa, Amazon's voice assistant, can help. But it is not a one-step solution. If you want Alexa to be part of your emergency plan, you need to set it up in advance.
Yes, and no. Alexa cannot call 911 unless you subscribe to a paid service called Alexa Emergency Assist. This add-on costs about $6 per month and connects the user with trained agents who can call emergency services on their behalf. Once subscribed, a person needs to say, 'Alexa, call for help,' and they will be connected to a live agent who can assess the situation and arrange assistance. It will also contact up to 25 emergency contacts to inform them that you need help.
If you do not want to pay for the Emergency Assist subscription, there is a useful workaround. You can manually add trusted contacts to the Alexa app. These could include family members, close friends, or neighbors who can assist in an emergency. Once those contacts are saved, the user can say, 'Alexa, call [name],' and Alexa will dial their number.
To add a contact:
Open the Alexa app.
Tap 'Communicate' at the bottom of the screen.
Go to 'Contacts.'
Tap the plus sign to add a new contact.
Enter the name and phone number.
Save the contact.
This allows someone to call for help hands-free, even if they cannot reach a phone. The person on the other end can then call 911 if needed.
Alexa also has a feature called Drop In, which works like an intercom. If your devices are on the same Amazon account, you can say, 'Alexa, Drop In on all devices,' and everyone connected will hear the call. This is especially helpful for families checking in on loved ones or for situations where multiple Echo devices are spread throughout the home.
While Alexa cannot call 911 directly without a paid subscription, Siri on iPhones can. Saying 'Call 911' or 'I need help' will trigger the emergency call function on most iPhones. Just be cautious when testing it, because the call will go through in about 2 seconds.
There is no option to call for help using Google Assistant.
Alexa is a helpful tool for emergency preparedness, but it should not replace a phone or medical alert system. Without the Emergency Assist subscription, Alexa cannot contact emergency services directly. However, by setting up contacts and utilizing features like Drop In, it can add an additional layer of safety. Always test your setup and ensure that your loved ones know how to use it.
Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
42 minutes ago
- Forbes
AWE 2025 Fueled By Android XR, Snap Specs, And AI
The theme of the show was evident from the start. Augmented World Expo 2025, now in its 16th year, wrapped up today in Long Beach, California. The XR industry's largest and longest-running event drew more than 5,000 attendees and 250 exhibitors to the cavernous Long Beach Convention Center from June 10 to 12. For the first time, both the conference and expo floor ran a full three days, with expanded programming that included hackathons, keynotes, investor meetups, and breakout areas for startups, game developers, and enterprise providers. The week began, as always, with Ori Inbar's annual keynote. AWE's co-founder took the stage with his usual mix of irreverence and conviction. This year's theme was direct: 'XR is going mainstream.' Inbar said the wait was over. 'The hardware is good enough, the tools are mature, and AI has lowered the barrier to entry,' he said, urging developers to stop building for the future and start shipping to the present. He celebrated XR's strange persistence—joking that we've been waiting for a 'mass market moment' for 30 years—and framed AI as both a complement and a catalyst: 'XR is the killer interface for AI,' he said, to sustained applause. AWE head of Programming Sonya Haskins and CEO and co-founder of AWE, Ori Inbar. Google and Snap delivered first day main stage keynotes that energized the crowd. Snap dominated the hallways with demos of Specs and their mirror technology. Niantic Spatial also had a big presence, as they did last year, before they spun off Pokemon Go to Scopely to focus on WebXR and a digital twin of the physical world. Google's Justin Payne at AWE 2025. Google's Justin Payne introduced Android XR, the company's new spatial computing operating system. Introduced to some fanfare at Google I/O two weeks ago, this was a direct pitch to the developers in the audience. Android XR is built to unify headset and glasses development across Qualcomm and Samsung hardware and deeply integrate with Gemini. 'This is the first Android platform built for the Gemini era,' Payne said. He described Android XR as the logical evolution of Google's long-term investment in vision-based computing—Glass, ARCore, Lens, Live View—now converging with real-time AI. He emphasized that XR devices shouldn't be siloed or episodic. 'The same person will use multiple XR devices throughout the day,' he said, 'and Gemini can follow them between worlds.' Snap's Evan Spiegel took the stage next and as expected he announced that consumer-ready Spectacles are coming in 2026. Snap has spent over $3 billion and 11 years refining its mobile AR platform, which now supports more than 4 million lenses used 8 billion times a day. 'We're obsessed with making computers more human,' Spiegel said. With OpenAI and Gemini onboard, the new Spectacles will support spatial AI interactions, WebXR, and shared gaming overlays. Specs are already in the hands of hundreds of developers, many of whom were demoing real-world applications throughout the Long Beach venue. In the past CTO Bobby Murphy has keynoted AWE, but this is Speigel's inaugural appearance, signaling the growing importance of the medium and its largest annual gathering. Chi Xu, founder and CEO of Xreal. Both Google and Snap highlighted the growing ecosystem of Android XR tools. XREAL's Chi Xu previewed Project Aura, the company's latest eyewear, built for Android XR stack and also unveiled two weeks earlier at I/O. Featuring an upgraded Qualcomm X1S spatial chip, Aura has a 70-degree field of view and native support for Gemini-powered voice interfaces. Xu described it as a long-awaited convergence of hardware, AI, and open platforms: 'All the pieces are finally ready,' he said. At Qualcomm's booth, attendees could test its new AR1+ Gen1 chipset, an on-device AI processor designed for smartglasses. Qualcomm SVP Ziad Asghar framed it as a turning point for wearable computing: 'It's time to build AI glasses that can stand alone.' From L to R: Dylan, Brent, Nolan, Alissa, and Wyatt Bushnell In a packed session featuring Atari and Chuck-E-Cheese founder Nolan Bushnell and his family consisting of entrepreneurs, daughter Alissa, and brothers Brent, Wyatt and Dylan, the family discussed the personal, and professional reality of being a Bushnell. The discussion turned to the lessons XR can learn from arcade design. The Bushnells made a persuasive case for intuitive mechanics and social play, less UI, more instinct. 'Nobody wants to play a tutorial,' one of them said. 'If they don't get it in the first ten seconds, they walk.' They also made a passionate case for location-based XR. Brent's Dream Park demo on the show floor's Playground allows players to interact with digital characters in the physical world. 'This isn't VR anymore,' he said. 'You are the game.' Palmer Luckey at AWE 2025. Palmer Luckey began by explaining his hoarse voice was the result of spending a week in Washington, DC with his main customers. In the news just weeks ago was his surprise reunion with Meta, seven years after being fired. They are together taking over the IVAS project from Microsoft. IVAS was a $22 billion contract to create AR equipped infantry that could use heads-up displays for threat detection, drone management, mapping, targeting, in addition to the thermal imaging (night vision) they use now. 'The best AR hardware isn't coming out of DARPA anymore,' he said. 'It's coming from the consumer sector. Meta, Snap, Google, they've pulled ahead.' His Eagle Eye platform, developed for the U.S. Army, is a high-resolution, multimodal sensor suite that fuses thermal, RF, and spatial data in real time. 'It's not entertainment hardware,' he said. 'It's a tool built for life-and-death decisions, but it will trickle back to consumers.' Author and entrepreneur Tom Emrich signing copies of his new book, Next Dimension. Emrich announced ... More at the show that he is launching a new spatial/XR news site, Remix Reality. Vicki Dobbs Beck of ILM and researcher and author Helen Papagiannis approached XR from a cultural and narrative perspective, emphasizing its potential as a medium for identity, expression, and immersive storytelling. Beck framed ILM's evolving mission as a shift from 'storytelling to storyliving.' Drawing from a decade of immersive projects under the Lucasfilm banner, she described the next frontier as emotionally responsive worlds, powered by real-time AI and character memory. Papagiannis, author of Augmented Human, unveiled her new book Reality Modding, which proposes that reality-like software which is now editable, customizable, and increasingly aesthetic. 'This is about identity and presence,' she said. 'We're no longer just users of technology, we're becoming the medium itself.' Mentra AR glassess will soon be compatible with Android XR. The tone of the show was celebratory but not naive. Inbar acknowledged the ghosts of past hype cycles. XR has been 'the next big thing' for nearly two decades. But this year, the combination of stable platforms, purpose-built hardware, and AI-native developer tools made the proposition feel more grounded. The term 'ambient computing' came up repeatedly—devices that disappear into daily life, interfaces that respond without friction. On the floor, dozens of demos aimed at enterprise deployment, not just entertainment: spatial planning, logistics, training, and field service. Enterprise now represents 71% of the XR market, and it showed. All 5000 people must have tried the new Snap Spectacles by the end of the show. The AWE Playground is always a highlight as it features entertainment experiences for both in-home and out-of-home audiences. Installations ranged from social XR games to large-scale multisensory exhibits. A highlight was an expanded version of Brent Bushnell's Dream Park, a walkable mixed-reality experience that allowed users to embody virtual characters without controllers. They just raised $1.3 M to expand from their Santa Monica pilot. Their 'theme park in a box' can literally be run by a couple of kids in a park. Auki's robot had a. lot of fans. Auki Labs placed QR codes on the floor of the convention center for indoor navigation. This mobile AR experience helped guide their attention-getting robot. Auki is doing a massive retail rollout of their indoor virtual positioning systems on a much larger scale in decentralized protocol, PoseMesh, uses scannable QR codes and self-hosted data to guide robots and humans through physical spaces. Auki also worked with Zappar on enhanced QR codes, which Unilever is now putting on their packaging. Auki won a coveted Auggie award for its Posemesh technology. Trying out Viture for the first time at CES 2023. Virture's Kickstarter raised $3.2 M for these ... More Assisted Reality smartglasses targeting gamers. Founder Marcus Lim has raised over $10M. Every year there are a handful of suite demos in the nearby Hyatt Hotel. Some meetings are better and more relevant than others. This year I got a private detailed tour from the founder David Jiang who I first met at CES in 2023, where he showed me his Viture AR screen reflecting glasses. According to IDC, they account for 52% of AR smartglasses sales worldwide. You plug them into your phone and see a 200' screen in a compact form factor. It's favored by gamers but popular for content consumption and productivity as well. They've come a long way in three short years, diversifying into software, including an app that uses AI to transform movies into 3D, spatial experiences, much like Leia, which does it with a 3D display in tablet form. It is even more impressive when fully immersed in Viture's lightweight headset. With Google and Apple entering the market they're hoping their software will give them a way to leverage the competition into even greater success. Trying out Flow Immersiver on an Xreal AR headset. In the hallways and informal corners of the convention center, old ideas resurfaced in sharper, more polished form. Jason Marsh, founder of Flow Immersive, gave one of his signature roaming demos—an evolving tradition that began seven years ago when he first cornered me outside a session room with a prototype on his tablet. This year, Flow's layered, interactive data visualizations ran smoothly on headsets, phones, and smartglasses. What once felt like an ambitious idea now looked like a viable product, complete with enterprise traction and UX refinements. The evolution of Flow mirrored the tone of the show itself: confident, capable, and finally ready for primetime. Patrick Johnson and the team from Rock, Paper, Reality, with the hideous yet coveted Auggie Award, ... More which they won for their extraordinary work with Google maps on the history of Paris. This year's Auggie Awards reflected both breadth and maturity across the XR spectrum. With a record number of nominations and public votes, the 16th annual ceremony honored excellence across 19 categories: LOS ANGELES, CA - FEBRUARY 11: Director for Medical Virtual Reality Institute for Creative ... More Technologies Albert "Skip" Rizzo at Participant Medias screening of That Which I Love Destroys Me in Los Angeles on Wednesday, February 11, 2015 in Los Angeles, California. (Photo byfor Participant Media) Ten new XR Hall of Fame inductees were honored on June 11, celebrating pioneers whose work has shaped today's $40 billion industry: Their induction honors the foundational work they've done while helping the next generation of creators. The packed theatre was a reminder that today's XR movement is not new, but finally catching up to its own imagination.
Yahoo
an hour ago
- Yahoo
COMPAL Optimizes AI Workloads with AMD Instinct MI355X at AMD Advancing AI 2025 and International Supercomputing Conference 2025
SAN JOSE, Calif., June 12, 2025 /PRNewswire/ -- As AI computing accelerates toward higher density and greater energy efficiency, Compal Electronics (Compal; Stock Ticker: a global leader in IT and computing solutions, unveiled its latest high-performance server platform: SG720-2A/ OG720-2A at both AMD Advancing AI 2025 in the U.S. and the International Supercomputing Conference (ISC) 2025 in Europe. It features the AMD Instinct™ MI355X GPU architecture and offers both single-phase and two-phase liquid cooling configurations, showcasing Compal's leadership in thermal innovation and system integration. Tailored for next-generation generative AI and large language model (LLM) training, the SG720-2A/OG720-2A delivers exceptional flexibility and scalability for modern data center operations, drawing significant attention across the industry. With generative AI and LLMs driving increasingly intensive compute demands, enterprises are placing greater emphasis on infrastructure that offers both performance and adaptability. The SG720-2A/OG720-2A emerges as a robust solution, combining high-density GPU integration and flexible liquid cooling options, positioning itself as an ideal platform for next-generation AI training and inference workloads. Key Technical Highlights: Support for up to eight AMD Instinct MI350 Series GPUs (including MI350X / MI355X): Enables scalable, high-density training for LLMs and generative AI applications. Dual cooling architecture – Air & Liquid Cooling: Optimized for high thermal density workloads and diverse deployment scenarios, enhancing thermal efficiency and infrastructure flexibility. The two-phase liquid cooling solution, co-developed with ZutaCore®, leverages the ZutaCore® HyperCool® 2-Phase DLC liquid cooling solution, delivering stable and exceptional thermal performance, even in extreme computing environments. Advanced architecture & memory configuration: Built on the CDNA 4 architecture with 288GB HBM3E memory and 8TB/s bandwidth, supporting FP6 and FP4 data formats, optimized for AI and HPC applications. High-speed interconnect performance: Equipped with PCIe Gen5 and AMD Infinity Fabric™ for multi-GPU orchestration and high-throughput communication, reducing latency and boosting AI inference efficiency. Comprehensive support for mainstream open-source AI stacks: Fully compatible with ROCm™, PyTorch, TensorFlow, and more—enabling developers to streamline AI model integration and accelerate time-to-market. Rack compatibility & modular design: Supports EIA 19" and ORv3 21" rack standards with modular architecture for simplified upgrades and maintenance in diverse data center environments. Compal has maintained a long-standing, strategic collaboration with AMD across multiple server platform generations. From high-density GPU design and liquid cooling deployment to open ecosystem integration, both companies continue to co-develop solutions that drive greater efficiency and sustainability in data center operations. "The future of AI and HPC is not just about speed, it's about intelligent integration and sustainable deployment. Each server we build aims to address real-world technical and operational challenges, not just push hardware specs. SG720-2A/ OG720-2A is a true collaboration with AMD that empowers customers with a stable, high-performance, and scalable compute foundation." said Alan Chang, Vice President of the Infrastructure Solutions Business Group at Compal. The series made its debut at Advancing AI 2025 and was concurrently showcased at the ISC 2025 in Europe. Through this dual-platform exposure, Compal is further expanding its global visibility and partnership network across the AI and HPC domains, demonstrating a strong commitment to next-generation intelligent computing and international strategic development. For more information, visit the website: AMD, Instinct, ROCm, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners. About Compal Founded in 1984, Compal is a leading manufacturer in the notebook and smart device industry, creating brand value in collaboration with various sectors. Its groundbreaking product designs have received numerous international awards. In 2024, Compal was recognized by CommonWealth Magazine as one of Taiwan's top 6 manufacturers and has consistently ranked among the Forbes Global 2000 and Fortune Global 500 companies. In recent years, Compal has actively developed emerging businesses, including cloud servers, auto electronics, and smart medical, leveraging its integrated hardware and software R&D and manufacturing capabilities to create relevant solutions. More information, please visit View original content to download multimedia: SOURCE COMPAL ELECTRONICS,INC.
Yahoo
an hour ago
- Yahoo
Apple Execs on AI Setbacks, Siri Delays, iPads and More (Full Interview)
WSJ's Joanna Stern sits down with Apple software chief Craig Federighi and marketing head Greg Joswiak at WWDC 2025 in Cupertino to talk about the future of AI, what happened to Siri, the new Liquid Glass redesign, iPads vs. Macs, tariffs and more. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data