
Google experiments with ditching the Favorites tab in the Phone app for a simpler layout
TL;DR Google is testing the removal of the Favorites tab in the Phone app.
Favorite contacts appear as a horizontal row at the top of the Recents tab in the new layout.
The frequent contacts section no longer has a place in this redesign.
Google is testing a significant change to its dialer app, and it could mark the end of a familiar tab. On at least some Pixel 8 Pro devices running version 178.0.765584175-publicbeta of the Google Phone app, the standalone Favorites tab has been removed. Instead, favorite contacts appear in a neat row across the top of the Recents screen, similar to how some other dialers integrate shortcuts for frequent contacts.
Credit to Telegram user @h_muc, who first spotted this change live on their device. The first screenshot below shows the app's current UI, while the other two display the new layout. In the updated design, the Favorites tab is gone, and the top of the Recents tab now displays pinned favorites, followed by your most recent calls. A new Add button appears at the end of the contact row, presumably linking to the contact list so you can add more favorites.
Another casualty of this redesign is the frequent contacts section, which previously appeared underneath favorites in the old tab. That section no longer appears in the new layout, and it's unclear whether it has been removed entirely or simply relocated.
You can get a better idea of how the potential new layout will look in this video:
The new layout is certainly cleaner, but it may frustrate users who liked having a dedicated space for both favorites and frequent contacts. It's unclear how widely this change is rolling out yet, but we'll keep an eye on things and let you know if this layout becomes the new default.
Got a tip? Talk to us! Email our staff at
Email our staff at news@androidauthority.com . You can stay anonymous or get credit for the info, it's your choice.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
34 minutes ago
- Forbes
AWE 2025 Fueled By Android XR, Snap Specs, And AI
The theme of the show was evident from the start. Augmented World Expo 2025, now in its 16th year, wrapped up today in Long Beach, California. The XR industry's largest and longest-running event drew more than 5,000 attendees and 250 exhibitors to the cavernous Long Beach Convention Center from June 10 to 12. For the first time, both the conference and expo floor ran a full three days, with expanded programming that included hackathons, keynotes, investor meetups, and breakout areas for startups, game developers, and enterprise providers. The week began, as always, with Ori Inbar's annual keynote. AWE's co-founder took the stage with his usual mix of irreverence and conviction. This year's theme was direct: 'XR is going mainstream.' Inbar said the wait was over. 'The hardware is good enough, the tools are mature, and AI has lowered the barrier to entry,' he said, urging developers to stop building for the future and start shipping to the present. He celebrated XR's strange persistence—joking that we've been waiting for a 'mass market moment' for 30 years—and framed AI as both a complement and a catalyst: 'XR is the killer interface for AI,' he said, to sustained applause. AWE head of Programming Sonya Haskins and CEO and co-founder of AWE, Ori Inbar. Google and Snap delivered first day main stage keynotes that energized the crowd. Snap dominated the hallways with demos of Specs and their mirror technology. Niantic Spatial also had a big presence, as they did last year, before they spun off Pokemon Go to Scopely to focus on WebXR and a digital twin of the physical world. Google's Justin Payne at AWE 2025. Google's Justin Payne introduced Android XR, the company's new spatial computing operating system. Introduced to some fanfare at Google I/O two weeks ago, this was a direct pitch to the developers in the audience. Android XR is built to unify headset and glasses development across Qualcomm and Samsung hardware and deeply integrate with Gemini. 'This is the first Android platform built for the Gemini era,' Payne said. He described Android XR as the logical evolution of Google's long-term investment in vision-based computing—Glass, ARCore, Lens, Live View—now converging with real-time AI. He emphasized that XR devices shouldn't be siloed or episodic. 'The same person will use multiple XR devices throughout the day,' he said, 'and Gemini can follow them between worlds.' Snap's Evan Spiegel took the stage next and as expected he announced that consumer-ready Spectacles are coming in 2026. Snap has spent over $3 billion and 11 years refining its mobile AR platform, which now supports more than 4 million lenses used 8 billion times a day. 'We're obsessed with making computers more human,' Spiegel said. With OpenAI and Gemini onboard, the new Spectacles will support spatial AI interactions, WebXR, and shared gaming overlays. Specs are already in the hands of hundreds of developers, many of whom were demoing real-world applications throughout the Long Beach venue. In the past CTO Bobby Murphy has keynoted AWE, but this is Speigel's inaugural appearance, signaling the growing importance of the medium and its largest annual gathering. Chi Xu, founder and CEO of Xreal. Both Google and Snap highlighted the growing ecosystem of Android XR tools. XREAL's Chi Xu previewed Project Aura, the company's latest eyewear, built for Android XR stack and also unveiled two weeks earlier at I/O. Featuring an upgraded Qualcomm X1S spatial chip, Aura has a 70-degree field of view and native support for Gemini-powered voice interfaces. Xu described it as a long-awaited convergence of hardware, AI, and open platforms: 'All the pieces are finally ready,' he said. At Qualcomm's booth, attendees could test its new AR1+ Gen1 chipset, an on-device AI processor designed for smartglasses. Qualcomm SVP Ziad Asghar framed it as a turning point for wearable computing: 'It's time to build AI glasses that can stand alone.' From L to R: Dylan, Brent, Nolan, Alissa, and Wyatt Bushnell In a packed session featuring Atari and Chuck-E-Cheese founder Nolan Bushnell and his family consisting of entrepreneurs, daughter Alissa, and brothers Brent, Wyatt and Dylan, the family discussed the personal, and professional reality of being a Bushnell. The discussion turned to the lessons XR can learn from arcade design. The Bushnells made a persuasive case for intuitive mechanics and social play, less UI, more instinct. 'Nobody wants to play a tutorial,' one of them said. 'If they don't get it in the first ten seconds, they walk.' They also made a passionate case for location-based XR. Brent's Dream Park demo on the show floor's Playground allows players to interact with digital characters in the physical world. 'This isn't VR anymore,' he said. 'You are the game.' Palmer Luckey at AWE 2025. Palmer Luckey began by explaining his hoarse voice was the result of spending a week in Washington, DC with his main customers. In the news just weeks ago was his surprise reunion with Meta, seven years after being fired. They are together taking over the IVAS project from Microsoft. IVAS was a $22 billion contract to create AR equipped infantry that could use heads-up displays for threat detection, drone management, mapping, targeting, in addition to the thermal imaging (night vision) they use now. 'The best AR hardware isn't coming out of DARPA anymore,' he said. 'It's coming from the consumer sector. Meta, Snap, Google, they've pulled ahead.' His Eagle Eye platform, developed for the U.S. Army, is a high-resolution, multimodal sensor suite that fuses thermal, RF, and spatial data in real time. 'It's not entertainment hardware,' he said. 'It's a tool built for life-and-death decisions, but it will trickle back to consumers.' Author and entrepreneur Tom Emrich signing copies of his new book, Next Dimension. Emrich announced ... More at the show that he is launching a new spatial/XR news site, Remix Reality. Vicki Dobbs Beck of ILM and researcher and author Helen Papagiannis approached XR from a cultural and narrative perspective, emphasizing its potential as a medium for identity, expression, and immersive storytelling. Beck framed ILM's evolving mission as a shift from 'storytelling to storyliving.' Drawing from a decade of immersive projects under the Lucasfilm banner, she described the next frontier as emotionally responsive worlds, powered by real-time AI and character memory. Papagiannis, author of Augmented Human, unveiled her new book Reality Modding, which proposes that reality-like software which is now editable, customizable, and increasingly aesthetic. 'This is about identity and presence,' she said. 'We're no longer just users of technology, we're becoming the medium itself.' Mentra AR glassess will soon be compatible with Android XR. The tone of the show was celebratory but not naive. Inbar acknowledged the ghosts of past hype cycles. XR has been 'the next big thing' for nearly two decades. But this year, the combination of stable platforms, purpose-built hardware, and AI-native developer tools made the proposition feel more grounded. The term 'ambient computing' came up repeatedly—devices that disappear into daily life, interfaces that respond without friction. On the floor, dozens of demos aimed at enterprise deployment, not just entertainment: spatial planning, logistics, training, and field service. Enterprise now represents 71% of the XR market, and it showed. All 5000 people must have tried the new Snap Spectacles by the end of the show. The AWE Playground is always a highlight as it features entertainment experiences for both in-home and out-of-home audiences. Installations ranged from social XR games to large-scale multisensory exhibits. A highlight was an expanded version of Brent Bushnell's Dream Park, a walkable mixed-reality experience that allowed users to embody virtual characters without controllers. They just raised $1.3 M to expand from their Santa Monica pilot. Their 'theme park in a box' can literally be run by a couple of kids in a park. Auki's robot had a. lot of fans. Auki Labs placed QR codes on the floor of the convention center for indoor navigation. This mobile AR experience helped guide their attention-getting robot. Auki is doing a massive retail rollout of their indoor virtual positioning systems on a much larger scale in decentralized protocol, PoseMesh, uses scannable QR codes and self-hosted data to guide robots and humans through physical spaces. Auki also worked with Zappar on enhanced QR codes, which Unilever is now putting on their packaging. Auki won a coveted Auggie award for its Posemesh technology. Trying out Viture for the first time at CES 2023. Virture's Kickstarter raised $3.2 M for these ... More Assisted Reality smartglasses targeting gamers. Founder Marcus Lim has raised over $10M. Every year there are a handful of suite demos in the nearby Hyatt Hotel. Some meetings are better and more relevant than others. This year I got a private detailed tour from the founder David Jiang who I first met at CES in 2023, where he showed me his Viture AR screen reflecting glasses. According to IDC, they account for 52% of AR smartglasses sales worldwide. You plug them into your phone and see a 200' screen in a compact form factor. It's favored by gamers but popular for content consumption and productivity as well. They've come a long way in three short years, diversifying into software, including an app that uses AI to transform movies into 3D, spatial experiences, much like Leia, which does it with a 3D display in tablet form. It is even more impressive when fully immersed in Viture's lightweight headset. With Google and Apple entering the market they're hoping their software will give them a way to leverage the competition into even greater success. Trying out Flow Immersiver on an Xreal AR headset. In the hallways and informal corners of the convention center, old ideas resurfaced in sharper, more polished form. Jason Marsh, founder of Flow Immersive, gave one of his signature roaming demos—an evolving tradition that began seven years ago when he first cornered me outside a session room with a prototype on his tablet. This year, Flow's layered, interactive data visualizations ran smoothly on headsets, phones, and smartglasses. What once felt like an ambitious idea now looked like a viable product, complete with enterprise traction and UX refinements. The evolution of Flow mirrored the tone of the show itself: confident, capable, and finally ready for primetime. Patrick Johnson and the team from Rock, Paper, Reality, with the hideous yet coveted Auggie Award, ... More which they won for their extraordinary work with Google maps on the history of Paris. This year's Auggie Awards reflected both breadth and maturity across the XR spectrum. With a record number of nominations and public votes, the 16th annual ceremony honored excellence across 19 categories: LOS ANGELES, CA - FEBRUARY 11: Director for Medical Virtual Reality Institute for Creative ... More Technologies Albert "Skip" Rizzo at Participant Medias screening of That Which I Love Destroys Me in Los Angeles on Wednesday, February 11, 2015 in Los Angeles, California. (Photo byfor Participant Media) Ten new XR Hall of Fame inductees were honored on June 11, celebrating pioneers whose work has shaped today's $40 billion industry: Their induction honors the foundational work they've done while helping the next generation of creators. The packed theatre was a reminder that today's XR movement is not new, but finally catching up to its own imagination.
Yahoo
an hour ago
- Yahoo
COMPAL Optimizes AI Workloads with AMD Instinct MI355X at AMD Advancing AI 2025 and International Supercomputing Conference 2025
SAN JOSE, Calif., June 12, 2025 /PRNewswire/ -- As AI computing accelerates toward higher density and greater energy efficiency, Compal Electronics (Compal; Stock Ticker: a global leader in IT and computing solutions, unveiled its latest high-performance server platform: SG720-2A/ OG720-2A at both AMD Advancing AI 2025 in the U.S. and the International Supercomputing Conference (ISC) 2025 in Europe. It features the AMD Instinct™ MI355X GPU architecture and offers both single-phase and two-phase liquid cooling configurations, showcasing Compal's leadership in thermal innovation and system integration. Tailored for next-generation generative AI and large language model (LLM) training, the SG720-2A/OG720-2A delivers exceptional flexibility and scalability for modern data center operations, drawing significant attention across the industry. With generative AI and LLMs driving increasingly intensive compute demands, enterprises are placing greater emphasis on infrastructure that offers both performance and adaptability. The SG720-2A/OG720-2A emerges as a robust solution, combining high-density GPU integration and flexible liquid cooling options, positioning itself as an ideal platform for next-generation AI training and inference workloads. Key Technical Highlights: Support for up to eight AMD Instinct MI350 Series GPUs (including MI350X / MI355X): Enables scalable, high-density training for LLMs and generative AI applications. Dual cooling architecture – Air & Liquid Cooling: Optimized for high thermal density workloads and diverse deployment scenarios, enhancing thermal efficiency and infrastructure flexibility. The two-phase liquid cooling solution, co-developed with ZutaCore®, leverages the ZutaCore® HyperCool® 2-Phase DLC liquid cooling solution, delivering stable and exceptional thermal performance, even in extreme computing environments. Advanced architecture & memory configuration: Built on the CDNA 4 architecture with 288GB HBM3E memory and 8TB/s bandwidth, supporting FP6 and FP4 data formats, optimized for AI and HPC applications. High-speed interconnect performance: Equipped with PCIe Gen5 and AMD Infinity Fabric™ for multi-GPU orchestration and high-throughput communication, reducing latency and boosting AI inference efficiency. Comprehensive support for mainstream open-source AI stacks: Fully compatible with ROCm™, PyTorch, TensorFlow, and more—enabling developers to streamline AI model integration and accelerate time-to-market. Rack compatibility & modular design: Supports EIA 19" and ORv3 21" rack standards with modular architecture for simplified upgrades and maintenance in diverse data center environments. Compal has maintained a long-standing, strategic collaboration with AMD across multiple server platform generations. From high-density GPU design and liquid cooling deployment to open ecosystem integration, both companies continue to co-develop solutions that drive greater efficiency and sustainability in data center operations. "The future of AI and HPC is not just about speed, it's about intelligent integration and sustainable deployment. Each server we build aims to address real-world technical and operational challenges, not just push hardware specs. SG720-2A/ OG720-2A is a true collaboration with AMD that empowers customers with a stable, high-performance, and scalable compute foundation." said Alan Chang, Vice President of the Infrastructure Solutions Business Group at Compal. The series made its debut at Advancing AI 2025 and was concurrently showcased at the ISC 2025 in Europe. Through this dual-platform exposure, Compal is further expanding its global visibility and partnership network across the AI and HPC domains, demonstrating a strong commitment to next-generation intelligent computing and international strategic development. For more information, visit the website: AMD, Instinct, ROCm, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners. About Compal Founded in 1984, Compal is a leading manufacturer in the notebook and smart device industry, creating brand value in collaboration with various sectors. Its groundbreaking product designs have received numerous international awards. In 2024, Compal was recognized by CommonWealth Magazine as one of Taiwan's top 6 manufacturers and has consistently ranked among the Forbes Global 2000 and Fortune Global 500 companies. In recent years, Compal has actively developed emerging businesses, including cloud servers, auto electronics, and smart medical, leveraging its integrated hardware and software R&D and manufacturing capabilities to create relevant solutions. More information, please visit View original content to download multimedia: SOURCE COMPAL ELECTRONICS,INC.


San Francisco Chronicle
an hour ago
- San Francisco Chronicle
Meta invests $14.3B in AI firm Scale and recruits its CEO for 'superintelligence' team
Meta is making a $14.3 billion investment in artificial intelligence company Scale and recruiting its CEO Alexandr Wang to join a team developing 'superintelligence' at the tech giant. The deal announced Thursday reflects a push by Meta CEO Mark Zuckerberg to revive AI efforts at the parent company of Facebook and Instagram as it faces tough competition from rivals such as Google and OpenAI. Meta announced what it called a 'strategic partnership and investment' with Scale late Thursday. Scale said the $14.3 billion investment puts its market value at over $29 billion. Scale said it will remain an independent company but the agreement will 'substantially expand Scale and Meta's commercial relationship.' Meta will hold a 49% stake in the startup. Wang, though leaving for Meta with a small group of other Scale employees, will remain on Scale's board of directors. Replacing him is a new interim Scale CEO Jason Droege, who was previously the company's chief strategy officer and had past executive roles at Uber Eats and Axon. Zuckerberg's increasing focus on the abstract idea of 'superintelligence' — which rival companies call artificial general intelligence, or AGI — is the latest pivot for a tech leader who in 2021 went all-in on the idea of the metaverse, changing the company's name and investing billions into advancing virtual reality and related technology. It won't be the first time since ChatGPT's 2022 debut sparked an AI arms race that a big tech company has gobbled up talent and products at innovative AI startups without formally acquiring them. Microsoft hired key staff from startup Inflection AI, including co-founder and CEO Mustafa Suleyman, who now runs Microsoft's AI division. Google pulled in the leaders of AI chatbot company while Amazon made a deal with San Francisco-based Adept that sent its CEO and key employees to the e-commerce giant. Amazon also got a license to Adept's AI systems and datasets. Wang was a 19-year-old student at the Massachusetts Institute of Technology when he and co-founder Lucy Guo started Scale in 2016. They won influential backing that summer from the startup incubator Y Combinator, which was led at the time by Sam Altman, now the CEO of OpenAI. Wang dropped out of MIT, following a trajectory similar to that of Zuckerberg, who quit Harvard University to start Facebook more than a decade earlier. Scale's pitch was to supply the human labor needed to improve AI systems, hiring workers to draw boxes around a pedestrian or a dog in a street photo so that self-driving cars could better predict what's in front of them. General Motors and Toyota have been among Scale's customers. What Scale offered to AI developers was a more tailored version of Amazon's Mechanical Turk, which had long been a go-to service for matching freelance workers with temporary online jobs. More recently, the growing commercialization of AI large language models — the technology behind OpenAI's ChatGPT, Google's Gemini and Meta's Llama — brought a new market for Scale's annotation teams. The company claims to service 'every leading large language model,' including from Anthropic, OpenAI, Meta and Microsoft, by helping to fine tune their training data and test their performance. It's not clear what the Meta deal will mean for Scale's other customers. Wang has also sought to build close relationships with the U.S. government, winning military contracts to supply AI tools to the Pentagon and attending President Donald Trump's inauguration. The head of Trump's science and technology office, Michael Kratsios, was an executive at Scale for the four years between Trump's first and second terms. Meta has also begun providing AI services to the federal government. Meta has taken a different approach to AI than many of its rivals, releasing its flagship Llama system for free as an open-source product that enables people to use and modify some of its key components. Meta says more than a billion people use its AI products each month, but it's also widely seen as lagging behind competitors such as OpenAI and Google in encouraging consumer use of large language models, also known as LLMs. It hasn't yet released its purportedly most advanced model, Llama 4 Behemoth, despite previewing it in April as "one of the smartest LLMs in the world and our most powerful yet.' Meta's chief AI scientist Yann LeCun, who in 2019 was a winner of computer science's top prize for his pioneering AI work, has expressed skepticism about the tech industry's current focus on large language models. 'How do we build AI systems that understand the physical world, that have persistent memory, that can reason and can plan?' LeCun asked at a French tech conference last year. These are all characteristics of intelligent behavior that large language models 'basically cannot do, or they can only do them in a very superficial, approximate way,' LeCun said. Instead, he emphasized Meta's interest in 'tracing a path towards human-level AI systems, or perhaps even superhuman.' When he returned to France's annual VivaTech conference again on Wednesday, LeCun dodged a question about the pending Scale deal but said his AI research team's plan has 'always been to reach human intelligence and go beyond it.' 'It's just that now we have a clearer vision for how to accomplish this,' he said. LeCun co-founded Meta's AI research division more than a decade ago with Rob Fergus, a fellow professor at New York University. Fergus later left for Google but returned to Meta last month after a 5-year absence to run the research lab, replacing longtime director Joelle Pineau.