
Files by Google is also getting a fresh coat of Material 3 Expressive (APK teardown)
Edgar Cervantes / Android Authority
TL;DR Google is testing Material 3 Expressive components in a beta version of its Files app.
Changes include centrally placed, redesigned floating buttons, larger thumbnails, and a new progress bar design.
These design features are not yet live and may change before public release in the stable branch of the app.
Material 3 Expressive is going to be the flavor of the season, and we'll see plenty of Google and third-party apps update to incorporate Android's new UX design language. The broader rollout of Material 3 Expressive will happen later in the year, but the design is expected to debut with Android 16 QPR1 beta. Ahead of these releases, several Google apps have started adopting Material 3 Expressive components, and the Files by Google app is the latest one to jump on the bandwagon.
Authority Insights story on Android Authority. Discover
You're reading anstory on Android Authority. Discover Authority Insights for more exclusive reports, app teardowns, leaks, and in-depth tech coverage you won't find anywhere else.
An APK teardown helps predict features that may arrive on a service in the future based on work-in-progress code. However, it is possible that such predicted features may not make it to a public release.
Files by Google v1.7528 beta includes code for Material 3 Expressive components, which we managed to activate ahead of its release.
Starting with the app's landing page, we see that the Quick Share and File Scanner FABs, which were previously right-aligned at the bottom, now sit side-by-side in the middle. The button design has also changed, giving it a clean and uniform look. Thumbnails for Recents are now larger, though, which is a tad bit unsightly.
Current UI
Upcoming UI
The same treatment is given to the Edit and Circle to Search FABs in the image viewer within the app:
Current UI
Upcoming UI
We've also spotted some padding and sizing changes in the sidebar, but it's not immediately clear whether these are intentional.
Current UI
Upcoming UI
Lastly, Files by Google displays a progress bar whenever an APK file is being installed or files are compressed into a ZIP file. In the current UI, this progress bar is shown at the bottom, while the upcoming UI will restore its position to the center of the screen and give it a wavy progress bar.
All of these changes aren't currently live in the app. Google is still working on them, so they may or may not make it to the final release in this avatar. We'll keep you updated when we learn more.
Got a tip? Talk to us! Email our staff at
Email our staff at news@androidauthority.com . You can stay anonymous or get credit for the info, it's your choice.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
39 minutes ago
- Forbes
Apple Prepares To Cancel The Classic MacBook Pro
New MacBook Pro laptops are displayed during Apple's Worldwide Developers Conference 2023. (Photo by ... More JOSH EDELSON/AFP via Getty Images) As Tim Cook prepares to open the Worldwide Developer Conference next week, countless Apple fans are getting ready for new life to be breathed into their hardware. There will be something for everybody, from iPhones and iPads to Apple Watches and MacBook Pro laptops. Yet there will be some let down by the next version of macOS. It's worth noting that Apple is refreshing the numbering system across all of its operating systems, so the next version of macOS will be macOS 26, and the moniker naming looks set to continue with macOS Tahoe. Apple continues to offer multi-year support across its hardware, including the Mac platform in general and the MacBook laptops in particular. And this is where things get awkward for the laptop owners. The support window for the MacBook Pro is expected to reach back as far as the 2019 MacBook Pro. That passes an important rubicon. At that point, the Mac family was still running on x86-based Intel hardware. The ARM-based Apple Silicon arrived at the end of 2020, with the M1 MacBook Pro, M1 MacBook Air, and M1 Mac Mini. Apple Silicon offered a significant leap in performance, power and efficiency. That Apple can support the M1 chipset some five years down the line and bring the full range of new tools, including the latest generative AI suite, should not come as a surprise. Yet Apple still plans to support the older, slower and inefficient Intel MacBook Pro models. How much can Apple offer the older laptops? It's clear for MacBook Air owners still running Intel-powered Airs. The MacBook Air from 2020, the last with the Intel Core chipset, will be dropped, leaving only Apple Silicon powered MacBook Air models supported by macOS Tahoe and the versions that follow. It's less clear for the MacBook Pro. At the very least, Apple should offer security updates to the laptops, but whether the latest apps and utilities are offered remains to be seen. Given the demands placed on the hardware, it's likely to be a limited subset of those available to Apple Silicon Macs. Consumers using third-party apps will already be familiar with the 'unsupported' error messages on apps that are exclusively for Apple Silicon Macs) a list that is growing longer by the day), and more demanding apps such as Photoshop are asking for so much that the Intel Mac are being left behind, no matter what Cupertino is offering. It's unfortunate that no laptop can last forever. While Apple's support window is rarely specified for Macs, six years of updates is welcome, yet feels short for an Apple product. This is the downside of the 2020 move from Intel to ARM; Apple gained a much more powerful platform, but was left having to support the older platform for a polite number of years. That support is coming to an end. The consumer-focused MacBook Air has a hard stop coming up. At the same time, the professionally focused MacBook Pro will squeeze out another year or two, which will be welcomed by those using the laptop in a production environment, but Tim Cook has put the platform on notice. The MacBook Pro that many knew and loved is coming to the end of its story. Now read the latest MacBook Pro, macOS and WWDC headlines in Forbes' weekly Apple news digest...

Yahoo
an hour ago
- Yahoo
Sprouting Gear Inc. Founder Paul Pluss Announces Report on:
'The Unintended Consequences of the AI Race on the Livestock Industry' RAMONA, Calif., June 07, 2025 (GLOBE NEWSWIRE) -- The U.S. livestock industry, already grappling with rising feed costs and shrinking herd sizes, now faces a fast-approaching and under-recognized threat: the massive expansion of artificial intelligence (AI) infrastructure—especially data centers—and its impact on water availability, says Paul Pluss, a veteran livestock rancher and researcher focused on the intersection of agriculture, water policy, and emerging infrastructure demands. 'The water usage of data centers operated by Microsoft, Google, Meta, and Amazon remains largely unrecognized by agricultural stakeholders. Prime location for data centers is the same hot dry inland location preferred for feedlots and are often sharing the same aquifers and rivers" said Pluss. Fueled by public and private investment in AI infrastructure, the number of U.S. data centers is expected to grow from 5,426 today to more than 8,378 within five years. Many existing facilities are also expanding. These data centers—crucial for powering AI models, cloud computing, and digital services—require enormous amounts of water to cool their servers. Key figures: Each data center can consume up to 5 million gallons of water per day for cooling. Average water usage per megawatt of electricity is estimated at 6 to 7 million gallons. U.S. data center power demand is currently 35 gigawatts and rising. Annual electricity usage by data centers is expected to nearly triple, from 224 terawatt-hours today to 606 terawatt-hours within five years. Based on current and projected growth, total water use by U.S. data centers could exceed 15 trillion gallons annually—equivalent to more than 46 million acre-feet of water per year (calculated on the well-documented 5M gallons/day per center, prior to new expansions). This level of water consumption rivals agricultural water use in major farming states and could soon surpass the entire livestock industry's combined water footprint, including feed crop irrigation, drinking water, and processing needs. View the report here, as well as a articles and short videos to explain hydroponic livestock feeding and the economics behind it: The Carbon Footprint of Livestock 'Can We REALLY Slash Livestock Environmental Damage by 90 Percent?' Our Country's Water Crisis: Why Aquifers Are a Bigger Problem Than the Colorado River 'Our Country's Water Crisis' From 2 Pounds of Seed to 19 Pounds of Feed Paul PlussCEO & Founderpaul@ in to access your portfolio


Boston Globe
an hour ago
- Boston Globe
Welcome to campus. Here's your ChatGPT.
'Our vision is that, over time, AI would become part of the core infrastructure of higher education,' Leah Belsky, OpenAI's vice president of education, said in an interview. In the same way that colleges give students school email accounts, she said, soon 'every student who comes to campus would have access to their personalized AI account.' Advertisement To spread chatbots on campuses, OpenAI is selling premium AI services to universities for faculty and student use. It is also running marketing campaigns aimed at getting students who have never used chatbots to try ChatGPT. Get Starting Point A guide through the most important stories of the morning, delivered Monday through Friday. Enter Email Sign Up Some universities, including the University of Maryland and California State University, are already working to make AI tools part of students' everyday experiences. In early June, Duke University began offering unlimited ChatGPT access to students, faculty and staff. The school also introduced a university platform, called DukeGPT, with AI tools developed by Duke. OpenAI's campaign is part of an escalating AI arms race among tech giants to win over universities and students with their chatbots. The company is following in the footsteps of rivals like Google and Microsoft that have for years pushed to get their computers and software into schools, and court students as future customers. Advertisement The competition is so heated that Sam Altman, OpenAI's CEO, and Elon Musk, who founded the rival xAI, posted dueling announcements on social media this spring offering free premium AI services for college students during exam period. Then Google upped the ante, announcing free student access to its premium chatbot service 'through finals 2026.' OpenAI ignited the recent AI education trend. In late 2022, the company's rollout of ChatGPT, which can produce human-sounding essays and term papers, helped set off a wave of chatbot-fueled cheating. Generative AI tools like ChatGPT, which are trained on large databases of texts, also make stuff up, which can mislead students. Less than three years later, millions of college students regularly use AI chatbots as research, writing, computer programming and idea-generating aides. Now OpenAI is capitalizing on ChatGPT's popularity to promote the company's AI services to universities as the new infrastructure for college education. OpenAI's service for universities, ChatGPT Edu, offers more features, including certain privacy protections, than the company's free chatbot. ChatGPT Edu also enables faculty and staff to create custom chatbots for university use. (OpenAI offers consumers premium versions of its chatbot for a monthly fee.) OpenAI's push to AI-ify college education amounts to a national experiment on millions of students. The use of these chatbots in schools is so new that their potential long-term educational benefits, and possible side effects, are not yet established. California State University announced this year that it was making ChatGPT available to more than 460,000 students across its 23 campuses to help prepare them for 'California's future AI-driven economy.' Cal State said the effort would help make the school 'the nation's first and largest AI-empowered university system.' Advertisement Some universities say they are embracing the new AI tools in part because they want their schools to help guide, and develop guardrails for, the technologies. " You're worried about the ecological concerns. You're worried about misinformation and bias," Edmund Clark, the chief information officer of California State University, said at a recent education conference in San Diego. 'Well, join in. Help us shape the future.' Last spring, OpenAI introduced ChatGPT Edu, its first product for universities, which offers access to the company's latest AI. Paying clients like universities also get more privacy: OpenAI says it does not use the information that students, faculty and administrators enter into ChatGPT Edu to train its AI. (The New York Times has sued OpenAI and its partner, Microsoft, over copyright infringement. Both companies have denied wrongdoing.) Last fall, OpenAI hired Belsky to oversee its education efforts. An ed tech startup veteran, she previously worked at Coursera, which offers college and professional training courses. She is pursuing a two-pronged strategy: marketing OpenAI's premium services to universities for a fee while advertising free ChatGPT directly to students. OpenAI also convened a panel of college students recently to help get their peers to start using the tech. Among those students are power users like Delphine Tai-Beauchamp, a computer science major at the University of California, Irvine. She has used the chatbot to explain complicated course concepts, as well as help explain coding errors and make charts diagraming the connections between ideas. 'I wouldn't recommend students use AI to avoid the hard parts of learning,' Tai-Beauchamp said. She did recommend students try AI as a study aid. 'Ask it to explain something five different ways.' Advertisement Some faculty members have already built custom chatbots for their students by uploading course materials like their lecture notes, slides, videos and quizzes into ChatGPT. Jared DeForest, the chair of environmental and plant biology at Ohio University, created his own tutoring bot, called SoilSage, which can answer students' questions based on his published research papers and science knowledge. Limiting the chatbot to trusted information sources has improved its accuracy, he said. 'The curated chatbot allows me to control the information in there to get the product that I want at the college level,' DeForest said. But even when trained on specific course materials, AI can make mistakes. In a new study -- 'Can AI Hold Office Hours?' -- law school professors uploaded a patent law casebook into AI models from OpenAI, Google and Anthropic. Then they asked dozens of patent law questions based on the casebook and found that all three AI chatbots made 'significant' legal errors that could be 'harmful for learning.' 'This is a good way to lead students astray,' said Jonathan S. Masur, a professor at the University of Chicago Law School and a co-author of the study. 'So I think that everyone needs to take a little bit of a deep breath and slow down.' OpenAI said the 250,000-word casebook used for the study was more than twice the length of text that its GPT-4o model can process at once. Anthropic said the study had limited usefulness because it did not compare the AI with human performance. Google said its model accuracy had improved since the study was conducted. Advertisement Belsky said a new 'memory' feature, which retains and can refer to previous interactions with a user, would help ChatGPT tailor its responses to students over time and make the AI 'more valuable as you grow and learn.' Privacy experts warn that this kind of tracking feature raises concerns about long-term tech company surveillance. In the same way that many students today convert their school-issued Gmail accounts into personal accounts when they graduate, Belsky envisions graduating students bringing their AI chatbots into their workplaces and using them for life. 'It would be their gateway to learning -- and career life thereafter,' Belsky said. This article originally appeared in