
Samsung is now bringing One UI 7 to the Galaxy A54 in the US
Galaxy A54 5G
TL;DR The One UI 7 update is now available to many Galaxy A54 owners in the US.
Users on AT&T, Cricket Wireless, T-Mobile, and Tracfone are reporting that they've received the update.
TL;DR 3
The Samsung Galaxy A54 is the most recent Galaxy A50 series smartphone in the US, as the Galaxy A55 skipped the region, and the Galaxy A56 is coming stateside later this year. Fortunately, there's good news if you've got a Galaxy A54 in the US as it's now getting One UI 7.
SamMobile reports that the update is now available to some Galaxy A54 owners in the US. It looks like this update is only available to users on Metro PCS, T-Mobile, and US Cellular. This release was corroborated by several users on Reddit (1, 2, 3) and Twitter. The Reddit users added that they were on AT&T, Cricket Wireless, T-Mobile and Tracfone. We're guessing you might have to wait a while to get the update if you're on another carrier.
In any event, One UI 7 on the Galaxy A54 weighs in at over 3GB and brings an overhauled visual experience. These visual changes include a redesigned camera app and new icons. Samsung's update also offers the Now Bar feature, along with improved widget customization, and live notifications.
News of the update's availability also comes a short while after Samsung kicked off the One UI 8 beta program. This new update is scheduled to arrive in the summer, but we're guessing Galaxy A series owners will have to wait a few months longer.
Got a tip? Talk to us! Email our staff at
Email our staff at news@androidauthority.com . You can stay anonymous or get credit for the info, it's your choice.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
30 minutes ago
- CNET
You Can Now File a Claim for Part of AT&T's $177 Million Data Breach Settlement
AT&T's Snowflake data breach of 2024 affected more than 100 million people. wdstock/Getty Images AT&T customers who were included in the company's 2019 or 2024 data breaches now have the chance to file a claim for compensation. On Monday, the administrative firm managing the company's massive $177 million settlement announced that the claim process has begun and that eligible customers will need to file claims by November 18, 2025 in order to receive a payout. The settlement, which received preliminary approval from a federal judge in June, is meant to resolve two class-action lawsuits tied to the breaches. If your personal data was compromised in either incident, it's worth checking your eligibility. Submitting a claim could put real money back in your pocket, even if your time with AT&T is long behind you. The settlement prioritizes larger payments to customers who suffered damages that are "fairly traceable" to the data leaks. It will also provide bigger payments to those affected by the larger of the two leaks, which began in 2019. While the company is working toward a settlement, it has continued to deny that it was "responsible for these criminal acts." What happened with these AT&T data breaches? AT&T confirmed the two data breaches last year, announcing an investigation into the first in March before confirming it in May and confirming the second in July. The first of the confirmed breaches began in 2019. The company revealed that about 7.6 million current and 65.4 million former account holders had their data exposed to hackers, including names, Social Security numbers and dates of birth. The company began investigating the situation last year after it reported that customer data had appeared on the dark web. The second breach began in April of 2024, when a hacker broke into AT&T cloud storage provider Snowflake and accessed 2022 call and text records for almost all of the company's US customers, about 109 million in all. The company stressed that no names were attached to the stolen data. Two individuals were arrested in connection with the breach. Both of these incidents sparked a wave of class action lawsuits alleging corporate neglect on the part of AT&T in failing to sufficiently protect its customers. Who is eligible to file a claim for the AT&T data breach settlement? As of now, we know that the settlement will pay out to any current or former AT&T customer whose data was accessed in one of these data breaches, with higher payments reserved for those who can provide documented proof that they suffered damages directly resulting from their data being stolen. If you're eligible, you should receive a notice about it, either by email or a physical letter in the mail. How can I file a claim for my share of the AT&T data breach settlement? On Aug. 4, 2025, the Kroll Settlement Administration announced that affected AT&T customers can now file claims against the proposed $177 million settlement. Claims can be filed online at After clicking on "Submit Claim" on the settlement website, you'll be asked for a Class Member ID. That number should be in the email or mailed notice you received about the settlement. You'll also be asked to provide an email Address, your AT&T account number or your full name. If you don't want to file the claim online, you can download and print the PDF forms for either the first data breach announced on March 30, 2024, the second breach announced on July 12, 2024, or the overlap form if you were part of both breaches. Printed and signed forms should be mailed to: AT&T Data Incident Settlement c/o Kroll Settlement Administration LLC, P.O. Box 5324 New York, NY 10150-5324. Settlement claims need to be filed by Nov. 18, 2025, or else you'll be left out of the class action settlement. If you disagree with the settlement and want to opt out so you can sue AT&T yourself, you'll need to do that by October 17, 2025. The final approval process for the proposed settlement will be Dec. 3, 2025. If the settlement is approved in December, payouts to customers are likely to begin in early 2026. How much will the AT&T data breach payments be? You'll have to "reasonably" prove damages caused by these data breaches to be eligible for the highest and most prioritized payouts. For the 2019 breach, those claimants can receive up to $5,000. For the Snowflake breach in 2024, the max payout will be $2,500. It's not clear at this time how the company might be handling customers who've been affected by both breaches. AT&T will focus on making those payments first, and whatever's left of the $177 million settlement total will be disbursed to anyone whose data was accessed, even without proof of damages. Because these payouts depend on how many people get the higher amounts first, we can't say definitively how much they will be.


Bloomberg
an hour ago
- Bloomberg
AT&T Seeks More Than $2 Billion for Mexico Mobile Unit
AT&T Inc. is working with advisers to sell its Mexico unit, people familiar with the matter said, after struggling for more than a decade to gain ground on billionaire Carlos Slim's dominant carrier in the country. Dallas-based AT&T is seeking more than $2 billion for the business, according to the people, who asked not to be identified discussing confidential information.


Android Authority
3 hours ago
- Android Authority
Free, offline ChatGPT on your phone? Technically possible, basically useless
Robert Triggs / Android Authority Another day, another large language model, but news that OpenAI has released its first open-weight models (gpt-oss) with Apache 2.0 licensing is a bigger deal than most. Finally, you can run a version of ChatGPT offline and for free, giving developers and us casual AI enthusiasts another powerful tool to try out. As usual, OpenAI makes some pretty big claims about gpt-oss's capabilities. The model can apparently outperform o4-mini and scores quite close to its o3 model — OpenAI's cost-efficient and most powerful reasoning models, respectively. However, that gpt-oss model comes in at a colossal 120 billion parameters, requiring some serious computing kit to run. For you and me, though, there's still a highly performant 20 billion parameter model available. Can you now run ChatGPT offline and for free? Well, it depends. In theory, the 20 billion parameter model will run on a modern laptop or PC, provided you have bountiful RAM and a powerful CPU or GPU to crunch the numbers. Qualcomm even claims it's excited about bringing gpt-oss to its compute platforms — think PC rather than mobile. Still, this does beg the question: Is it possible to now run ChatGPT entirely offline and on-device, for free, on a laptop or even your smartphone? Well, it's doable, but I wouldn't recommend it. What do you need to run gpt-oss? Edgar Cervantes / Android Authority Despite shrinking gpt-oss from 120 billion to 20 billion parameters for more general use, the official quantized model still weighs in at a hefty 12.2GB. OpenAI specifies VRAM requirements of 16GB for the 20B model and 80GB for the 120B model. You need a machine capable of holding the entire thing in memory at once to achieve reasonable performance, which puts you firmly into NVIDIA RTX 4080 territory for sufficient dedicated GPU memory — hardly something we all have access to. For PCs with a smaller GPU VRAM, you'll want 16GB of system RAM if you can split some of the model into GPU memory, and preferably a GPU capable of crunching FP4 precision data. For everything else, such as typical laptops and smartphones, 16GB is really cutting it fine as you need room for the OS and apps too. Based on my experience, 24GB RAM is required; my 7th Gen Surface Laptop, complete with a Snapdragon X processor and 16GB RAM, worked at an admittedly pretty decent 10 tokens per second, but barely held on even with every other application closed. Despite it's smaller size, gpt-oss 20b still needs plenty of RAM and a powerful GPU to run smoothly. Of course, with 24 GB RAM being ideal, the vast majority of smartphones cannot run it. Even AI leaders like the Pixel 9 Pro XL and Galaxy S25 Ultra top out at 16GB RAM, and not all of that's accessible. Thankfully, my ROG Phone 9 Pro has a colossal 24GB of RAM — enough to get me started. How to run gpt-oss on a phone Robert Triggs / Android Authority For my first attempt to run gpt-oss on my Android smartphone, I turned to the growing selection of LLM apps that let you run offline models, including PocketPal AI, LLaMA Chat, and LM Playground. However, these apps either didn't have the model available or couldn't successfully load the version downloaded manually, possibly because they're based on an older version of Instead, I booted up a Debian partition on the ROG and installed Ollama to handle loading and interacting with gpt-oss. If you want to follow the steps, I did the same with DeepSeek earlier in the year. The drawback is that performance isn't quite native, and there's no hardware acceleration, meaning you're reliant on the phone's CPU to do the heavy lifting. So, how well does gpt-oss run on a top-tier Android smartphone? Barely is the generous word I'd use. The ROG's Snapdragon 8 Elite might be powerful, but it's nowhere near my laptop's Snapdragon X, let alone a dedicated GPU for data crunching. gpt-oss can just about run on a phone, but it's barely usable. The token rate (the rate at which text is generated on screen) is barely passable and certainly slower than I can read. I'd estimate it's in the region of 2-3 tokens (about a word or so) per second. It's not entirely terrible for short requests, but it's agonising if you want to do anything more complex than say hello. Unfortunately, the token rate only gets worse as the size of your conversation increases, eventually taking several minutes to produce even a couple of paragraphs. Robert Triggs / Android Authority Obviously, mobile CPUs really aren't built for this type of work, and certainly not models approaching this size. The ROG is a nippy performer for my daily workloads, but it was maxed out here, causing seven of the eight CPU cores to run at 100% almost constantly, resulting in a rather uncomfortably hot handset after just a few minutes of chat. Clock speeds quickly throttled, causing token speeds to fall further. It's not great. With the model loaded, the phone's 24GB was stretched as well, with the OS, background apps, and additional memory required for the prompt and responses all vying for space. When I needed to flick in and out of apps, I could, but this brought already sluggish token generation to a virtual standstill. Another impressive model, but not for phones Calvin Wankhede / Android Authority Running gpt-oss on your smartphone is pretty much out of the question, even if you have a huge pool of RAM to load it up. External models aimed primarily at the developer community don't support mobile NPUs and GPUs. The only way around that obstacle is for developers to leverage proprietary SDKs like Qualcomm's AI SDK or Apple's Core ML, which won't happen for this sort of use case. Still, I was determined not to give up and tried gpt-oss on my aging PC, equipped with a GTX1070 and 24GB RAM. The results were definitely better, at around four to five tokens per second, but still slower than my Snapdragon X laptop running just on the CPU — yikes. In both cases, the 20b parameter version of gpt-oss certainly seems impressive (after waiting a while), thanks to its configurable chain of reasoning that lets the model 'think' for longer to help solve more complex problems. Compared to free options like Google's Gemini 2.5 Flash, gpt-oss is the more capable problem solver thanks to its use of chain-of-thought, much like DeepSeek R1, which is all the more impressive given it's free. However, it's still not as powerful as the mightier and more expensive cloud-based models — and certainly doesn't run anywhere near as fast on any consumer gadgets I own. Still, advanced reasoning in the palm of your hand, without the cost, security concerns, or network compromises of today's subscription models, is the AI future I think laptops and smartphones should truly aim for. There's clearly a long way to go, especially when it comes to mainstream hardware acceleration, but as models become both smarter and smaller, that future feels increasingly tangible. A few of my flagship smartphones have proven reasonably adept at running smaller 8 billion parameter models like Qwen 2.5 and Llama 3, with surprisingly quick and powerful results. If we ever see a similarly speedy version of gpt-oss, I'd be much more excited. Follow