
Everyone hates the new Google Photos editing interface
Change isn't always easy, and while sometimes we resist it for as long as we can, dealing with it is more often than not an inevitability. Though our attitudes may sometimes gravitate more towards acceptance than full-on embracing that change, swallowing that pill can be a lot easier if we feel reassured that we're moving in a positive direction — that we're at least going to get some benefit from the change.
Right now, Google Photos is going through some growing pains: a moment of temporary upheaval — and we really are trying to emphasize there, 'temporary' — as the app introduces a rejiggered approach to its editing suite that has a lot of users feeling somewhere between confused, frustrated, and angry.
Why did Google change my Photos editor?
Let's back up for a second. Google Photos first debuted back in 2015, emerging from the rubble of Google+, and while celebrating its 10th birthday this year, Google announced a few upcoming changes. Those included that updated QR code scanner as well as a new editor experience that 'provides helpful suggestions and puts all our powerful editing tools in one place.'
While Google did mention a few concrete changes, like being able to tap on part of an image to get editing suggestions, the full scope of the reorganization wasn't immediately clear. And while Google missed its initial June release timetable for distributing the update, that also didn't sound like a bad thing, with Google talking seriously about how much it wanted to get this refresh right:
This is a major redesign for our editor — providing all new helpful suggestions and bringing all our powerful editing tools together in one place — so we are taking our time rolling it out and making sure that it is working well for users before bringing it to everyone.
This week, on the cusp of August, the new editor has finally started rolling out widely. Did that extra time pay off? Based on the reactions we're hearing from users, Google may have wanted to keep testing some of these tweaks just a little bit longer.
Worse, or just different?
Pull up the new Google Photos on your Android device, and it's going to look reasonably similar to what we had just earlier this week. And even when you tap on that 'Edit' button, it's still clear this is very much the Google Photos editor, even as our editing options present themselves in a quite different way.
Old UI
New UI
Maybe the first thing you'll notice is the persistent cropping interface. While you could tap on an edge to immediately begin cropping in the old UI, here you're more or less always in cropping mode, which Google makes practical by parking those crop options up above your image.
Old UI
New UI
That's a big change already, but one that's not too hard to get behind. What's more frustrating is that this shift seems to have resulted in us losing the ability to perform perspective correction while cropping — if that option is hiding somewhere in the new UI, we haven't spotted it yet. Hopefully that's not a permanent oversight, as this was a quite useful tool we'd love to have back.
One of the issues that appears to be generating the most frustration is that a lot of editing options are no longer where you'd expect them to be. With this new UI, Google has seriously reorganized where many editing tools live, and while this new approach arguably makes a bit more sense than the old implementation, having to relearn everything is slowing users down.
Before, Google split the Photos editing options into a few main categories: Suggestions
Crop
Tools
Adjust
Filters
Markup
With the new editor, those categories receive a big overhaul: Auto
Actions
Markup
Filters
Lighting
Color
Even where we have the same categories existing across the two interfaces, the options within are changing. For instance, if you wanted to play with sky options before, you'd find that control grouped under Tools. With that option going away in the UI (aren't all of these tools, after all?), Google has instead started categorizing it with Filters:
Old UI
New UI
Let's go through all of these and look at what has — and what hasn't — changed. Markup offers the same selection as before, with pen, highlighter, and text tools. Filters starts by adding that distinction between old filters and sky styles, but once you tap through, you'll see the same filter choices as you had before, now joined by an 'auto' option at the end. You can tap a filter once to select, and again to control the intensity.
Old UI
New UI
While that works the same as it did before, Google now gives us a slightly tweaked look for how those sliders are presented.
Actions offers a combination of the pop, sharpen, and denoise features from the old Adjust, combined with all the old Tools (with the exception of sky filters). And if you ever need to get really explicit about your intent to crop a pic (despite being able to already do so at most places across this new editing interface), an option for it also lives here.
Lighting contains a subset of options that used to be under Adjust — brightness and contrast stuff. And the new editor splits the saturation and tint family of options from the old Adjust off into a new Color section, all by themselves.
New UI
New UI
To Google's credit, it seems aware that people are going to stumble a bit (at least initially) as they pick up this new editor, and all the way over on the right you'll find an incredibly handy magnifying glass icon that pulls up a detailed, searchable list explaining all available options. In a pinch, you could even just skip the rest of the interface and work straight from here.
If that's all this update amounted to, we'd offer a little sympathy for the haters, acknowledge that change can be tough, and suggest they keep a stiff upper lip and learn to deal with the new placement of so many editing options. But that's not the complete story, as we alluded to when mentioning the vanishing perspective crop tool earlier.
Old UI
New UI
Magic Eraser is back, but in the new Photos editing interface the old camouflage options no longer appears. This would recolor objects to help make them less distracting, without outright removing them. Granted, it's not one we used nearly as much as the object removal tool, but it's still odd to see it unceremoniously disappear like this. We've tried long-pressing buttons and everything else we can think of to find another way to access it, but if it is there, it is very much not proving intuitive.
While we've spotted these couple feature omissions while attempting to catalog the overarching editor reorganization, it's entirely possible that there are even more cuts that impact the tools available to you in Google's new editor.
Keep calm, and edit on
Those missing tools aside, it's hard to honestly categorize this refresh as anything other than positive. For all the complaints we've heard voiced, most don't amount to more than 'I need to make an extra tap' or 'I have to remember where this moved to.'
When we look at how Google is now grouping the editing options in Photos, none of the categorization really seems 'wrong' — it's just 'wrong to us' because we're used to the old sorting. After a few weeks of experience using this new interface, it's going to feel just as familiar as it used to, rest assured.
New UI
New UI
Get past that mental hump, and you're well on your way towards learning to appreciate these editor improvements for what they really offer. The way the initial editing screen now includes engaging previews of its auto-edit suggestions is a nice move in the right direction.
Being able to tap for suggestions on a specific area of an image is also a fun new way to get started, and we especially like the way it helps highlight some of the newer and AI-powered tools that longtime Photos users might have glossed over when they first arrived.
Ultimately, this is still very much the Google Photos we know and love. It's fine to be a little frustrated when you've got to learn a new workflow, but it's definitely going to be worth your effort. Take some time getting comfortable with the new tool placement, and get yourself ready for the next decade of Google Photos.
Follow
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
12 minutes ago
- Yahoo
美國政府「1 蚊」就能用 ChatGPT Enterprise服務
OpenAI CEO Sam Altman speaks during the Federal Reserve's Integrated Review of the Capital Framework for Large Banks Conference in Washington, D.C., U.S., July 22, 2025. REUTERS/Ken Cedeno 在 Elon Musk 為特朗普政府大刀闊斧地減省開支之後,接著簽下的合約也有保持 DOGE 的精神。OpenAI 近日宣布,與美國聯邦政府達成破格協議,未來一年內將以每個機構僅收取 1 美元的象徵性價錢,向參與的聯邦機構提供 ChatGPT Enterprise 服務,此舉明顯有意壓低 Anthropic 及 Google 等競爭對手的進軍門檻。 新協議是美國總務管理局(GSA)將 OpenAI、Google 和 Anthropic 列為獲批 AI 供應商後翌日落實,三者產品現可透過政府預審合約平台(Multiple Award Schedule, MAS)供各聯邦機構選用,省卻逐一協商的程序,意味美國公營機構採用生成式 AI 將變得更簡易與低成本。 廣告 廣告 除了 ChatGPT Enterprise 服務外,OpenAI 還額外為美國政府客戶提供 60 日無限次使用進階模型,並設立專屬政府用戶社群和入門培訓資源。數據安全方面,GSA 強調政府採「安全優先」策略,OpenAI 需確保敏感資料獲高度保護,但就沒有透露 OpenAI 有否為美國政府提供額外數據保護,如專屬雲端伺服器或本地部置等安排。 據報這項超低價策略緊貼特朗普政府近日頒布的新 AI 行動計劃及行政命令,後者要求政府採購 AI 時必須確保「意識形態中立」,並禁止「覺醒 AI」(即推動某些議題取態的 AI)參與合約。 更多內容: OpenAI is practically giving ChatGPT to the government for free 廣告 廣告 Providing ChatGPT to the entire U.S. federal workforce GSA Announces New Partnership with OpenAI, Delivering Deep Discount to ChatGPT Gov-Wide Through MAS 緊貼最新科技資訊、網購優惠,追隨 Yahoo Tech 各大社交平台! 🎉📱 Tech Facebook: 廣告 廣告 🎉📱 Tech Instagram: 🎉📱 Tech WhatsApp 社群: 🎉📱 Tech WhatsApp 頻道: 🎉📱 Tech Telegram 頻道:


Android Authority
13 minutes ago
- Android Authority
YouTube is expands its powerful new search feature to more Premium users
Joe Maring / Android Authority TL;DR Google's AI-powered search results carousel is rolling out to more Premium users in the US. According to the updated YouTube Premium page for experimental new features, the AI-powered search results carousel will be available until August 20. Google may decide to expand access to the feature once the testing period ends. YouTube is expanding access to its experimental AI-powered search results carousel to more Premium subscribers, though the feature still appears to be limited to users in the United States. Google first announced its AI-powered search results carousel for YouTube Premium subscribers in the US back in June. Now, according to YouTube's New Features page for Premium users, the feature is available on both iOS and Android until August 20, after which the company may consider a wider rollout. How it works YouTube The AI-powered search results carousel appears at the top of certain search results to help users discover content more efficiently. For specific types of queries, particularly those related to shopping, travel, or local activities, the carousel showcases a row of relevant videos accompanied by brief AI-generated topic descriptions. For instance, searching for 'Best beaches in Hawaii' might present a curated set of videos with quick summaries, allowing viewers to jump straight to relevant content without scrolling through the entire results page. The experimental feature is available only in the YouTube mobile app, supports English-language videos, and doesn't appear for every query. If you want to try it out, you can opt in by heading to this page. The feature currently works only within the United States, but if testing goes well, YouTube may expand the AI-powered discovery tool to more regions after the trial concludes later this month. Follow
Yahoo
21 minutes ago
- Yahoo
Sam Altman says he's 'uneasy' about people trusting their biggest life decisions to ChatGPT
Sam Altman says many people are using ChatGPT as their therapist. "Although that could be great, it makes me uneasy," Altman said. Altman said OpenAI is "closely tracking" people's sense of attachment to their AI models. OpenAI CEO Sam Altman said he doesn't feel comfortable with how people are consulting ChatGPT on major life decisions. "A lot of people effectively use ChatGPT as a sort of therapist or life coach, even if they wouldn't describe it that way," Altman wrote on X on Sunday. "I can imagine a future where a lot of people really trust ChatGPT's advice for their most important decisions. Although that could be great, it makes me uneasy," Altman added. OpenAI did not respond to a request for comment from Business Insider. Altman said in his X post that OpenAI has been "closely tracking" people's sense of attachment to their AI models and how they react when older versions are deprecated. "People have used technology including AI in self-destructive ways," Altman wrote on Sunday. "If a user is in a mentally fragile state and prone to delusion, we do not want the AI to reinforce that." Altman said that while most ChatGPT users can distinguish "between reality and fiction or role-play," a minority cannot. He added that ChatGPT could be harmful if it leads people away from their "longer term well-being." "This is just my current thinking, and not yet an official OpenAI position," Altman said. In his post, Altman also referenced the negative response from some ChatGPT users after GPT-5 landed on Friday. Some ChatGPT users, for instance, called for OpenAI to restore older models like GPT-4o. People made posts on social media to voice their complaints about GPT-5, saying the model's replies were written in a "flat" tone and lacked creativity. OpenAI has also tweaked its models. In April, the company said it was rolling back an update to GPT-4o because the model had become sycophantic and was "overly flattering" to users. Altman has previously expressed concern about how people are using ChatGPT as a personal therapist — and the legal concerns around it. In a podcast that aired last month, Altman told podcaster Theo Von that OpenAI may be required to produce its users' therapy-style chats in a lawsuit. "So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up," Altman said. "No one had to think about that even a year ago, and now I think it's this huge issue of like, 'How are we gonna treat the laws around this?'" Altman added. Read the original article on Business Insider Solve the daily Crossword