
Nothing's bare-bones gallery app scores a major upgrade as Phone 3 launch draws closer
TL;DR Nothing has rolled out a significant update for its native gallery app.
The update introduces new editing tools for cropping, color grading, and adding filters to images.
Nothing Gallery now also features a native video editor that lets you trim clips, adjust volume, and adjust slow-mo video speed.
Nothing introduced a native gallery app to its devices with Nothing OS 3.0 last year, offering users an alternative to third-party gallery apps like Google Photos. The app was pretty barebones at launch, featuring a minimal interface and basic photo management tools. Now, Nothing is rolling out a major update that brings some much-needed image and video manipulation tools.
Nothing CEO Carl Pei announced the update on X earlier today, highlighting all the included changes. For starters, the Nothing Gallery update brings tools for basic image adjustments, including six cropping and rotation tools. It also introduces a color grading option with twelve tuning parameters and ten Nothing-design filters with intensity adjustments.
Nothing has also baked in a basic video editor that lets you trim clips, adjust the volume, and tweak slow-mo video speed. Both the image and video editors have fairly minimalistic interfaces, with tools split across three tabs in the former and two in the latter. Each tool offers subtle haptic feedback for an intuitive user experience.
The Nothing Gallery update even brings some performance improvements, with Nothing claiming that transitioning 'from the camera to Gallery is now over 20% faster, with image processing up to 25% quicker.' Additionally, Nothing has introduced AI-powered photo categorization and a unified visual experience that looks more native to Nothing OS.
The Nothing Gallery update will begin rolling out via the Google Play Store today, and it should reach all eligible devices by tomorrow. It arrives as Nothing is gearing up to launch its first true flagship, the Nothing Phone 3. The company is expected to announce the device early next month alongside a new pair of headphones.
Got a tip? Talk to us! Email our staff at
Email our staff at news@androidauthority.com . You can stay anonymous or get credit for the info, it's your choice.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
an hour ago
- Yahoo
Graduate's safety bracelet aims to stop harassment
A university graduate has designed a bracelet with a series of secret features aimed at keeping the wearer safe from harassment. Inspired by her own experiences, Nahla O'Rourke said the bangle included a hidden button linked to an app capable of initiating a fake phone call or sending location details to a contact. The 22-year-old created "the Venus project" as part of her product design course at Sheffield Hallam University in hopes of giving wearers "discrete confidence" while out in public. Ms O'Rourke, from Lincoln, said: "Harassment can happen to anyone, anywhere, and I wanted something people could have on them all the time that doesn't look like a big red safety button." She said she wanted the device to look like "a piece of jewellery, not ruin an outfit, and be something women can actually wear day-to-day". Therefore, she said, when it appeared the wearer was just scratching their wrist, they could actually be "alerting loved ones, and getting yourself out of the situation". "If I'm at a bar and I'm uncomfortable or some is being creepy, I can get a discrete phone call from tapping a hidden button and they wouldn't know," she said. "But if I was trying to send a text to my friend on an Apple Watch, it would be immediately obvious to the person who's making me feel that way." Ms O'Rourke said she was working to refine the design beyond university standard and hoped to have it on the market by the end of 2025, with more than 100 people on a waiting list. Around one in eight women have been victims of stalking, sexual assault or domestic abuse in the last year, according to figures from the Office for National Statistics. Meanwhile, reports of violent crimes against women and girls on public transport rose by 20% in the year to August 2024. According to a report by the UK's spending watchdog, government efforts to tackle violence against women and girls had so far "not improved outcomes". Georgia Theodoulou, a senior campaigner for Our Streets Now, said technology, well-lit streets and other measures can help make people feel safer, but more needs to be done to address the root causes of the issue. She said: "To be safe, rather than feel safe, we must focus on early intervention and prevention of public sexual harassment so women, girls and people of marginalised genders can exist safely in public, without feeling the need to change their behaviour." Listen to highlights from South Yorkshire on BBC Sounds, catch up with the latest episode of Look North More on this story 'I'm a woman walking to work – leave me alone' A tiny tool Indian women use to fight sexual harassment Further delay in enforcing sexual harassment law Related internet links Our Streets Now Venus Project


USA Today
an hour ago
- USA Today
Should you delete your ChatGPT history? Why you might not have a choice.
Page Harrington asked ChatGPT to make a morning checklist for her five-year-old child with ADHD who can't read. The generative AI shot out a color-coded routine. Harrington also uses ChatGPT to make amusing oil paintings of her family, source trending songs for TikToks and even create interior designs for her home. The Massachusetts 33-year-old is among a growing number of ChatGPT power users. Some estimates find ChaptGPT now has between 800 million and 1 billion active daily users. Harrington is such a prolific user that the chatbot told her she seeks answers or ideas about every 11 minutes. "I'm using it all day everyday," she told USA TODAY. "I am entrenched." ChatGPT is so integral to her daily life that her chat history has become a detailed map of her inner self, Harrington said. "It is hysterical to look at because it really shows what my brain looks like," she said. "I use ChatGPT to take an idea I have and make it better. It reveals the constant thought-spiraling I have on a daily basis." While a detailed ChatGPT history can be hugely beneficial − the more users reveal, the more relevant the outputs are − it also raises privacy implications. In fact, ChatGPT histories are so intensely personal some people say they'd rather let a stranger read their texts than see their chatbot banter. That prospect is now a possibility. A May court order has − at least temporarily − prevented ChatGPT's parent company OpenAI from honoring user requests to delete the history of personal accounts. The move has created confusion over what exactly is ChatGPT history. Is it public? Could it be used against you? "We are forgetting how much we are sharing with it," said Jessica Camilleri-Shelton, an early ChatGPT adopter and UK-based AI content creator. "A lot of people are in a daze about what this tool represents and the way they're interacting with it. The things they're sharing and the history that's being built on them is a deep and revealing picture of who they are." Why your ChatGPTs aren't being deleted right now New York federal Judge Ona Wang ruled in May that OpenAI must hold on to some chats in a copyright infringement lawsuit. The lawsuit filed by The New York Times in 2023 alleges OpenAI used its articles to train its generative AI models. The newspaper and other plaintiffs say the ChatGPT user data could contain potential evidence to support that claim. The order requires OpenAI to keep ChatGPT histories even if a user requests the chats be deleted or if state privacy laws require OpenAI to delete the data. The order does not apply to business accounts. OpenAI appealed the preservation order, but was unsuccessful. Last month, Wang upheld the order when she rejected a user's petition that chats should not be maintained for privacy reasons. "Every single chat from everybody in America is now frozen under protective order and cannot be deleted," said Jay Edelson, a Chicago-based plaintiff's attorney who sues AI companies on behalf of users. "Even if people think they have temporary chats, or are deleting chats, by virtue of a court order, that's not happening." It's common for a judge to preserve records in litigation, said Ryan Calo, a University of Washington law professor. According to Calo, the court order doesn't technically violate OpenAI's terms of service, which say the company must retain data to comply with legal orders. But it raises "a very important legal question about what the people that make AI owe to the people who own the copyright behind the training data." The company said it will "resume our standard data retention practices" once the court permits. Next generation AI: OpenAI unveils Chat GPT-5 model with 'Ph.D level intelligence' In the meantime, user data applicable to the order will be "stored separately in a secure system" and is only accessible to OpenAI's legal and security team, according to OpenAI. Calo says ChatGPT users should be vigilant about the sensitive information they share. Such a robust data stockpile could still be vulnerable to cyber attacks and requests from law enforcement. "So, if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, like we could be required to produce that," OpenAI CEO Sam Altman acknowledged on the "This Past Weekend" podcast last month. Bottom line? Calo's recommendation is to stay calm and carry on using ChatGPT, just more cautiously. ChatGPT is 'one of us' From social media to search histories, people have navigated myriad privacy threats before. But ChatGPT makes us feel uniquely vulnerable because it is increasingly an extension of ourselves, according to Kate Devlin, professor of artificial intelligence and society at King's College London. "We tend to treat these things as if they are one of us," Devlin said. It's not you, it's me. ChatGPT doesn't want to be your therapist or friend That's the case for Rue Halloway, a 20-year-old social media creator from New York City, who uses generative AI for guidance on interpersonal situations and her ADHD. When she found therapy was too expensive, Halloway turned to ChatGPT. "I'm very neutral," Halloway said about the vulnerability of her ChatGPT history. While she wouldn't like it if her history ever became public, she wouldn't get too upset over it. After all, she already forks over a lot of personal data to other tech platforms, she says. Some people are willing to trade their data for access to "an expert at your fingertips,"Camilleri-Shelton said. Not everyone is so sure. With no federal law protecting online information and just a patchwork of state privacy laws, many Americans are confused and concerned about how their online information is used, according to surveys by the Pew Research Center. The discomforting reality is that ChatGPT consumes vast amounts of information all the time, privacy experts say. Even if you request that it delete your information, it has likely already digested and incorporated it, according to Calli Schroeder, lead of the AI and Human Rights Project at Electronic Privacy Information Center. "Because of the way these systems are built, you can't delete individual pieces of information once its become part of a training data set," she said. For this reason, some ChatGPT power users like Harrington have decided to avoid highly personal or emotional queries, preferring less revealing conversations, such as how to make a skin-care routine out of the products on her bathroom counter. "ChatGPT doesn't know if your boyfriend hates you," Harrington said. "But ChatGPT does know if you should use CeraVe Oil Cleanser before Cetaphil Soap ... I'm going with things that are more fact rather than opinion." How to protect your ChatGPT history Worried about disclosing too much to ChatGPT? Here are some tips:


Android Authority
an hour ago
- Android Authority
I love Google's new Calling Cards feature. Here's how to use it on your Android phone
Joe Maring / Android Authority In late July, Android Authority reported that Google was working on a new 'Calling Cards' feature for its Contacts app. Yesterday, it began rolling out widely for everyone to use. When you receive a call on your Android phone, you see a standard incoming call screen with that person's name, number, and profile picture. Calling Cards let you spruce this up a bit, allowing you to select a full-screen picture and stylized text to display for each of your contacts. It's basically Google's version of the iPhone's Contact Posters feature, so while not the most original feature, it's a fun one nonetheless. With Calling Cards now available for everyone, here's a quick guide showing you where to find the feature and how to use it. Ready? Let's get to it. Do you like Google's Calling Cards? 0 votes Yes! I think they look great. NaN % They're fine. NaN % I don't like them. NaN % Other (let us know in the comments). NaN % What you need to access Google's Calling Cards Joe Maring / Android Authority Before showing you how to use Calling Cards, a couple of quick housekeeping notes. Calling Cards first began rolling out on August 15, and in typical Google fashion, it may take a little while before the feature is live on your phone. Calling Cards are tied to the Google Contacts and Google Phone apps. Calling Cards are live on my Google Pixel 9a with version 4.61.28.792249534 of the Contacts app, as well as version 188.0.793710089 of the Phone app. A couple of members on the Android Authority team reported needing to be in the Google Phone beta program before Calling Cards appeared. If you have the latest versions of both apps and still aren't seeing Calling Cards, you can join the Google Phone beta here to see if that prompts them to show up. It's worth noting that Calling Cards should soon be available in the non-beta version of the Google Phone app. You can stay on the stable version and wait for Calling Cards to show up there, or join the beta to use Calling Cards now. Either way, here's how to use Calling Cards once they're on your phone. How to use and edit Calling Cards Joe Maring / Android Authority You can create Calling Cards in the Phone and Contacts app, though I've found it easier to access them in the latter. Just how easy? Open the Google Contacts app on your phone. Select a contact. Tap Try adding a calling card. Add a photo for your calling card. You can use the camera app, select a picture from your on-device gallery, or pick something from Google Photos. Customize your calling card! You can use one finger to move your photo around and use two fingers to pinch and zoom to crop it. You can also choose the font style and color of the text. Tap your calling card to preview it, then tap Done in the upper-right when you're finished. Tap Skip to only use the picture for your contact's calling card, or tap Confirm and crop to use that same photo for your contact's profile picture. Finally, tap Save in the upper-right corner to save your work. And that's all there is to it! If the 'Try adding a calling card' prompt doesn't appear on a contact's page, you can also tap the pencil icon at the top of your screen to create a Calling Card that way. Now, when you receive a call from that contact, you'll see their personalized Calling Card instead of the usual (and boring) incoming call screen. There's not much else to explain for this one. Calling Cards may not be the most revolutionary feature we've ever seen from Google, but I think it's a fun and charming one. It's a cute way to spruce up your phone when you get a call from a friend or family member, and it's the little touches like this that I sometimes enjoy the most. Follow