
The Top New Features in MacOS Tahoe—Including One Feature Mac Nerds Will Love
MacOS 26 Tahoe brings a slew of new features, but the massive overhaul to Spotlight will likely become a fan favorite. The next version of macOS has a whole new design. It matches the new look coming to iPhones, iPads, and the rest of Apple's devices. Courtesy of Apple
We haven't been excited about recent MacOS updates, but with MacOS 26 Tahoe, it looks like we'll be getting one of the biggest overhauls in a while.
Between the visual redesign and some exciting pro features, there's something for everyone in MacOS 26 Tahoe, including one that Mac nerds will absolutely love.
Also be sure to check out the full rundown of everything Apple announced at WWDC 2025. Liquid Glass
It's been a while since Apple has introduced a visual revamp quite this broad. Liquid Glass is what Apple calls the visual motif of its next era in software design.
Marked by soft transparency and subtle gradients, the glass-like design appears in just about every interface, including the Menu Bar, the Lock Screen, app icons, and sidebars in apps. From what I've seen so far, it's a nice change, though some designers have concerns about how it might interfere with basic functionality. The aesthetic changes to macOS haven't been as drastic as what iOS has been through, but this appears to be a well-appreciated fresh coat of paint.
There's a few smaller design changes, such as the ability to change the color of folders or add an emoji to them. Control Center, located up top in the Menu Bar, has also been made completely transparent. More importantly, Apple has added tons more customization so you can add more quick settings to change. You can even add specific settings in commonly used apps such as Zoom.
MacOS Tahoe also brings over a design tweak that came to iPhones last year: tint for icons, which applies a unified look to all your icons at once. Honestly, I'm surprised this is coming to MacOS, because it wasn't well received by reviewers. Major Spotlight Update
This is the feature I'm most excited about. The Spotlight search tool has always been one of my favorite features in MacOS, and I've been waiting for Apple to take this popular feature to the next level. It's smarter on just about every level, quickly offering up apps, documents, or even your clipboard history. But this update is far more than that.
First off, Spotlight can now perform system actions and even in-app actions, such as playing a podcast or starting a recording. You can even fill out parameters such as who you're sending an email to—again, right in Spotlight!
Spotlight gets greatly enhanced in Tahoe. Courtesy of Apple
Hundreds of actions can now be triggered from Spotlight, like sending an email, creating a note, or playing a podcast. Courtesy of Apple
But wait, there's more. There are even Quick Keys you can use to speed things up further. Type 'sm' to send a message or 'ar' to create a reminder. Think next-level key commands with these. You can set up your own Quick Keys too, really expanding the capabilities and customization. For example, you can set up Quick Keys to take actions within an app that you're using, letting you quickly set up a task all from the keyboard. It's for the Mac nerds out there who are already know every other key command, and I can't wait to try it out. AI-Powered Shortcuts
Shortcuts can be a really powerful way of automating tasks on your Mac. With MacOS Tahoe, these get upgraded by Apple Intelligence, letting you set up shortcuts like summarizing text or generating images. You can even tap into ChatGPT (or the on-device Neural Engine) if necessary, setting up chains of actions that could potentially be extremely useful. For example, you might create a Shortcut that compares notes from text in Notes from a lecture to an audio transcription, and then summarize the differences using Apple Intelligence.
The new actions in Shortcuts. Courtesy of Apple
Bonus points—you can now access these AI-powered shortcuts through the aforementioned Spotlight update. More Continuity Features
This one was a bit of a surprise. As part of its ever-growing suite of Continuity features, the Phone app is now coming to Macs. Why put a Phone app on a device that doesn't have a cellular modem? It doesn't make a lot of sense on the surface, but remember, you can intake calls from your iPhone directly to your Mac.
With the app will come all the same newly announced features on iOS 26, such as live translation in calls, new backgrounds for contacts, and automatically screened calls. Not surprisingly, all the changes to group chats will also be coming to the Mac Messages app.
I do think the inclusion of the Phone app could point us in the direction of 5G MacBooks in the future, something Apple has resisted for a long time. While cellular laptops aren't exactly common these days, it certainly feels more possible now that the Phone app is here. So who knows? Maybe the M6 MacBook Pros due out later this year will have a surprise option for cellular connectivity to better make use of the Phone app.
Live Activities from an iPhone will appear in the Mac's menu bar. Courtesy of Apple
Apple is also introducing Live Activities to the Mac, which will hand off an ongoing task from your iPhone, such as an Uber Eats order, and give you updates right in the Menu Bar on your Mac. Other MacOS Tahoe Features
There were a couple other features worth mentioning. One is improvements to gaming, with a dedicated Games app, similar to what will be in iPadOS 26 and iOS 26. It's perhaps the most useful here on the Mac, though, since the question of which games are available on Mac often comes up. It also allows Apple to highlight some of the bigger games coming, such as Cyberpunk 2077 .
The new Apple Games app. Courtesy of Apple
The more exciting part is the new Game Overlay, something that PC gamers usually have access to. The overlay let's you chat with friends, adjust settings, and more without having to exit the game.
Game Overlay lets players adjust their settings, chat, or invite new players. Courtesy of Apple
Some smaller changes include the ability to capture audio recordings within the Notes app, the Journal app coming to the Mac for the first time, and a new Magnifier feature that zooms in with your connected webcam or camera. As per usual, many of the smaller changes will be discovered later in the release, and some new features may pop up along the way. When Will MacOS Tahoe Be Available?
The public beta for macOS Tahoe will be available starting in July, with the official release expected in the fall of 2025.
The developer beta release, meanwhile, was launched the day of the announcement, and you don't have to be a developer to install it. We don't typically recommend installing developer betas, as they can be quite buggy, but for the adventurous, you can find the developer beta under Software Update in System Settings by clicking on the 'i' icon next Beta Updates. So long as you have a compatible Mac, you can install it and play around. Just be sure to back up all of your data first.
If you have a recent Mac, even as old as a 2020 M1 MacBook Air, you'll be able to download and install MacOS Tahoe as a free upgrade when it's released later this year.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Gizmodo
37 minutes ago
- Gizmodo
After 18 Years, Apple Is Killing Its 9-Minute Snooze—That Can Only Mean One Thing
For years, it's always been nine more measly minutes. If you don't know what the hell I'm talking about, you've probably never owned an iPhone, or you're one of those freaks who wakes up without a device screaming in your face to do so. If you are in one of those camps, let me explain: for 18 years, Apple has maintained a vice grip on its alarm snooze feature, which grants nine more minutes to your alarm. No more, no less. Just nine minutes. And there's no adjusting that in settings. No adjusting that until now, that is. As noted by MacRumors, iOS 26, which was just introduced at Apple's WWDC 2025, finally lets you manually set your snooze time, which means one thing: it's time to sleep the f**k in, at least for as much as 15 whole minutes. In normal, non-sleep-related time, six minutes more isn't a lot, but when it comes to waking up, if you're anything like me, six minutes is basically a lifetime. Imagine all the horrible stress dreams about your teeth falling out you could have had in that time. Or heck, you might even luck out and get the one where you're driving a car and the brakes go out. The possibilities are really endless, or at least endless within a 15-minute span. Not only that, but you can even—if you're a total masochist—set your snooze time to be shorter. As noted by MacRumors, the developer beta allows you to choose anywhere between one and 15 minutes. The world is now your sleepy little oyster, and you are able to shuck it into the future up to 15 minutes at a time. On one hand, it's kind of wild that it's taken this long to give people the option to extend or retract their snooze times, but also very Apple-like. For many years, Apple was known for its definitive design that locked people in, though that's changed as the years have gone by. In today's iOS, you can change app icons, customize wallpapers, and—soon in iOS 26—choose backgrounds for your threads in Messages, and much more. Those are all things that iOS users of yore only dreamed about, and now they're a reality. It's a shift for Apple, but in this case, probably one that most people will welcome. As for the 9-minute default, well, it'll still have its place as the iOS default and also its own place in history. The 9-minute snooze, if you'll allow me a quick reverie, is a vestige of alarm clock history, originating from GE's Model 7H241 from 1956, which was the first alarm clock with a snooze feature. Why nine minutes exactly? Well, back in the day, clocks had gears, and that meant you had to work around the physical constraints of said gears. GE wasn't able to set 10 minutes exactly due to those constraints—it had to choose nine minutes and change or 10 minutes and change, and ultimately it went with nine. Clearly that decision lasted a lot longer than nine minutes in the long run. If you're ready to break out of the 9-minute prison Apple has kept you in, you'll have to wait a little bit, though. Currently, iOS 16 is only available via a developer beta, and the first public beta launches next month. The non-beta software should launch in the fall in full, along with Apple's newest-generation iPhones, and once that happens, we can all rest easy—at least for 15 more minutes.


Tom's Guide
44 minutes ago
- Tom's Guide
Copilot Vision just launched on Windows — here's what it actually does
Microsoft just flipped the switch on one of its most ambitious Copilot features yet. Copilot Vision with Highlights is now rolling out to Windows 11 users in the U.S. The new tool allows Copilot to 'see' what's on your screen and provide contextual help — a move that puts it in direct competition with Google's Gemini Live and Apple's upcoming Apple Intelligence. Essentially, it's Microsoft's answer to the next generation of AI assistants: ones that are proactive, ambient and deeply integrated into your device. At its core, Copilot Vision gives the AI access to "see" whatever you're currently doing on your PC. Whether you're browsing, editing a document, watching a video or working in Excel, and allows it to offer help based on that screen content. For example, you can ask questions like: Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Copilot can now view the apps and windows on your screen (with permission), making the AI smarter and more responsive in real time. Highlights is a companion feature that automatically surfaces useful content from your apps, browser and documents. Think of it like an AI assistant that notices what you've been working on and suggests relevant files, reminders or actions; but no prompt is necessary. Highlights appear in a refreshed Copilot interface, which now docks to the side of your screen for quick access. These features are now available to U.S. users running Windows 11 version 23H2 with Copilot+ PCs, or select devices that meet the hardware requirements. You'll need to have screen reading enabled in the Copilot settings. Note, Vision only activates when you give it permission. You can try it today by opening Copilot from the taskbar and clicking the new Vision icon in the corner. A pop-up will confirm screen access and let you toggle Highlights on or off. While you're at it, try the prompts in the video to help get you started. Microsoft's move signals a major shift towards staying competitive, giving their AI assistant more capabilities, similar to those of rivals like Google's Gemini and OpenAI's ChatGPT. With OpenAI powering Copilot and Meta and Apple launching their own ambient AI tools, we're entering the age of 'AI that sees.' Whether that's helpful or a little creepy may depend on how well it works — and how much you choose to share.


CNET
an hour ago
- CNET
I Need Apple to Make the iPhone 17 Cameras Amazing. Here's What It Should Do
Apple's WWDC was a letdown for me, with no new hardware announced and few new features beyond a glassy interface for iOS 26. I'm pinning my hopes that the iPhone 17 will get my pulse racing, and the best way it can do that is with the camera. The iPhone 16 Pro already packs one of the best camera setups found on any phone, it's capable of taking stunning images in any conditions. Throw in its ProRes video, Log recording and the neat 4K slow motion mode and it's a potent video shooter too. It even put up a strong fight against the other best camera phones around, including the Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra. Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra Despite that, it's still not the perfect camera. While early reports from industry insiders claim that the phone's video skills will get a boost, there's more the iPhone 17 will need to make it an all-round photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change. Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025. An accessible Pro camera mode At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps. The new camera app is incredibly barebones Apple/Screenshot by Joe Maldonado/CNET And that's fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It's not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers -- myself very much included -- want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony. That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple's camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is disappointing, and I want to see how the company will continue to make these phones usable for enthusiastic photographers. Larger image sensor Though the 1/1.28-inch sensor found on the iPhone 16 Pro's main camera is already a good size -- and marginally larger than the S24 Ultra's 1/1.33-inch sensor -- I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It's why pro cameras tend to have at least "full frame" image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous "medium format" sensors for pristine image quality. Even on pro cameras, sensor size is important. Even the full-frame image sensor in the middle is dwarfed by the medium format sensor on the right. Phone camera sensors don't come anywhere near to this size. Andrew Lanxon/CNET Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It's larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing a Taylor Swift concerts. I'm keen to see Apple at least match Xiaomi's phone here with a similar 1-inch type sensor. Though if we're talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won't hold my breath on that one -- the phone, and the lenses, would need to be immense to accommodate it, so it'd likely be more efficient just to let you make calls with your mirrorless camera. Variable aperture Speaking of the Xiaomi 14 Ultra, one of the other reasons that phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 -- significantly wider than the f/1.78 of the iPhone 16 wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject. The streetlight outside this pub has been turned into an attractive starburst thanks to the variable aperture of the Xiaomi 14 Ultra. Andrew Lanxon/CNET But Xiaomi's 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it's able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they've been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen. More Photographic Styles Though Apple has had various styles and effects integrated into the iPhone's cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It's enough that CNET Senior Editor Lisa Eadicicco even declared the new Photographic Styles her "favorite new feature on Apple's latest phone." I think they're great too. Or rather, they're a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there's still not a whole lot to choose from and the interface can be a little slow to work through. I'd love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm. I like the warmer tones produced by the iPhone's Amber style in this image, but I'd definitely like to see more options for getting creative with color tones. Andrew Lanxon/CNET And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple's styles means you can take your images with the look already applied, and then change it afterward if you don't like it -- nothing is hard-baked into your image. I was recently impressed with Samsung's new tool for creating custom color filters based off the look of other images. I'd love to see Apple bring that level of image customization to the iPhone. Better ProRaw integration with Photographic Styles I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can't use them when shooting in ProRaw. I love Apple's use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone's computational photography -- including things like HDR image blending -- but still outputs a DNG raw file for easier editing. The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple's color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further. I do a lot of street photography in black and white, and I'd love more flexibility to take ProRaw shots in monochrome. Andrew Lanxon/CNET Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple's ProRaw. Oddly, while the older-style "Filters" are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone's gallery app through the editing menu. LUTs for ProRes video And while we're on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look -- think dark and blue for horror films or warm and light tones for a romantic drama vibe. But Apple doesn't offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn't really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional. ProRes footage looks very low contrast and desaturated. Apple needs to introduce ways to help you do more with ProRes files on the iPhone. Andrew Lanxon/CNET But that still leaves the files on your phone, and I'd love to be able to do more with them. My gallery is littered with ungraded video files that I'll do very little with because they need color grading externally. I'd love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful. With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn't see this software functionality discussed as part of the company's June WWDC keynote, that doesn't mean it couldn't be launched with the iPhone in September. If Apple were able to implement all these changes -- excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious -- it would have an absolute beast of a camera on its hands.