Latest news with #VisionOS


Tom's Guide
4 days ago
- Tom's Guide
iOS 26's Liquid Glass design brings big changes to your iPhone — here's everything coming in the public beta
A lot has been said about Apple's new design language for iOS 26, the much vaunted (or maligned) Liquid Glass. Apple's latest design, which is based somewhat on Apple's Vision OS, offers something very different from what we've seen before. However, while Apple did show off a lot of the new look during the last WWDC, there's nothing quite like a hands-on experience to understand if you like something. While you can access the developer beta for iOS 26, we wouldn't recommend it due to how it can negatively affect your device and apps. Thankfully, the public beta is rumored for release on July 23, according to Mark Gurman, meaning interested users could test it more easily soon. With that in mind, let's break down all the changes we've seen for Liquid Glass, compared to iOS 18, that could be coming to you soon. When it comes to the home screen, the most noticeable change occurs when you activate the 'All Clear' mode. This will make your icons and widgets appear much more translucent. However, we've heard a lot of reports that, at least in the developer mode, this can affect readability. You can solve this with the Reduce Transparency option in the settings, or by turning it off completely. If you do turn All Clear off, then the difference between the iOS 26 home screen and the one seen on iOS 18 is pretty minimal. For the most part, the only real evidence you'll see of Liquid Glass is on the dock at the bottom of the screen, which is more transparent. Overall, the icons on the iOS 26 homescreen are slightly bigger than on iOS 18. It's also noticeable that some of Apple's app icons have changed, with some, like Settings, offering slightly different shading, while others, like the Camera app, have been fully redesigned. For the most part, the Control Center remains relatively similar in both iOS 18 and iOS 26, aside from the transparency brought about by Liquid Glass. If you look at the above screenshots, you'll notice that you can fairly clearly see your iPhone's home or lock screen in the background in iOS 26. Meanwhile, the iOS 18 version's transparency has more of a grey tint. Again, this can lead to some issues making out the writing in the iOS 26 version compared to the current one, but you can turn it down a fair amount with the Reduce Transparency setting. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Arguably, the iPhone's lock screen offers the biggest difference between the two operating systems. The new version offers a more stylized clock that adjusts its size dynamically depending on the image you use, as well as the number of notifications you have. Meanwhile, the icons and notifications also feature the most noticeable translucency on the lock screen, alongside a new white text. The shortcut buttons for the torch and camera also have a slight specular highlight effect that makes them seem more 3D than the iOS 18 version. There's also a new unlock effect that makes it seem as though you're moving a pane of glass, rather than the simple slide-over animation seen in iOS 18's lock screen. One of the newest additions with iOS 26 is the introduction of dynamic tab bars in apps. This new bar will change depending on whether you're scrolling through an app or trying to perform a specific action. The new bar aims to free up the space that would usually be taken up by a static bar, while also aiming to be more intuitive. On top of that, iOS 26 will allow the on-screen buttons and menus to adapt to the color of the background. Now, this feature is available in iOS 18, but the current version is very muted, with it only changing from grey to white. Meanwhile, the iOS 26 version will be able to adapt to whatever is behind it, even as you scroll through your gallery. One thing to note is that, when the public beta eventually releases, it will likely be quite different from everything we've seen before. So far, we've seen this across the different developer beta releases. For instance, the second iOS 26 beta offered new transparency options, moved the Safari new tab button, and made changes to some of the apps and widgets. Meanwhile, the third iOS 26 beta added more changes to Liquid Glass to make it more readable, new wallpapers and several other fixes. As such, we'd expect the first public beta to take all the recommendations and fixes from the developer beta. This also goes for the full public release, which could look very different. On that note, if you plan on waiting for the official launch of the update, we're expecting it to happen around mid-September, which would match the release dates of both iOS 18 and iOS 17. Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.


CNET
6 days ago
- CNET
iOS 18 vs. iOS 26: Here's How Liquid Glass Reworks Your iPhone UI
Apple's new Liquid Glass design language, largely inspired by VisionOS on the Vision Pro headset, will add both shine and translucency to your devices. So far, we like what we see, but the new styling could look different by the time it's available for your iPhone, iPad and other Apple devices later this year. Brave iPhone users can install the developer beta right now, but the public beta could drop any day. At first glance, Liquid Glass feels dramatic, but it's more subtle than you might be led to believe. (Visually, at least -- Apple is skipping iOS 19 and the other intervening numbers.) Think of it as touch-up rather than reconstructive surgery with iOS 26's other, more mundane features tucked underneath. Liquid Glass on the home screen is a fairly minimal change, which is a good thing. We're only at the first developer beta of the new OS and design, though, and Apple will undoubtedly make tweaks until its final release. Below, we'll take a closer look at both iOS 26 and iOS 18 to see what's different between the two. For more, here are the iPhone models that will support iOS 26 when it launches. Home screen Apple kept the new Liquid Glass minimal on the home screen, with only minor changes to the default home screen appearance versus iOS 18's. Apple/Screenshot by Jeff Carlson Looking at the home screens, the primary difference you'll find is that in iOS 26 the background of the dock and the search option that sits in between the dock and the home screen icons are more transparent and have a sheen to the edges, whereas in iOS 18, these are slightly darker. Other smaller changes are that the icons on iOS 26 look slightly larger, and some app icons seem to have been more influenced by the redesign than others, most notably (from the screenshots) Settings, Camera and Mail. For Liquid Glass to really shine on the home screen, you'll want to opt for the "All Clear" mode, which will create the most dramatic change to your icons and widgets. Going this route could potentially introduce some viewability issues, but the "reduce transparency" setting remedies this quite well. Control Center Apple/Screenshot by Jeff Carlson Things here are largely unchanged. Outside of the new glassy look in iOS 26, the 1x2 and 2x1 controls are more rounded than that of iOS 18. Lock screen Apple/Screenshot by Jeff Carlson It's easy to see the differences that Liquid Glass brings to the lock screen of the iPhone. The digital clock in iOS 26 dynamically resizes depending on the wallpaper and the number of notifications you have at any given moment, which is pretty cool. The clock itself on iOS 18 can be changed, but it won't change in size in response to content displayed on the lock screen. The background on notifications is clearly different between the two OS versions, with iOS 18 providing more opacity and a black text versus iOS 26's near-transparent background on white text. The controls at the bottom in iOS 26 also appear more like physical buttons with depth and more of a see-through background. The new unlock effect in iOS 26 is that the motion of unlocking your iPhone will appear as though you're lifting a sheet of glass, highlighted by a shiny edge to give it form when you begin to slide your finger up. Menus and dynamic tab bars iOS 26's new Dynamic Tab gives you a cleaner look and more space to view your content. Apple/Screenshot by Jeff Carlson A new addition in iOS 26 is the introduction of dynamic tab bars in apps that will change depending on whether you're scrolling or trying to perform a specific action. Apple says this will create a more intuitive experience while freeing up space for your content. If you were to replace the glass effect with heavily saturated colors, no one would blame you for mistaking this new tab bar with what Google's doing in Android 16 in some of its apps -- they look a lot alike. But compared to iOS 18, this new dynamic tab bar should not only reduce sifting through multiple menus, but it looks pretty good in the process. iOS 26 will dynamically adapt to light and dark backgrounds In iOS 26, the color of menu icons and icon text will adapt depending on the background. Apple/GIF by CNET While it's harder to compare Liquid Glass to iOS 18 here, an upcoming feature is that buttons and menus will adapt depending on the content's background color. For instance, when you're scrolling through an app with a light background, the floating menu options will appear with black text for easier viewing and will automatically change to white upon scrolling to a dark background. in iOS 18, some apps aspects of the user interface would appear darker depending on the color of the background, but less so than how Liquid Glass handles it now. CNET/ Screenshots by Jeff Carlson iOS has had this type of feature show up in a less dramatic fashion before, as you can tell from the photos app screenshots above. Comparing these to what's on the horizon, it's hard not to get excited about the small tweaks Liquid Glass has in store, too. Those are just a few of our initial findings, and we'll likely add more once we surface them. If you want more about iOS 26, check out three upcoming features that are a bigger deal than Liquid Glass.


Forbes
07-07-2025
- Forbes
Spatial Computing Takes The Lead For Apple At WWDC 25
Apple is touting the new multi-user and enterprise capabilities for VisionOS 26. Apple While everyone was focused on the big improvements to iPad OS's multitasking at this year's Worldwide Developers Conference, Apple snuck some of the most significant improvements into VisionOS. Not only that, but it also revealed that the new Liquid Glass design principle that will span all of its operating systems is inspired by VisionOS . I say this is probably the biggest announcement because I believe it will enable Apple and developers to build apps that are inherently more portable to spatial computing. I would also argue that many of the announcements Apple made at WWDC outside of VisionOS found it catching up with the competition rather than breaking new ground. That's why its spatial computing announcements for VisionOS are so important; they potentially set Apple up as a leader in the space going forward. Spatial widgets don't seem like a big deal until you realize how they change the ground rules of Apple's spatial computing UI. These are persistent widgets that stay anchored where you leave them, which enables you to create a truly spatial computing environment that persists until you say otherwise. Besides immediately improving usability, this means that users can customize their spatial computing experiences in any way they like — while also enabling developers to build toward very specific user experiences. I think that Apple making spatial widgets its own kind of applet will also encourage experimentation and likely create a new class of persistent spatial computing applications. Personas are Apple's way of creating virtual avatars for users while they are wearing their headsets; they enable users to be virtually present with a high-quality scan of their faces. Apple's initial launch of this feature was a bit rough around the edges and had some quality issues, but I would say that its second go with VisionOS 2 was a huge improvement. This next-generation upgrade to Personas has made them nearly photorealistic, and I would say this function has gone beyond the uncanny valley that has plagued most of these digital avatars. While I haven't had a chance to redo my Persona and experience this change myself, many of my friends have, and I must say that I'm truly impressed by how accurately their Personas depict them. Some people have also commented that this is the first time we've seen Apple executives like marketing chief Greg Joswiak post their own Personas online, indicating a level of confidence that had not yet been seen. While Sony's PlayStation VR2 could be considered a failure for many reasons, it does seem that Apple has decided to potentially breathe some life into that ecosystem. Apple's new support for PlayStation VR2 controllers enables developers to finally bring games requiring controllers to VisionOS and the Apple Vision Pro headset. Currently, the Vision Pro and VisionOS work only via hand tracking, which is great for lots of things — but not necessarily gaming. While porting a PSVR2 game to Vision Pro may not be the most straightforward thing, there may be opportunities for other games to come to VisionOS thanks to these controllers. And if you're anything like me and feel guilty for your underutilized PSVR2 purchase, this might be a great way to make more use of the controllers. This is also the first time that Sony ever offered the controllers separately from the headset, so you don't need to buy a whole PSVR2 kit to use the controllers with the Vision Pro. In addition to the PSVR2 controllers, Apple also announced support for the new Logitech Muse for spatial content creation. The Muse is Logitech's stylus for spatial computing, specifically designed for VisionOS and the Vision Pro. One of my favorite features in VisionOS 2 was the 2-D to 3-D conversion tool for literally any image. It is still one of my most-used features, and I still regard it as borderline magical for how it works. In the new VisionOS 26 — note the shift to Apple's new year-by-year naming schema — Apple is adding 'spatial scenes,' which take this conversion to an entirely new level. This feature uses Gaussian splats that create immersive 3-D scenes that incorporate more than one 3-D perspective. I believe that Apple is leaning even harder into the success of the 2-D/3-D feature, and that spatial scenes will enable VisionOS to become an even more content-rich platform. I believe this is so important because every spatial platform has suffered from a lack of content since the beginning of time. Speaking of content, VisionOS 26 will also add support for wider-FoV content including 180-degree and 360-degree images from GoPro, Insta360 and Canon. This should make it even easier to bring existing content to VisionOS and encourage more content creators to consider these platforms as the defaults for Apple. That said, these three camera companies have been the defaults for the industry for quite some time, so it's great to see Apple recognizing this and supporting them. I'm excited to see how some of my Insta360 footage shot in 360-degree format over the last few years looks on the Vision Pro. I know that it's not quite the 8K content that the Insta360 One X5 can shoot, but it's still pretty good-looking nonetheless, and the images are still very high-resolution (72MP). Enterprise Features To Address Privacy And Spatial Sharing While Apple did announce a slew of enterprise APIs such as the Protected Content API, there is also an 'eyes only' mode that brings even more privacy protections. While the added security and privacy features are welcome, the enterprise space is also highly collaborative, which is why it's great to see Apple finally addressing sharing in the context of spatial computing. This means that users can finally share spatial content and experiences, whether we're talking about professional applications or 3-D movies and games. What I like to see is that people can also collaborate on projects while adding remote participants via FaceTime. I expect that enterprise users will also appreciate the new ability to pair all these capabilities with additional enterprise license management features such as Vision Entitlement Services to streamline license status checks and app approvals. VisionOS 26 And MacOS Introduce Better External Device Support Last but certainly not least, Apple has brought the Mac and iPhone closer to the Vision Pro with VisionOS 26. For one thing, a user can finally unlock their iPhone while wearing the Vision Pro, even inside fully immersive experiences. This was one of my biggest pet peeves while using the Vision Pro, which has the most accurate eye-tracking available and uses iris scans for authentication — so it shouldn't introduce any friction unlocking my iPhone while I have my headset on. Thanks to VisionOS 26, that is now true. The VisionOS update also supports allowing phone calls to come into the headset through the iPhone so a user no longer needs to take off the headset to answer a call. There also appears to be enhanced support for streaming applications from MacOS with spatial rendering. I believe that this reflects Apple's approach to wireless VR connectivity with MacOS, using the Mac for compute and the Vision Pro as the display. With MacOS now supporting Steam natively on Apple Silicon, we could potentially see all kinds of VR applications becoming available on MacOS/Vision Pro. This is especially important considering that Apple is going to sunset support for Intel-based Macs and will cease Rosetta 2 support after macOS 27. Apple's VisionOS 26 summary bento box of the latest updates Apple The Importance Of VisionOS For Apple, And The Importance Of AI For VisionOS While lots of people have criticized many of Apple's moves with iOS 26 and the Liquid Glass design (which Apple has already dialed back in the latest iOS 26 Beta), I think a lot of people outside of the spatial computing world missed how seriously Apple is taking VisionOS. If anything should be learned from VisionOS 26, it is that Apple has shown its unwavering commitment to the platform — and its investment in the platform isn't going away anytime soon. Sure, plenty of people have critiques of the Vision Pro, which in my opinion tend to be slightly premature. The reality is that Apple is showing that the Vision Pro is very much a development platform for the improvements it wants to make with VisionOS. Rumor has it that Apple's next headset will be lighter, faster and cheaper than the Vision Pro. If that headset is coming to market anytime soon, it will benefit greatly from the last year and a half of improvements to VisionOS. Mind you, Apple still has plenty of room for further improvement in terms of how AI is integrated into VisionOS, but that is unfortunately a broader problem for Apple's AI strategy connected to its troubles with Siri's generative AI relaunch. I believe that Apple will eventually work out these challenges, but I also feel sure that having a subpar AI experience in an XR platform can only hurt that platform's growth potential. There is no doubt in my mind that AI and XR are highly complementary technologies and potentially even act as catalysts for each other's growth. While I welcome many of Apple's VisionOS 26 improvements with open arms and commend Apple's commitment to XR, I still think Apple needs to get competitive on AI, whether that's from an acquisition or by accelerating current development. This will be especially important if the company wants to ship AI smart glasses — which are heavily dependent on quality AI performance and accuracy — to compete with the likes of Meta and Google. Moor Insights & Strategy provides or has provided paid services to technology companies, like all tech industry research and analyst firms. These services include research, analysis, advising, consulting, benchmarking, acquisition matchmaking and video and speaking sponsorships. Of the companies mentioned in this article, Moor Insights & Strategy currently has (or has had) a paid business relationship with Google, Intel, Meta and Sony.


CNET
11-06-2025
- CNET
Apple's VisionOS 26 Hands-On: Virtual Me and 3D Memories Are Stunning
My virtual Scott Stein persona is hauntingly real, spatial scenes feel like living 3D memories and even the experience of sticking widgets to virtual walls – and virtual windows – is better than I ever thought. Hey. That's me. My first experience in Apple's new Vision OS 26, announced Monday at WWDC, was making my new 3D-scanned Persona, a feature that Apple says is finally out of beta. I used to find its uncanny style funny, but not anymore. I find it unsettlingly real. Like, I feel like I'm watching myself. New Personas are one of several upgrades to Apple's $3,500 Vision Pro headset announced at this year's WWDC, but there's more that surprised me. There's 3D-converting Spatial Scenes mode that also works in iOS and looks absolutely wild in-headset. The new widgets in VisionOS can be stuck to walls – even into walls – and looked so convincingly real in my demo at Apple Park. I felt like I could stick my head through a virtual window into a panorama photo of Tokyo that wasn't there. None of these improvements are game-changers, but they all far exceeded my expectations when I actually tried them. Apple's on-stage demos during its keynote really didn't do them justice. As usual I had to experience them in the actual headset to appreciate the impact. Unfortunately Apple still hasn't made any headway in camera-enabled AI for Vision Pro, something Google is already planning for Android XR, and an extra I can't wait to see in action. But Apple's skill at adding other features to Vision its AR/VR platform, once it actually does them, is impressive. If updates continue to be this eye-popping, I'm really curious where things go next, as Apple heads towards what should be a lower-priced, lighter version of Vision in the next year or two. Maybe by then it will finally work in camera-supported AI too. Yeah, Personas look this good. Apple/Screenshot by CNET Virtual me is almost me now (except for my hands) Apple says its Personas in Vision are now officially out of beta with VisionOS 26, and it shows. The previous versions of Personas, the 3D-scanned avatars Apple uses in VisionOS, have improved over time but their uncanny vibe remained off-putting. The new scanning now includes more of the sides of people's heads, and Apple's not windowing off Personas in FaceTimes anymore in the headset. They're popping into your room. I scanned myself using the Vision Pro like always, but this time I was greeted with a more realistic version of me, for better and for worse. I saw the bags under my eyes. I saw my beard's salt-and-pepper details. I looked like me. Personas can't be scanned while wearing glasses, so my specs still can't come aboard, but the virtual glasses options are far better now. I could pick from a variety of frames, colors and materials and size them up, too. That feature alone felt like a preview of potential Vision glasses-shopping apps to come. My expressions feel right, too. I couldn't make every expression, but I tried a bunch and I didn't see many fail. Sadly, my hands were still just ghostly things that vanished as I put them closer to my face. I also recorded a test clip of myself using a third-party app, and the result was good enough that I think it captures me now. Will people think it's me? I started wondering about how Personas could be used as virtual stand-ins for myself – not just in Vision, but in 2D apps. Will Apple bring Personas everywhere across iOS and Macs someday? I think it will. There's no way to show in 2D how Spatial Scenes feel. Apple/Screenshot by CNET Virtual memories via Spatial Scenes feel like 3D dioramas Apple is finally living up to the initial Vision Pro ads, where people watched photo memories in 3D like they were moments from Minority Report. A previous auto-converting tool turned 2D photos into 3D, but the Spatial Scenes upgrade lets you actually move back and forth and even deeper into a photo. The frame's larger field of view feels like a window into a museum diorama. A few demo examples made my jaw drop. They're not the same as full volumetric 3D scans, but the tool magically fills in some fuzzy details at the fringes as I shift my point of view, making it feel like the whole thing is really a window into somewhere else. That scene in Ready Player One where Wade visits a museum full of 3D memories of James Halliday's life? It's sort of like that. But this is only for still photos. Spatial Scenes also work in iOS, but I'm telling you, the effect isn't nearly as compelling. Apple/Screenshot by CNET Widgets to fill my rooms Another demo showed me how Apple's widgets can be pinned to walls and other surfaces. I walked into a virtual room and found widgets suddenly popping up everywhere: a music poster on the wall, a window on another, calendars and clocks somewhere else. The OS update can also recognize and remember room layouts and turn off the virtual overlays until you enter, preventing bleed where you might see other rooms' screens through the walls. I've seen pinned displays and windows before in other headsets, glasses and apps, but these still surprised me with their fidelity. An Apple Music poster looked convincingly real and added extra details as I approached, then played music if I tapped it. Clocks look like actual wall clocks. And widgets can be virtually inset into walls, which is wild. The panorama window, which adds photos from your library, had reflective detail around the white curved pane that made it feel really there and inset. The 3D effect was convincing enough that I felt like I could walk up to it and feel really transported – it was even better at closer range. Would I actually use these widgets? I don't know, but I feel the mixed reality blend more than ever. There's more. A 3D "spatial browser" turns Safari into a larger reading mode that auto-converts the images inside to 3D. There's also a new interactive environment in the headset that shows Jupiter viewed from one of its moons, with an interactive panel that can change time of day. The interactive features aren't coming to Apple's other Vision 3D environment backgrounds yet, but I hope they will. Collaboration in the Vision Pro is going to be more of a thing. Apple/Screenshot by CNET More to come this year Apple has other updates that are useful, too. Collaboration in apps can work with other people in-room now, or mix in others coming in as Personas. And I didn't get to try any spatial controller support, which will work with PlayStation VR 2 controllers and third-party styluses. That's coming later this year, I was told, likely because the apps aren't there to work with it yet. Apple still has a long way to go to make its Vision really feel like a face-mounted computer for everyone, but the updates in VisionOS 26 are more impressive than I expected. Apple is pushing boundaries that competitors like Meta and Google, with their focus on AI, are not even tapping into yet.


The Verge
09-06-2025
- The Verge
Apple's Liquid Glass redesign doesn't look like much
Design, to quote a wildly overused Steve Jobs-ism, is how it works. And if that's the case, Apple's new design language, which the company is calling 'Liquid Glass' and just announced at WWDC 2025, is really nothing new at all. The Liquid Glass look comes largely from VisionOS, which shipped with a particular constraint: it had to layer digital information over your physical world, without occluding that physical world. That's why everything in VisionOS is translucent and glassy, so you can both see it and see through it. Everything is layered and three-dimensional, an effort to make digital experiences feel more like objects in space than objects on a screen. The impetus for turning the VisionOS look into the Liquid Glass system, Apple software boss Craig Federighi said at the beginning of this year's developer conference keynote, was that Apple's devices are more closely connected than ever. That's certainly true: Apple's ecosystem remains tight, and there are lots of good reasons to buy an iPad if you have an iPhone or an Apple TV if you have a Mac. (Call it synergy, call it illegal monopoly maintenance, you pick.) Most of Apple's devices have a lot of features in common, and it makes sense to bring them all more closely together. Putting elements in familiar places, making sure things work the same everywhere — these are all good things! Leaving aside the somewhat wild decision to pivot your entire UI system around a prohibitively expensive headset hardly anyone has ever even tried, the thing about most of Apple's devices is that they aren't overlaying digital information on the physical world. They're just screens! So the little glass loupe that slides over text as you highlight on a webpage won't feel like you're moving something around; it'll feel like you're poking at a fake water droplet on the screen. The playback controls that seem to float slightly above your content, refracting its light and colors, look to my eyes a little like a hokey 3D effect. The navigation buttons that ripple as you scroll a webpage don't look like physical objects — they just look busy and hard to read. Apple executives frequently made a point of noting that Liquid Glass is minimalist and 'keeps your content in focus,' but the constantly morphing interface feels to me like it might be even more noticeable. There is one thing about Liquid Glass I really like. Now, when you tap on an alert or a menu item, the rest of the content appears from within, as if it were contained by the thing you just tapped. That's a clever way to keep people anchored in place. You won't tap something, only to be taken to another screen, with no obvious way back to where you were. The menu just radiates out over top of whatever you were doing, and then folds back in on itself when you're done. It's far too easy to get lost in your phone, and this is a nice touch. You really can't look at Liquid Glass without thinking of Windows Aero, the similarly glassy and translucent design language that shipped with… Windows Vista. (Tough comparison, that.) With Aero, Microsoft made an effort to make it easy to know where you were on your computer, and to find everything you needed. You could see through windows to other windows; app borders would change to match the content within; you could use widgets and live thumbnails to get quick access to information. Aero didn't last, in part because it was a huge resource suck to render something so graphically intense. Now, Microsoft's design is much more colorful, and even more aggressively physical — there are drop shadows everywhere. The ideas behind both Liquid Glass and Windows Aero are good ones! They stand for personalization, customization, for helping people figure out where they are and what they're doing on their device. Apple has long been unmatched in executing this kind of stuff, too, and the demos we saw at WWDC today suggest that this layered, three-dimensional effect will work smoothly across all of Apple's devices. But for all the epic language of the unveiling, I don't see much in Liquid Glass that will matter. Maybe we'll get more in the months to come, and maybe developers will figure out how to make the best of the layers. But for every place this kind of layered translucency makes sense, there will be lots of places it just looks like a mess. It won't change much about how you use your devices or the way you perceive them, and at least to my eyes, it doesn't even make them better-looking. It's just … slightly different. Watching Apple's announcement, it's hard not to read the whole thing as borne of efficiency rather than of inspiration. Alan Dye, Apple's vice president of design, started his portion of the keynote by harkening back to iOS 7, and its simple, layer-based look. 'Now, with the powerful advances in our hardware, silicon, and graphics technologies,' he said, 'we have the opportunity to lay the foundation for the next chapter of our software.' He called Liquid Glass 'our broadest design update ever.' Not biggest. Not best. Just broadest. In that broadest sense, it's logical that this is where Apple landed. It obviously wouldn't, and probably couldn't, fundamentally change the look and feel of every device it makes for billions of users around the world. No one wants that. So Apple just took all its elements and made them more universal: everything's a little more round, a little more contained, a little less designed for a specific screen size. A floating menu of black and white icons works pretty much anywhere, you know? By turning menus into lists that pop out of buttons, Apple prevents itself from having to optimize every menu for every device and screen orientation. Liquid Glass is the lowest common denominator, done about as well as you could. But I'm not impressed, and I'm not optimistic. Apple is at its best when it has strong opinions about how things should work; even the attempt to get out of the way and let your content dictate everything feels like the wrong tack. Plus, I've spent the past year tinkering with Apple's new tinted and color-matching iPhone homescreens, which mostly serve to make your device uglier. I don't see a reason that Liquid Glass would make my devices better, simpler, or more personal. I just see buttons that are harder to read.