
Good Taste Is More Important Than Ever
There's a lesson I once learned from a CEO—a leader admired not just for his strategic acumen but also for his unerring eye for quality. He's renowned for respecting the creative people in his company. Yet he's also unflinching in offering pointed feedback. When asked what guided his input, he said, 'I may not be a creative genius, but I've come to trust my taste.'
That comment stuck with me. I've spent much of my career thinking about leadership. In conversations about what makes any leader successful, the focus tends to fall on vision, execution, and character traits such as integrity and resilience. But the CEO put his finger on a more ineffable quality. Taste is the instinct that tells us not just what can be done, but what should be done. A corporate leader's taste shows up in every decision they make: whom they hire, the brand identity they shape, the architecture of a new office building, the playlist at a company retreat. These choices may seem incidental, but collectively, they shape culture and reinforce what the organization aspires to be.
Taste is a subtle sensibility, more often a secret weapon than a person's defining characteristic. But we're entering a time when its importance has never been greater, and that's because of AI. Large language models and other generative-AI tools are stuffing the world with content, much of it, to use the term du jour, absolute slop. In a world where machines can generate infinite variations, the ability to discern which of those variations is most meaningful, most beautiful, or most resonant may prove to be the rarest—and most valuable—skill of all.
I like to think of taste as judgment with style. Great CEOs, leaders, and artists all know how to weigh competing priorities, when to act and when to wait, how to steer through uncertainty. But taste adds something extra—a certain sense of how to make that decision in a way that feels fitting. It's the fusion of form and function, the ability to elevate utility with elegance.
Think of Steve Jobs unveiling the first iPhone. The device itself was extraordinary, but the launch was more than a technical reveal—it was a performance. The simplicity of the black turtleneck, the deliberate pacing of the announcement, the clean typography on the slides—none of this was accidental. It was all taste. And taste made Apple more than a tech company; it made it a design icon. OpenAI's recently announced acquisition of Io, a startup created by Jony Ive, the longtime head of design at Apple, can be seen, among other things, as an opportunity to increase the AI giant's taste quotient.
Taste is neither algorithmic nor accidental. It's cultivated. AI can now write passable essays, design logos, compose music, and even offer strategic business advice. It does so by mimicking the styles it has seen, fed to it in massive—and frequently unknown or obscured —data sets. It has the power to remix elements and bring about plausible and even creative new combinations. But for all its capabilities, AI has no taste. It cannot originate style with intentionality. It cannot understand why one choice might have emotional resonance while another falls flat. It cannot feel the way in which one version of a speech will move an audience to tears—or laughter—because it lacks lived experience, cultural intuition, and the ineffable sense of what is just right.
This is not a technical shortcoming. It is a structural one. Taste is born of human discretion—of growing up in particular places, being exposed to particular cultural references, developing a point of view that is inseparable from personality. In other words, taste is the human fingerprint on decision making. It is deeply personal and profoundly social. That's precisely what makes taste so important right now. As AI takes over more of the mechanical and even intellectual labor of work—coding, writing, diagnosing, analyzing—we are entering a world in which AI-generated outputs, and the choices that come with them, are proliferating across, perhaps even flooding, a range of industries. Every product could have a dozen AI-generated versions for teams to consider. Every strategic plan, numerous different paths. Every pitch deck, several visual styles. Generative AI is an effective tool for inspiration—until that inspiration becomes overwhelming. When every option is instantly available, when every variation is possible, the person who knows which one to choose becomes even more valuable.
This ability matters for a number of reasons. For leaders or aspiring leaders of any type, taste is a competitive advantage, even an existential necessity—a skill they need to take seriously and think seriously about refining. But it's also in everyone's interest, even people who are not at the top of the decision tree, for leaders to be able to make the right choices in the AI era. Taste, after all, has an ethical dimension. We speak of things as being 'in good taste' or 'in poor taste.' These are not just aesthetic judgments; they are moral ones. They signal an awareness of context, appropriateness, and respect. Without human scrutiny, AI can amplify biases and exacerbate the world's problems. Countless examples already exist: Consider a recent experimental-AI shopping tool released by Google that, as reported by The Atlantic, can easily be manipulated to produce erotic images of celebrities and minors.
Good taste recognizes the difference between what is edgy and what is offensive, between what is novel and what is merely loud. It demands integrity.
Like any skill, taste can be developed. The first step is exposure. You have to see, hear, and feel a wide range of options to understand what excellence looks like. Read great literature. Listen to great speeches. Visit great buildings. Eat great food. Pay attention to the details: the pacing of a paragraph, the curve of a chair, the color grading of a film. Taste starts with noticing.
The second step is curation. You have to begin to discriminate. What do you admire? What do you return to? What feels overdesigned, and what feels just right? Make choices about your preferences—and, more important, understand why you prefer them. Ask yourself what values those preferences express. Minimalism? Opulence? Precision? Warmth?
The third step is reflection. Taste is not static. As you evolve, so will your sensibilities. Keep track of how your preferences change. Revisit things you once loved. Reconsider things you once dismissed. This is how taste matures—from reaction to reflection, from preference to philosophy.
Taste needs to considered in both education and leadership development. It shouldn't be left to chance or confined to the arts. Business schools, for example, could do more to expose students to beautiful products, elegant strategies, and compelling narratives. Leadership programs could train aspiring executives in the discernment of tone, timing, and presentation. Case studies, after all, are about not just good decisions, but how those decisions were expressed, when they went into action, and why they resonated. Taste can be taught, if we're willing to make space for it.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Tom's Guide
43 minutes ago
- Tom's Guide
Get more from your Apple Watch — 5 hidden Control Center features worth knowing
The Apple Watch Control Center is designed for quick access to essential features, but most users only scratch the surface of what's possible. While everyone knows the basics, there's a whole layer of hidden functionality waiting to be discovered. Many Control Center icons do much more than their simple tap functions suggest. Press and hold various controls, and you'll unlock advanced options that can save time and add convenience to your daily routine. These hidden features turn the Control Center from a basic toggle panel into a powerful shortcut hub that puts advanced controls right at your fingertips. These lesser-known capabilities make your Apple Watch significantly more useful once you know they exist. Most people know tapping the Wi-Fi icon turns wireless on and off, but there's more hidden underneath. Press and hold the Wi-Fi icon to access a full network selection screen where you can switch between available networks without going into Settings. You'll also find Auto Hotspot controls at the bottom, letting you choose between "Never," "Ask to Join," and "Automatic" options. If your watch has the network password saved, it connects instantly — otherwise you'll need to use Scribble to enter it. This feature is particularly useful when you're moving between locations with different Wi-Fi networks, like switching from home to office networks without having to dig through Settings menus. The Ping My iPhone feature gets even better when you know this trick. While tapping the icon makes your phone beep loudly, pressing and holding it activates both the sound and the LED flash on your iPhone's back. This dual approach makes finding your phone much easier, especially in dark rooms or if your phone is buried under cushions. It's a simple upgrade that transforms an already useful feature into something even more effective. The flashing light is visible even when your phone is face-down or in silent mode, making it incredibly useful in noisy environments where the ping sound might not be audible. Instead of manually remembering to turn Silent Mode back on, you can set it to activate for specific timeframes. Press and hold the Silent Mode icon to reveal three timing options: "On," "On for 1 hour," and "On until this evening" or "On until tomorrow morning." This prevents those awkward moments when your watch starts buzzing during meetings because you forgot to re-enable silent notifications. The timed options are perfect for situations like movies, meetings, or sleep, where you know exactly how long you need notifications silenced without worrying about forgetting to turn them back on. When you have an active Focus mode running, pressing and holding its icon brings up the complete list of available Focus options. This lets you switch directly to a different Focus without having to turn off the current one first, then navigate back to select a new mode. You'll also get timing options for the new Focus, making it faster to manage your notification preferences throughout the day. This streamlined switching is especially valuable for people who use multiple Focus modes throughout their day, like Work, Personal, Sleep, and Do Not Disturb, allowing seamless transitions without multiple taps. This feature requires setup but offers powerful quick access once configured. Go to Settings, Accessibility, Accessibility Shortcut, and select the accessibility features you want rapid access to. A new icon appears in Control Center that provides instant access to tools like Color Filter, which can reduce distractions by making your watch display monochrome. You can also triple-press the Digital Crown to access these same shortcuts from anywhere on your watch. These shortcuts aren't just for users with accessibility needs — features like Zoom and Color Filter can be helpful for anyone in bright sunlight or low-light conditions where screen visibility becomes challenging. Now you've learned the five hidden Control Center features worth knowing, why not explore some of our other articles? You might find 5 features to try first on your Apple Watch useful, or the 7 best Apple Watch features you're probably not using, but should. And if you're looking for something new, check out the best Apple Watch apps in 2025. Get instant access to breaking news, the hottest reviews, great deals and helpful tips.


Tom's Guide
an hour ago
- Tom's Guide
iOS 26's Liquid Glass looks amazing, except for this one glaring issue
WWDC 2025 revealed some pretty amazing new features and developments that will land on Apple products later this year. And certainly the biggest change figures to be Liquid Glass, the new design language coming to all of Apple's different software platforms. I'll have a better idea of how Liquid Glass performs once I get a chance to use Apple's beta software, particularly when it comes to iOS 26 on my iPhone. But watching Apple's keynote on Monday (June 9) and looking at the iOS 26 preview page, I'm concerned that the updated software may not fix one of my big issues introduced with iOS 18. As you may recall, iOS 18 brought a new way to customize your phone's home screen including app tinting. This feature allowed you to change the color of your app icons — at least in theory. In practice, the finished result made it look like you poured sauce all over your apps. Ultimately, I never wound up using app tinting to adjust the color of my home screen icons — except when I wanted to show how weird it looked. I had hoped that iOS 26 would improve things, but everything I've seen so far doesn't fill me with hope. As a reminder, Liquid Glass is Apple's new design language that aims to improve the look of, well, everything. Essentially, Apple wants to create an iconic, stylized design that's uniform across everything from iPhones to Macs to Apple Watches. And from what we saw at WWDC, Apple seems to have managed this, thanks to the new translucent look, expandable menus and a design that works with the contours of the screen of whatever device you happen to be using. During the part of the WWDC keynote that focused on Liquid Glass, Apple gave us a glimpse at what tinting your apps will look like in iOS 26. When you change the color of an app in the new update, the main body of the icon will still be slightly translucent, but will take on a slight hue that matches your chosen color. Meanwhile, the main element of the icon will be brighter, usually close to white, but again with a shade of the chosen color. Initially, I liked the look of the new tints when I first saw them, although that's changed the more I look at them. The final product just looks dull, as the icons don't jumping out at me. Take a look at the screenshot above with the Map widget in the top right corner. The person icon in that map looks really low-res as a result of that tinting. And this is just with Apple's built-in apps on the iPhone. It's third-part apps where the really issues have been with app tinting in iOS 18. iOS 18's app tinting doesn't just look blocky — the real issue surfaces with third-party apps downloaded from the App Store. In the above image, you can see the home screen for my iPhone 15 Pro Max, which includes a mix of Apple's apps and a few third-party ones. Notice how Apple's apps split the colors fine, the background is black and the main element is red. However, if you look at my Warhammer 40,000 app or the MCM ComicCon app, you'll notice that the phone just shoves a red screen over them. There's no attempt made to make them match the other apps, and in fairness, I don't see how you could. So what you end up with is these odd-looking red blotches on the screen. Consider the Kindle app, which features a lovely design of someone reading normally; with the tint, though, it's just a block of color. There's more than just aesthetics at play here. App tinting as it stands makes it just a little more difficult to quickly pick the right app. It can be annoying when you're in a rush and you accidentally tap the wrong icon because they all start to look the same. This is made worse for someone with a vision problem like myself, where images on a screen can be harder to read. We won't really know what app tinting in iOS 26 looks like until the update comes out, and we see how developers tweak their apps for the Liquid glass interface. But there are ways Apple could bolster the chances of icons looking their best. The most obvious answer is to simply make the icons more translucent, while keeping their original color. This isn't a terrible idea, although it would remove the feeling of uniformity that Apple appears ot o be going for. Alternatively, Apple could make sure each developer codes in the same design options for their apps so that they match the look of the native apps. That said, developing an app is difficult enough and I can't see companies making sure each has an option in case someone tweaks the tint of an icon. The final option would be to repeat what was done in iOS 18, but that seems even less likely as it would ruin the sleek look of the new home screen. While I don't think any design will entice me to actually tint my apps, I am curious what you all think. Let me know: do you already make use of app tinting and will you continue to do so, or do you view it as more of an option you won't bother to use.


Android Authority
an hour ago
- Android Authority
After three days with iOS 26, I'm amazed by Apple's Liquid Glass redesign, but I have concerns
Dhruv Bhutani / Android Authority The biggest buzz at WWDC 2025 was around Apple's spanking new Liquid Glass interface. From a unified year-based naming scheme for its platforms to what might be the most extensive visual overhaul to iOS in years, iOS 26 marks a significant shift in Apple's software approach. But is there substance beneath the divisive shiny sheen? I dove into the developer betas to give it a try. Let me preface this by saying this first beta is very buggy, and I wouldn't recommend installing it on your primary phone. Still, if you're eager to explore it, just go to the 'Software Update' section under Settings and select 'Beta Updates.' That's all it takes. Since last year, Apple has dramatically simplified the beta sign-up process. Regardless, I'd highly recommend waiting for next month's public beta before installing the update. With that said, here are some of the most significant additions to iOS 26. Liquid Glass: The most dramatic design overhaul since iOS 7 Dhruv Bhutani / Android Authority Apple's biggest change this year is the introduction of a new design language called Liquid Glass. If you're a design enthusiast or have experience in web design, you're likely familiar with glassmorphism. Liquid Glass builds on that aesthetic and makes extensive use of transparency and floating elements. More importantly, this redesign spans every Apple platform from the iPhone to the iPad, Mac, Watch, TV, and even Vision Pro. It's Apple's first real attempt to unify the visual language across its entire ecosystem. Dhruv Bhutani / Android Authority In practice, Liquid Glass means layers of translucent color, soft reflections, and depth that shift as you interact with your device. It's playful, dramatic, and distinctly Apple — for better or worse. The Home Screen shows this off best. App icons appear like digital glass, glinting based on the background. You'll notice bubble-like UI elements across the Photos app, the Fitness App, and even the Camera. On the Lock Screen and in Control Center, most flat backgrounds are now translucent layers. It's a subtle but impactful shift that makes everything feel like it's floating rather than just sitting on top of your wallpaper. Readability suffers under all that transparency — especially in Control Center. In day-to-day use, not everything works perfectly yet. Transparency can hurt readability, especially in Control Center when it overlaps busy apps like the music player. The Lock Screen has similar issues. Some animations also feel inconsistent. The interface tweaks continue on to the browser, where you now get a near-full-screen view of the webpage with glass-inspired elements that pop out. Similar to the rest of the interface, there is ample reason to be concerned about readability (especially for those with accessibility needs), and your experience is entirely dependent on the background. Still, this is early beta territory, and Apple typically refines this by the time of public release. Despite the mixed public consensus, I quite like the general direction that Apple is taking here. The interface looks futuristic to a fault, like something straight out of an Apple TV science fiction show, and I'm personally here for it. But even at this early stage, it is clear that a lot of pain points need to be addressed before the public rollout this September. The new camera experience Dhruv Bhutani / Android Authority The Camera app, too, has received a major, and much-needed, overhaul. In fact, this is the first time in years that Apple has rethought the camera UI from the ground up. While the basics remain the same, Apple has refined the layout to provide quicker access to controls. The refreshed interface makes it easy to swipe between modes like photos, videos, portrait, and more with a single swipe along the bottom edge. This feels intuitive and much more useful when composing shots. Dhruv Bhutani / Android Authority Similarly, a subtle but welcome touch is how Apple now surfaces adjustments. In some ways, the Camera app has finally gained the 'Pro' mode users have been waiting for. Features such as switching between different recording settings, LOG video, and camera resolution are infinitely more straightforward to access. While it's nowhere close to the level of Pro mode features in the best Android camera phones or dedicated third-party camera apps, it's a good compromise for casual enthusiasts who desire more control without sacrificing simplicity. A side effect of these changes is that the overwhelming amount of animations and floating elements makes the interface feel slower than it is, with everything taking just half a second too long. I can't say for sure if Apple will allow for toned-down animations, but as it stands, the floaty feeling of the UI wears you down pretty quickly. Apple Intelligence everywhere It's fair to say that Apple's initial AI push has been somewhat underwhelming. When Apple Intelligence was announced last year, well behind the competition, it distinguished itself with a strong promise of privacy. A year later, a large portion of last year's promised features are still unavailable, making it difficult to take Apple's 2025 claims entirely seriously. Regardless, among the newly announced features is deeper integration with the entire suite of on-device communication apps. Moreover, this year, Apple is opening up access to its on-device LLM to third-party developers. That is bound to open up some very interesting and innovative use cases. In Messages, FaceTime, and the Phone app, Live Translation now enables real-time language translation of both text and audio. It functions within message threads and during calls, providing quick responses without requiring you to leave the app. I couldn't find a way to activate the feature in the beta. Apple Intelligence still lags in effectiveness despite the interesting platter of system-wide integrations. Similarly, Visual Intelligence now understands what's on your screen and can surface related results, links, or suggestions. For instance, if someone sends you a product image, you can ask the on-device intelligence to show you similar items from the web or pull up information about it without ever leaving the thread. Think of it as Apple's take on 'Circle to Search' but leveraging the power of Apple's on-device LLM and ChatGPT. This is one of iOS 26's more exciting features, but once again, it is not yet available in the developer beta. Genmoji and Image Playground are also part of this AI layer. You can now combine emoji, photos, and descriptive phrases to generate custom stickers and images. While these tools feel like fun party tricks for now, their true power lies in deep system-wide integration. The feature works exactly as you'd imagine and lets you combine existing emojis, photos and text-based prompts to create custom emojis. The results are pretty good, as you can see in the screenshot above. It's not really something I'd use very often, but better on-device image and emoji generation is effectively table stakes, so an improved experience is very welcome. Dhruv Bhutani / Android Authority The other feature that I found exciting was deeper integration of AI into Apple's on-device scripting service. Apple Intelligence is now available to the Shortcuts app, enabling you to create smarter automations. This means you can integrate Apple's on-device LLM or even ChatGPT into a shortcut and use it to parse data before passing it on to another app. I can envision use cases like instantly splitting a tab or summarizing any on-screen content, such as an Instagram post. In fact, it took me minutes to get a shortcut up and running to automatically create a note based on a shared Instagram post after passing it through the on-device LLM. That's very cool. A smarter battery dashboard Talking about everyday use features, Apple has finally overhauled the Battery section in Settings. The new interface replaces the 24-hour and 10-day views with a more digestible weekly breakdown. It then compares your average battery consumption to your daily usage, highlighting which apps are consuming power and why. Tapping into any given day reveals a split between active screen time and idle background use. It's very similar to the battery insights available to Android users and is a welcome addition. Dig deeper, and you'll also find a new Adaptive Power Mode. Unlike the static Low Power Mode, Adaptive adjusts in real time based on how aggressively you're using your phone. It can dim the screen or scale back background tasks without requiring user input. You still get the manual 80% charge limiter and battery health metrics, but the focus here is on smarter defaults. Settings, Keyboard, Messages, and other subtle improvements In addition to the big hits, numerous smaller quality-of-life improvements are sprinkled throughout the OS. The keyboard feels chunkier and more precise, with better haptic feedback. There's a new preview app that lets you perform a wide range of file-based functions, including previewing files, of course. The Settings app has undergone minor restructuring. While not a radical shift, the app feels cleaner and faster to navigate with its revamped font sizing and kerning. In Messages, you can now set custom backgrounds per conversation, adding a bit more personality to threads. Apple has also added a polls feature for group chats, something that arguably should have existed years ago. The Phone app has also received some attention. It now unifies the Recents, Favorites, and Voicemails tabs into a single, streamlined interface. The most significant addition is Call Screening. It screens unknown callers by gathering context and offering options to respond or dismiss them without ever answering. Hold Assist is another helpful tool. If you're stuck in a call queue with customer support, your iPhone can now wait on hold for you and alert you when a human finally joins the line. iOS 26 also introduces a dedicated Apple Games app. It acts as a central hub for all things gaming on your device, effectively serving as a lightweight but genuinely useful Game Center replacement. The app pulls in your installed games, offers personalized recommendations, and allows you to see what your friends are playing. Achievements, leaderboards, and Game Center invites are now neatly tucked into this space. Apple is clearly trying to make iOS gaming feel more like a platform and less like a series of one-off downloads, but it remains to be seen if there's significant adoption. So, is iOS 26 worth the hype? Dhruv Bhutani / Android Authority It's hard to say definitively at this early stage. There's no doubt that Liquid Glass gives iOS a bold new face, and updated Apple Intelligence features feel like the beginning of something genuinely useful. But right now, it's mostly potential. iOS 26 is playful, dramatic, and distinctly Apple — for better or worse. Many features are buggy or half-baked, and even improvements like those in the camera app require further polish. To be fair, this is a developer beta. I'll reserve judgment until the final release rolls out later this year, but what is undeniable is that this is the most ambitious update Apple has shipped in years.