
Arts: The secret of the best software teams
If you've spent more than a few quarters in the software world, you know we move fast. Agile sprints, continuous delivery—we're always optimizing. But in the rush to automate, scale, and code our way to market dominance, we often forget a powerful source of competitive advantage: the arts. It's the A in STEAM, and it's what separates excellence from adequacy.
That's right. Painting, baking, music, writing, theater—these aren't just leisurely pastimes. They are potent engines of innovation, empathy, and long-term resilience in a tech ecosystem that rewards novelty and emotional intelligence just as much as it does technical prowess.
THE SCIENCE BEHIND ART AND INNOVATION
Creativity is no longer a soft skill—it's a differentiator. A 2024 study conducted by Statista, based on data from over 11 million employees across 803 organizations globally, found that more than 70% of companies consider creative thinking the skill most expected to rise in importance over the next five years. As the report highlights, 'Creative thinking is expected to grow in importance for 73% of companies by 2027, making it the most highly ranked core skill in the survey.'This underscores the growing emphasis employers place on creativity as a critical skill in the evolving workplace landscape.
Michigan State University found that Nobel laureates in the sciences were more likely than their peers to have artistic hobbies—including music, acting, writing, and visual arts. The arts, it turns out, don't just make us more interesting—they make us more innovative.
Apple Inc. serves as a prime example of the successful integration of arts into technology. Steve Jobs famously stated, 'It's in Apple's DNA that technology alone is not enough. It's technology married with liberal arts, married with the humanities, that yields us the results that make our hearts sing.' This philosophy led to products that are not only technologically advanced, but also aesthetically pleasing and user-friendly, demonstrating the value of incorporating artistic sensibilities into software and hardware design.
1. Encourage Cross-Disciplinary Collaboration
Foster an environment in which software developers work alongside designers, artists, and writers. This collaboration can lead to more holistic and innovative solutions.
Organize events that challenge employees to think outside the box and apply their skills in new, creative ways. Host a maker day or do a collaborative art project as an activity at an offsite. You could even do something as simple as hosting a lunch-and-learn with someone sharing about their art experience. These can stimulate innovation and team cohesion.
3. Support Continuous Learning In The Arts
Provide resources and opportunities for employees to engage in artistic pursuits, such as painting, music, or creative writing. This not only enhances individual creativity, but also brings fresh perspectives to problem-solving in software development.
4. Practice What You Preach
Set the standard. Actively engage your team with creative outlets and share how they have shaped your journey as a leader. Spark conversations around the role of the arts in personal and professional growth, and be sure to highlight your own experiences, celebrate colleagues who are exploring their creative sides, and elevate those stories. You never know who you might inspire or what magic it could unlock.
WHY THIS MATTERS FOR YOUR TEAM
As a leader, I've witnessed the impact of the arts firsthand. Teams that embrace creativity don't just produce better designs—they communicate better, solve problems faster, and cultivate higher morale. When engineers write poetry or dabble in improv, they start seeing multiple solutions instead of one. They pitch better. They listen better.
In the software world, we pride ourselves on being logic-driven. But the truth is, our users are emotional beings. Whether we're building an app to manage diabetes or a platform to deliver government programs at scale, the way we solve problems must be as human-centered as it is data-driven.
The arts teach us how to be human —how to understand nuance, embrace ambiguity, and connect deeply with others. Magic isn't found in spells or stars—it's found in the moment someone dares to see possibility beyond the ordinary.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
41 minutes ago
- CNET
I Need Apple to Make the iPhone 17 Cameras Amazing. Here's What It Should Do
Apple's WWDC was a letdown for me, with no new hardware announced and few new features beyond a glassy interface for iOS 26. I'm pinning my hopes that the iPhone 17 will get my pulse racing, and the best way it can do that is with the camera. The iPhone 16 Pro already packs one of the best camera setups found on any phone, it's capable of taking stunning images in any conditions. Throw in its ProRes video, Log recording and the neat 4K slow motion mode and it's a potent video shooter too. It even put up a strong fight against the other best camera phones around, including the Galaxy S25 Ultra, the Pixel 9 Pro and the Xiaomi 14 Ultra. Read more: Camera Champions Face Off: iPhone 16 Pro vs. Galaxy S25 Ultra Despite that, it's still not the perfect camera. While early reports from industry insiders claim that the phone's video skills will get a boost, there's more the iPhone 17 will need to make it an all-round photography powerhouse. As both an experienced phone reviewer and a professional photographer, I have exceptionally high expectations for top-end phone cameras. And, having used the iPhone 16 Pro since its launch, I have some thoughts on what needs to change. Here are the main points I want to see improved on the iPhone 17 when it likely launches in September 2025. An accessible Pro camera mode At WWDC, Apple showed off the changes to the upcoming iOS 26 which included a radical change to the interface with Liquid Glass. But that simplified style extended to the camera app too, with Apple paring the interface down to the most basic functions of Photo, Video and zoom levels. Presumably, the idea is to make it super easy for even the most beginner of photographers to open the camera and start taking Instagram-worthy snaps. The new camera app is incredibly barebones Apple/Screenshot by Joe Maldonado/CNET And that's fine, but what about those of us who buy the Pro models in order to take deeper advantage of features like exposure compensation, Photographic Styles and ProRaw formats? It's not totally clear yet how these features can be accessed within the new camera interface, but they need to not be tucked away. Many photographers -- myself very much included -- want to use these tools as standard, using our powerful iPhones in much the same way we would a mirrorless camera from Canon or Sony. That means relying on advanced settings to take control over the image-taking process to craft shots that go beyond simple snaps. If anything, Apple's camera app has always been too simple, with even basic functions like white balance being unavailable. To see Apple take things to an even more simplistic level is disappointing, and I want to see how the company will continue to make these phones usable for enthusiastic photographers. Larger image sensor Though the 1/1.28-inch sensor found on the iPhone 16 Pro's main camera is already a good size -- and marginally larger than the S24 Ultra's 1/1.33-inch sensor -- I want to see Apple go bigger. A larger image sensor can capture more light and offer better dynamic range. It's why pro cameras tend to have at least "full frame" image sensors, while really high-end cameras, like the amazing Hasselblad 907X, have enormous "medium format" sensors for pristine image quality. Even on pro cameras, sensor size is important. Even the full-frame image sensor in the middle is dwarfed by the medium format sensor on the right. Phone camera sensors don't come anywhere near to this size. Andrew Lanxon/CNET Xiaomi understands this, equipping its 15 Ultra and previous 14 Ultra with 1-inch type sensors. It's larger than the sensors found on almost any other phone, which allowed the 15 Ultra to take stunning photos all over Europe, while the 14 Pro was heroic in capturing a Taylor Swift concerts. I'm keen to see Apple at least match Xiaomi's phone here with a similar 1-inch type sensor. Though if we're talking pie-in-the-sky wishes, maybe the iPhone 17 could be the first smartphone with a full-frame image sensor. I won't hold my breath on that one -- the phone, and the lenses, would need to be immense to accommodate it, so it'd likely be more efficient just to let you make calls with your mirrorless camera. Variable aperture Speaking of the Xiaomi 14 Ultra, one of the other reasons that phone rocks so hard for photography is its variable aperture on the main camera. Its widest aperture is f/1.6 -- significantly wider than the f/1.78 of the iPhone 16 wider aperture lets in a lot of light in dim conditions and more authentically achieves out-of-focus bokeh around a subject. The streetlight outside this pub has been turned into an attractive starburst thanks to the variable aperture of the Xiaomi 14 Ultra. Andrew Lanxon/CNET But Xiaomi's 14 Ultra aperture can also close down to f/4, and with that narrower aperture, it's able to create starbursts around points of light. I love achieving this effect in nighttime imagery with the phone. It makes the resulting images look much more like they've been taken with a professional camera and lens, while the same points of light on the iPhone just look like roundish blobs. Disappointingly, Xiaomi actually removed this feature from the new 15 Ultra so whether Apple sees value in implementing this kind of technology remains to be seen. More Photographic Styles Though Apple has had various styles and effects integrated into the iPhone's cameras, the iPhone 16 range took it further, with more control over the effects and more toning options. It's enough that CNET Senior Editor Lisa Eadicicco even declared the new Photographic Styles her "favorite new feature on Apple's latest phone." I think they're great too. Or rather, they're a great start. The different color tones, like the ones you get with the Amber and Gold styles, add some lovely warmth to scenes, and the Quiet effect adds a vintage filmic fade, but there's still not a whole lot to choose from and the interface can be a little slow to work through. I'd love to see Apple introduce more Photographic Styles with different color toning options, or even with tones that mimic vintage film stocks from Kodak or Fujifilm. I like the warmer tones produced by the iPhone's Amber style in this image, but I'd definitely like to see more options for getting creative with color tones. Andrew Lanxon/CNET And sure, there are plenty of third-party apps like VSCO or Snapseed that let you play around with color filters all you want. But using Apple's styles means you can take your images with the look already applied, and then change it afterward if you don't like it -- nothing is hard-baked into your image. I was recently impressed with Samsung's new tool for creating custom color filters based off the look of other images. I'd love to see Apple bring that level of image customization to the iPhone. Better ProRaw integration with Photographic Styles I do think Apple has slightly missed an opportunity with its Photographic Styles, though, in that you can use them only when taking images in HEIF (high-efficiency image format). Unfortunately, you can't use them when shooting in ProRaw. I love Apple's use of ProRaw on previous iPhones, as it takes advantage of all of the iPhone's computational photography -- including things like HDR image blending -- but still outputs a DNG raw file for easier editing. The DNG file typically also offers more latitude to brighten dark areas or tone down highlights in an image, making it extremely versatile. Previously, Apple's color presets could be used when shooting in ProRaw, and I loved it. I frequently shot street-style photos using the high contrast black-and-white mode and then edited the raw file further. I do a lot of street photography in black and white, and I'd love more flexibility to take ProRaw shots in monochrome. Andrew Lanxon/CNET Now using that same black-and-white look means only shooting images in HEIF format, eliminating the benefits of using Apple's ProRaw. Oddly, while the older-style "Filters" are no longer available in the camera app when taking a raw image, you can still apply those filters to raw photos in the iPhone's gallery app through the editing menu. LUTs for ProRes video And while we're on the topic of color presets and filters, Apple needs to bring those to video, too. On the iPhone 15 Pro, Apple introduced the ability to shoot video in ProRes, which results in very low-contrast, almost gray-looking footage. The idea is that video editors will take this raw footage and then apply their edits on top, often applying contrast and color presets known as LUTs (look-up tables) that gives footage a particular look -- think dark and blue for horror films or warm and light tones for a romantic drama vibe. But Apple doesn't offer any kind of LUT for editing ProRes video on the iPhone, beyond simply ramping up the contrast, which doesn't really do the job properly. Sure, the point of ProRes is that you would take that footage off the iPhone, put it into software like Davinci Resolve, and then properly color grade the footage so it looks sleek and professional. ProRes footage looks very low contrast and desaturated. Apple needs to introduce ways to help you do more with ProRes files on the iPhone. Andrew Lanxon/CNET But that still leaves the files on your phone, and I'd love to be able to do more with them. My gallery is littered with ungraded video files that I'll do very little with because they need color grading externally. I'd love to share them to Instagram, or with my family over WhatsApp, after transforming those files from drab and gray to beautifully colorful. With the iPhone 17, or even with the iPhone 16 as a software update, I want to see Apple creating a range of its own LUTs that can be directly applied to ProRes video files on the iPhone. While we didn't see this software functionality discussed as part of the company's June WWDC keynote, that doesn't mean it couldn't be launched with the iPhone in September. If Apple were able to implement all these changes -- excluding, perhaps, the full-frame sensor which even I can admit is a touch ambitious -- it would have an absolute beast of a camera on its hands.
Yahoo
3 hours ago
- Yahoo
What is Liquid Glass? Internet reacts to Apple's new software design
Apple's Liquid Glass made a splashy debut this week, but it might not be for everyone. Some social media users have been quick to criticize or poke fun at the "beautiful, new" software design for iOS 26, which was unveiled at the 2025 Worldwide Developers Conference, an annual information technology conference hosted by the tech juggernaut. The design, dubbed Liquid Glass, was crafted with a "translucent material [that] reflects and refracts its surroundings, while dynamically transforming to help bring greater focus to content," Apple said in a news release. Alan Dye, Apple's vice president of Human Interface Design, called the iOS 26 rollout the company's "broadest software design update ever." 'It combines the optical qualities of glass with a fluidity only Apple can achieve, as it transforms depending on your content or context," said Dye in a statement. Here's what to know about Liquid Glass, and what people are saying about it. Liquid Glass is a new software design, or aesthetic, described by Apple CEO Tim Cook as "Expressive. Delightful. But still instantly familiar." According to Apple, the "look" makes apps and system experiences more expressive and delightful while being instantly familiar. It is translucent and behaves like glass in the real world and its color is informed by surrounding content and intelligently adapts between light and dark environments. The new design, unlike previous iterations, will extend across platforms, including iOS 26, iPadOS 26, macOS Tahoe 26, tvOS 26 and watchOS 26. Many people like the idea of Apple's "Liquid Glass," but the execution, not as much. Some of the most prominent concerns include the readability of notifications and the distortion to the image behind the squiggly, bubble-shaped app outline. "The new liquid glass looks abysmal and is a perfect example of focusing on form/prettiness/design over of functionality/readability/practicality like, what are we doing here," an X user wrote in a June 9 post. Some have also speculated the new design would not have passed the sniff test of the late Apple co-founder Steve Jobs. "Steve Jobs had very famously said that design was how it works, not how it thoughts on liquid glass would have been interesting," one user wrote, referencing a famous quote from the founder. Others, still, felt that Apple completely missed the mark by offering the "liquid glass display" instead of revisiting the AI upgrades unveiled at last year's WWDC. Memes, critiques and threads have surfaced on social in the wake of Apple's announcement. See a compilation of posts made about "Liquid Glass" below: Contributing: James Powel and Mike Snider, USA TODAY This article originally appeared on USA TODAY: What is Liquid Glass? New Apple iPhone aesthetic sparks discussion


Bloomberg
6 hours ago
- Bloomberg
Apple's 10 Biggest Challenges, From AI to Tariffs
Apple Inc., long the envy of its Silicon Valley peers, is now facing one of the most trying times of the post-Steve Jobs era. The company is scrambling to catch up with rivals in artificial intelligence, and regulators are attacking its business model globally. Demand for the iPhone — Apple's biggest moneymaker — also remains sluggish, especially in China.