Spotify executives banned a common phrase from their weekly 3-hour meeting
Every Tuesday afternoon, Spotify's top brass — all of its vice presidents — pile into a room for a standing three-hour meeting with a key rule.
"You're not allowed to say the word 'offline' or 'later' — because that person is in the room," said Gustav Söderström, Spotify's co-president, on an episode of the "Invest Like The Best" podcast published Tuesday.
At other companies, when conversations get uncomfortable or someone hasn't delivered, people tend to punt the issue. But that's not Spotify's ethos, Söderström, who also leads tech and product, said.
Instead of circling back, people are expected to hash things out.
"It's real-time resolution — very simple in theory but incredibly powerful in practice. Most companies don't do it," he said.
Another rule: No bringing direct reports. Everyone in the room is expected to know the discussion's details.
"I'm trying to literally force the VPs to solve it themselves because I want them to be in the details. So, you're not allowed to bring anyone else in to explain your thing," Söderström said.
"You have to be on top of it enough to explain it to yourself," he added.
Without direct reports coming and going, the same group shows up each week. Over time, it becomes a tight-knit, high-trust team, Söderström said.
Spotify and Söderström did not respond to a request for comment.
Spotify's 'bets' process
The marathon Tuesday sessions are part of what Spotify calls its "bets" process — a structured way of deciding what the company builds next.
Every six months, each VP pitches a bet.
"It's very much like a startup process," Söderström said. "You don't get to use the fact that Gustav or Alex or Daniel may like you. This is like a VC meeting, you have to convince us."
After the pitches, the leaders "stack rank" the 30 to 50 pitches. Teams then allocate resources based on that list and execute what makes the cut over the next six months.
"It's a good mix of bottom-up innovation," Söderström said. Instead of relying on the company's top executives, Spotify brings in ideas from across its leadership and "all the layers below."
"You're going to be much better at delivering something if you were the one who said, 'I can do this,' than if your boss said you can do this," Söderström said.
The company's stock is up nearly 116% in the last year.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
10 hours ago
- Yahoo
All of the new features coming to Apple's iOS apps
At WWDC 2025 on Monday, Apple unveiled a series of new features that will launch with iOS 26 this fall across its apps. These include significant updates like Call Screening, more travel-friendly features in Wallet, and highly requested group chat features in Messages. Some of these updates include Apple Intelligence, like Live Translation. Others, like Apple bringing back tabs in the Photos app, just make devices a bit easier to use. We'll update this post as new features come out. Call Screening lets you determine what a call is about before picking it up. When you get a call from an unknown number, Call Screening will automatically answer silently in the background. When the caller shares their name and the reason for their call, the iPhone will ring, and you can view their response before deciding whether to pick up or ignore it. Hold Assist will detect hold music and stay on the line for you until a live agent is available, and Live Translation will translate conversations on the fly. Your words will be translated as you talk, and the translation is spoken out loud via an AI voice for the call recipient. As the person you're speaking to responds in their own language, you'll hear a spoken translation of their voice. Members in group chats on Messages can now create polls to better plan events and make quick decisions. Apple Intelligence will also detect when a poll might be useful and suggest that users start one. Plus, group chats can now create custom backgrounds for chats and see typing indicators, and you can now request, send, and receive Apple Cash in group chats. The app now also lets you screen messages from unknown senders. Messages from unknown senders will appear in a dedicated folder where you can mark the number as known, ask for more information, or delete the message. Apple notes that these messages will remain silenced until a user accepts them. Live Translation is also coming to Messages. The feature will automatically translate text for you as you type and deliver it in your preferred language. When the person you're texting responds, their message will be translated for you. Apple Music users will get access to a Lyrics Translation feature to help them understand the words in their favorite songs in other languages. A new Lyrics Pronunciation feature will display phonetic lyrics so that listeners can sing along in a different language. The app is also adding an AudioMix feature that will transition from one song to the next using time stretching and beat matching like a DJ to deliver continuous playback. The feature could be seen as a competitor to Spotify's AI DJ feature. Apple is also introducing a karaoke feature that will turn your iPhone into a handheld microphone for Apple TV. The feature will amplify your voice as you belt your favorite songs, as real-time lyrics and visuals appear on the TV screen. You can also pin your favorite music to the top of your Library in Apple Music to allow for easier access. Apple Maps is going to get better at understanding your daily commute. It will now use on-device intelligence to start showing you preferred routes when headed home or to the office. The app will notify you of delays and offer alternative routes, too. Apple Maps is also getting a new Visited Places feature that will help you remember the places you've been. You can choose to have your iPhone detect when you're at a restaurant or shop, and view all of your Visited Places in Maps. Apple notes that Visited Places are protected with end-to-end encryption and that it cannot access them. Apple Wallet is introducing the ability for people to store a digital version of their passport, called a Digital ID. The tech giant notes that this won't be a replacement for your actual passport, but it can be used in apps that need to verify age and identity, and at supported TSA checkpoints. With Real ID implementation in effect, Digital ID will give you another way to present an ID in person during domestic travel. In addition, another feature will let you present your driver's license or state ID in Wallet to websites for age and identity verification, starting with Chime, Turo, Uber Eats, and U.S. Bank, as well as the Arizona MVD, Georgia DDS, and Maryland MVA. Apple is also refreshing its boarding pass experience in Wallet. You'll now get real-time updates about flights with Live Activities. You can also share your flight's Live Activities with others so they can remain updated on your travels. You'll now be able to access Maps from your boarding pass in Wallet to navigate to the airport. You can also use Find My to track items and report lost baggage from the boarding pass, and also view key services on an airline's app, such as seat upgrades and standby lists. Apple also announced that Wallet now uses Apple Intelligence to automatically summarize and display order tracking details from emails sent from merchants or delivery carriers. Like Phone and Messages, FaceTime will leverage Live Translation to let people communicate with each other in different languages. When you're talking to someone in a different language, FaceTime will display translated captions so that the two of you can understand each other. After receiving significant user backlash for its Photos app redesign in iOS 18, Apple is bringing back a tabbed interface to Photos. In Collections, you'll find your favorites, albums, and the ability to search across your library. The Library tab makes it easier to scroll through recent photos. The Photos app is also able to transform your 2D photos into 3D spatial photos. The iPhone's Camera app will showcase the two capture modes you use most on the main screen: photo and video. To reveal additional modes, like Portrait Mode and Cinematic Mode, you can swipe your finger left or right. To access other settings, such as flash, timer, aperture, and more, you'll now swipe up from the bottom of the screen. You can also change formats with a tap, which can be helpful when switching between HD and 4K resolution or adjusting the frame rate on video. Apple Podcasts is getting a new customized playback experience that will allow you to choose speed options, from 0.5x and 3x. The feature will bring listening experience on Apple Podcasts more in line with Spotify, which already lets you choose playback speed for podcasts. Plus, Apple Podcasts is getting an 'Enhance Dialogue' feature that will use audio processing and machine learning to make speech more clear over background sounds. Apple is adding a ChatGPT integration to supercharge Image Playground. You'll be able to access new styles, such as vector art or an oil painting. Plus, you'll be able to tap the 'Any Style' option to describe exactly what you want. Image Playground will send the description or photo to ChatGPT to create a unique image. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


The Verge
10 hours ago
- The Verge
Barnes & Noble's Nook app has joined
The Nook app has started linking to outside purchases on iOS, too. Kindle and Spotify in making it easier to buy e-books and audiobooks on the iPhone. A recent update has added a new 'buy on option to the iOS app, as spotted earlier by Good e-Reader. The change comes after a judge ordered Apple to lift restrictions on web links and outside payment options, a decision that a higher court upheld earlier this month.
Yahoo
18 hours ago
- Yahoo
Apple redesigns its operating systems with 'Liquid Glass' at WWDC 25
Apple's iPhone may not be getting a significant AI upgrade, but it is getting a fresh coat of paint. As are Apple's other operating systems. At WWDC 2025, the company announced a refreshed user interface called Liquid Glass, which features shiny, reflective, and transparent visual interface elements that give the software a more "glassy" look and feel. The design refresh is inspired by Apple's VR headset, the Vision Pro. It unifies the iPhone's design and that of Apple's other devices, with the interface built for the spatial computing headset. This change could also hint at a potential future that could see Apple's operating system and software extended to other surfaces besides phones, tablets, and watches -- like AR glasses, perhaps. Introduced at WWDC by Alan Dye, Apple's vice president of design, the Liquid Glass interface represents the biggest visual update to iOS, the software powering the iPhone, since the move from the original skeumorphic design to a flat design style in iOS 7. With skeumorphism, the idea was to translate real-world objects to the touch screen -- like a Notes app that looked like a yellow legal pad. Flat design upended this visual language, opting instead for simple shapes, clean lines, a minimalist user interface, and more colorful icons. Over time, iOS's flat design evolved to have more glossy and semi-translucent layers, like a Control Center that mimicked a frosted pane of glass. As Dye explained, the Liquid Glass redesign includes the "optical qualities of glass and a fluidity that only Apple can achieve." The company says the update will bring more clarity to navigation and controls, refracts light, and dynamically reacts to your movement. In addition, it will respond in real time to your content and your input, creating a "more lively experience," Dye said. The Liquid Glass display is translucent and will behave like glass in the real world. The color of the screen is informed by your content and will adapt between light and dark environments. In addition, alerts appear from where you tap, context menus expand into a scannable list when you scroll and tap. The design applies to both the system experiences, like the Lock Screen, Notifications, and Control Center, as well as the app icons. The company says the new icons will look like they've been crafted with multiple layers of liquid glass and will come in light mode, dark mode, and a new clear mode. On the Lock Screen, the time will appear in a glassy San Francisco typeface and will dynamically adapt its weight, width, and height depending on the image on the screen. Plus, the iPhone can change your 2D photos into spatial scenes with 3D effects, which pairs nicely with the glass-like user interface. When streaming from Apple Music, the player controls are also designed with Liquid Glass and show new animations provided by artists that enhance the playing experience -- a feature Spotify offers today with its looping videos. Liquid Glass can be found in other elements, like FaceTime and Safari tab bar and their various controls. Liquid Glass isn't only built for the iPhone's iOS 26. It's coming to all of Apple's operating systems, including iPadOS 26, macOS Tahoe 26, watchOS 26, and tvOS 26. Developers will also be able to build apps using new Liquid Glass materials via SwiftUI, UIKit, and AppKit, and new APIs. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data