Phonon interference hits new heights, promising leaps in quantum and energy tech
Their discovery shows interference two orders of magnitude greater than any previously observed, opening new possibilities for quantum sensing and computing technologies.
This phenomenon, known as Fano resonance, occurs when two phonons with different frequency distributions interfere with each other, producing distinctive patterns of amplification or cancellation—similar to overlapping ripples on a pond.
'While this phenomenon is well-studied for particles like electrons and photons, interference between phonons has been much less explored,' said Kunyan Zhang, a former postdoctoral researcher at Rice and first author on the study.
'That is a missed opportunity, since phonons can maintain their wave behavior for a long time, making them promising for stable, high-performance devices.'
Phonons in quantum spotlight
By proving phonons can be harnessed as effectively as electrons or light, this discovery opens the door to a new generation of phonon-based technologies.
The researchers achieved their results by placing a two-dimensional (2D) metal layer atop a silicon carbide substrate using a method called confinement heteroepitaxy.
They intercalated just a few layers of silver atoms between graphene and silicon carbide, creating a tightly bound interface with unique quantum properties. 'The 2D metal triggers and strengthens the interference between different vibrational modes in silicon carbide, reaching record levels,' Zhang explained.
The team used Raman spectroscopy, a technique that measures vibrational modes, to study how phonons interfere. The resulting spectra displayed sharply asymmetric shapes and, in some cases, full dips forming antiresonance patterns, signatures of intense interference.
The phenomenon was highly sensitive to the precise nature of the silicon carbide surface, with three different surface terminations each producing distinct Raman line shapes.
Vibrational signals unlock secrets
Remarkably, the presence of even a single dye molecule on the surface caused dramatic changes in the spectral line shape. 'This interference is so sensitive that it can detect the presence of a single molecule,' Zhang noted.
'It enables label-free single-molecule detection with a simple and scalable setup. Our results open up a new path for using phonons in quantum sensing and next-generation molecular detection.'
The study also confirmed that the interference arises purely from phonon interactions rather than electrons, marking a rare example of phonon-only quantum interference.
This effect appears only in the special 2D metal/silicon carbide system studied and does not exist in bulk metals due to unique transition pathways and surface configurations created by the atomically thin metal layer.
Looking ahead, researchers are exploring other 2D metals, like gallium or indium, to replicate and customize this effect.
'Compared to conventional sensors, our method offers high sensitivity without the need for special chemical labels or complicated device setup,' said Shengxi Huang, associate professor at Rice and corresponding author on the study.
'This phonon-based approach not only advances molecular sensing but also opens up exciting possibilities in energy harvesting, thermal management and quantum technologies, where controlling vibrations is key.'
Supported by the National Science Foundation, Air Force Office of Scientific Research, Welch Foundation, and University of North Texas, the study has been published in the journal Science Advances.
Solve the daily Crossword
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
29 minutes ago
- CNET
AI Data Centers Are Massive, Energy-Hungry and Headed Your Way
From the outside, this nondescript building in Piscataway, New Jersey, looks like a standard corporate office surrounded by lookalike buildings. Even when I walk through the second set of double doors with a visitor badge slung around my neck, it still feels like I'll soon find cubicles, water coolers and light office chatter. Instead, it's one brightly lit server hall after another, each with slightly different characteristics, but all with one thing in common — a constant humming of power. The first area I see has white tiled floors and rows of 7-foot-high server racks protected by black metal cages. Inside the cage structure, I feel cool air rushing from the floor toward the servers to prevent overheating. The wind muffles my tour guide's voice, and I have to shout over the noise for him to hear me. Outside the structure, it's quieter, but there's still a white noise that reminds me of the whooshing parents used to get newborn babies to sleep. On the back of the servers, I see hundreds of cords connected — blue, red, black, yellow, orange, green. In a distant server, green lights are flashing. These machines, dozens of them, are gobbling electricity. In all, this building can support up to 3 megawatts of power. This is a data center. Facilities like it are increasingly common across the US, sheltering the machinery that makes our online lives not only possible, but nearly seamless. Data centers host our photos and videos, stream our Netflix shows, handle financial transactions, and so much more. The one I'm visiting, owned by a company called DataBank, is modest in scope. The ones coming in one after another to suburban communities and former farmlands across the US, riding the tidal wave of artificial intelligence's swift advances, are monstrous. CNET/Tharon Green It's a building boom based on generative AI. In late 2022, OpenAI launched ChatGPT, and within two months, it had approximately 100 million users and had spurred a frantic scramble among the biggest tech companies and a host of newborn startups. Now, it has nearly 700 million active users each week and 5 million paying business users. We are inundated with chatbots, image generators and speculation about superintelligence looming in the not-too-distant future. AI is being woven into our everyday lives, from banking and shopping to education and language learning. Amazon, Google, Meta, Microsoft, Apple and OpenAI are all spending massive amounts of money to drive that growth. The Trump administration has also made it clear that it wants the US to lead AI innovation across the globe. "We need to build and maintain vast AI infrastructure and the energy to power it," the White House said in July in a document called America's AI Action Plan, which calls for streamlined construction permitting and the removal of environmental regulations. "Simply put, we need to 'Build, Baby, Build!'" Building, and building big, is very much on the mind of Meta CEO Mark Zuckerberg. He's been touting his company's plans for an AI data center in Louisiana, nicknamed Hyperion, that would be large enough to cover "a significant part of the footprint of Manhattan." All of that is adding up to an enormous demand for electricity and water to run and cool those new data centers. Generative AI requires energy-intensive training of large language models to do its impressive feats of computing. Meanwhile, a single ChatGPT query uses 10 times more energy than a standard Google search, and with millions of queries every day — not just from ChatGPT but also from the likes of Google's Gemini, Microsoft's Copilot and Anthropic's Claude — that's a staggering increase in the stresses on the US electrical grid and local water supplies. "Data centers are a critical part of the AI production process and to its deployment," said Ramayya Krishnan, professor of management science and information systems at Carnegie Mellon University's Heinz College. "Think of them as AI factories." But as data centers grow in both size and number, often drastically changing the landscape around them, questions are looming: What are the impacts on the neighborhoods and towns where they're being built? Do they help the local economy or put a dangerous strain on the electric grid and the environment? AI growth has caused a data center boom On the outskirts of communities across the country — and sometimes smack dab in the middle of cities like New York — giant AI data centers are springing up. Meta, for instance, is investing $10 billion into its 4-million-square-foot Hyperion data center, planned to open by 2030. An explosion of construction is likely coming to Pennsylvania. In July, at an energy summit in Pittsburgh attended by President Donald Trump, developers announced upward of $90 billion for AI in the state, including a $25 billion investment from Google. Perhaps the most ambitious undertaking is unfolding under the auspices of a new company called the Stargate Project, backed by OpenAI, Oracle, Softbank and others. In late January, on the day that Trump was sworn in to his second term as president, OpenAI said that Stargate would invest $500 billion in AI infrastructure over the next four years. An early signature facility for Stargate, amid reports of early struggles, is a sprawling data center under construction in Abilene, Texas. OpenAI said last month that Oracle had delivered the first Nvidia GB200 racks and that they were being used for "running early training and inference workloads." The publication R&D World has reported that the 875-acre site will eventually require 1.2GW of electricity, or the same amount it would take to power 750,000 homes. (Disclosure: Ziff Davis, CNET's parent company, in April filed a lawsuit against OpenAI, alleging it infringed Ziff Davis copyrights in training and operating its AI systems.) Currently, four tech giants — Amazon Web Services, Google, Meta and Microsoft — control 42% of the US data center capacity, according to BloombergNEF. The sky-high spending on AI data centers has become a major contributor to the US economy. Those four companies have spent nearly $100 billion in their most recent quarters on AI infrastructure, with Microsoft investing more than $80 billion into AI infrastructure during the current fiscal year alone. Not all data centers in the US handle AI workloads — Google's data centers, for instance, power services including Google Cloud, Maps, Search and YouTube, along with AI — but the ones that do can require more energy than small towns. A July report from the US Department of Energy said that AI data centers, in particular, are "a key driver of electricity demand growth." From 2021 to 2024, the number of data centers in the US nearly doubled, a PennEnvironment Research & Policy Center report found. And according to the National Telecommunications and Information Administration, the need for data centers is expected to increase by 9% each year until at least 2030. By 2035, data centers' US electricity demand is expected to double compared with today's. CNET/Tharon Green Here's another way to look at it: Speaking before the Senate Commerce Committee in May, Microsoft President Brad Smith said his company estimates that "over the next decade, the United States will need to recruit and train half a million new electricians to meet the country's growing electricity needs." As fast as the AI companies are moving, they want to be able to move even faster. Smith, in that Commerce Committee hearing, lamented that the US government needed to "streamline the federal permitting process to accelerate growth." This is exactly what's happening under the Trump administration. Its AI Action Plan acknowledges that the US needs to "build vastly greater energy generation" and lays out a path for getting there quickly. Among its recommendations are creating regulatory exclusions that favor data centers, fast-tracking permit approvals and reducing regulations under the Clean Water Act and the Clean Air Act. One step already taken: The Trump administration rescinded a Biden administration executive order — outlining the need to ensure AI development and use was done ethically and responsibly — to reduce "onerous rules imposed." 'Say no to a data center in our community' Early this year, June Ejk set up the Facebook page called Concerned Clifton Citizens to keep her neighbors informed about the happenings in Clifton Township, Pennsylvania. Now, her main focus is stopping a proposed 1.5GW data center campus from coming to the area that she's called home for the last 19 years. The developer, 1778 Rich Pike, is hoping to build a 34-building data center campus on 1,000 acres that spans both Clifton and Covington townships, according to Ejk and local reports. That 1,000 acres includes two watersheds, the Lehigh River and the Roaring Brook, Ejk says, adding that the developer's attorney has said each building would have its own well to supply the water needed. CNET/Tharon Green/Jeffrey Hazelwood "Everybody in Clifton is on a well, so the concern was the drain of their water aquifers, because if there's that kind of demand for 34 more wells, you're going to drain everybody's wells," Ejk says. "And then what do they do?" Ejk, a retired school principal and former Clifton Township supervisor, says her top concerns regarding the data center campus include environmental factors, impacts on water quality or water depletion in the area, and negative effects on the residents who live there. Her fears are in line with what others who live near data centers have reported experiencing. According to a New York Times article in July, after construction kicked off on a Meta data center in Social Circle, Georgia, neighbors said wells began to dry up, disrupting their water source. The data center Ejk is hoping to stop hasn't yet been approved — the developer has to get zoning ordinances amended and signed off on before moving forward — but Covington Township has shown an interest in the project moving forward. For her part, Ejk has created and shared a "say no to a data center in our community" flyer with a call-to-action for her fellow citizens to attend monthly board of supervisors meetings for discussions on the topic. "I worry about the kind of world I'm leaving for my grandchildren," Ejk says. "It's not safer, it's not better, and we're selling out to these big corporations. You know, it's not in their backyard, it's in my backyard." If one or both of the townships do decide to move forward with the project, Ejk won't stop there. "I'm going to be telling residents to get your wells tested now, because if, after [the data centers] are built and the quality of your water changes, you will have to have a basis of what changed," she said. 'They have only sold the positives' In Louisiana, some residents are welcoming Meta's planned data center in Richland Parish, the one that Zuckerberg says would cover a large part of Manhattan. Others, like Julie Richmond Sauer, believe it could harm the entire state. The facility will be located between the towns of Rayville, population of roughly 3,300, and Delhi, population 2,500. "It is 2,250 acres of farmland that will never be farmed again," Sauer, a registered nurse in central Louisiana, tells me. "That, of course, is a concern of mine, for my children and my grandchildren one day." She also thinks job development, a key selling point for data centers, is often overestimated. "It was sold by our legislators as, 'Hey, we're getting jobs,' which sounds wonderful. 'We're bringing industry in,' which sounds wonderful, but then the more I'm reading, it looks like 500 jobs max," Sauer says, who compared the amount with a medium-size hospital. The Louisiana Economic Development agency expects the data center to bring in 500 "direct jobs," or permanent ones, to the area, along with 1,000 "indirect" jobs and 5,000 construction and temporary jobs at its peak. It's unclear if those construction jobs would go to locals or to workers brought in temporarily from elsewhere. Meanwhile, OpenAI is pitching vastly more jobs for 4.5GW of Stargate data center capacity in the US, should it ever come to pass: 100,000 jobs, "spread across construction and operations roles." But it also acknowledges that the construction jobs would be "short-term." OpenAI's 4.5-gigawatt Stargate data center under construction in Abilene, Texas. CNET/Jeffrey Hazelwood "I just don't think it's enough to sell your soul for," Sauer says. "They have only sold the positives in this and not told the public the negatives, and that's a fact." She believes ultimately that the decision on where to put these data centers should fall on a statewide public vote. There are currently more than 5,000 data centers in the US. While no state is completely free of these computing facilities, some states, such as Virginia, have become magnets for them. Ashburn, Virginia, alone boasts 140 data centers of the more than 500 in the state, earning the area the nickname "Data Center Alley." Texas and California, meanwhile, have more than 300 each. Virginia is attractive for data centers thanks to tax incentives, fiber optic infrastructure and a skilled workforce. Other states are actively trying to attract data centers by offering incentives, too. But concerns are rising regarding these tax breaks and who ends up picking up the bill. "More than 20 states are offering tax breaks to data centers in an effort to incentivize them to come to their state," Quentin Good, a policy analyst at Frontier Group, tells me. "So data centers are often given exemptions on things like the sales tax for all of the equipment that they need to fill up their data centers, and that ultimately falls on taxpayers to pay for the cost of those tax breaks." How much energy do AI data centers use? No matter where they're located, all data centers require a lot of power. According to the International Energy Agency, the US accounted for the largest share of global data center electricity consumption in 2024, at 45%. The Trump administration has emphasized the need to strengthen the grid to support the coming tidal wave of data centers. The president has gone so far as to declare the situation a national energy emergency. "The United States is experiencing an unprecedented surge in electricity demand driven by rapid technological advancements, including the expansion of artificial intelligence data centers and an increase in domestic manufacturing," an April executive order reads. To combat this issue, the government wants to use all available power sources, monitor the US electricity supply closely and follow the new AI Action Plan. "We've [previously] had really stable electricity demand increases of like 2% or 3%, but with a recent boom in data centers and the electrification of other things, like our homes and our vehicles, the [projected] demand for electricity is starting to jump up dramatically," Good says. Last month, a report from the Department of Energy warned that updates to the country's electric grid are imperative for grid reliability due to AI's escalating demands. "Absent intervention, it is impossible for the nation's bulk power system to meet the AI growth requirements while maintaining a reliable power grid and keeping energy costs low for our citizens," the report says. AI's growth and the need for more data centers to support it are rapidly increasing the stress on the US energy grid. This strain is causing "a lower system stability," the North American Electric Reliability Corporation's 2025 State of Reliability found. The US energy grid, built in the 1960s and '70s, was not designed to handle the energy pull AI is creating. At the end of 2023, the US energy grid — which supports every request for electricity, from your home's lighting and air conditioning to massive industrial processes — could handle about 1,189 gigawatts. Meta's Hyperion, for example, will have a capacity of 2 gigawatts, or 2,000 megawatts. That's a roughly 30 times greater demand for electricity than at DataBank's EWR2 location. "What we're seeing with new data centers is just the size difference," John Moura, NERC's director of reliability assessment and performance analysis, tells me. "For the past decade, we've probably seen a couple hundred megawatts as kind of your largest ones. Now we see interconnection requests for one or two or, I think I heard about 5-gigawatt requests, and that really changes the fundamentals of how the system is planned." The Alliance for Affordable Energy is challenging Meta's Louisiana data center — calling it "a power-hungry giant" — along with Entergy Louisiana's bid to build three gas plants to power it. Citing expert testimony, the group is sounding the alarm about a potentially debilitating strain on the electric grid and the cost to the citizens of Louisiana. "It's not exactly black and white in terms of who's paying for the [data center's] upgrades that are needed," Good says, adding that utilities have an obligation to serve all customers. "If any customer moves into their service area, they have to meet that customer's needs in terms of electricity." So, regardless of the scale of a data center, if they get approved to build in any town, the utility must provide the energy needed to power it. A large customer moving into the area could also cause a "short-term constraint on the supply of energy." "That's going to push utility prices up for everyone who's a customer of that utility," Good says. In a bid for additional energy sources, tech companies are turning to nuclear power as a possible solution, but Moura says nuclear power is still at least "a couple of years out." "In the next five years, there's not too many options to build generation, and so [energy] storage can help, but it's not a source of generation," Moura says. Meta has said it will begin using nuclear energy in 2027, with Amazon and Google hoping to use nuclear energy sometime in the 2030s. Environmental impact The water consumption of these data centers, specifically ones that help power AI, has been top of mind for many. Data centers use water to cool the servers. This usage is something that tech companies have tried — and often failed — to keep quiet. In 2022, after the newspaper The Oregonian sought records about Google's water use for a data center in The Dalles, the Oregon city sued to stop the paper from releasing the information. Eventually, the paper did receive the information, which revealed that in 2021, the Google data center used a staggering 355 million gallons of water, which is roughly equal to 538 Olympic-size swimming pools. The Oregonian's reporting helped shine a light on the natural resources these data centers need to run, and, maybe more importantly, it opened the question of whether our finite resources can handle the demand. According to Google's 2024 environmental report, the company's location that used the most water in 2023 was Council Bluffs, Iowa, home to two data centers, one built in 2007 and the other in 2012. In 2023, the Council Bluffs facilities sucked in 1.3 billion gallons of water from the local water supply. Google spent $1 billion in 2024 to expand the facility, and that year the intake rose to 1.4 billion gallons. Meta's 2024 sustainability report doesn't break down water usage by data center; it just gives an aggregate number. In 2023, its data centers worldwide took in 1.39 billion gallons of water. Just under 50% of that was permanently removed from local water sources. Between 2019 and 2023, Meta's data center water withdrawal increased by roughly 43%, but it still uses significantly less water than Google's data centers as a whole. When data centers consume water, a significant amount evaporates during the cooling process. The remaining water, which is often polluted, is put into the city's wastewater system. Both companies have stated they plan to be "water positive" by 2030, meaning they want to return more water to the communities than what the data centers consume through water recycling, reusing and water replenishment projects. However, returning water to the exact source the data center drew from is not always possible. Instead, Google states it attempts to improve additional water sources in the area, restore wetlands and recycle treated wastewater in an effort to counter its water usage. Are climate pledges enough? Even as big tech companies invest heavily in AI, they also continue to promote their sustainability goals. Amazon, for example, aims to reach net-zero carbon emissions by 2040. Google has the same goal but states it plans to reach it 10 years earlier, by 2030. With AI's rapid advancement, experts no longer know if those climate goals are attainable, and carbon emissions are still rising. "Wanting to grow your AI at that speed and at the same time meet your climate goals are not compatible," Good says. For its Louisiana data center, Meta has "pledged to match its electricity use with 100% clean and renewable energy" and plans to "restore more water than it consumes," the Louisiana Economic Development press release reads. However, questions remain around these promises. US Sen. Sheldon Whitehouse of Rhode Island, the top Democrat on the Senate Committee on Environment and Public Works, questioned Meta and Zuckerberg in an official inquiry in May, labeling those climate pledges as "vague." Whitehouse said he believes Meta is putting the need for data centers and natural gas generation "over climate safety." Meta has not yet responded. Google's 2025 Environmental Report shows a 51% increase in carbon emissions in 2024 compared with 2019, despite its sustainability efforts outlined in the report. DataBank, although smaller in scale, also has a sustainability goal tied to its more than 65 locations. It plans to achieve net-zero carbon emissions by 2030. Jenny Gerson, DataBank's sustainability chief, tells me that DataBank has decreased emissions through "procuring renewable power on the grid" and is looking at alternative fuel sources to replace diesel fuel, including hydro-treated vegetable oil. "So instead of pulling more fossil fuels out of the ground and burning them, you're using a plant-based source that has a much shorter carbon cycle and leaving the fossil fuels in the ground," Gerson explains. DataBank is also prioritizing minimizing energy usage by switching to LED lightbulbs throughout its data centers, optimizing air flow to keep cool air around the servers and using closed-loop water systems, "meaning you fill the loop once, and then whatever water or glycol is in there remains in there, and you do not consume more water," she says. Microsoft is currently transitioning new data centers to closed-loop systems. Other possible solutions include creating flexible data centers, meaning they can pull less energy from the grid when energy usage in the surrounding community is expected to be high, such as during a heat wave or when severe weather is incoming. Both Meta and Google are founding members of the Electric Power Research Institute's DCFlex initiative, which aims to make more data centers flexible and help the energy grid remain reliable. "Obviously, everyone wants to use the internet, they want to use AI, and we need to do it responsibly," Gerson says. "So how can we as players do that? And a lot of that is making sure we're doing it through renewable power." Is there a data center near you? There's at least one data center in each US state, and plenty more are on the horizon. If you don't live near one now, there's a good chance you will soon. If you live in an area that isn't prone to natural disasters and boasts natural resources, such as an abundance of water or robust wind, tech companies may be eyeing the spot for an AI factory. Google tells me it has "a very rigorous process to select sites, which includes factors like proximity to customers and users, local talent, land, a community that's excited to work with us and availability of (or potential to bring new) carbon-free energy." The Trump administration's AI Action Plan emphasizes the need for more data centers, electricians and HVAC technicians for the US to win the AI race. Many of the new data centers being built are massive and impossible to miss. There will be smaller ones as well, like Databank's EWR2 facility that I visited in Piscataway — and lots of them. The quiet in the hallways, with the powerful computing servers tucked away behind closed doors, is a stark contrast to the busy, noisy construction activity taking place across the country. Those smaller data centers use less power and water, and they employ far fewer people — and they're often hiding in plain sight. Visual Design and Motion | Tharon Green Art Director | Jeff Hazelwood Creative Director | Viva Tung Video Editors | Dillon Payne, Owen Poole, JD Christison Project Manager | Danielle Ramirez Editors | Corinne Reichert, Jon Reed Director of Content | Jonathan Skillings


CNET
29 minutes ago
- CNET
Get Back to School Ready With This $599 Apple 13-Inch M1 MacBook Air
You may be eyeing Apple's latest M4 MacBook Air, but, in reality, most people probably don't need quite so much performance. Instead, consider picking up the M1 MacBook Air, a model that ushered in the new era of Apple M-series chips in 2020. It has a gorgeous 13-inch display, a fast chip and a great keyboard. The best part? You can now get the device for just $599 at Walmart. This deal is available to everyone who wants to take advantage of it, but we suggest ordering soon -- we don't know how long this $50 discount will last. The good news? You can choose from three different colors when ordering. The M1 MacBook Air has an Apple M1 chip, which includes an eight-core CPU and a seven-core GPU. That chip is backed up by 8GB of RAM and a fast 256GB solid-state drive. And the 13.3-inch display promises sharp text and vibrant colors, with a resolution of 2,560x1,600 pixels. Hey, did you know? CNET Deals texts are free, easy and save you money. This MacBook Air is silent because it lacks a fan and sports a long-lasting battery that can run for up to 18 hours between charges. Factor in support for Apple's latest macOS software updates and tight integration with other Apple hardware, like the iPhone and iPad, and this extremely capable laptop is hard to beat at this price. It also makes a great option for teens or college students who want to be in the Apple ecosystem but don't want to spend the big bucks on the M4 models. Don't worry if this deal isn't for you, though. Feel free to peruse our other top laptop deals and check back often since we'll be updating these pages throughout this sale event. Why this deal matters The M1 MacBook Air is the best MacBook you can get on a budget. It marked the beginning of a new era for Apple, delivering significant performance and efficiency gains over its Intel-powered MacBooks. Although we've seen newer chips from Apple, the M1 is still powerful. It also has a thin, light and fanless design and offers solid battery life. At under $600, it's difficult to find a laptop that will tick as many of the right boxes as this one.


TechCrunch
29 minutes ago
- TechCrunch
Why I finally left Spotify
After our decade-long relationship, I'm breaking up with Spotify. It's nothing personal. It's just that Spotify and I have grown up, but we haven't grown together. Over the years, I've been tempted to leave Spotify many times. I know that the company faces accusations of poor streaming payouts for artists, compared to its competitors, and I haven't forgotten that it was Spotify that platformed Joe Rogan's podcast, then exclusive to the platform, to spread misinformation about COVID-19 and other viruses. I know that Spotify is trying to kill the RSS feed, a move that siphons independence from podcasters. And yet, I'm embarrassed to say that until recently, these issues didn't move me enough to take the time and energy to investigate alternatives to Spotify, a platform that I've been using daily since high school. It's unfortunately easy for us to bury our heads in the sand when the tech companies we pay monthly do things that disappoint us. (Yes, I still remember when Netflix laid off my industry colleagues, but I also know I'll end up watching the new season of 'Love is Blind'). We don't feel like our one subscription makes a difference — after all, Duolingo still beat revenue estimates after sparking backlash when it said it would replace contractors with AI. But Spotify finally got to me in a way that's unavoidable: when I open the app, I cannot bear its all-encompassing, suffocating reliance on algorithmic recommendations. There's an overwhelming display of visual clutter from the time it takes to navigate from Spotify's home page to the music you're looking for. These suggestions are front-and-center when you open the app. First, I may see an unsolicited, full-screen pop-up promoting a new podcast. Then, I'm greeted with a 2-by-4 grid of music and podcast suggestions, including new episodes of shows I listened to once because they had a guest I liked, plus some other albums that I've dabbled in briefly over the last month or so. Below that, there's a sponsored recommendation for a song I might like by an artist that I have never heard of. When I navigate to the search tab, I'm prompted with an audiobook recommendation, and if I scroll a little bit, I see vertical video clips that look like they belong on TikTok. It's easy to fall into Spotify's recommendations, as the app constantly pelts you with customized playlists that its AI has curated specifically for you. On Spotify, you never have to make any decisions — and for some listeners, maybe that's the point. But I noticed that I stopped listening to the music I actually wanted to listen to, and instead, I embraced the music that Spotify told me I wanted to listen to. Without realizing it, I gave up my agency. This isn't to say that my moral qualms with Spotify didn't influence my choice. According to a January report from the music financing platform Duetti, Spotify, a company worth about $140 billion, pays about $3 per 1,000 streams. Amazon Music, Apple Music, and YouTube paid $8.80, $6.20, and $4.80, respectively, per 1,000 streams in 2024. (Spotify previously disputed the accuracy of these figures) Spotify further alienated a portion of its audience in June, when CEO Daniel Ek announced that his investment firm led a nearly $700 million funding round for a company making AI-enabled military weapons. Some bands like Deerhoof, Xiu Xiu, and King Gizzard & the Lizard Wizard pulled their catalogs from Spotify in protest. It's like deja vu. In 2022, Joni Mitchell and Neil Young pulled their music from Spotify over Joe Rogan's platforming of medical misinformation. (The two artists returned to the platform in 2024.) Perhaps it's taken me so long to leave Spotify because choosing a streaming platform leaves you between a rock and a hard place. But if you're attached to your years of playlists, tools like Soundiiz make it easy to port your collection between platforms. I chose Apple Music, mostly because I got a new iPhone and it came with a three-month free trial, which helped me ease my transition. Also, Apple Music has lossless audio, which Spotify has been promising for nearly five years. But I cannot tell you with a straight face that I have departed big, bad Spotify to support the little guy, when I've opted for another tech giant. I have my ethical concerns around Apple, too — even as I type this on my Magic Keyboard that's connected to my MacBook Pro while my iPhone, AirPods, and Apple Watch sit on the other side of my desk. Plus, Apple CEO Tim Cook recently showed up at the White House to gift Donald Trump with a custom, Apple-branded plaque, which sits atop a 24-karat gold base, while performing his fiduciary duty to shareholders to keep Apple products tariff-free. At least the Apple Music app isn't as overwhelming as Spotify.