Scott Galloway has some choice metaphors to describe AI's impact on workers: 'I think of it as corporate Ozempic'
Scott Galloway, a New York University Stern School of Business professor and host of the podcast The Prof G Pod, shared some of the metaphors he's come up with to describe AI's impact on the workplace in a discussion with Microsoft Chief Scientist Jaime Teevan and Greg Shove, the CEO of Section, an AI education company.
Here are some of Galloway's best metaphors:
'Corporate Ozempic'
The leadership and boards of many companies are using AI to cut costs. In this scenario, Galloway said he thinks of AI as having some of the same properties as GLP-1s.
I think of AI as "corporate Ozempic." and that is, Ozempic goes into your brain and kind of switches off a switch that says 'you don't need more calories' even though your instincts are telling you to consume as many calories as possible if you're fortunate enough to have salty, or sugary, or fatty food in front of you.
And typically when you're a CEO, and you're growing, the signal is 'I need more calories. I need more people.' Musk, to a certain extent, by offering a minimum viable product with 20% of the staff of Twitter, and really Meta announcing what was the seminal earnings call where they said, 'We've laid off 20% of our staff, and meanwhile maintain growth of 23% sending earnings up 70%,' everybody started thinking I want the great taste of growth without the calories of more people. And AI is the Ozempic.
'The East German Stasi with WiFi'
The "dark side" of AI lies in how easily it can identify low performers, Galloway said, comparing it to East Germany's Cold War-era secret police, notorious for their widespread surveillance.
Now, I can upload all the email and Slack interactions I have with an employee and say 'Give me an estimate of how many hours a week this person is actually working.' And it'll give me what I believe, maybe incorrectly, but I believe, and that's all that's important, is my perception of how many hours a week this person is actually working.
'Warrior making machine'
For the top 10% of the US labor force, however, Galloway thinks AI is a boon.
If you're really good, this is really good news for you. America has essentially been optimized for the top 10%. Essentially companies, essentially America over the last 50 years, have transitioned to an economy where you use the bottom 90% as a nutrition bag to make the top 10% wealthier and wealthier. And AI kind of speedballs that AI is gonna take the top 10% who work really hard and are really creative and know how to leverage these tools and just make them fu----g warriors. I mean they're just going to be monsters.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
32 minutes ago
- Yahoo
Did Nvidia Make a Mistake by Selling SoundHound AI Stock? The Answer Might Surprise You.
Key Points SoundHound AI stock is generating blistering revenue growth right now, as some of the world's biggest brands adopt its conversational artificial intelligence software. Nvidia used to be a SoundHound shareholder, but the chip giant sold its entire stake towards the end of 2024. SoundHound stock is soaring right now, but that doesn't mean Nvidia made the wrong move. 10 stocks we like better than SoundHound AI › Nvidia (NASDAQ: NVDA) has become the world's largest company on the back of surging demand for its data center chips, which are the gold standard for developing artificial intelligence (AI) models. The chip giant occasionally puts its vast financial resources to work by investing in other AI companies, and SoundHound AI (NASDAQ: SOUN) was one of them. SoundHound is a specialist in conversational AI technologies, and its customers include some of the world's biggest brands. Nvidia first revealed its stake in the company in a 13-F filing with the Securities and Exchange Commission in February 2024, but the chipmaker had sold its entire position by December. SoundHound's revenue tripled during its most recent quarter, and its stock is up by a whopping 40% in the past month alone. Did Nvidia make a mistake by dumping its position? Read on for the surprising answer. A leader in conversational AI SoundHound's conversational AI applications are popular in a variety of industries, but they are experiencing particularly high demand in quick-service restaurant chains and with automotive brands that want to include a powerful AI assistant in their new vehicles. In the restaurant space, SoundHound's Voice AI technology can accept customer orders autonomously in-store, over the phone, and in the drive-thru. It can also answer queries from employees, whether they need help making menu items or need clarity on a particular store policy. Chains like Chipotle, Krispy Kreme, and Papa John's are just a few of SoundHound's customers. In the automotive industry, companies like Hyundai and Stellantis (Chrysler, Jeep, and Dodge) are using SoundHound's Chat AI software in their latest vehicles. It can give drivers information about the weather, stocks, and everything in between with a simple voice command, and manufacturers can customize its personality to suit their brand. Last August, SoundHound acquired another conversational AI specialist called Amelia. The joint companies recently launched a new platform called Amelia 7, which allows businesses to create custom AI agents that can assist customers with their inquiries or even help employees troubleshoot technical issues. Of course, these agents can be controlled entirely with voice commands. SoundHound's revenue is absolutely skyrocketing SoundHound generated a record $42.6 million in total revenue during the second quarter of 2025 (ended June 30), which was a blistering 217% increase from the year-ago period. The strong result gave management the confidence to increase its full-year revenue guidance for 2025 from $167 million to $169 million (at the midpoint of the forecast range), which would be a 99% increase compared to 2024. That would mark an acceleration from the 85% growth SoundHound generated last year, highlighting the significant momentum in its business. But that growth is coming at a significant cost, because SoundHound continues to burn truckloads of cash. It lost $74.7 million on a generally accepted accounting principles (GAAP) basis during the second quarter, which was twice as much as it lost in the year-ago period. SoundHound did suffer a one-off, $31 million hit to its bottom line from a liability associated with one of its acquisitions during the quarter, but even after stripping it out -- along with every other one-off and non-cash expense -- the company still lost $11.8 million. SoundHound has a solid balance sheet with $230 million in cash on hand and no debt, so it can sustain losses of that size for the foreseeable future. However, the company will eventually have to prioritize profitability, or else it might need to raise capital, which could dilute existing shareholders. Cost cuts would almost certainly dent SoundHound's revenue growth, which is something for investors to keep in mind. Here's why Nvidia didn't make a mistake by selling SoundHound stock Nvidia never told investors why it sold SoundHound stock, but if I had to speculate, I think its valuation likely had something to do with the decision. Its price-to-sales (P/S) ratio is trading at an eye-popping level of 48.6, which is more than a 50% premium to Nvidia's P/S ratio of 29.9. It was even more expensive when Nvidia sold it toward the end of 2024, because its P/S ratio was hovering near 100. SoundHound will quickly grow into its current valuation if its revenue continues to increase at such a blistering pace, but the financial results of companies in the early stages of commercialization are notoriously unpredictable, so there's no guarantee it will. Moreover, Nvidia is one of the highest quality companies in the world with a track record of success that spans decades, a rock-solid balance sheet, and surging profits, so I don't think SoundHound deserves to trade at a premium to the chip giant. Nvidia held 1.73 million SoundHound shares, which would've been worth around $27.7 million at the current price of $16. Given the chip giant's market cap of $4.4 trillion, a total loss would've been a mere rounding error. However, holding a stock with such a steep valuation opens the door to substantial downside if the underlying company falters, so I don't think Nvidia made a mistake by closing its position. Should you invest $1,000 in SoundHound AI right now? Before you buy stock in SoundHound AI, consider this: The Motley Fool Stock Advisor analyst team just identified what they believe are the for investors to buy now… and SoundHound AI wasn't one of them. The 10 stocks that made the cut could produce monster returns in the coming years. Consider when Netflix made this list on December 17, 2004... if you invested $1,000 at the time of our recommendation, you'd have $663,630!* Or when Nvidia made this list on April 15, 2005... if you invested $1,000 at the time of our recommendation, you'd have $1,115,695!* Now, it's worth noting Stock Advisor's total average return is 1,071% — a market-crushing outperformance compared to 185% for the S&P 500. Don't miss out on the latest top 10 list, available when you join Stock Advisor. See the 10 stocks » *Stock Advisor returns as of August 13, 2025 Anthony Di Pizio has no position in any of the stocks mentioned. The Motley Fool has positions in and recommends Chipotle Mexican Grill and Nvidia. The Motley Fool recommends the following options: short September 2025 $60 calls on Chipotle Mexican Grill. The Motley Fool has a disclosure policy. Did Nvidia Make a Mistake by Selling SoundHound AI Stock? The Answer Might Surprise You. was originally published by The Motley Fool Sign in to access your portfolio
Yahoo
40 minutes ago
- Yahoo
Marvell Technology, Inc. (MRVL): A Bear Case Theory
We came across a bearish thesis on Marvell Technology, Inc. on Irrational Analysis's Substack. In this article, we will summarize the bulls' thesis on MRVL. Marvell Technology, Inc.'s share was trading at $77.34 as of August 8th. MRVL's trailing and forward P/E were 22.15 and 27.40, respectively according to Yahoo Finance. Copyright: ralwel / 123RF Stock Photo Marvell is currently facing significant setbacks with its high-speed SerDes technology, a critical component in its networking and custom AI chip initiatives, leading to strained relationships with major customers like Microsoft and Amazon. Recent reports from Edgewater Securities and Semianalysis reveal that both Marvell and its competitors have struggled with SerDes performance issues, causing delays such as a second tape-out for the Trn3 ASIC and pushing mass production to the second half of 2026. Microsoft is reportedly reconsidering Marvell as a silicon partner due to underwhelming networking performance, while Amazon is shifting its Trainum 4 AI chip production away from Marvell to alternative suppliers like Alchip and Astera Labs, which use more reliable IP from Synopsys and Cadence. Marvell's internal attempts to hype its XPU and AI socket initiatives have been met with skepticism, as industry insiders claim Marvell is not involved in key emerging AI projects. Despite boasting of numerous ISSCC publications and advanced 5nm SerDes demos, real-world performance has been disappointing—demos have shown high error rates even with aggressive cooling, and comparisons favor competitors like Broadcom, whose SerDes deliver better error correction and reliability. Marvell's sales and demo tactics, including selective performance metrics and controlled testing environments, have been criticized as misleading. While some technological advances like dense SRAM and novel power delivery (PIVR) are praised, the overall narrative suggests Marvell is struggling to deliver the promised value in its cutting-edge chip segments. The combination of delayed products, technical flaws, and customer defections casts a shadow over Marvell's growth prospects and raises questions about its ability to compete against rivals with superior IP and execution. Previously we covered a on Marvell Technology, Inc. by Simple Investing in January 2025, which highlighted Marvell's strong data center growth, custom silicon deals, and positive financial outlook. The company's stock price has depreciated approximately by 32% since our coverage, reflecting setbacks in SerDes technology and customer concerns. The thesis still stands as Marvell's IP portfolio and market diversification support long-term growth. Irrational Analysis shares a contrarian view but emphasizes near-term technical and execution challenges. Marvell Technology, Inc. is not on our list of the 30 Most Popular Stocks Among Hedge Funds. As per our database, 73 hedge fund portfolios held MRVL at the end of the first quarter which was 105 in the previous quarter. While we acknowledge the potential of MRVL as an investment, we believe certain AI stocks offer greater upside potential and carry less downside risk. If you're looking for an extremely undervalued AI stock that also stands to benefit significantly from Trump-era tariffs and the onshoring trend, see our free report on the best short-term AI stock. READ NEXT: 8 Best Wide Moat Stocks to Buy Now and 30 Most Important AI Stocks According to BlackRock. Disclosure: None. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data
Yahoo
3 hours ago
- Yahoo
OpenAI CEO sparks controversy after refusing to disclose information about latest AI model: 'It's more critical than ever to address'
OpenAI CEO sparks controversy after refusing to disclose information about latest AI model: 'It's more critical than ever to address' ChatGPT's popularity exploded instantly upon its November 2022 debut — acquiring a staggering 100 million active users in two months — but concerns about the technology's drain on resources have risen in tandem. Quantifying the environmental impact of generative artificial intelligence (AI) and large language models (LLMs) like ChatGPT has been an elusive endeavor, and as The Guardian observed, OpenAI and CEO Sam Altman have been frustratingly opaque about the energy demands of its newest offering, GPT-5. What's happening? Per The Guardian, as of mid-2023, a simple query using ChatGPT used "about as much electricity as an incandescent bulb consumes in 2 minutes." Research published as a preprint in March 2023 looked at ChatGPT's water usage, asserting that training the GPT-3 model could "directly evaporate 700,000 liters of clean freshwater, but such information has been kept a secret." On August 7, ChatGPT's parent company OpenAI introduced GPT-5, its latest and most feature-heavy offering. GPT-5 represented "a significant leap in intelligence over all our previous models, featuring state-of-the-art performance across coding, math, writing, health, visual perception, and more," OpenAI claimed as the model was unveiled. The Guardian's coverage pointed out that ChatGPT's usage of resources would likely increase in conjunction with its capabilities, adding that OpenAI had been markedly silent on the subject over the past five years. While OpenAI hasn't been forthcoming, experts weighed in on what they suspect is necessarily a thirstier, energy-gobbling ChatGPT model. "A more complex model like GPT-5 consumes more power both during training and during inference … I can safely say that it's going to consume a lot more power than GPT-4," said University of Illinois professor Rakesh Kumar, who researches AI and energy usage. Why is GPT-5's energy usage concerning? ChatGPT's leap from a million to 100 million users wasn't a blip — back in March, TechCrunch analyzed more recent usage trends following its introduction in November 2022. Do you think your city has good air quality? Definitely Somewhat Depends on the time of year Not at all Click your choice to see results and speak your mind. "By November 2023, ChatGPT had reached another milestone of 100 million weekly active users, which grew to 300 million by December 2024, then 400 million in February 2025," the outlet explained. Citing initial calculations from the University of Rhode Island's AI lab on Friday, August 8, The Guardian surmised that GPT-5's capabilities could require an amount of energy that "would correspond to burning that incandescent bulb for 18 minutes." Put another way, GPT-5 could use as much daily energy as 1.5 million households in the United States. University of California, Riverside, AI researcher Shaolei Ren said GPT-5's energy requirements "should be orders of magnitude higher than that for GPT-3" based on its size alone. What can be done about AI's environmental impact? Although AI researchers can make credible estimates, experts called for responsible corporate disclosures. "It's more critical than ever to address AI's true environmental cost. We call on OpenAI and other developers to use this moment to commit to full transparency by publicly disclosing GPT-5's environmental impact," said University of Rhode Island professor Marwan Abdelatti. Join our free newsletter for good news and useful tips, and don't miss this cool list of easy ways to help yourself while helping the planet. Solve the daily Crossword