logo
Super Micro Computer Inc (SMCI) Q3 2025 Earnings Call Highlights: Navigating Growth Amid Challenges

Super Micro Computer Inc (SMCI) Q3 2025 Earnings Call Highlights: Navigating Growth Amid Challenges

Yahoo07-05-2025
Revenue: Fiscal Q3 net revenue totaled $4.6 billion, up 19% year over year, but down 19% quarter over quarter.
Non-GAAP EPS: $0.31 per share, compared to $0.66 last year.
Gross Margin: Non-GAAP gross margin was 9.7%, down from 11.9% in Q2.
Operating Expenses: Non-GAAP operating expenses were $216 million, a decrease of 5% quarter over quarter and an increase of 30% year over year.
Operating Margin: Non-GAAP operating margin was 5%, compared to 7.9% in Q2.
Cash Flow from Operations: $627 million generated in Q3.
Free Cash Flow: $594 million during the quarter.
Inventory: Closing inventory was $3.9 billion, up 7.6% quarter over quarter.
Net Cash Position: $44 million, compared to a negative net cash position of $479 million last quarter.
Q4 Revenue Guidance: Expected net sales in the range of $5.6 billion to $6.4 billion.
Q4 Non-GAAP EPS Guidance: $0.40 to $0.50 per share.
CapEx: $33 million for Q3, with Q4 expected to be in the range of $45 million to $55 million.
Release Date: May 06, 2025
For the complete transcript of the earnings call, please refer to the full earnings call transcript.
Positive Points
Super Micro Computer Inc (NASDAQ:SMCI) reported fiscal Q3 2025 revenues of $4.6 billion, up 19% year over year.
The company achieved a volume shipment of new AI platforms, indicating strong demand and market leadership.
Super Micro Computer Inc (NASDAQ:SMCI) is expanding its global operations, including new facilities in Malaysia, Taiwan, and Europe.
The company is launching its Data Center Building Block Solution (DCBBS), which promises to reduce power consumption and optimize space.
Super Micro Computer Inc (NASDAQ:SMCI) maintains a strong cash position with $2.54 billion in cash and a net cash position of $44 million.
Negative Points
Fiscal Q3 net revenue of $4.6 billion was lower than the original forecast due to delayed customer commitments.
Non-GAAP EPS for fiscal Q3 was $0.31 per share, down from $0.66 last year, impacted by inventory write-downs.
The company's gross margin decreased to 9.7%, down from 11.9% in the previous quarter, due to higher inventory reserves and lower volume.
Super Micro Computer Inc (NASDAQ:SMCI) faces macroeconomic challenges and tariff impacts, creating uncertainty in forecasting.
The company experienced a 19% quarter-over-quarter revenue decline, attributed to delayed platform decisions by customers.
Q & A Highlights
Q: Are customers pulling back orders due to macroeconomic conditions, and how does this affect the June quarter outlook? A: Charles Liang, CEO, stated that despite macroeconomic uncertainties, they see strong orders and expect a robust June quarter. The September quarter is anticipated to be even stronger with the full production of new products like Blackwell.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

What investors see in the sale of AI chips to China
What investors see in the sale of AI chips to China

Axios

time35 minutes ago

  • Axios

What investors see in the sale of AI chips to China

Nvidia and AMD can sell their AI chips to China for the low price of 15% of their revenue, paid out to the U.S. government. Investors are unfazed. Why it matters: Shareholders are focusing on the revenue opportunities that come with more access to Beijing, not on the unprecedented involvement of the Trump administration in Nvidia's business dealings. What they're saying: "There's way more upside," Daniel Newman, principal analyst and CEO of The Futurum Group, tells Axios. Catch up quick: The Trump administration previously backed export controls on Nvidia's H20 chips, which are "orders of magnitude" less powerful than Nvidia's Blackwell chips, Newman says. A month ago, the administration signaled that it was shifting course on these controls, but did not issue the licenses required for sales to be possible. That appeared to change after Nvidia CEO Jensen Huang met with President Trump. Nvidia walked away with promises of licenses so long as the chip giant cut the U.S. government a check for 15% of its China revenue. Zoom in: Nvidia stock is up nearly 0.5% since the news broke Monday, with investors and analysts bullish on the deal. The lifting of the export controls could lead to a $15 billion revenue windfall for Nvidia. Both Nvidia and AMD have pricing power, given the strength of demand for AI chips in China, according to a note from Bank of America. That means the 15% expense could be passed on to Chinese customers. Between the lines: While the deal could lead to billions of dollars in additional revenue for the U.S. government, it's not just about the money. It's also about access to rare earth magnets, Newman says. The U.S. has powerful AI chips that China wants. China has rare earth metals the U.S. wants. When the administration first changed course on export controls in July, Commerce Secretary Howard Lutnick told CNBC that selling the "fourth best" AI chip to China wasn't material. Lutnick also said the export control rollback was tied to a rare earths deal, though those details have not fully materialized. Yes. but: Export controls are typically put in place for a reason: in this case, national security concerns. The 15% revenue split, first reported by the Financial Times, includes an anonymous source quote that points to the security concerns: "What's next — letting Lockheed Martin sell F-35s to China for a 15% commission?" Situational awareness: Beijing is urging local companies to avoid buying chips from American companies because of its own security concerns. Newman says that may be political theater – an effort for China to keep the upper hand in ongoing negotiations. Chinese companies will likely still want access to the best possible chips. Be smart: In just January of this year, investors feared China outpacing the U.S. in the AI arms race given the reported success of DeepSeek.

What to Know About Trump's Nvidia Deal and China's Response
What to Know About Trump's Nvidia Deal and China's Response

Time​ Magazine

time36 minutes ago

  • Time​ Magazine

What to Know About Trump's Nvidia Deal and China's Response

The world's most valuable company is now at the center of President Donald Trump's trade war with China. Trump said Monday that he has cut a deal with chipmaker Nvidia, allowing it to sell certain artificial intelligence chips to China in exchange for a cut of the revenue, which would go to the U.S. government. Trump said he also negotiated a similar deal with chipmaker Advanced Micro Devices (AMD). The deal is a marked departure from an effort by the U.S. to restrict China's access to advanced semiconductors over concerns that they would be used to advance the country's military technology. Washington began restricting exports of some semiconductors to China in 2022, although Nvidia was able to export a specially made-for-China chip, the H20, which is deliberately slowed down. In April, the Trump Administration announced that it would require a license to export the H20 chip, abruptly curbing the shipment of $2.5 billion of H20 revenue from China in the fiscal quarter ending April 27. The announcement spurred months of lobbying by Nvidia CEO Jensen Huang, who committed a $500 billion investment from Nvidia to make AI servers in the U.S. Last month, Nvidia announced that it would resume sales of the H20 to China after Trump and Huang met in the Oval Office, and the Commerce Department began licensing the chips for export last week. But the unusual deal is not just a return to the previous status quo—rather, analysts and lawmakers warn that it could open the doors to a 'pay-to-play' trade policy. Here's what to know. What the deal means for Nvidia The deal, which was first reported by the Financial Times, involves allowing exports of certain AI processors to China in exchange for Nvidia and AMD paying 15% of the proceeds to the U.S. government. 'I said, 'Listen, I want 20% if I'm going to approve this for you,'' Trump said at a White House press conference on Monday, adding that Huang negotiated that number down to 15%. 'For our country, for the U.S., I don't want it myself,' Trump clarified about the kickback. Trump said the deal was limited to Nvidia's H20 chips, which he said are 'essentially old,' and a similarly slowed down MI308 chip for AMD, but he indicated that a similar deal may be in the works for more advanced chips. 'The Blackwell is super-duper advanced. I wouldn't make a deal with that,' Trump said, referring to Nvidia's most advanced chip for AI. 'Although, it's possible I'd make a deal on a somewhat enhanced in a negative way Blackwell. In other words, take 30% to 50% off of it, but that's the latest and the greatest in the world. Nobody has it. They won't have it for five years.' Nvidia said in a statement to media outlets, 'We follow rules the U.S. government sets for our participation in worldwide markets.' It added, 'While we haven't shipped H20 to China for months, we hope export control rules will let America compete in China and worldwide.' China urges firms not to use the less-advanced chips The Chinese government has reportedly discouraged local firms from using Nvidia's H20 chips. The guidance did not outright ban use of the chips, but it said that they should not be used in particular for national security-related work. Some Chinese companies have reportedly made plans to reduce their orders of Nvidia chips after the government asked firms to justify why they buy Nvidia H20 chips over domestic chips, such as those by Chinese firm Huawei, which the U.S. has imposed strict export controls on. But analysts have said that the American chips, even in their slowed down form, are world-class and will continue to be sought after by companies. Although Trump characterized Nvidia's H20 chips as 'obsolete' and said that China 'already has it in a different form,' he also noted it 'still has a market.' Beijing in July raised security concerns with Nvidia, while Chinese state media reports in recent weeks have also highlighted these concerns. 'The H20 is not a military product or for government infrastructure,' Nvidia told Bloomberg. Nvidia export controls have also been used as a 'negotiating chip' in U.S.-China trade talks, Treasury Secretary Scott Bessent said in July. The deadline for the U.S. and China to reach a trade deal was Aug. 12, but the two countries extended their truce by 90 days in order to continue negotiations. Should Chinese firms purchase fewer chips from Nvidia, Trump's much-touted deal—and any further attempts to use chips as a bargaining chip—may hold less weight. Deal raises legal concerns Nvidia's H20 chips are believed to have been used in developing Chinese company DeepSeek's open-source AI model, which caused a panic in the U.S. over whether China's AI technology was more advanced than previously believed. It prompted former President Joe Biden to further curb exports of AI chips to China, before the Trump Administration placed an effective ban on the H20 exports in April. The loosening of those restrictions for essentially a sales commission arrangement has drawn criticism over both national security risks and legality concerns. The White House said the details of the deal are still being worked out. 'The legality of it, the mechanics of it, is still being ironed out by the Department of Commerce, and I would defer you to them for any further details on how it will actually be implemented,' White House Press Secretary Karoline Leavitt said Tuesday, adding that the deal could be expanded to include other companies. 'I am concerned by reports that the U.S. government will be taking a cut of the proceeds from the sale of advanced H20 and equivalent chips to China,' Rep. John Moolenaar (R, Mich.), the head of the House Select Committee on the Chinese Communist Party, told the Financial Times. 'By putting a price on our security concerns, we signal to China and our allies that American national security principles are negotiable for the right fee,' said Rep. Raja Krishnamoorthi (D, Ill.), the ranking member of the House select committee on the CCP. Bernstein analyst Stacy Rasgon said the deal sets a bad 'precedent' for other companies. Rasgon told Bloomberg on Monday that the 15% cut in order to sell to China for the chipmakers makes financial sense for the companies and is better for the U.S. in order not to lose its competitive edge in the Chinese market but also that 'it feels like a very slippery slope.''It raises concerns, certainly for many national security minded folks, of—are we now selling export control licenses?' Owen Tedford, a senior research analyst at Beacon Policy Advisors, told the Hill. 'Is there a way that Nvidia will be able to buy licenses to sell more advanced chips than they're currently able to?' 'It raises questions about how—and I think this gets to some of more general concerns with the Trump Administration—just, policy feels like it's for sale in some ways, like policy outcomes,' Tedford added. 'If companies are big enough or strong enough, they can basically buy the policy that they want from the Trump Administration.'

Are Your Smart Devices Really Spying on You?
Are Your Smart Devices Really Spying on You?

New York Times

time41 minutes ago

  • New York Times

Are Your Smart Devices Really Spying on You?

CAIRA: Okay. Now we know what data they're collecting. Can you tell us who is this amorphous they? Who is actually collecting the data? JON: It's a guy named Gary. CHRISTINE: I'm Christine Cyr Clisset. CAIRA: I'm Caira Blackwell. ROSIE: I'm Rosie Guerin and you're listening to The Wirecutter Show . ROSIE: Hi friends. CHRISTINE: Hey there. CAIRA: Hi. ROSIE: So we are talking about data privacy on the show today and I've actually been thinking and kind of worrying a lot about it recently and maybe increasing since having kids. Okay, let me explain. I was in the car the other day and my wife texted me, sent me a text message. And so being the good safe driver I am, I had Siri play it. And My wife must have had the new Siri AI thing enabled because not only did it transcribe the words she wanted to text to me, it started transcribing or describing the photo she sent, which happened to be of my two children. And she's describing their features and their faces through the, I was like, cancel. Cancel. She being the AI. She being Siri. Yes. Was she accurate? Not really, but I assume they're learning, right? And so that really gave me pause, and it started me thinking about the age we live in with these two polars on the one hand, all we have been able to achieve from technology, from our lives being very connected, and on the other hand, relinquishing control of our data and therefore our price. CAIRA: Yeah. CHRISTINE: Yeah, I mean, it's happening around us all the time. I think we all kind of understand, or most of us understand, that when we're using things on the internet, we're kind of relinquishing a little bit of our information every time we do that. We actually published recently a bunch of articles around data privacy and security, and one of them we're gonna talk about today, we're going to bring on Jon Chase, who is our supervising editor of Smart Home Coverage at Wirecutter. And his team did this pretty intense deep dive into looking at... The data that these different devices are collecting about us. It's not surprising if you have a quote unquote smart device like a smart speaker that it's collecting data on you. I know that some people choose not to get these devices because of that. But what this team found is that a lot of devices in our homes that we may not think of as smart devices are actually collecting quite a lot of data and it's pretty up in the air about what's actually happening with this data. CAIRA: So good. I'm so glad that we gave up our freedoms for this convenience. Well, but like, what's the re- ROSIE: You know, it's tough because what's the recourse? Going and living off the grid, which actually kind of sounds really nice. Learn how to farm. But we're in this, I think. CAIRA: Okay, so after the break, we're going to talk with Jon about which devices are spying on you, what they're looking for, and how to protect your data. Be right back. CAIRA: Welcome back. With us now is Jon Chase. He's a supervising editor on the tech team who covers smart home devices for Wirecutter. He's also been writing about tech for over two decades. A fun fact about Jon is that in addition to his impressive career in journalism, he's also worked as a TV writer for several game shows, including my favorite Cash Cab and Who Wants to Be a Millionaire. CHRISTINE: Jon, welcome to the show. JON: Happy to be here. ROSIE: That's incredible. And also, very much tracks. CHRISTINE: Yeah. CAIRA: Yeah. I want to know- CHRISTINE: You better say something funny on this episode. CAIRA: Oh. I just want to know if you got to ride in the Cash Cab. JON: We followed the Cash Cab, keeping up and making sure we could manage it. ROSIE: Ah. CHRISTINE: All right. CAIRA: Okay. Well, Jon, right off the bat, I want to know which of my smart devices are spying on me because I know it has to be more than just Alexa and Amazon. JON: First off, we should tell anyone who's listening to this to mute their smart speakers because they're going to be triggered in more ways than one. I would say spy is a very charged term. CAIRA: Okay. JON: You might just say they are paying attention, close attention. Yeah. We like to say that data is the fuel of the smartphone. These devices, all of the amazing things they can do, they fully depend on creating and collecting data, and synthesizing it, and sending it to the cloud. That's just table stakes. You can't really get around that. I don't think anyone would be surprised that smart home devices are collecting data. That's what they do. But I definitely think there's a few that we encountered that are doing a whole lot more than we suspected. Smart TVs were pretty egregious. CHRISTINE: I often think of a smart home device as a smart speaker, something that has smart in the title, a smart thermostat or something. But what are the devices we're really talking about in the home? JON: Yeah. A lot of times, there's devices that are just Wi-Fi connected, which aren't necessarily smart. We've always defined it as any device that has a control app, has the ability to be accessed remotely, and connects to the internet. Usually, we bias towards ones that can be controlled using a third party platform, which many people are familiar with if you have Google Home, Amazon Alexa, Apple Home. That's the broad definition. But you've got your smart speakers, you've got your smart light bulbs. Thermostats, like your Nest and Ecobee, things like that. There's also a lot of kitchen appliances have for a long time ... Amazon put out an Alexa-powered microwave. CAIRA: Why? JON: And an Alexa-powered clock. There's a lot of stuff out there that may not even be labeled smart, but has the ability to be connected. ROSIE: Until I have a robot that can take my food and put it into the microwave, I don't understand the purpose of being able to be far away and having Alexa turn on my microwave. JON: First of all, that's what kids are for. You tell the kids to put the stuff in the microwave. ROSIE: Fair point. JON: I get that. But I think one of the things, I sometimes feel like I'm a smart home apologist .. but really it's problem solving. There's a lot of people ... One of the things we've really learned in the past few years is with the accessibility community, people who have mobility issues, things like that, a bulb that goes on and off at a set time for someone who has the inability to turn off light bulbs is a godsend. CHRISTINE: Yeah. Turn fans on when it's hot. JON: Yeah. CHRISTINE: Turn your AC on. JON: Change the temperature, yeah. CHRISTINE: Yeah, there's definitely some real, real use cases that benefit. ROSIE: It's life-changing, yeah. CHRISTINE: Yeah, it can be really life-transforming. JON: Then, back to what I said before, all of that depends on data. CHRISTINE: I know a bunch of people who refuse to get a smart speaker because they are concerned with these devices listening in on them or collecting their data. You just listed many devices, many other types of devices, things that people might not think of as smart devices. What exactly is the type of data that they're collecting? JON: There's personal data type stuff, which I'll talk about in a second, and then there's also just functional data type stuff. I'll give the example of a thermostat. A smart thermostat, it's checking the temperature nearby. Some of them have a motion sensor. Some of them also have a presence sensor, and they might even have other sensors that are remote. And it also connects to the internet, and it learns over time if the temperature is X degrees and it is this inside, and the weather is this, it'll take this amount of time for your heat to fully heat or cool your house. That's functional data. Then on top of that though, it may have your address. If you pair it with other devices, then those devices- CAIRA: They talk to each other? JON: They talk to each other. And if you connect it to a third party service, say like Amazon Alexa or Google Home, or something like that, then things start to get, I will use the technical term, hinky because it becomes really, really confusing. I think the issue we're all going to be talking about here is just how no mere mortal has the capacity to really gauge what's going on. CHRISTINE: Right. None of us really know how much of our personal or situational data is out there at this point. JON: That's right. The smart speakers is probably the most obvious example of people getting skeeved out, another technical term. There's some truth, and then there's also a lot of anecdotal, weird stuff that happens and I think it colors the whole experience. Blankly, I'll just state we spoke with all these companies, we've tested these things for years. I've spoken with, there's this great researcher that works at Georgia Tech who has tested all these devices. They give you a signal when they're listening and they're always listening. I'll just use Alexa as an example because it's the foremost example. An Alexa is always listening, but it's literally listening for a particular wave form, a vocal code, and that's what's called the wake word. You can change it, it's Alexa by default, but you can say "computer" or you can say "Echo," that kind of stuff. Then eventually, it'll hear that tone and it perks up. And it signals that it's perked up. There's a light, so you know visually that it's happened. If it's a wrong thing, it'll just fade and delete the recording. If it's correct and you actually are communicating with it, it will interact with you and that kind of stuff. That does go to the cloud. Now, depending on your settings, we can talk about what that actually means. You can opt to have that recording saved or not. Depending on the platform, meaning the device, you can also decide whether it records or not. For instance, Google speakers don't actually save recordings by default, which was a great surprise to us. CAIRA: I feel like a lot of people might think that an Alexa speaker is listening literally all the time. You hear people say, "I was talking with my friend about a toothbrush that I really wanted, and then next thing you know, I get a toothbrush ad, so it must be listening to me all the time." It sounds like that's not actually happening. JON: Yeah. There is a phenomenon of what you're talking about. I don't know if there's a name for it, but there is a thing where people are like, "Oh, we were talking about Aruba," or a baseball bat, or some kind of thing like that, and then it shows up in your feed. My understanding, I spoke with a whole bunch of people about this, your device, almost all of us could walk around with a smartphone with us. When you connect to the internet, you have what's called an IP address which is specific to you. If you are connected to Wi-Fi, and someone else is connected to Wi-Fi, and someone else is connected to Wi-Fi, you become associated. Then you might also travel to other places and you might also search for certain things. Suddenly, it just all becomes algorithmic. There's basically an association, the data profiles of people. These live on your phones, they live in your laptops. These devices collect information on you, on your search habits, your location, things like that. They allegedly get anonymized, but ads are served to you based on those things. If there's an affinity, if you are around other people, they might be like, "Oh, okay, we think she's a white woman who's 21 obviously." CHRISTINE: Obviously. JON: Yes. She traveled to Florida, and also does this and does this, and they might serve you the same ads. It might be that any one of you might have searched for something recently and that ends up being the trigger. CHRISTINE: This happened to me recently because Caira showed me this swimming pool she went to and it popped up on my Instagram feed. CAIRA: Really? CHRISTINE: I was like, "I've never searched this." JON: But I'll bet you ... You searched it, right? CAIRA: Probably to show Christine. JON: But you searched it. CAIRA: Yeah. JON: And you're affiliated with her because of your address, so therefore there's a pretty good, better than not chance that this would interest her. CHRISTINE: Yeah. It wasn't because we vocally talked about it- JON: Nothing to do with that. CHRISTINE: ... together in this studio. JON: Yes. CHRISTINE: It was because she had searched for it on her phone and our phones were in the same room. JON: That's right. CHRISTINE: That's so creepy. JON: But also, you guys probably gallivant and you're associated in multiple places. That gives you, "Oh, well, she likes nice restaurants." You go clubbing a lot I know. All that stuff, it adds up to this profile. CAIRA: The data profile. CHRISTINE: Yeah. JON: The data profile, that's where you start getting really icky. ROSIE: Jon, you talked about the functional data that's being collected. What other kind of data is being collected from these smart devices? JON: Sure. As we talked about, when you're setting up a device, you use an app. The app will probably ask you, depending what the device is, it might say, "What's your address? Because I need to know that for geolocation," which is- CHRISTINE: Where you are on the Earth. JON: Where you are on the Earth and that's used for a lot of really cool functions. You can have things turn on and off when you leave and come home from your house, or something like that. It might be email, phone number. You might have billing associated with it, you might have a credit card. But there's also stuff like there's your IP address. That is not necessarily personal, but it's one of those things that once it becomes associated with you, all of the IP addresses that you travel around the world and connect to all become part of a profile, and they make you more and more findable. You also might have health data. CHRISTINE: I was going to ask you about health data. I have an Apple Watch, I work out with my Apple Watch. I've started using Apple Fitness, and the fitness app and my Apple watch are integrated. When you're using a device like that, then presumably the data that is going into your profile, like things like your weight, and other metrics that are in there, right? JON: Yes. I will say Apple actually has really good policies around that. But if you have a non-Apple thing like a headphone- CHRISTINE: Yeah. I sometimes use my Soundcore headphones with my Apple Watch. JON: Right. Those can, there are no rules really around health data. Any app ... A lot of headphones now have cardiac monitors, and sleep monitors, all this kind of stuff. They actually can access your health data and they have willy-nilly access to it. They can do whatever they want with it. HIPAA is… it doesn't actually protect people nearly as much as they think it does. CAIRA: Okay. Now we know what data they're collecting. Can you tell us who is this amorphous they? Who is actually collecting the data? JON: It's a guy named Gary. ROSIE: C'mon, Gary. JON: Yeah. The companies themselves. One of the things we learned is large companies actually tend to be much more trustworthy than small companies. Not out of any sense of malice, but because a lot of times, smaller companies simply do not have the technical chops to do the security testing. CAIRA: When things fall through the cracks, who is it going to? Who's buying it? JON: Oh, yeah. Great question. There's companies that own it themselves. People talk about Amazon. Amazon is a giant sales company. They want data about you to put products your way and they have millions of partners. They state equivocally, "We do not sell data that is collected," but they're still using it. But also, Google. Google is an advertising company. They also help make ads for other companies. I spoke with someone who had been a higher level product specialist about this stuff. He was like, "Yeah, we don't give the data that we collect to those other companies, but they're putting your data to work." And then there's data brokers, which are these companies that they scour public records, probably work with credit card companies. Credit card companies monetize your data. Data brokers find ways to get all this data outside of that. There's the companies themselves has their data, and then there's external companies that are just finding inroads and trying to monetize that. CHRISTINE: How are the data brokers ... You've got all this smart home data that has been collected about you. The companies that own the devices or that have made the devices have this data. How are these data brokers accessing your data? JON: I will say I can't state unequivocally that every company works this way. But we spoke with some people at DuckDuckGo, which is a privacy company. They did a sample of the top huge chunk of Android apps that are downloadable. It was in the high 90% of them would have Google Analytics in those apps. That's because Google, Facebook, and other companies like that, they help small companies make apps easier. In doing so, they have their software in there. Even though you may not have an association with Google, you may be using an app that has Google Analytics in it, so they would get some of your data. The idea that even though these companies say your data is anonymous, outside companies, data brokers, they can ... It's a statistical ... CAIRA: They can just piece it all together. JON: They can piece it all together. "It's anonymous. Oh, no, we protected it. We have anonymized data." Well, yes. But once they have an association from here, from here, from here, from here, data brokers specialize in unearthing this information, selling it to the highest bidder, and it gets used for useful purposes. But at the same time, people have been stalked, law enforcement uses these, it's used at the border, ICE uses it. Insurance companies use it. They may or may not decide to cover you. The greater overarching concern here is you don't know what data is collected on you, you don't know how true it is, and you have no access to it. Those are the real problems and there's next to no regulation around any of this. We're just swimming around in this gray area and all of this is happening around us constantly. CHRISTINE: I got to know, why is this legal? Why do we live in a culture and a society where this can happen? ROSIE: Why is this okay? JON: Because cha-ching! (singing) Money, money, money, money. Money! I would say the history of technology and innovation when it comes to the government is the history of the government running frantically behind with a briefcase that has papers flying out of it and a floppy hat. Like, "Wait! We're trying to catch up." There's vested interests that are like, "Oh, we're going to make a lot of money," and people want to protect that. There's also just the wheels of justice move slowly kind of thing. You don't want the government to walk in with a hammer and just slam down and stop innovation. But at the same time, we've struggled to find a system that keeps tabs. The innovations happen so quickly. We're seeing this with AI especially. CAIRA: All right. Well, I don't want to hear any more about it. CHRISTINE: We're done here, we'll see you later. CAIRA: That's a wrap. CHRISTINE: I'm going to go into my bunker. JON: Totally, it's disturbing, it's annoying, et cetera. But the better news is on a state-by-state level, there are legislation. California, always at the fore with this stuff. They have a legislation that has come out where data brokers actually have to register. It isn't hopeless in so much as that states are taking on some of the burden here. There's also a program called the Cyber Trust Mark Program, which is supposed to basically be like a food label type thing, nutrition label, that will be on smart devices in particular. That's actually in the works and is hopeful they will basically say, "It does this, it does this, it meets these standards." Then there's also the example of Europe where they have passed I think it's GDPR. You probably had all those popups every time you access a website that are like, "Do you submit to these cookies," and all that kind of ... That's essentially from that. It's a step forward, it basically holds these companies accountable and allows people to opt out of data collection policies. CAIRA: Okay. All right, to quickly recap, it sounds like it's more than just your smart speaker that's kind of listening to you. It's anything that can be connected to the internet that you probably are using for convenience has the ability to maybe collect some of your data. And there isn't really much oversight on how much they can collect, just from a government level, and it sounds like there isn't much incentive for that to change, but things are maybe moving in the right direction on a state-by-state basis. Also, the way that you are describing these companies collecting data reminds me of when you go to a bookstore and they cover the bookcases in a brown paper bag essentially and they write a description on the bag. They're like, "If this sounds like it's for you, you should buy this book." Companies are doing that to our data and us, "anonymizing" us and selling it to the highest bidder essentially, right? JON: Yeah, I think that's right. CHRISTINE: All right. We're going to take a quick break, and when we're back we're going to get into more details about how specific devices you may own are collecting your data. We're also going to talk about the ways that you can keep these devices from collecting all this data, some safety measures you can take. We'll be right back. ROSIE: Welcome back. Jon, in your article you highlighted three main devices that are perhaps most culpable here. Smart speakers, smart TVs, security cameras, which include video doorbells. Can you briefly explain how each of these devices is collecting your data? Let's start with smart speakers. JON: Sure. With a smart speaker, all of them, there's a setup thing and it's related to an app. You're incorporating these devices into what we call a platform, it's essentially an app. They have your basic stuff. Your name, your home. They might have access to your contacts. They might have access to, depending on the device, it might have access to your photos, if you have one that has a screen on it. If you have sync them up with other devices, which is very common, they They may have access to whatever those devices are collecting. Then on top of that, with a smart speaker, you're talking to it, you're asking it questions. You are- ROSIE: Teaching it. JON: You are teaching it, exactly. Honestly, that's about to get many magnitudes greater because the current versions I'll say, of say Alexa, they learn a little bit, but they don't actually learn in a meaningful way about you. For instance, the new version of Alexa+, it will be learning deeply about you. You can tell it things like, "Oh, hey, Alexa, I'm allergic to gluten, I hate Bob Seeger, and I only drive Fords." It will internalize that and whenever it's answering you or things like that, it will in theory use that in making suggestions. ROSIE: Because of AI, right? JON: Because of AI, yeah. It's generative AI, there's language learning models, and all that kind of stuff. Basically, it's a different way of interacting with these things, they're actually learning. That's what's coming. ROSIE: Can we talk about TVs now? Because I think this is the most mind-blowing thing that I read in your article. It's like when I'm watching Severance, my TV is watching me. How is that happening? JON: It's watching your Innie. ROSIE: Oh, God. JON: No, your Outie. ROSIE: My Outie! JON: Your Outie. It's watching your Outie. Or is it? I don't know. ROSIE: It's both. JON: Yeah. TVs were ... After we did our initial research, Lee Neikirk who covers this, very casually presented this information and everyone's jaw was dropping. Essentially, there's a technology called ACR, automatic content recognition. It's essentially if you've ever used, the most common one is Shazam, or something like that, where you want to identify- CHRISTINE: A song? JON: A song, or something. CHRISTINE: Right. JON: There's technology like that built into most every TV. We don't know of a TV that doesn't have it that's been made in the last few years. ROSIE: It doesn't have to be labeled a smart TV to have this? JON: There's almost no such thing as smart TVs anymore because they're all smart. Any TV that you're going to connect to the internet or anything like that, they almost certainly have automatic content recognition, ACR. What happens is you're watching TV, every couple seconds, the TV's taking functionally what is a screenshot of what is on your screen, sends it up to the internet. It's analyzed, and then it's added to a data profile. Then that is sold, shared, packaged, whatever. Now, if you also have something plugged into your TV, anything that goes on your screen. It's this amazing thing where it recognizes what's there, it's not even what's streaming through your TV. CHRISTINE: Slideshows of your kids or a vacation. JON: Slideshow of your kids, yeah. CHRISTINE: Or of a vacation, or something. JON: Now, I don't know what happens on the far end. They might just be like, "Oh, blobby shapes." It's not identifying your kids or something like that. But also, it's out there and you have no idea of that. The crazy thing is you sort of opt into this almost certainly accidentally. If you buy a new TV and you hit yes, yes, yes, yes, yes, yes, boom, one of those is almost certainly ACR. They might have a branded name for it. Also, if you use a ... Let's say your TV itself isn't connected to the internet, but you have a Google TV, or Roku especially. CAIRA: A Fire Stick? JON: A Fire Stick, those also have ACR. I will say Apple TV is the only company we found of the large companies that does not have ACR built in. CHRISTINE: Well, they already have our phones. CAIRA: They already have all of my data. ROSIE: What more do you need? JON: But if you have your TVs connected to the internet and you plug in Apple TV into it, anything you watch through your Apple TV, the TV is going to see it and the TV is going to send it off. CHRISTINE: Okay. One last device. Let's talk about security cameras. JON: Yeah. CHRISTINE: I don't think it's surprising, these things are meant to watch you. JON: Yes. CHRISTINE: That's the whole point, right? JON: Yes. CHRISTINE: But it is surprising the data that these are collecting. Tell us about that. JON: Yeah. Rachel Cericola reported on this and she referenced the Surfshark data study. Surfshark is a VPN security company. They said that among all the typical smart home devices, security cameras actually collect the most data points. Not volume of data necessarily, but the different types of data. That's because cameras, they're visual. They have temperature sensors. Nest cameras have facial recognition. Some people were like, "Hey, that's very cool. I can go back and search for all the times that-" CAIRA: That thief came to my door. JON: Yes, Mr. Thief. That's my neighbor, Mr. Thief. But you can also understand why someone coming to your door might not want their image on there because who knows what's happening to it online. Again, Nest has a very comprehensive and relatively accessible privacy policy, but there's a lot of gray area in there. There are some states that have proactively banned this because they say, "Oh, you should not have the ability to take someone's face, and label it, and put it on the internet" kind of thing. CAIRA: Yeah, I like that. I like that rule. We should add more of that everywhere. CHRISTINE: Jon, based on the first half of this show, I am now completely terrified and I am not willing to have any smart devices in my home. JON: No! I have failed. I have failed you. CHRISTINE: I am going back to a rotary phone, I am moving to the country. No, in all honesty though, it sounds pretty concerning. I don't want these devices to collect so much about me. But I know that you have a lot of smart devices and clearly you're not getting rid of all of them, you're not going into your bunker. Why do you feel comfortable having all of these smart devices in your home? JON: I think there is absolutely a comfort level thing. I think also, having covered security for a good long while, I feel like for the most part, what happens with smart home is not that very different than what is happening by using the internet and using the smartphone. I do take basic measures that I think are useful and they do limit extreme exposure. But I also understand it's a cost-benefit type thing. One of the security people we spoke with, he talked about security cameras. He's like, "I analyze all these devices, I know what their pitfalls are." He's like "But I feel more protected having security cameras outside than the alternative, not having them." He's like, "I got little kids and I don't like having cameras inside." We cover smart security cameras a lot and we do recommend a bunch of indoor ones, and that you can have them only trigger when there's a pet. You can have them only trigger during certain hours. Some of them, you can have them as you come home, they turn off, when you leave they go back on, that kind of stuff. You shouldn't just be casual about bringing one of these devices into your home. You should really be thoughtful about it. I do think one of the things we've learned is do not buy rando devices that you see, the cheap thing with nine consonants in a row and one vowel named knockoff of some real device. We work really hard on our picks, and we vet them, and we trust them. Trust, but verify. That's our whole thing. I don't think you should be incautious around these at all, but I don't think it's completely justified that these are any worse than your own experiences. CHRISTINE: Right. You need to run a cost-benefit analysis on each device you're using in your life, that's the thing. JON: And just be prudent like you do with everything else. It's like use good passwords. Don't use bad passwords. Don't make it easy. CHRISTINE: Let's talk a little bit about what people can do to protect their data. Let's break it down by the devices that we just talked about. Let's talk through the smart speakers, let's talk through the TVs, and let's talk through the cameras. What are the steps people can do? JON: You interact with smart speakers with your voice, so if you're unhappy with your voice potentially being recorded and sent out there, you can go into any one of the control apps that are associated with the device and you can actually just turn that off. Within the privacy settings, there's a way to do it. I'm not going to describe it for each one right now, but there's the ability to either limit recordings or stop them all together, or you can go back and delete old ones, things like that. That's really basic. You can opt to not have a precise location. You can sometimes just put in a zip code or something like that when you're setting these up. You can also use email addresses that are not your main email address. CHRISTINE: Which we talked a lot about in our episode from a couple months ago with Max Eddy and we'll link that in the show notes. Okay, what about TVs? JON: TVs is more in-depth. Essentially, there is one regulation, yay! That is if you have ACR built into a TV, they are required to make it optional, you are allowed to opt out. They do not make it easy necessarily, but you're going to have to dig in on the settings. Same thing I believe with the Roku and stuff like that, you have to go in and actively turn it off. It may not be called ACR, it may have another name. You can probably go to the support page and they'll guide you through it or something like that. But basically, you're going to go in and turn off one or more, sometimes there's a few settings, it's about data collection, watching habits, other stuff like that. Yeah, turn it off. CHRISTINE: Okay. Let's talk about security cameras. What should people do there? JON: Yeah. I mentioned how one of the people we spoke with was like, "Yeah, I don't have them inside." But what you can do, you can shop for ones that have robust security settings and the ability to turn them on and off. Don't put them in sensitive areas. I know a lot of people, they have kids and they'll have a baby camera. It's not the same thing as a security camera, but those may be going up to the cloud. You may want to have thoughts around that, make sure that the company has really secure data handling practices. I just used one that was audio only. You can opt to not use AI with the cameras. That makes them helpful, but it's an option. CHRISTINE: It sounds like with all of these devices as well, you really want to know that you are buying a product that is from a relatively trustworthy company. Of course, I'm going to plug Wirecutter Picks. If you're listening and wondering, and you want to shortcut it, we've done a lot of research. But if you're going to go out into the wilderness and try to do this on your own, you need to do your homework and just really sort out whether the companies have good data privacy policies. JON: Absolutely. That's what we do is we do your homework for you. CHRISTINE: Right. JON: That's why these are especially important picks because the stakes are so high. CAIRA: Okay. Before we wrap, we always ask our guest one last question. Jon, what's the last thing you bought that you've really loved? JON: The last Wirecutter Pick I bought that I really loved, I will be honest with you, I haven't used it yet. It's actually a Christmas present under the tree, I'm so excited. One of my neighbors bought a Ryobi power washer. I'm going to give you the really eloquent name. ROSIE: That was a wind up to a power washer. JON: I live big. Maybe you've heard of the Ryobi RY 1419MTVNM 1900 PSI electric pressure washer. CHRISTINE: Wow! That does sound like a Christmas present right there. JON: Yeah. I was like, "What is that horrible noise? What is that been going on all day long?" Then I borrowed it and it was like I can't find enough uses for it. It's awesome. CAIRA: Yeah. JON: I'm unreasonably excited to get home and open that thing. CAIRA: Wow! JON: Yes. ROSIE: Jon, I'm so happy for you. I'm grateful that you have joined us today to talk about what has been undoubtedly harrowing, but also incredibly instructive. Thank you so, so much. JON: It is my pleasure. CHRISTINE: Okay, are we all creeped out now by all of the data that everything is collecting about us terrified. ROSIE: Thank you so much for checking in. CHRISTINE: Okay, well, what did you learn today? What are you taking away? ROSIE: I learned a lot. One of the things I'm taking away as soon as possible, Apple TVs are not opting into this automatic ACR. That's that feature that enables smart TVs to screenshot the TV and then send your data right up into the cloud. That makes me wanna buy one. Like today. Another takeaway is to just keep, generally speaking, keep an eye out on AI advances across all of these smart devices and just being more vigilant. CAIRA: Yeah, that's part of my takeaway too. I definitely need to be more diligent about opting out of things. You know, when you download an app, don't let it just take all your data immediately. Also, y'all will never catch me with a smart speaker ever. I think I'm just not going to risk it. How about you, Christine? CHRISTINE: Yeah, you know, I think I now understand why I am getting served certain ads when I didn't Google something. So now I'm understanding a little bit more about how I'm in a network with all of you, and then you're in a networks probably with people I know. So just the web, the matrix basically. And like Rosie, I'm going to switch my streaming stick. I have a Roku streaming stick, the first thing I'm gonna do when I get home is turn the ACR off. And then, yeah, I'm considering an Apple TV now. If I'm in the system, I am in the Apple system. They already have my info, so might as well. Might as well. ROSIE: I'll look online for a promo code. Two for one. There we go. CHRISTINE: There we go. Thank you. Thank you. ROSIE: And if you want to find out more about Wirecutter's coverage, or if you want to check out any of the products Jon recommended today, you can check out our website, you could find a link in our show notes. That's it for us. Have a good week. Bye. ROSIE: The Wirecutter Show is executive produced by Rosie Guerin and produced by Abigail Keel, engineering support from Maddy Masiello and Nick Pitman. Today's episode was mixed by Catherine Anderson. Original music by Dan Powell, Marion Lozano, Elisheba Ittoop, and Diane Wong. Wirecutter's deputy publisher is Cliff Levy. Ben Frumin is Wirecutter's editor-in-chief. I'm Rosie Guerin. CAIRA: I'm Caira Blackwell. CHRISTINE: And I'm Christine Cyr Clisset. ROSIE: Thanks for listening. CHRISTINE: By the way, why is this legal? JON: Sleep well tonight, my children.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store