Latest news with #MichaelCalore


WIRED
23-05-2025
- Entertainment
- WIRED
Let's Talk About ChatGPT and Cheating in the Classroom
Photo-Illustration: WIRED Staff/Gety Images All products featured on WIRED are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links. There's been a lot of talk about how AI tools like ChatGPT are changing education. Students are using AI to do research, write papers, and get better grades. So today on the show, we debate whether using AI in school is actually cheating. Plus, we dive into how students and teachers are using these tools, and we ask what place AI should have in the future of learning. You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Katie Drummond on Bluesky at @katie-drummond. Write to us at uncannyvalley@ How to Listen You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for 'uncanny valley.' We're on Spotify too. Transcript Note: This is an automated transcript, which may contain errors. Michael Calore: Hey, this is Mike. Before we start, I want to take the chance to remind you that we want to hear from you. Do you have a tech-related question that's been on your mind or just a topic that you wish we talk about on the show? If so, you can write to us at uncannyvalley@ and if you listen to and enjoy our episodes, please rate it and leave a review on your podcast app of choice. It really helps other people find us. How's everybody doing? How you feeling this week? Katie Drummond: I'll tell you how I'm feeling. It's Katie here. My vibe levels are up. I'm feeling really good. I was at Columbia University earlier this week with five of our fantastic editors and reporters at WIRED because we were honored at the Columbia Journalism School this week for our politics reporting. And so we got dressed up, I gave a speech and it was so wonderful to have a minute to sit back and take a breath and think about all of the journalism we've done in the last several months and celebrate that. And it was also really, really cool to just see and talk to journalists who were graduating from journalism school and feel their energy and their excitement and their drive to do this work. Because I think, as you guys know, and you probably agree, we're all quite tired. Lauren, how are you? Lauren Goode: When you said, "Because we're tired." I wasn't sure if you meant we're just tired in this moment or we are existentially tired because I am a little tired in this moment, but I am not existentially tired. I'm here for the fight, Katie. Katie Drummond: Oh, I'm so glad to hear that. Lauren Goode: Yeah. Katie Drummond: Yeah, I'm tired in this moment. I just think it's so nice to spend some time with a couple hundred people who are new to this and just so excited to get down to business. It was very cool. Michael Calore: How much ChatGPT use is there at Columbia University in the journalism department, do we think? Lauren Goode: Good question, Mike. Katie Drummond: I really hope very little. Michael Calore: Me too. For the sake of us all. This is WIRED's Uncanny Valley , a show about the people power and influence of Silicon Valley, and today we are talking about how AI tools like ChatGPT are changing education from middle school to graduate school. More and more students are using generative chatbot tools to gather information, finish assignments faster and get better grades, and sometimes just write things for them. Just this month, there has been a ton of reporting and discourse on this trend, and some of it has been fairly optimistic, but a lot of it has also been critical as one user on X put it, "The kids are cooked." Lauren Goode: The kids are all right. Katie Drummond: Which X user was it? I can think of a few. I'm just curious. We don't actually know. Michael Calore: So on this episode, we're going to dive into how students are using ChatGPT, how professors are using it, whether we think this trend is, in fact, cheating when the students use it, and what AI's place could be in the future of learning. I'm Michael Calore, director of consumer tech and culture here at WIRED. Lauren Goode: I'm Lauren Goode. I'm a senior correspondent at WIRED. Katie Drummond: And I'm Katie Drummond, WIRED's global editorial director. Michael Calore: So before we dive into what has been happening with AI and students potentially using ChatGPT to cheat in their coursework, I want to have all of our cards on the table. Did either of you cheat in high school or in college? And if so, how? Katie Drummond: I feel like I should go first here because I'm the boss and I want to set Lauren up for success in her answer. I did not cheat in college. I was a very serious person in college. I was getting an undergraduate degree in philosophy, which felt like a very serious thing to be doing at the time. So I was totally above board. And also, as I was thinking about this earlier, this was in the early 2000s and it wasn't, I don't think, or wouldn't have been particularly easy to cheat at philosophy back then, whereas interestingly, it would be pretty easy to cheat at philosophy now. You're reading a lot. You're writing a lot of essays. It's hard to imagine how I would've effectively cheated, but I didn't cheat. I did cheat in high school though. Everybody cheated all the time. I'm not saying I cheated all the time. I'm not going to answer that question, but I did cheat. I specifically remember we had graphing calculators and we would program equations and answers into the calculators using special code so that teachers, if they went through our calculators, they wouldn't be able to tell that it was cheats. But we went to pretty great lengths to cheat on math exams, which is so stupid because I would've done great on the math exam regardless, but there was just something about being able to get away with it. Lauren Goode: Do you feel like a weight has been lifted from you now that you have confessed? Katie Drummond: No, I don't care. Look, I think that most students, at least in middle school and high school, dabble with cheating, and so I have no shame. What are they going to do? Strip me of my high school diploma. Good luck. Lauren Goode: Yeah, it's kind of a rite of passage. Katie Drummond: Exactly. Lauren Goode: I was very similar to Katie in that I did not cheat in college. In high school though, I remember reading Cliff's Notes for some book assignments. My best friend and I also did some light cheating in high school because the first initial of our last names wasn't that far apart, and it was a small school as well, so she was often sitting in front of me and I was directly behind her. And we had a tapping scheme where we'd tap our pencils during Scantron tests. Katie Drummond: Wow. Michael Calore: Oh, like sending secret messages to each other. Lauren Goode: Yeah, yeah. So if she was on question 13, she would sort of slide her Scantron to the side of the desk and so that you could see which number, which question number 13, and then the person who had the answer would tap their pencil a corresponding number of times to be like, answer A, answer B, answer C. Anyway, I don't want to implicate her. Totally. She's an adult now with a career and two grown children, and I'm not sure if the statute limitations has expired on this grand felony from Notre Dame Catholic High School. So maybe we can scrap that from the record. Thank you very much. Mike, did you cheat? Michael Calore: No, I was a total goody-goody, like super-duper do everything by the book Eagle Scout kind of kid. Didn't cheat in high school. I did encounter a course in college that I had a really hard time keeping up with. It was the 19th century British novel, and the reading list was absolutely brutal. It was one super long, boring book every week. And I mean, there was some good stuff in there, like Jane Eyre and Frankenstein. And then there were absolutely terrible books in there, like Barchester Towers and The Mayor of Casterbridge. So I learned the art of the shortcut. I would zoom in on one chapter and I would read the Cliff's Notes, and then I would read that chapter and I would be able to talk about that chapter in depth on a test. Katie Drummond: Oh, that's very smart. That's smart. But not cheating. Michael Calore: Not necessarily cheating. I don't consider Cliff's Notes to be cheating. I'm one of those people. Lauren Goode: Why not? Michael Calore: Well, because you're still actually doing the work and comprehending. And I think some of the examples that we're going to talk about don't even have that step in them. They just sort of skip over all the learning, Lauren Goode: Yeah, but you're not understanding the full context of where that author fits into a certain category of other writers. Katie Drummond: Lauren, I think that what you're trying to do right now is distract both us and our audience from your Scantron felony, when in fact, it seems like Mike is the most innocent party here. I just need to say. Lauren Goode: Fair enough. Michael Calore: At least I did the reading. All right, well we've all come clean. So thank you for all of that. And we can acknowledge that, of course, cheating is nothing new, but we're talking about it now. Because of the use of AI tools like ChatGPT by students and how it has exploded in recent years. It's become a topic of debate in both the tech and education spheres. So just to get a sense of the scale of how much students are using AI, one estimate by the Digital Education Council says that around 86% of students, globally, regularly use AI. During the first two years that ChatGPT was publicly available, monthly visits to ChatGPT steadily grew and then started to dip In June when school gets out. Katie Drummond: 86%. Michael Calore: 86%. So yeah, I've used AI in my school. Katie Drummond: That is an astonishing figure. Michael Calore: So the appeal of something like ChatGPT, if you've used it, you understand why it would be useful to students. The appeal of using it is pretty obvious. It can write, it can research, it can summarize, it can generate working code, but the central question remains. Is using ChatGPT in schoolwork cheating? Where do we draw the line here? Katie Drummond: So I don't think that there's a black and white answer, which is good for the length of this episode, but I think that that informs my overall view about AI and education, which is that this technology is here, you can't hide it, you can't make it go away. You can't prevent teenagers and young adults from accessing it. So you need to learn to live with it and evolve and set new rules and new guardrails. So in that context, I think there are a lot of uses of AI for students that I would not qualify as cheating. So getting back to the Cliff Notes debacle, I think using AI to summarize information, like say you're coming up with notes to help you study and you use AI to summarize information for you and come up with a study guide for you, I think that's a fantastic use of AI and that would actually just save you a lot of time and allow you to focus on the studying part instead of the transcription and all of that stuff. Or honestly to me, using it to compile research for you that you'll use to then write a paper, I think use cases like that are a natural evolution of technology and what it can help us do. I think for me, where AI becomes cheating is when you use AI to create a work product that was assigned and meant to come from you and now doesn't. But Lauren, I'm curious to hear what you think. Lauren Goode: Well, it would make for a really good podcast if I vehemently disagreed with you right now. I think we're pretty aligned on this. Earlier this week I happened to be at the Google I/O conference, which is their annual software conference, and it's a huge AI fest. It's an AI love fest. And so I had the opportunity to talk to a bunch of different executives and many of these conversations were off the record. But after we got through the round of like, "Okay, what's the latest thing you announced?" I just said, "How are you feeling about AI and education? What's your framework for thinking about this?" And one of the persons said, "Are you using it to replace the goal of the exercise?" And it's a blurry line, but it's, I think, a line to draw in terms of whether or not you're "cheating". So if you're going to ask that question, you first have to determine the goal and then you have to determine what the product is. The product of an education is not actually test scores or assignments. The product is, are you learning something from doing it? So if you're using AI to generate an output, it's understandable that you would say, "Does this output demonstrate cheating?" But the cheating actually happens during the generative part of generative AI. And once again, that's very fuzzy, but I think that if the goal of an assignment is not just turn this thing into your teacher's desk on Tuesday morning, goal of it is, did you learn something? And so if you're using AI to cheat through the learning part, which is like I think what we're going to be discussing, then yes, I guess that is cheating. Broadly, the use of these tools in education, just broadly speaking, doesn't scream cheating to me. Katie Drummond: I think that's a really interesting way of thinking about it actually. I like that a lot. Thank you person at Google. Michael Calore: Yeah. If the assignment is to write 600 words about the French Revolution, then that's obviously something that ChatGPT can do for you pretty easily. But if the assignment getting knowledge into your brain and then being able to relay it, then to prove that you've memorized it and internalized it and understand it, then I think there's a lot of things that ChatGPT and tools like it can do for you. Like you mentioned Katie, you can use it to summarize books, you can use it to help you with the research. One of the most ingenious uses that I've seen is people ask it to generate practice tests. They upload their whole textbook and they say, "I have a test on Friday on chapters four and six, can you generate five practice tests for me to take?" And then that helps them understand what sort of questions they would be getting and what kinds of things keep popping up in all of those practice tests, those things are probably the most important things to learn. So let me quickly share a real world example of AI cheating to see what you think about it. The most infamous case perhaps comes from a recent New York Magazine story about students using ChatGPT for their coursework. The story starts off with Chungin Roy Lee, a former Columbia student who created a generative AI app explicitly to cheat on his computer science schoolwork. He even ended up using it in job interviews with major tech companies. He scored an internship with Amazon after using his AI helper during the interview for the job. He declined to take that job, by the way. So that's pretty ingenious. He's coding an app. He's using generative AI to make an app to help him cheat on things and get jobs. Do you think that the "ingenuity" behind building something like this is cheating? Do we think that his creation of this AI tool carries any merit? Lauren Goode: I mean, it's so clearly cheating because the intent is to cheat. If we go back to that question of, are you using it to replace the goal of what you're trying to do? His goal is cheating. His goal is like, "Look how clever I am and then I'm cheating." Lee strikes me as the irritant in the room. What it's doing is bubbling to the surface, a lightning rod topic that is much bigger than this one specific app. Katie Drummond: Well, and he, in April of this year, something I thought was interesting just in terms of he's the irritant, but how many complicit irritants does he have on his team? Lee and a business partner raised $5.3 million to launch an app that scans your computer screen, listens to the audio and then gives AI generated feedback and answers to questions in real time. And my question when I read that was, "Who are these investors? Who are these people?" The website for this company says, "We want to cheat on everything." And someone was like, "Yes, I am writing a check." Of course it's cheating. They say that it's cheating. I mean, I appreciate the creativity. It's always interesting to see what people dream up with regards to AI and what they can create. But using AI to ace a job interview in real time, not to practice for the job interview beforehand, but to, in real time, answer the interviewer's questions, like you're setting yourself up and your career up for failure. If you get the job, you do need to have some degree of competence to actually perform the job effectively. And then I think something else that I'm sure we'll talk about throughout this show is it's the erosion of skill. It's knowing how to think on your feet or answer tough questions or engage with a stranger, make small talk. There are all of these life skills that I worry we're losing when we start to use tools like the tools that Lee has developed. And so of course I think there are interesting potential use cases for AI like interview prep or practice is an interesting way to use that technology. So again, it's not about the fact that AI exists and that it's being used in the context of education or a job interview, but it's about how we're using it. And certainly in this case it's about the intent. Is someone who is developing these tools specifically with the intention of using them and marketing them for cheating? And I don't like that. I don't like a cheater, other than when I cheated in high school. Michael Calore: Well, we've been talking a lot about ChatGPT so far and for good reason because it's the most popular of the generative AI tools that students are using, but there are other AI tools that they can use to help with their coursework or even just do their schoolwork for them. What are some of the other ones that are out there? Lauren Goode: I think you can literally take any of these AI products that we write about every day in WIRED, whether it's ChatGPT, whether it's Anthropic's Claud, whether it's Google Gemini or the Perplexity AI search engine, Gamma for putting together fancy decks. All of these tools, they're also sort of highly specialized AI tools like Wolfram or MathGPT, which are both math focused models. And you can see folks talking about that on Reddit. Katie Drummond: Something interesting to me too, is that there are now also tools that basically make AI detectors pretty useless. So there are tools that can make AI generated writing sound more human and more natural. So you basically would have ChatGPT, write your paper, then run it through an additional platform to finesse the writing, which helps get that writing around any sort of AI detection software that your professor might be using. Some students have one LLM write a paper or an answer, and then they sort of run it through a few more to basically make sure that nothing can show up or nothing can be detected using AI detection software. Or students, I think too, are getting smarter about the prompts they use. So there was a great anecdote in this New York Magazine story about asking the LLM to make you sound like a college student who's kind of dumb, which is amazing. It's like maybe you don't need the A plus, maybe you're okay getting the C plus or the B minus. And so you set the expectations low, which reduces your risk, in theory, of getting caught cheating. Michael Calore: And you can train a chatbot to sound like you. Katie Drummond: Yes. Yeah. Michael Calore: To sound actually like you. One of the big innovations that's come up over the last year is a memory feature, especially if you have a paid subscription to a chatbot, you can upload all kinds of information to it in order to teach it about you. So you can give it papers, you can give it speeches, YouTube videos of you speaking so it understands the words that you'd like to use. It understands your voice as a human being. And then you can say, "Write this paper in my voice." And it will do that. It obviously won't be perfect, but it'll get a lot closer to sounding human. So I think we should also talk about some of the tools that are not necessarily straight chatbot tools that are AI tools. One of them is called Studdy, which is study with two Ds, which I'm sure the irony is not lost on any of us that they misspelled study in the name, but it's basically an AI tutor. You download the app and you take a picture of your homework and it acts like a tutor. It walks through the problem and helps you solve it, and it doesn't necessarily give you the answer, but it gives you all of the tools that you need in order to come up with the answer on your own. And it can give you very, very obvious hints as to what the answer could be. There's another tool out there called Chegg, C-H-E-G-G. Katie Drummond: These names are horrific, by the way. Just memo to Chegg and Studdy, you have some work to do. You both have some work to do. Lauren Goode: Chegg has been around for a while, right? Katie Drummond: It's a bad name. Lauren Goode: Yeah. Michael Calore: It has been, it's been very popular for a while. One of the reasons it's popular is the writing assistant. Basically you upload your paper and it checks it for grammar and syntax and it just helps you sound smarter. It also checks it for plagiarism, which is kind of amazing because if you're plagiarizing, it'll just help you not get caught plagiarizing and it can help you cite research. If you need to have a certain number of citations in a paper, oftentimes professors will say, "I want to see five sources cited." You just plug in URLs and it just generates citations for you. So it really makes that easy. Katie Drummond: I mean, I will say there are some parts of what you just described that I love. I love the idea of every student, no matter what school they go to, where in the country they live, what their socioeconomic circumstances are, that they would have access to one-on-one tutoring to help support them as they're doing their homework, wherever they're doing it, whatever kind of parental support they do or don't have. I think that that's incredible. I think the idea of making citations less of a pain in the ass is like, yeah, that sounds good. Not such a huge fan of helping you plagiarize, right? But it's again, it's like this dynamic with AI in education where not all good, not all bad. I've talked to educators and the impression I have gotten, and again, this is just anecdotal, but there is so much fear and resistance and reluctance and this feeling among faculty of being so overwhelmed by, "We have this massive problem, what are we going to do about it?" And I just think that too often people get caught up in the massive problem part of it and aren't thinking enough about the opportunities. Michael Calore: Of course, it's not just students who are using AI tools in the classroom, teachers are doing it too. In an article for The Free Press, an economics professor at George Mason University says that he uses the latest version of ChatGPT to give feedback on his PhD student's papers. So kudos to him. Also, The New York Times recently reported that in a national survey of more than 1800 higher education instructors last year, 18% of them described themselves as frequent users of generative AI tools. This year, that percentage has nearly doubled. How do we feel about professors using generative AI chatbots to grade their PhD students papers? Lauren Goode: So I have what may be a controversial opinion on this one, which is just give teachers all the tools. Broadly speaking, I don't think it is wrong for teachers to use the tools at their disposal, provided it aligns with what their school system or university policies say if it is going to make their lives easier and help them to teach better. So there was another story in The New York Times written by Kashmir Hill that was about a woman at Northeastern University who caught her professor using ChatGPT to prepare lecture notes because of some string of a prompt that he accidentally left in the output for the lecture notes. And she basically wanted her $8,000 back for that semester because she was thinking that, "I'm paying so much money to go here and my teacher is using ChatGPT." It currently costs $65,000 per year to go to Northeastern University in Boston. That's higher than the average for ranked private colleges in the US, but it's all still very expensive. So for that price, you're just hoping that your professors will saw off the top of your head and dump all the knowledge in that you need, and then you'll enter the workforce and nab that six-figure job right off the gate. But that's not how that works, and that is not your professor's fault. At the same time, we ask so much of teachers. At the university level, most are underpaid. It is increasingly difficult to get a tenure-track position. Below the university level, teachers are far outnumbered by students. They're dealing with burnout from the pandemic. They were dealing with burnout before then, and funding for public schools has been on the decline at the state level for years because fewer people are choosing to send their kids to public schools. Katie Drummond: I mean, I totally agree with you in terms of one group of people in this scenario are subject matter experts, and one group of people in this scenario are not. They are learning a subject. They are learning how to behave and how to succeed in the world. So I think it's a mistake to conflate or compare students using AI with teachers using AI. I think that what a lot of students, particularly at a university level, are looking for from a professor is that human-to-human interaction, human feedback, human contact. They want to have a back-and-forth dialogue with their educator when they're at that academic level. And so if I wrote a paper and my professor used AI to read the paper and then grade the paper, I would obviously be very upset to know that that feels like cheating at your job as a professor. And I think cheating the student out of that human-to-human interaction, that, ostensibly, they are paying for access to these professors, they're not paying for access to an LLM. Lauren Goode: Lesson plan, yeah. Katie Drummond: But for me, when I think about AI as an efficiency tool for educators, so should a professor use AI to translate a written syllabus into a deck that they can present to the classroom for students who are maybe better visual learners than they are written learners? Obviously. That's an amazing thing to be able to do. You could create podcast versions of your curriculum so that students who have that kind of aptitude can learn through their ears. You know what I mean? There are so many different things that professors can do to create more dynamic learning experiences for students, and also to save themselves a lot of time. And none of that offends me, all of that actually, I think is a very positive and productive development for educators. Michael Calore: Yeah, I mean essentially what you're talking about is people using AI tools to do their jobs in a way that's more efficient. Katie Drummond: Right, which is sort of what the whole promise of AI in theory, in a best-case scenario, that's what we're hoping for. Lauren Goode: What it's supposed to be. Yeah. Katie Drummond: Yeah. Michael Calore: Honestly, some of these use cases that we're talking about that we agree are acceptable, are much the same way that generative AI tools are being used in the corporate world. People are using AI tools to generate decks. They're using them to generate podcasts so that they can understand things that they need to do for their job. They're using them to write emails, take meeting notes, all kinds of things that are very similar to the way that professors are using it. I would like to ask one more question before we take a break, and I want to know if we can identify some of the factors or conditions that we think have contributed to this increasing reliance on AI tools by students and professors. They feel slightly different because the use cases are slightly different. Katie Drummond: I think that Lauren had a really good point about teachers being underpaid and overworked. So I think the desire for some support via technology and some efficiency in the context of educators, I think that that makes total sense as a factor. But when I think about this big picture, I don't really think that there is a specific factor or condition here other than just the evolution of technology. The sometimes slow, but often very fast march of technological progress. And students have always used new technology to learn differently, to accelerate their ability to do schoolwork and yes, to cheat. So now AI is out there in the world, it's been commercialized, it's readily available, and they're using it. Of course they are. So I will acknowledge though that AI is an exponential leap, I think, in terms of how disruptive it is for education compared to something like a graphing calculator or Google search. But I don't think there is necessarily some new and novel factor other than the fact that the technology exists and that these are students in this generation who were raised with smartphones and smart watches and readily accessible information in the palms of their hands. And so I think for them, AI just feels like a very natural next step. And I think that's part of the disconnect. Whereas for teachers in their thirties or forties or fifties or sixties, AI feels much less natural, and therefore the idea that their students are using this technology is a much more nefarious and overwhelming phenomenon. Michael Calore: That's a great point, and I think we can talk about that forward march of technology when we come back. But for now, let's take a break. Welcome back to Uncanny Valley . So let's take a step back for a second and talk about that slow march of technology and how various technologies have shaped the classroom in our lifetimes. So the calculator first made its appearance in the 1970s. Of course, critics were up in arms. They feared that students would no longer be able to do basic math without the assistance of a little computer on their desk. The same thing happened with the internet when it really flowered and came into being in the late 90s and early 2000s. So how is this emergence of generative AI any similar or different than the arrival of any of these other technologies? Lauren Goode: I think the calculator is a false equivalence. And let me tell you, there is nothing more fun than being at a tech conference where there's a bunch of Googler PhDs when you ask this question too. And they go, "But the calculator." Everyone's so excited about the calculator, which is great, an amazing piece of technology. But I think it's normal that when new technology comes out, our minds tend to reach for these previous examples that we now understand. It's the calculator, but a calculator is different. A standard calculator is deterministic. It gives you a true answer, one plus one equals two. The way that these AI models work is that they are not deterministic. They're probabilistic. The type of AI we're talking about is also generative or originative. It produces entirely new content. A calculator doesn't do that. So I think if you sort of broadly categorize them all as new tools that are changing the world, yes, absolutely tech is a tool, but I think that generative AI, I think it's in a different category from this. I was in college in the early 2000s when people were starting to use Google, and you're sort of retrieving entirely new sets of information in a way that's different from using a calculator, but different from using ChatGPT. And I think if you were to use that as the comparison, and the question is, is skipping all of those processes that you typically learn something doing, the critical part? Does that make sense? Katie Drummond: That makes sense. And this is so interesting because when I was thinking about this question and listening to your answer, I was thinking about it more in that way of thinking about the calculator, thinking about the advent of the internet and search, comparing them to AI. Where my brain went was what skills were lost with the advent of these new technologies and which of those was real and serious and maybe which one wasn't. And so when I think about the calculator, to me that felt like a more salient example vis-a-vis AI because the advent of the calculator, are we all dumber at doing math on paper because we can use calculators? Michael Calore: Yes. Katie Drummond: For sure. Lauren Goode: Totally, one hundred percent. Katie Drummond: For sure. You think I can multiply two or three numbers? Oh no, my friend, you are so wrong. I keep tabs on my weekly running mileage, and I will use a calculator to be like, seven plus eight plus 6.25 plus five. That's how I use my calculator. So has that skill atrophied as a result of this technology being available? 100%. When I think about search and the internet, I'm not saying there hasn't been atrophy of human skill there, but that to me felt more like a widening of the aperture in terms of our access to information. But it doesn't feel like this technological phenomenon where you are losing vital brain-based skills, the way a calculator feels that way. And to me, AI feels that way. It's almost like when something is programmed or programmable, that's also where I feel like you start to lose your edge. Now that we program phone numbers into our phones, we don't know any phone numbers by heart. I know my phone number, I know my husband's phone number. I don't know anyone else's phone number. Maybe Lauren, maybe you're right. It's this false equivalence where you can't draw any meaningful conclusion from any one new piece of technology. And AI again, I think is just exponentially on this different scale in terms of disruption. But are we all bad at math? Yes, we are. Michael Calore: Yeah. Lauren Goode: Well, I guess I wonder, and I do still maintain that it's kind of a false equivalence to the calculator, but there were some teachers, I'm sure we all had them, who would say, Fine, use your calculator, bring it to class." Or, "We know you're using it at home for your homework at night, but you have to show your work." What's the version of show your work when ChatGPT is writing an entire essay for you? Michael Calore: There isn't one. Katie Drummond: Yeah, I mean, I think some professors have had students submit chat logs with their LLMs to show how they use the LLM to generate a work product, but that starts from the foundational premise that ChatGPT or AI is integrated into that classroom. I think if you're just using it to generate the paper and lying about it, you're not showing your work. But I think some professors who maybe are more at the leading edge of how we're using this technology have tried to introduce AI in a way that then allows them to keep tabs on how students are actually interacting with it. Lauren Goode: Mike, what do you think? Do you it's like the calculator or Google or anything else you can think of? Michael Calore: Well, so I started college in 1992, and then while I was at college, the web browser came around and I graduated from college in 1996. So I saw the internet come into being while I was in the halls of academia. And I actually had professors who were lamenting the fact that when they were assigning us work, we were not going to the library and using the card catalog to look up the answers to the questions that we were being asked in the various texts that were available in the library. Because all of a sudden we basically had the library in a box in our dorm rooms and we could just do it there. I think that's fantastic. Katie Drummond: Yes. Michael Calore: I think having access at your fingertips to literally the knowledge of the world is an amazing thing. Of course, the professor who had that view also thought that the Beatles ruined rock and roll and loved debating us about it after class. But I do think that when we think about using ChatGPT and whether or not it's cheating, like yes, absolutely, it's cheating if you use it in the ways that we've defined, but it's not going anywhere. And when we talk about these things becoming more prevalent in schools, our immediate instinct is like, "Okay, well how do we stop it? How do we contain it? Maybe we should ban it." But it really is not going anywhere. So I feel like there may be a missed opportunity right now to actually have conversations about how we can make academia work better for students and faculty. How are we all sitting with this? Lauren Goode: I mean, banning it isn't going to work, right? Do we agree with that? Is the toothpaste out of the tube? Katie Drummond: Yes, I think- Lauren Goode: And you could be a school district and ban it and the kids are going to go, "Haha, Haha, Ha." Michael Calore: Yeah. Katie Drummond: I mean that's a ridiculous idea to even... Lauren Goode: Right. Katie Drummond: If you run a school district out there in the United States, don't even think about it. Lauren Goode: Right. And what's challenging about the AI detection tools that some people use, they're often wrong. So I think, I don't know, I think we all have to come to some kind of agreement around what cheating is and what the intent of an educational exercise is in order to define what this new era of cheating is. So a version of that conversation that has to happen for all these different levels of society to say, "What is acceptable here? What are we getting from this? What are we learning from this? Is this bettering my experience as a participant in society?" Katie Drummond: And I think ideally from there, it's sort of, "Okay, we have the guardrails. We all agree what cheating is in this context of AI." And then it's about how do we use this technology for good? How do we use it for the benefit of teachers and the benefit of students? What is the best way forward there? And there are some really interesting thinkers out there who are already talking about this and already doing this. So Chris Ostro is a professor at the University of Colorado at Boulder, and they recommend actually teaching incoming college students about AI literacy and AI ethics. So the idea being that when students come in for their first year of college that we need to actually teach them about how and where AI should be used and where it shouldn't. When you say it out loud, you're like, "That's a very reasonable and rational idea. Obviously we should be doing that." Because I think for some students too, they're not even aware of the fact that maybe this use of AI is cheating, but this use of AI is something that their professor thinks is above board and really productive. And then there are professors who are doing, I think, really interesting things with AI in the context of education in the classroom. So they'll have AI generate an essay or an argument, and then they will have groups of students evaluate that argument, basically deconstruct it and critique it. So that's interesting to me because I think that's working a lot of those same muscles. It's the critical thinking, the analysis, the communication skills, but it's doing it in a different way than asking students to go home and write a paper or go home and write a response to that argument. The idea being, "No, don't let them do it at home because if they go home, they'll cheat." It's an interesting evolution of, I think, Lauren, to the point that you've brought up repeatedly that I think is totally right is thinking about what is the goal here, and then given that AI is now standard practice among students, how do we get to the goal in a new way? Michael Calore: Yeah, and we have to figure out what we're going to do as a society with this problem because the stakes are really, really high. We are facing a possible future where there's going to be millions of people graduating from high school and college who are possibly functionally illiterate because they never learned how to string three words together. Katie Drummond: And I have a second grader, so if we could figure this out in the next 10 years, that would be much appreciated. Lauren Goode: So she's not using generative AI at this point? Katie Drummond: Well, no, she's not. Certainly not. She gets a homework packet and she loves to come home and sit down. I mean, she's a real nerd. I love her, but she loves to come home and sit down and do her homework with her pencil. But my husband is a real AI booster. We were playing Scrabble a couple of months ago, adult Scrabble with her. She's seven, Scrabble is for ages eight and up, and she was really frustrated because we were kicking her ass, and so he let her use ChatGPT on his computer and she could actually take a photo of the Scrabble board and share her letters. Like, "These are the letters that I have, what words can I make?" And I was like, "That's cheating." And then honestly, as we kept playing, it was cool because she was discovering all of these words that she had never heard of before and so she was learning how to pronounce them. She was asking us what they meant. My thinking about it softened as I watched her using it. But no, it's not something that is part of her day to day. She loves doing her homework and I want her to love doing her homework until high school when she'll start cheating like her mother. Michael Calore: This is actually a really good segue into the last thing that I want to talk about before we take another break, which is the things that we can do in order to make these tools more useful in the classroom. So thought exercise, if you ran a major university or if you're in the Department of Education before you lose your job, what would you be doing over your summer break coming up in order to get your institutions under your stead ready for the fall semester? Katie Drummond: I love this question. I have a roadmap. I'm ready. I love this idea of AI ethics, so I would be scouring my network, I would be hiring a professor to teach that entry level AI ethics class, and then I would be asking each of my department heads because every realm of education within a given college is very different. If you have someone who runs the math department, they need to think about AI very differently than whoever runs the English department. So I would be asking each of my department leads to write AI guidelines for their faculty and their teachers. You can tell I'm very excited about my roadmap. Michael Calore: Oh yes. Katie Drummond: I would then review all of those guidelines by department, sign off on them, and also make sure that they laddered up to a big picture, institutional point of view on AI. Because obviously it's important that everyone is marching to the beat of the same drum, that you don't have sort of wildly divergent points of view within one given institution. Lauren Goode: What do you think your high level policy on AI would be right now if you had to say? Katie Drummond: I think it would really be that so much of this is about communication between teachers and students, that teachers need to be very clear with students about what is and is not acceptable, what is cheating, what is not cheating, and then they need to design a curriculum that incorporates more, I would say, AI friendly assignments and work products into their education plan. Because again, what I keep coming back to is, you can't send a student home with an essay assignment anymore. Lauren Goode: No, you can't. Katie Drummond: You can't do that. So it comes down to, what are you to do instead? Lauren Goode: I like it. Katie Drummond: Thank you. What would you do? Lauren Goode: I would enroll at Drummond. Drummond, that actually sounds like a college. Where did you go to school? Drummond. Michael Calore: It does. Lauren Goode: Well, I was going to say something else, but Katie, now that you said you might be hiring an ethics professor, I think I'm going to apply for that job, and I have this idea for what I would do as an ethics professor teaching AI to students right now. On the first day of class, I would bring in a couple groups of students. Group A would have to write an essay there on the spot and group B presumably were doing it, but actually they weren't. They were just stealing group A's work and repurposing it as their own. And I haven't quite figured out all the mechanics of this yet, but basically I would use as an example for here's what it feels like when you use ChatGPT to generate an essay because you're stealing some unknown person's work, essentially cut up into bits and pieces and repurposing it as your own. Katie Drummond: Very intense, Lauren. Lauren Goode: I would start off the classroom fighting with each other, basically. Katie Drummond: Seriously? Michael Calore: It's good illustration. I would say that if I was running a university, I would create a disciplinary balance in the curriculum across all of the departments. You want to make sure that people have a good multi-disciplinary view of whatever it is that they're studying. So what I mean is that some percentage of your grade is based on an oral exam or a discussion group or a blue book essay, and some other percentage is based on research papers and tests and other kinds of traditional coursework. So, I think there has to be some part of your final grade that are things that you cannot use AI for. Learning how to communicate, how to work in teams, sitting in a circle and talking through diverse viewpoints in order to understand an issue or solve a problem from multiple different angles. This is how part of my college education worked, and in those courses where we did that, where one third of our grade was based on a discussion group, it was one class during the week was devoted to sitting around and talking. I learned so much in those classes, and not only about other people, but also about the material. The discussions that we had about the material were not places that my brain would've normally gone. So yeah, that's what I would do. I think that's the thing that we would be losing if we all just continued to type into chatbots all the time. There are brilliant minds out there that need to be unleashed, and the only way to unleash them is to not have them staring at a screen. Lauren Goode: Mike's solution is touch some grass. I'm here for it. Michael Calore: Sit in a circle, everybody. Okay, let's take one more break and then we'll come right back. Welcome back to Uncanny Valley . Thank you both for a great conversation about AI and school and cheating, and thank you for sharing your stories. Before we go, we have to do real quick recommendations. Lightning round. Lauren, what is your recommendation? Lauren Goode: Ooh. I recommended flowers last time, so... Katie Drummond: We are going from strength to strength here at Uncanny Valley . Lauren Goode: My recommendation for flowers has not changed for what it's worth. Hood River, Oregon. That's my recommendation. Michael Calore: That's your recommendation. Did you go there recently? Lauren Goode: Yeah, I did. I went to Hood River recently and I had a blast. It's right on the Columbia River. It's a beautiful area. I you are a Twilight fan, it turns out that the first Twilight movie, much of it was filmed right where we were. We happened to watch Twilight during that time just for kicks. Forgot how bad that movie was, but every time the River Valley showed up on screen, we shouted, "Gorge." Because we were in the gorge. I loved Hood River. It was lovely. Michael Calore: That's pretty good. Katie? Katie Drummond: My recommendation is very specific and very strange. It is a 2003 film called What a Girl Wants, starring Amanda Bynes and Colin Firth. Michael Calore: Wow. Katie Drummond: I watched this movie in high school, where I was cheating on my math exams. Sorry. For some reason, just the memory of me cheating on my high school math exams makes me laugh, and then I rewatched it with my daughter this weekend, and it's so bad and so ludicrous and just so fabulous. Colin Firth is a babe. Amanda Bynes is amazing, and I wish her the best. And it's a very fun, stupid movie if you want to just disconnect your brain and learn about the story of a seventeen-year-old girl who goes to the United Kingdom to meet the father she never knew. Michael Calore: Wow. Lauren Goode: Wow. Katie Drummond: Thank you. It's really good. Lauren Goode: I can't decide if you're saying it's good or it's terrible. Katie Drummond: It's both. You know what I mean? Lauren Goode: It's some combination of both. Katie Drummond: It's so bad. She falls in love with a bad boy with a motorcycle, but a heart of gold who also happens to sing in the band that plays in UK Parliament, so he just happens to be around all the time. He has spiky hair. Remember 2003? All the guys had gel, spiky hair. Lauren Goode: Yes, I still remember that. Early 2000s movies, boy, did they not age well. Katie Drummond: This one though, aged like a fine wine. Michael Calore: That's great. Katie Drummond: It's excellent. Lauren Goode: It's great. Katie Drummond: Mike, what do you recommend? Lauren Goode: Yeah. Michael Calore: Can I go the exact opposite? Katie Drummond: Please, someone. Yeah. Michael Calore: I'm going to go literary. Katie Drummond: Okay. Michael Calore: And I'm going to recommend a novel that I read recently that it just shook me to my core. It's by Elena Ferrante, and it is called The Days of Abandonment. It's a novel written in Italian, translated into English and many other languages by the great pseudonymous novelist, Elena Ferrante. And it is about a woman who wakes up one day and finds out that her husband is leaving her and she doesn't know why and she doesn't know where he's going or who he's going with, but he just disappears from her life and she goes through it. She accidentally locks herself in her apartment. She has two children that she is now all of a sudden trying to take care of, but somehow neglecting because she's- Katie Drummond: This is terrible. Michael Calore: But it's the way that it's written is really good. It is a really heavy book. It's rough, it's really rough subject matter wise, but the writing is just incredible, and it's not a long book, so you don't have to sit and suffer with her for a great deal of time. I won't spoil anything, but I will say that there is some resolution in it. It's not a straight trip down to hell. It is a, really, just lovely observation of how human beings process grief and how human beings deal with crises, and I really loved it. Katie Drummond: Wow. Michael Calore: I kind of want to read it again, even though it was difficult to get through the first time. Katie Drummond: Just a reminder to everyone, Mike was the one who didn't cheat in high school or college, which that totally tracks from the beginning of the episode to the end. Michael Calore: Thank you for the reminder. Katie Drummond: Yeah. Michael Calore: All right, well, thank you for those recommendations. Those were great, and thank you all for listening to Uncanny Valley . If you liked what you heard today, make sure to follow our show and to rate it on your podcast app of choice. If you'd like to get in touch with us with any questions, comments, or show suggestions, write to us at uncannyvalley@ We're going to be taking a break next week, but we will be back the week after that. Today's show is produced by Adriana Tapia and Kiana Mogadam. Greg Obis mixed this episode. Jake Loomis was our New York studio engineer, Daniel Roman fact-checked this episode. Jordan Bell is our executive producer. Katie Drummond is WIRED's global editorial director, and Chris Bannon is the head of Global Audio.


WIRED
17-05-2025
- Business
- WIRED
Is Elon Musk Really Stepping Back From DOGE?
Elon Musk is apparently turning his attention away from Washington and back to Tesla. On this episode of Uncanny Valley , the hosts unpack what Musk's pivot means for the future of DOGE. Elon Musk arrives for a town hall meeting wearing a cheesehead hat at the KI Convention Center on March 30 in Green Bay, Wisconsin. Photo-Illustration: WIRED Staff; Photograph:All products featured on WIRED are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links. Elon Musk says he's stepping back from his role with the so-called Department of Government Efficiency to turn his attention to his businesses—most urgently to Tesla, which has faced global sales slumps in recent months. In this episode, we discuss how our understanding of DOGE has evolved over the past five months and what we think will happen when Musk scales back. You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Katie Drummond on Bluesky at @katie-drummond. Write to us at uncannyvalley@ How to Listen You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for 'uncanny valley.' We're on Spotify too. Transcript Note: This is an automated transcript, which may contain errors. Michael Calore: Hey, this is Mike. Before we start, I want to take the chance to remind you that we want to hear from you. Do you have a tech related question that's been on your mind, or maybe you have a topic that you wish we talk about on the show? If so, you can write to us at uncannyvalley@ and if you listen to and enjoy our episodes, please rate it and leave your review on your podcast app of choice. It really helps other people find us. Hi folks, co-hosts. How's it going? Katie Drummond: Ugh. Michael Calore: That good? Katie Drummond: That was me, Katie. That was me speaking. No, it's going all right. It's been a stressful 90 minutes leading up to recording this podcast, but I'm okay. Michael Calore: Did you just fly through Newark? Katie Drummond: No, actually I didn't. Although I know that that is in your cards, in the near future. I actually rescheduled a flight to avoid Newark, so I'm now taking a red eye for no reason other than I don't want to fly into Newark Airport. Lauren Goode: Smart. Katie Drummond: Thank you. Michael Calore: I'm jealous. Lauren Goode: Mike, I'm sending you all of the good wishes. Michael Calore: Thank you. I hope to listen to this podcast on an airplane that took off on time and lands on time without incident on Thursday. Lauren Goode: I hope you return next week able to tape another podcast because you didn't get stuck somewhere. Michael Calore: I think metaphysically, we're all stuck somewhere right now, I think. Lauren Goode: Yeah, we're in the middle of some big transitions. That's probably the one thing that we have in common with Elon Musk. Katie Drummond: Touché. Michael Calore: Back in the first week of January, we put out an episode of this show that was all about DOGE, the so-called Department of Government Efficiency. I would say it was our very first DOGE episode, if I'm remembering correctly. And we talked about the key players, the goals of the group, and the ins and outs of government spending. A lot has happened since then. And now, Elon Musk, says that he's stepping back from his full-time role at DOGE. There are still many unanswered questions about where DOGE stands now, including if and when Elon's exit will happen, but we're wondering what actually has been accomplished during Musk's time with the DOGE Bros. So, today in the show, the latest on DOGE and what it may look like post-Elon. This is WIRED's Uncanny Valley , a show about the people, power, and influence of Silicon Valley. I'm Michael Calore, Director of Consumer Tech and Culture here at WIRED. Lauren Goode: I'm Lauren Goode, I'm a Senior Writer at WIRED. Katie Drummond: And I'm Katie Drummond, WIRED's Global Editorial Director. Michael Calore: So, I want to start by asking a question that we asked in our last deep dive on DOGE, because I think the answer may have changed since then. At this moment, just a few months into Trump's second term as President, May 2025, what exactly is DOGE? Lauren Goode: Well, I wish it was a figment of our imagination. Katie Drummond: Yes, I wish that it was a fever dream, but that is still the big question, incredibly enough. And I think at WIRED, we've actually been very careful when we characterize DOGE in our reporting, we often, or always, use the term, "so-called." The so-called Department of Government Efficiency, because it doesn't really actually exist. And as some WIRED reporters pointed out last month, I think it was Zoë and Kate, it's almost a metaphysical question at this point. And that was in relation to employees at the General Services Administration, despite the fact that there are at least half a dozen DOGE operatives on payroll at that administration, despite the fact that there is a section of that building that is for DOGE use only and is a secure facility within the GSA, that the acting head of the GSA actually said, in an all-hands, that there was no DOGE team working at the GSA. Which begs the question, well, who are these people then and who do they work for? I think in a more practical way, there are two DOGEs. There's US Digital Service, which was essentially hijacked and repurposed by the administration, now known as the US DOGE Service. Sure. And then there's a temporary organization within the US DOGE service, called, obviously, the US DOGE Service Temporary Organization. And that organization is ostensibly in charge of carrying out the DOGE agenda. So, I think all of this semantic BS aside, what is DOGE? Well, it is the brainchild of Elon Musk. It is something that the president got on board with very early, and DOGE is effectively a collection of typically young, I think almost always male, technologists who come from companies that Musk and Peter Thiel do run or have run. Despite what the acting head of GSA says, there is a DOGE, and it is made up of these dozens and dozens of technologists who are working inside all of these different agencies. That is what DOGE is, whether it's a real department or agency or not, that's what it is. And we have a pretty good sense now, in May, of what they're actually doing. Michael Calore: And it's important to note that they did make a number of hires, dozens and dozens of people who they hired to be a part of DOGE, who are now installed in various agencies around the federal government. Lauren Goode: And a lot more layoffs too. Michael Calore: Yeah. Well, we have been doing a lot of reporting on DOGE. As Katie, as you just mentioned, WIRED has been on top of the story ever since the beginning, because we know Elon and we know his playbook. So, what are some of the stories that WIRED has done over the last few months on DOGE that have just totally blown your mind? Katie Drummond: Wow. There are a lot. I think the reporting that we have done around what DOGE is doing using AI and using all of the data that they've been able to access to actually surveil immigrants, I think that that reporting is incredibly disturbing. I think it is beyond the worst fears of folks in late January, early February as DOGE's work was getting underway, the idea that this kind of thing could happen and that it could happen so quickly, it certainly was talked about. It was speculated in terms of what do you think they're going to do? What are they after? There were a lot of hypotheses at the time. I don't think anyone anticipated that we would see that kind of work happen so quickly and in such a dystopian way. And then, I think, it hasn't blown my mind, but I really like the coverage that we've done around how recruiting for DOGE happens. And we just published another story on this recently, I think it was a couple of weeks ago. It was in early May, from Caroline Haskins and Tori Elliot, that was about another round of recruiting that's happening for DOGE. And this recruiting always seems to happen in these Slack groups for alumni of various tech companies, this time it was Palantir, and this guy, this entrepreneur, went into the Slack room and basically said, "Hey, I'm looking for people who would be excited to design and deploy AI agents who could free up at least 70,000 full-time government workers over the next year." And in the way he phrased it, he was saying, "These agents could free up these 70,000 people for," quote, "higher impact work." Which begs the question, higher impact work in the private sector after you fire all of them? Exactly what is the plan? And that story was really interesting to me because of how, first of all, I think how the recruiting happens is really interesting. I think the fact that it's happening, they're specifically targeting alums from certain companies, that this is happening in Slack groups and message boards. I think that's interesting. But I thought that the way that message was received was fascinating, given that we're now in May. And so, people have seen DOGE play out over the last few months. We wrote, "Eight people reacted with clown face emojis, three reacted with a custom emoji of a man licking a boot. Two reacted with a custom emoji of Joaquin Phoenix giving a thumbs down in the movie Gladiator. And three reacted with a custom emoji with the word 'fascist.'" So, it was just interesting to me to note that alums of a company like Palantir are looking at that message, and at least some of them are saying, like, "Nah, I see what you're doing here. And this is not only not compelling to me as a recruitment effort, but actually fascist." Lauren Goode: Now, I should mention that I happen to have been on a short book leave at the start of this year— Katie Drummond: Good timing. Lauren Goode: When ... Great timing. Katie knows I came back, and I was lamenting to her via our Slack, like, "Katie, I'm literally never taking leave again because so much happened." And starting in late January, I started to see WIRED's incredible reporting, watching it from afar and seeing all this news come out about DOGE, and just was like, "What is happening?" And one of the things that stood out to me almost immediately was this juxtaposition of cuts to the federal workforce and also cuts to federal spending, like the $1 limit that was placed on federal employees credit cards— Michael Calore: Oh, gosh. Lauren Goode: And how much this limited their ability to do their job, like running out of toilet paper, running out of printer paper, not being able to just do office functions as a federal employee, juxtaposed with Trump's incredibly lavish candlelight dinners and the crypto scheme we talked about last week, and all of the ways in which it seems like there are members of this administration who are simply lining their pockets as they have dispatched DOGE to make all of these cuts. If you just step back from that, it's hard to see, at this point, how this benefits America. What has actually happened here? Michael Calore: I think probably my favorite story is one of our most recent ones about the Library of Congress, and how two gentlemen showed up to the Library of Congress and said, "Hi, we work here. You need to let us in." Capitol Police said, "No. Who are you? Can you identify yourselves?" And they showed him a note from DOGE saying that they worked there and that they should let them in. And the Capitol Police turned them away. And it turns out they did actually work there. They had a note from Daddy. Lauren Goode: Please never call him that again. Katie Drummond: Oh, boy. Michael Calore: So, back when we first started talking about DOGE, at the beginning of the year, it was actually two people. It was Elon Musk and Vivek Ramaswamy. I think a week after we published that episode, Vivek was out. Lauren Goode: Has anyone heard from Vivek? Katie Drummond: I don't think about him. I don't know him. I don't know that man. No. Isn't he running for governor? Lauren Goode: I was going to say he's running for governor of Ohio. Wasn't that the plan? I like how we're all Googling this. Katie Drummond: He's pivoted. Michael Calore: Well, it's important to think about who's running it now, because Elon says he's only going to be around one to two days a week. He says he will continue to do work for DOGE and for President Trump until the end of Trump's term, whatever year that may be. He's going to be scaling back. He's going to go on 20% time, basically. So, who are the people who are still there? Who are the people? Who are the names that we now need to know? Lauren Goode: I think AI agents are going to be running all of it. Katie Drummond: Well, obviously they're apparently replacing 70,000 federal workers with them within the year. Obviously, there are some very high-profile members of DOGE after just a few short months. There's Edward "Big Balls" Coristine, this 19-year-old appointed by Musk who owns LLC. I'm sure everyone is familiar with Big Balls at this point. There are plenty of other young inexperienced engineers working across these agencies, and then there are the adults in the room. There are people like Steve Davis, who is one of Musk's, really, right-hand men who works closely alongside him at a number of his companies, and has been working with him in the federal government. And we also, of course, know that they are still actively recruiting, again, largely from companies that Musk himself own. So, I think that the whole point of all of this is that, yes, Elon Musk is scaling back. So, let's say he scales back, let's say he decides to part ways with DOGE and the administration altogether. DOGE is already embedded in the federal government. He accomplished what he set out to do, in so far as we now have DOGE team members, DOGE operatives at dozens and dozens and dozens of federal agencies. They very clearly have their marching orders, they're carrying out work. So, at this point, you can't claw that all back, and that doesn't leave the federal government just because Elon Musk potentially leaves the government. The damage is done. I do think it's important to note here, and I know this will come up over and over because I'm going to keep bringing it up. Elon Musk at two days a week, is a lot of Elon Musk. 20% of Elon Musk's time going to the federal government, sure, he won't be in the weeds seven days a week, 24 hours a day, but that's a lot of Musk time. So, I do think it's important to be cautious, and I just say this to all of our listeners and to everyone out there, this idea that Musk is disappearing from the federal government or disappearing from DOGE, the administration might want you to think that that's what's happening. I suspect that that is not at all what's happening. That said, from all appearances, Elon Musk might be less involved in DOGE, but DOGE is going to keep on keeping on. Michael Calore: And while it's trucking, what is Elon going to be doing? What does he say? Lauren Goode: Yeah, what is he going to be doing? Katie, do you have a sense of how much of this is related to the fact that Tesla isn't doing so well right now? Katie Drummond: Well, I suspect that that's a big factor, but I think so much of the narrative externally, and even people at Condé Nast who have come up to me to be like, "Elon, he's out. Is it Tesla? Why is he leaving DOGE?" This is optics. This is narrative. His company is in the tubes, it is really struggling. They needed a way to change that story, and they needed a way to change that story very quickly. The best way that they could change that story was to say, "No, no, no, no, no. Don't worry. Elon Musk is not all in on DOGE and the federal government. He is going to be stepping back and he's going to be focusing on his other companies." Even just Trump saying that, Musk saying that, that being the narrative that plays out in the media is incredibly helpful for Musk, particularly in the context of Tesla, and just the board, and shareholders, and their confidence in his ability to bring this company back from the brink. So, do I think that he's pulling back and will be spending less time with DOGE? Yes. Do I think a lot of this was just smoke, and mirrors, and optics, and narrative and PR? Yes, it was incredibly well-timed right as Tesla was really, really, really in the tubes and getting a ton of bad press. Elon Musk makes this very convenient announcement, right? Lauren Goode: Mm-hmm. Right. And this is something that the venture capitalist and Musk's fellow South African, David Sacks, has said, "It's just what Musk does." He said he has these intense bursts where he focuses on something, gets the right people and the structure in place, feels like he understands something, and then he can delegate. And he's just reached that point with DOGE. He's in delegation mode. Katie Drummond: Yes, it seems like he has all the right people in place, and a structure that is so clear and transparent to the American people, that it's time for him to move on. Michael Calore: And I do think that he is going to have to figure out the Tesla situation. As you said, the company's really struggling, and there are a lot of reasons for that. There are no new Tesla models for people to buy, even though they were promised. There have been a bunch of recalls. People are just hesitant about buying a new EV right now anyway, for a number of reasons. But it's really, it's him that people don't like. So much like the damage that he has done to the structure of the federal government with DOGE, similarly, he has done damage to Tesla, the brand, by his association with the policies of the Trump Administration, and his cozying up to the President, and his firing, and destroying the rights of people. Katie Drummond: And isn't it also true that all of these problems with Tesla, all of the problems, aside from Elon Musk himself, those problems were happening or were poised to happen regardless, like issues with new models, with recalls, that all predates his work with DOGE, unless I'm drastically misunderstanding how time works. So, those problems with the company existed and were bound to become a bigger deal at some point, and then it really feels like his work with DOGE and the federal government just added fuel to the fire. He just poured gasoline on all of his company's problems by participating with the Trump Administration in the way that he did. But the fact that Tesla is a troubled company is old news, and has nothing to do with the fact that Elon Musk is not a well-liked individual. So, it's just problem on top of problem. Michael Calore: That's right. That's right. And the damage is done, I think, at this point. He would probably have to move on from that company in order to fully turn it around. Katie Drummond: Well, we still have a lot of time left in the year, so we'll see. Michael Calore: All right, well let's take a break and we'll come right back. Welcome back to Uncanny Valley . When we talked about DOGE at the beginning of the year, it still felt just like an idea. The tone was decidedly different. We talked about how the group was named after a meme coin, and we all had a good laugh at the absurdity of it all. It was still unclear what would happen. And of course, since then, DOGE has gutted multiple federal agencies, dismantled so many programs, fired a bunch of people, built a giant databases to track and surveil people, among other things. Katie Drummond: So, I wasn't actually with you guys on the show when you talked about DOGE in January, but I was listening to the show, and I remember you talking about Musk's plans to, quote, "open up the books and crunch the numbers to cut costs." Sounds very exciting. And cutting some of those costs, of course, had to do with laying people off. Now, I remember that because Zoë Schiffer, who hosts the other episode of Uncanny Valley , said she would be surprised if any, quote, "books were even opened." So, what did we see actually happen from that prediction to now, from January to May? Lauren Goode: I want to give Zoë a shout-out here because I think the context of that was me saying, "Oh, I wonder how they're going to go about this careful, methodical process of doing the thing." And so he was like, "This is going to be utter chaos. They're not going to open any books." Katie Drummond: She was right. It has been chaos. Lauren Goode: So we also said that the New Yorker reported Vivek had joked at one point that he was going to do a numbers game. You would lose your job if you had the wrong Social Security number. That didn't actually happen, but Zoë surmised at the time that this was potentially going to be run off of the Twitter/X playbook, run like a chaotic startup. And that's true. I definitely did think there would be more of a process to what DOGE was doing, so I was wrong. There was process. They have systematically terminated leases for federal office buildings, or taken over other buildings. They're reportedly building out this big master database. They've gutted public agencies like the CDC, and regulatory bodies like the CFPB, the Consumer Financial Protection Bureau. So they've done a lot. I think the part where I thought there would be more process was around the people, the human capital of all this, like the federal workforce. And so, maybe in a lot of ways, this is just like some startup, you're acting recklessly and worrying about the human beings you're affecting later. Michael Calore: And I think the thing that we also predicted correctly was that if DOGE has a chance to shape the regulatory agencies in the federal government, they would shape those agencies in a way that benefit people who are in their industry. Lauren Goode: Right. Katie Drummond: I think one of the questions you guys were asking back in January was whether or not the administration was bringing in these guys. It was Musk and Ramaswamy at the time, because they actually wanted them to advise on how technology is used as part of government services, as part of the way the government works, or because they thought the two would be influential over the types of regulations that are rolled back or introduced. So, man, it's crazy to even say all of that, knowing what we know now about ... It's just interesting, in January, we knew so little, we were so naive. But what do you think now about why Musk, in particular, was actually brought on board? Lauren Goode: Well, honestly, I think that they have done both. WIRED has reported that DOGE is building out a master database of sensitive information about private citizens, and a database that will reportedly help them track immigrants. And we know they're playing around with these AI agents, like you just talked about, Katie. And so, we know that they were brought in to apply that technology building mindset to government services, if you want to call it that. But I think that they also are influencing policy, because on the policy side, we've seen, I mentioned David Sacks, he's Trump's crypto and AI Czar, and he's been weighing in on cryptocurrency and stablecoin regulations. Even if that hasn't been pushed through yet, he's certainly in Trump's ear about it. Musk has also been pushing back on Trump's tariff policies. Musk has been expressing his opinion on immigration policies. Those are just a few examples, but safe to say, he has Trump's ear. Michael Calore: I think at the beginning I was cautiously interested in the IT consultant part of it, like the DOGE mission to come in and modernize the federal government. Obviously, if you've ever dealt with federal government agencies, as a person who's computer-literate, sometimes you are just completely flabbergasted by the tools that you have to use to get access to services in this country. So yes, guys, come in, do your thing, zhuzh it up, make it work better. Of course, that is absolutely not what happened. But I was excited about the prospect of that maybe happening. And it turns out that they really took the opportunity to take all of the data that are in all of these agencies and put it all together into one giant input, fed into various systems that are going to process that data and find efficiencies in ways that are probably going to affect human beings negatively. A computer is really good at doing very simple tasks over and over again. It doesn't necessarily understand the nuances of how things are divided up equitably among different sectors of society, it doesn't understand the nuances of people's personal situations. So, that's the modernization that we're going to see, I think, of government systems. And that's frightening, that wasn't what I was expecting. Katie Drummond: Now, we've talked a little bit on and off in this episode already about AI. AI has played a much bigger role with DOGE than maybe we thought it would, maybe we hoped it would, in January. So, let's talk about that. As far as we know now, what does DOGE aspire to do with AI, and how were you thinking about that in January, if you were thinking about it at all? Lauren Goode: I still feel like I don't really understand what they're trying to do with AI, frankly. Katie Drummond: Maybe they don't. Lauren Goode: We know at this point that there are AI officers and leaders in the federal government. We mentioned David Sacks before, who was put in charge of crypto and AI. There is now the first ever AI officer at the FDA, Jeremy Walsh. WIRED has reported that OpenAI and the FDA are collaborating for an AI assisted scientific review of products. Our colleague, Brian Barrett, has written about the use of AI agents. In particular, Brian wrote, "It's like asking a toddler to operate heavy machinery." Social Security Administration has been asked to incorporate an AI chatbot into their jobs. And we've also reported on how the GSA, the General Services Administration has launched something called the GSAI bot. But we also later found out that that's something that was based on an existing code base, a project that existed prior to DOGE taking over the building. I think the short answer is that when DOGE first started, we didn't really have a clear sense of how they were going to use AI. And even right now, after saying all that on this podcast, I cannot pretend to understand fully what they are doing with AI. And that's either due to a lack of transparency, or just the fact that it all seems very disparate, very scattered. I'm not going to sit here on this podcast and pretend to make sense of it. Michael Calore: With a lot of this stuff, it's hard to understand where the DOGE initiatives end, and where just other initiatives in the federal government begin. I think simply because there's a lack of transparency about how these decisions are being made, who's advising who, and who's really drafting the memos. When we think about what is AI going to do, we have to consider what an AI agent is. It is a program that can do the same work as a human being. And that's just the broad definition of it. So, you can deploy an AI agent to write emails, make phone calls, fill out paperwork, whatever it is. You're just basically doing admin work, and there is a lot of admins in the federal government, and I think that that is in our future. People have this cozy idea that their experience with AI is maybe ChatGPT or Siri, or something like that. So, "Oh, you have a problem with your taxes, you can just talk to the IRS chatbot and it'll solve it for you." That sounds like a nightmare. I can't imagine that any IRS chatbot is going to be able to solve any problems for me. It'll probably just make me mad and make the problems worse or the same. But when you think about, "Okay, here is an opportunity for us to use these AI agents in a way that will increase efficiency across the government," what you're really talking about is just we don't need these people anymore and we just need to replace them with the technology. Katie Drummond: One of the pieces of this that I think is so consequential, I remember maybe a year and a half ago, talking to a bunch of civil servants, people in decision-making roles across federal agencies, and they were all asking a lot of questions about AI. They were very curious about AI. The Biden Administration executive order had put forth all of these different demands of different agencies to investigate the potential for AI to do X, Y, or Z within their agencies. So they were in that exploratory process. They were very slow to think about how AI could be useful within those agencies, and that's for the bureaucracy reasons, but it's also because the work of these federal agencies, you don't really want to get it wrong. When we're talking about the IRS or we're talking about payments from treasury, we're talking about evaluating new drugs via the FDA, you want to be right. You want to reduce the risk of error as much as possible. And I think for so many people in technology, there's this notion that technology outdoes human performance just inevitably. It's inevitable that a system will do a better job than a human being who is fallible, who makes mistakes. That said, what we know about AI so far, generative AI in particular, is that it makes a lot of mistakes. This is very imperfect technology. AI agents are not even really ready for primetime within a private company for one individual to use in their own home, let alone inside the federal bureaucracy. So, I do think that a lot of what DOGE has done with AI, like Lauren, to your point about them building on top of this existing AI initiative at the GSA, is they're taking very preliminary work in AI at these agencies, and they're just fast tracking it. They're saying, "This is going to take three years. No, no, we're doing this in three weeks." And that's scary, given what we know about AI and how effective and how reliable it is right now. So, does anything stand out to you guys about that in the context of what we're talking about around AI and DOGE, and AI in the federal government? What are some of the risks that really stand out to you guys? Lauren Goode: I think that it is consequential when you think about AI being used in such a way that it ends up impacting people's jobs, right? Katie Drummond: Right. Lauren Goode: But I actually think that that idea of AI agents doing the jobs of humans at this point is a little bit optimistic. And when I think about what feels more consequential, is this idea of AI just becoming a code word or a buzzword for what is essentially very, very, very advanced search. So, if they are able to build this master database that creates some sort of profile of every US citizen, or every US non-citizen, and is pulling in from all these different data sources, both within government agencies, but public documents, and across the web and across social media, and anything you've ever tweeted, and anything you've ever said, and anything you've ever done, and if you've ever gotten a parking ticket or a DUI, or you've committed a crime, or anything like that, to just hoover that all into one centralized location and be able to pull that up in a citizen of the drop of a hat, that, to me, feels more consequential and potentially more dangerous than going to the Social Security website and having an annoying bot trying to answer your questions for you. Michael Calore: It's surveillance creep, really is what it is. And marry that with computer vision, like face recognition and the ability to photograph everybody who's in a car at the border, cross-reference that with government documentation like passports and driver's licenses, and you have a whole new level of surveillance that we have not dealt with before in our society. Katie Drummond: Now, not to be all negative Nelly, because we often are, but does any ... What? Michael Calore: What show are you on? Katie Drummond: You know me, the Canadian. Does anything stand out to both of you as having actually been good from all of this? So, DOGE takeover January to May, anything potentially exciting? Any bright spots, anything where we should be a little bit more generous in our assessment and say, "You know what, actually, as dystopian and scary as a lot of this is, this potentially a good thing, or this is unequivocally a good thing"? Anything like that that stands out to either of you? Lauren Goode: I would say that if there's one area where we could be a little bit more generous, it might be that if this turnaround of the federal government was something that was being done in good faith, then I might give them a pass after just five months. I might say ... Katie, you've done turnarounds before? Katie Drummond: I have. Lauren Goode: They take longer than five months, right? Katie Drummond: They do. Lauren Goode: Yes. Okay. Katie Drummond: Depends on the size of the organization. With the federal government, you're looking at five to 10 years. Lauren Goode: Right. Exactly. So there's that. In terms of the actual cuts to fraud and abuse as promised, as far as we know and has been reported by other outlets, the actual cuts that DOGE has made fall far below what Trump and Musk had promised. Initially, they said that they were going to slash $2 trillion from the federal budget. That goal was cut in half almost immediately. The latest claims are that 160 billion has been saved through firing federal workers, canceling contracts, selling off the buildings, other things. And PR just reported that the tracker on DOGE's own website is rife with errors and inaccuracies, though. The wall of receipts that DOGE has been posting totals just $63 billion in reductions, and actually, as of late March, government spending was up 10% from a year earlier. Revenue was still low. So, we're still in a deficit, in terms of federal spending. There is one thing I've heard from folks in Silicon Valley they think is a good thing. It's Musk's pushback on some of Trump's immigration policies, specifically those that affect high-tech workers. During Trump 1.0, the denial rates for H-1B visa spiked, and Trump said he wanted to end, forever, the use of H-1 B visas, he called it a cheap labor program. Now, he has flip-flopped a bit. Stephen Miller, his Homeland Security Advisor, Deputy Chief of Staff, has been pushing for more restrictions on this worker visa. But Musk, who actually understands how critical this visa is for the talent pipeline in Silicon Valley, maybe because he's an immigrant, I think has managed to sway Trump a bit on that. And so, for obvious reasons, perhaps people in Silicon Valley say, "Well, I think this is actually a good thing that Musk is doing." Michael Calore: I'll point out two things. Lauren Goode: Go ahead. Michael Calore: One, the LOLs. The press conference that they did in the Oval Office where Elon brought his child— Katie Drummond: Oh, that was good. Michael Calore: That was definitely a big highlight for me. But seriously, the other thing is that people are really engaged now. You talk to people who are somewhat politically minded, and they have opinions about government spending, they have opinions about oversight and transparency, they have opinions about what actually matters to them. Like what do they need from their government, what do they want their government to do for them. Those were all nebulous concepts even five, six months ago that I think are at the top of everybody's mind now. And I think that is a good thing. Katie Drummond: Oh, I love that. A galvanized and engaged public— Michael Calore: That's right. Katie Drummond: As a plus side to DOGE. I love it. We're going to take a quick break and we'll be right back. Michael Calore: Welcome back to Uncanny Valley . Before we wrap up, let's give the people something to think about, our recommendations. Katie, why don't you go first? Katie Drummond: I have an extremely specific recommendation. Do either of you use TikTok? Lauren Goode: I do sometimes. Michael Calore: Define use. Katie Drummond: Scroll. Lauren Goode: Yeah, scroll maybe like once every couple weeks. Katie Drummond: Do you thumb through TikTok? Michael Calore: I'm familiar with it, yes. Katie Drummond: There is an account on TikTok called Amalfi Private Jets. It is the account of a private jet company. This is the most genius marketing I have ever seen in my life. For someone who likes reality TV and trash, which is me. It's these little 60-second reality TV episodes, where the CEO of Amalfi Private Jets is on the phone or he's on a Zoom with one of his clients, often, I think her name is McKenna. She's a young, extremely wealthy, entitled little brat, and she'll call him up in the clip, he's at his office. He's young and handsome, and he's like, "Hey, McKenna." And she's like, "Hey, Colin. So, my dad said that I had to fly from Geneva to London," and blah, blah, blah. And then there's this whole dramatic narrative around McKenna and why she needs a $75,000 jet immediately, and she needs it to have vegan spinach wraps refrigerated. It's just these very dramatic little vignettes of what life is like for the rich and fabulous who are calling Amalfi Private Jets to book their private jets. So there's that account. And then, once you go down the rabbit hole of that account, the TikTok algorithm will start serving up these companion accounts they've created, like the CEO of the company has one, his girlfriend has one. I think McKenna now has one. And so, there's this little cinematic universe of Amalfi Private Jets on TikTok, and you get sucked in, and you get to know all of these people. And it's a little vertical video reality show experience that I highly recommend if you only have 60 seconds, which then turn into two hours, which then turn into pulling an all-nighter to learn everything about Amalfi Private Jets, their CEO, his girlfriend, and their wealthy clientele. This is the TikTok for you. Enjoy. Michael Calore: This is genius. Katie Drummond: Thank you. Lauren Goode: This is the reality TV of the future. Katie Drummond: It's incredible. Lauren Goode: It has arrived. Katie Drummond: And you know what? And I just did their job for them, because it's marketing for their company. They got me. Michael Calore: All right, Lauren, what's your recommendation? Lauren Goode: My recommendation might go nicely on your Amalfi Private Jet. Hear me out, peonies. You guys like flowers? Michael Calore: Oh, peonies. Lauren Goode: Peonies. Katie Drummond: I like flowers. Michael Calore: Sure. Lauren Goode: Do you like peonies? Katie Drummond: I couldn't tell one from another, but I like them. Lauren Goode: They're beautiful. It's peony season here. I'm saying that now with the O annunciated, which is how I would do if I was giving my architectural digest home tour. Michael Calore: I see. Lauren Goode: Yes, these are peonies. Katie Drummond: Oh, I'm just looking at Google images of them. They're very nice. Lauren Goode: Aren't they beautiful? Katie Drummond: They're very nice. Lauren Goode: The cool thing is they do have a very short-lived season. In this part of the world, it's typically late May through June. If you plant them, they only bloom for a short period of time. If you buy them, they're these closed balls, not to be confused with Edward Coristine "Big Balls." They're these closed balls, and then after a few days they open up and they're the most magnificent looking things. They're really, really pretty. And I got some last week at the flower shop, and when they opened, I was like, "Oh my God." It just made me so happy. And they're bright pink. And so, if you're just looking to do something nice for yourself, or someone you just want to pick up a nice little thoughtful gift for someone, get them some peonies. You know what? I didn't check to see if they're toxic to pets. So, check that first, folks. But, yes. Michael Calore: That's great. Katie Drummond: Mike, what's yours? Michael Calore: So, I'm going to recommend an app. If you follow me on Instagram, Snackfight in Instagram, you may notice that I have not posted in a long time, and that's because I stopped posting on Instagram, and I basically just use it as a direct message platform now. But there are still parts of my brain that enjoy sharing photos with my friends, so I found another app to go share photos on and it's called Retro. Lauren Goode: Yeah, Retro. Michael Calore: So, it's been around for a while, but I went casting about for other things out there, and I found that there was a group of my friends who are on Retro, and I was like, "Oh, this is great." It's very private. By default, somebody can only see back a couple of weeks. But if you would like to, you can give the other user a key, which unlocks your full profile so that they can look at all of your photos going back to the beginning of time, according to whenever you started posting on Retro. I really like that about it, the fact that when I post a photo, I know exactly who's going to see it. There are no Reels, there's no ads, there's no messaging features, there's no weird soft-core porno on there, there's no memes. It's just pictures. And I really like that. It's like riding a bicycle through the countryside after driving a car through a city. It's like a real different way to experience photo sharing, because it's exactly like the original way of experiencing photo sharing, and I'd forgotten what that feels like. Katie Drummond: Oh, it sounds lovely. Lauren Goode: What's cool about the app too is when you open it and you haven't filled out that week's photos, when you tap on it, it automatically identifies those photos from that week in your camera roll. It's like, "You shot these photos between Sunday and Saturday, and here's where you can fill this weekend." Michael Calore: And— Lauren Goode: It's pretty cool. Michael Calore: And all the photos from the week stack up. So, if you post 12 photos, and then you look at my profile, you can just tap through all 12 photos, and then that's it. That's all you get. Lauren Goode: Good job, Nathan and team. Michael Calore: Who's Nathan? Who are you shouting out? Lauren Goode: Nathan Sharp is one of the cofounders of it. He's a former Instagram guy. I think his cofounder is as well. It was founded by two ex Instagram employees. And the whole idea is they're trying to make, it's not the anti-Instagram, but it is more private. Michael Calore: Feels like the anti-Instagram right now. Lauren Goode: It's nice. It's a nice place to hang out. Michael Calore: Well, thanks to both of you for those great recommendations. Lauren Goode: Thanks, Mike, for yours. Katie Drummond: Yeah, Mike, thanks. Lauren Goode: Thanks, Mike. Katie Drummond: Bye. Lauren Goode: See you on the jet. Michael Calore: And thanks to you for listening to Uncanny Valley . If you liked what you heard today, make sure to follow our show and rate it on your podcast app of choice. If you'd like to get in touch with us with any questions, comments, or show suggestions, please write to us at uncannyvalley@ We'd love to hear from you. Today's show is produced by Kyana Moghadam, Amar Lal at Macro Sound mixed this episode. Jake Loomis was our New York Studio engineer. Daniel Roman fact-checked this episode. Jordan Bell is our Executive Producer. Katie Drummond is WIRED's Global Editorial Director, and Chris Bannon is the Head of Global Audio.


WIRED
01-05-2025
- Health
- WIRED
The Dangerous Decline in Vaccination Rates
Measles Vaccinations offered by Harris Public Health are photographed on Saturday, April 5, 2025 in Houston. Photo-Illustration: WIRED Staff; Photograph:All products featured on Wired are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links. In the year 2000, measles were declared eliminated from the United States. But thanks to declining vaccination rates, Americans may have to contend with a much scarier future for the deadly disease. Today on the show, we talk about the state of measles, and we explain the role Robert F. Kennedy Jr., Secretary of Health and Human Services, has played in the shifting culture around vaccines in America. You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Katie Drummond on Bluesky at @katie-drummond. Write to us at uncannyvalley@ How to Listen You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for 'uncanny valley.' We're on Spotify too. Transcript Note: This is an automated transcript, which may contain errors. Michael Calore: How's everybody feeling? Katie Drummond: I'm on the road this week, which listeners might notice, but I feel okay, I feel good. Lauren, how do you feel? Lauren Goode: I'm doing good, actually. I'm feeling better than I've been feeling in months. Katie Drummond: Wow! Lauren Goode: Somehow, some way. I'm not on the road. I think maybe that's why, I've had a long bout without travel. Katie, I thought you were going to say that after you came back from France and you developed this new French butter habit that you were feeling better than ever before. Katie Drummond: I do. The butter is so life-affirming. I've been eating a lot of French butter. And I feel great! I feel incredible. Lauren Goode: Amazing. Did you run five miles this morning? That's what I want to know. Katie Drummond: I ran seven miles this morning. Sorry. Lauren Goode: Stop it! Michael Calore: Seven? Katie Drummond: Yeah. Michael Calore: Wow. Lauren Goode: Flex. Michael Calore: I have run zero miles in the last month. No, no, that's not true. Lauren Goode: No, we went running. Michael Calore: In the last two weeks. We did. Lauren Goode: Yeah. Michael Calore: It's an important part of my health routine and when I don't do it, I definitely feel it. Katie Drummond: Oh, yeah. Lauren Goode: Yeah, Michael Calore: Today we are going to be talking about our health, and not just our own health, but the health of all Americans because it has been on everybody's mind lately. Here at WIRED, we've been reporting on the current administration's dismantling of the public health agencies and defunding of research programs. We've tracked all the ways that Elon Musk and his DOGE cohort have been hoovering up all of our sensitive health data and the sensitive health data of millions of Americans without offering a clear explanation of why they're doing it. And we've been watching the shift in culture around vaccines in America, and changing attitudes about what the government's role should be in our collective well-being. On this episode, we're going to talk about all of that. We'll talk about the measles outbreak. We'll talk about all of the other health crises that we really thought we would never have to talk about anymore. But it won't be all heavy stuff, I'm sure we will find a way at some point to have some fun in this show. Katie Drummond: We just had fun. 30 seconds ago we were having fun. Lauren Goode: Can we just go back to talking about butter and running? Katie Drummond: We'll have fun again. We will have fun. Michael Calore: We promise. This is WIRED's Uncanny Valley , a show about the people, power, and influence of Silicon Valley. I'm Michael Calore, director of consumer tech and culture here at WIRED. Lauren Goode: I'm Lauren Goode, I'm a senior writer at WIRED. Katie Drummond: And I'm Katie Drummond, WIRED's global editorial director. Michael Calore: Let's start by talking about the measles outbreak. Oddly, this is not the first time that we've brought up measles on Uncanny Valley . Because last month, Katie, you talked with Emily Mullen from our science desk on one of our Tuesday episodes about the rise in measles cases that have happened under the watch of Robert F. Kennedy, Jr., who is the country's head of health and human services appointed by the Trump Administration. Can you tell us what is going on with measles? Katie Drummond: Unfortunately, I can. I wish that I didn't have to talk about measles, but here we are. Look, measles cases are on the rise in this country. Measles has not been a going concern in the United States for a very, very long time. But that is because of vaccinations. That is because of successful campaigns to get American parents in particular to vaccinate their children. But an outbreak in Texas that started earlier this year is changing that. We have seen more than 600 cases of measles and two deaths, two children who have died from measles this year alone. It's the largest outbreak of measles in Texas since 1992. Nationally in the United States, we've seen 800 cases of measles so far this year, which is the most since 2019. Just by comparison, take that 800 number so far this year, last year, how many measles cases did we see in the United States in the entire year? 285. This is an exponential increase in measles cases in the United States, primarily concentrated in Texas around that outbreak. Michael Calore: We also have some new research about measles that we should talk about, some research out of Stanford. Lauren, your alma mater. Lauren Goode: Oh my gosh. I wish Zoe was here just so that she could groan and say, "Do we need to talk about Stanford again?" Yes, there is new analysis from epidemiologists out of Stanford, our colleague Emily wrote about this last week. The research was published in the Journal of American Medical Association and they used a computer model basically to look a little bit into the future. And determined that, with current state-level vaccination rates, measles could reestablish itself and become consistently present in the United States in the next two decades. Then they tried a variety of different simulations and their model predicted this exact outcome in 83% of the simulations that they did. What's interesting is that if the current vaccination rates just stayed the same, the model estimated that we could see more than 850,000 cases, 170,000 hospitalizations, and 2500 deaths over the next 25 years. We need to get our vaccination rates higher, at a better level, in order to thwart that. Basically, they said measles could become endemic if we don't course-correct quickly. There is a difference between endemic, an epidemic, and pandemic, which we've all just been living through. It's still not good. Michael Calore: Right. Lauren Goode: No matter which way you look at it. Michael Calore: Right, because it's a deadly disease. Lauren Goode: Correct. Michael Calore: This goes beyond just measles. What we're talking about is the MMR vaccine, measles, mumps, and rubella. But there are other vaccines that people are supposed to be taking that they're not taking and it's mostly among children. I think all the numbers show that kindergarten vaccines are down, which is one of the big factors that public health experts study. Lauren Goode: That's the scary part. By the way, for all of us here, do you remember getting MMR? Michael Calore: Oh, of course. Lauren Goode: Right. I'm looking at Katie, too. I'm like, "Katie?" Katie Drummond: I was just going to, as you were asking that question, in my head I was like, "Who remembers what they did when they were five?" Lauren Goode: Well, that's the thing. Katie Drummond: I don't remember that happening to me, but I know from ... I remember I think when I was pregnant, you have to get some vaccinations and some updates. I think I remember checking in with my dad just to be sure. But it wasn't really a question in our family. I think for still, the majority of families in the United States, it's not really a question of whether or not you're going to vaccinate your children, whether or not you were vaccinated. But there is this growing minority of people who are making a different choice, who are choosing not to vaccinate their kids, who are changing the vaccine schedules, who are spreading out vaccinations because they incorrectly think that that is a safer way to vaccinate. It is that minority as those percentages makes a really, really big difference when you're talking about herd immunity and you're talking about protecting an entire community. But no, I don't remember being vaccinated, but I was. Lauren Goode: That's exactly it. I don't remember the shot going into my arm, but I remember it was just standard that you got MMR. Then subsequently, for example when I did go to grad school, which happened to be a little bit later in life, I was in my early 30s, I remember asking my mother because I literally would not have been allowed to go to school if I didn't have evidence of these vaccines. We were looking for a little piece of paper- Katie Drummond: Right. Lauren Goode: ... from the late 1980s that had this. But it was just assumed that we did it. The vaccines we're talking about here include MMR< measles, mumps, rubella, DTaP, polio, and chicken pox. The drop in vaccination rates is especially dangerous for babies and kids. That decline, which I believe is at the state level from 95 to 93% vaccination rates, may seem small. But when you consider other factors, like how contagious some of these diseases are, how contagious measles is, that's the alarming part. Michael Calore: Yeah. All of this data coming out, and the outbreak, and vaccine hesitancy that we're seeing in society are happening at a moment when we have a new cabinet secretary for health and human services, Robert F. Kennedy Jr., who has brought many of these beliefs about vaccinations being bad and about how public health should be managed into his job. What kind of energy is he bringing? We all know the answer to this, but I want to break it down. I want to talk about what his role in this moment will be. Katie Drummond: I think it's really important to be really, really clear about RFK Jr., about his legacy, about the damage that he and others have done to this country, and to the integrity of trust in science and in scientific research in the United States. I think one of the really interesting things we're seeing play out now with RFK Jr. is that he is walking back, or modifying, or trying to tread this very careful line where he doesn't come out and enthusiastically deny that vaccines are safe and effective, which they are. But he doesn't want to go so far in the other direction, either. He's essentially trying to launder his history in the eyes of the American public. But the reality is Robert F. Kennedy Jr. has been leading the charge against vaccinations in this country for decades. He was the chair of the Children's Health Defense, which is a nonprofit that campaigns very vigorously against vaccinations. He has many times suggested things like that vaccines cause autism. I remember during the pandemic he said that COVID-19 is targeted to attack Caucasians and Black people. He said, "The people who are most immune are Ashkenazi Jews and Chinese." More recently, we have seen him try to tread the line where he is essentially saying things like, "People should think about vaccines. They should talk to their doctor. This is a personal choice." I don't really think it is actually a personal choice. I think it is a choice that you make with the knowledge that you live among a community of other people. You don't necessarily get vaccines just to protect yourself or just to protect your child. You get vaccines to protect the entire community that you live within. This is a population-wide imperative. That's something that I think even now in his current role, where he does need to tread a more careful line or he is trying to tread a more careful line, he has failed spectacularly to communicate that to the American public. Lauren Goode: Katie, right now, is Kennedy in support of MMR, or is he still toeing the line on vaccines? What's the latest? Katie Drummond: Well, I think the most recent comments he has made about MMR, after months of a lot of pressure and a lot of back-and-forth, he said, "The MMR vaccine is the most effective way to prevent measles." He has said that. That being said, in recent months he has also said things that directly contradict that statement or that call that statement into question. He did an interview with Fox News in March, so just a little over a month ago, where he said, "There are adverse events from the vaccine. It does cause deaths every year. It causes all the illnesses that measles itself causes, encephalitis and blindness, et cetera. People ought to be able to make that choice for themselves." I want to be very clear here. Healthy people, generally speaking, healthy kids, healthy adults who go get the MMR vaccine do not die from that vaccine. That is not a thing. What he is saying is false. He's saying it on Fox News and he's saying it to millions of Americans. Many of whom, if they are regular viewers of Fox News and regular consumers of right-leaning and far-right news organizations, they are already asking questions about vaccines. They are already potentially deciding not to vaccinate their kids. Maybe they are deciding not to get their own vaccines, not to get the flu shot every year. They are already a vulnerable community of people. What he is doing in interviews like that is he is further sowing doubt in that community, in those populations of people around the safety and efficacy of these vaccines. I don't really care if, at some point now, he says the MMR vaccine is the most effective way to prevent measles. Well, cool, dude. You have spent the last several months in your role as a government official, and the last several decades as a high profile person on this planet, telling everybody that this vaccine and other vaccines are not safe. That they might kill you. That is not true. Michael Calore: We're snapping our fingers here in the studio, but we need to take a break and we're going to come right back. Welcome back to Uncanny Valley . Ever since Donald Trump's inauguration and Robert F. Kennedy's approval as the cabinet secretary for health and human services, we have seen a rapid dismantling of HHS and all of the agencies that work underneath it. They're not going away, but there have been jobs cuts, there have been consolidations, there have been funding cuts for research, and all kinds of chaos. Where should we start with what's been going on in Washington? Katie Drummond: Oh, boy. These are, as WIRED and so many other outlets have reported, these are huge cuts. These are tens of thousands of employees at these agencies losing their jobs. I think it's important to note, just in the context of this conversation today, that at the same time as RFK Jr., and DOGE, and the administration are making these sweeping cuts to federal health agencies, they are also targeting what appears to be a lot of vaccine-related infrastructure. I think one of the most notable examples to me and something I found particularly disturbing is that the NIH is actually asking researchers to scrub references to MRNA vaccine research in grant proposals. Essentially suggesting, we don't know for sure, but there are strong indications that the federal government under Donald Trump and these health agencies under RFK Jr. will be deprioritizing MRNA vaccine research. Now, I should remind everyone that MRNA vaccines and the incredible research that has allowed them to be possible is the reason several years ago we were able to get shots in arms to make sure that millions more Americans, not to mention people around the entire world, did not die of COVID. MRNA vaccines were the key to thwarting a devastating pandemic. This was just a couple of years ago. It's an incredibly promising field of research, and it's one that now potentially looks like it's at risk because of the approach RFK Jr. is taking to what he describes, and the way he talks about it, is vaccine safety. "We need to make sure these things are safe." God forbid, everybody die of measles because they got a vaccine for it. Which, again, doesn't happen. That under the auspices of safety, that really promising, experimental work into vaccination technology will not happen. That's what really stands out to me from all of this, among other things. Michael Calore: They have to know that this is going to have a destabilizing effect on the health of Americans. Because the plan that they're instituting right now involves not only rolling back research and funding towards new medicine, but also vital systems that mostly people who have lower means in our society use in order to access healthcare. Healthcare for minority communities, healthcare for people who are suffering from addiction issues, healthcare for people who are single mothers who are on public assistance. These are the programs that are all being rolled up into a new administration called the Administration for Healthy America. When those programs are rolled up, they're going to be smaller and they're going to have less funding and fewer people working there than they did. It's this odd moment that we have where not only are we having less research and less effort put in to finding new cures for things, but we're also providing less public support in general. My big question is what's the plan here? What do we expect is going to happen? Katie Drummond: What we expect is going to happen is that some of these illnesses that were very much under control in this country will, as Lauren said earlier in the show, become endemic again. Or that the next COVID, the next devastating pandemic, we will not have the resources to contain that pandemic and communities won't have access to the information, let alone the vaccinations that they need, to take care of their families. Some of the grants that have already been canceled in this mass culling of federal agencies and this realignment of federal priorities, these are grants that provide measles vaccination centers in Texas. Mike, you were talking a minute ago about what the administration thinks is going to happen and what the plan is here. I think it's one of two scenarios to me, and neither one is particularly reassuring. One, scenario one is that they genuinely think that the United States will be a healthier country if they eliminate experimental research into vaccinations, if they provide less access to this medical care to communities across the country. They might actually think that, based on what they seem to believe, what RFK Jr. seems to believe, that this will be a healthier country if that happens. That's scenario one. Scenario two is that they just don't care. Especially when we're talking about vulnerable communities. One of two scenarios, not sure which one it is, don't like either of them. Michael Calore: Yeah. I feel like both are on the table right now. The goal of DOGE is to eliminate waste, fraud, and abuse, that's something that you see in all of the executive orders and all of the communications coming out of the government right now. "We're stamping out waste, fraud, and abuse." Sure, there was probably some waste, there was probably some fraud. I'm not so sure about abuse. But the wholesale dismantling of these programs that people rely on for their day-to-day lives to work just doesn't feel like the right path forward for America. I do not feel bad about saying that on a podcast. Lauren Goode: No. I think all of this actually threatens to make America as a nation weaker. Not great again. The MRNA research that Katie mentioned earlier that led to the COVID vaccines, that led to "Operation Warp Speed, look we've done this so quickly," was actually years in the works, the foundational technology for MRNA. All of our most pivotal research around cancer treatments and other diseases takes years. Then when we have a fractured system, a fractured healthcare system, we also become unable to respond as quickly as we should be able to to threats of bio-terrorism. Just picture all of the misinformation that, in some ways, we've been faced with for years, but now it's amplified because of internet culture, too. Just picture all of that flying around in a moment, in a very acute moment of needing a clear leader with evidence-backed knowledge in the room, and we don't have that right now. Michael Calore: Yeah. I want to dig into something you just said, which is internet culture. Lauren Goode: It's a big part of this. Michael Calore: Yeah. Can you talk us through how big of a part it is, and what the influencers are doing in this moment? Lauren Goode: Well, one of the trademarks of the wellness industry and particularly on the internet is that it doesn't really have an established standard of credibility. And that it's constantly suggesting information, and tips, and hacks to people that put something just out of reach for them. Just one more thing that you should be doing to optimize your health, it keeps that machine turning. There was this cultural critic that popped into my feed recently and I can't remember his name, but he made a great point about how what happened after GLP1s became widely accessible to people. That you started to see all the wellness influencers start to hype Pilates. Pilates is having a moment because it's the next thing in this flywheel of health hacks that just makes it a little bit more expensive, inaccessible, a thing that all the celebrities are doing that you can't do, but you should be doing. In a nutshell, that is the health and wellness industry online. You combine the psychology of that with the fact that a lot of people do feel utterly disgusted with the US traditional healthcare system, with health facts that seem to give you this sense of control, with a total lack of enforcement around bogus health claims on the internet, and then you add someone like RFK Jr. to the mix, who is supposedly speaking from this position of authority. It's a powder keg. It's this non-toxic, fluoride-free vitamin A powder keg. Katie Drummond: Save us from fluoride, Robert. Lauren Goode: Right. Save our teeth from fluoride. Then occasionally, you have these outlier examples that come up that end up supporting these claims. Someone does happen to have an adverse reaction to a vaccine. A family member who has been shunned by traditional healthcare, but actually did self-diagnose and is now thriving. It becomes an example in people's mind. Very occasionally, RFK Jr. will say something that a majority of people can glom onto, like wanting to ban those ridiculous pharmaceutical ads you see on television. Everyone goes, "Okay, yeah, that makes sense." Michael Calore: Yeah. Lauren Goode: You just combine all of these things and you just have this perfect storm of the potential for misinformation that actually seriously harms people's health. Michael Calore: Yeah. Katie Drummond: Yeah, I think that that's all right. I was thinking about this last night, knowing that we were going to be talking about this today on the podcast. I remember in 2011, I went to Minneapolis. I reported a story about how Andrew Wakefield, if that name rings a bell, it should. Andrew Wakefield was spending a lot of time with the Somali community. There's a very large community of Somali immigrants in Minneapolis. He was spending a lot of time with them and essentially telling them not to vaccinate their kids. That the MMR vaccine caused autism. He created this massive public health catastrophe in this one city in this one community that he decided to target. This was in 1998. Andrew Wakefield published a paper saying that, "It sure looks like this vaccine causes autism." That was this seminal turning point to me, at least in our lifetimes, around this idea of vaccine hesitancy. Around this idea of just asking questions about vaccines, which ultimately became a very dangerous thing. I think through the 2000s, we're now talking about 25 years of history, through the 2000s, the emergence of social media, of online connectivity really built up this mistrust and this anti-vaccine crusader movement. Along with very smart use of anti-vax activists, the use of celebrities. People like Jennie McCarthy, who I remember came out and said, "My son has autism and I think that vaccines are the reason." Michael Calore: Yeah. Katie Drummond: Celebrities like Jennie McCarthy and RFK Jr., who has spent the last 20-plus years of his career parroting a lot of the language, a lot of the ideas that Andrew Wakefield, who has been discredited over, and over, and over, and over, and over again. But you have people like RFK Jr. picking up that mantle and taking it, and then feeding it into this social media machine where, to Lauren's point, now that information, that misinformation more accurately, inaccurate information can propagate and reach communities not only all around the country, but all around the world. You saw it in the late '90s with Andrew Wakefield, and it really to me, as I think about it in my lifetime, it has just metastasized from there. It has been taken on by high profile people. It has made its way onto the internet. It has made its way into influencer culture. Of course, then the COVID pandemic was the perfect storm for all of this to spiral I think really out of control. Lauren Goode: I think we all remember that moment during the COVID pandemic when information was scant. I think we were all very afraid. Donald Trump said something about injecting disinfectant into your body. I think we can safely say that was misinformation. Katie Drummond: Yes, I think we can safely say that injecting bleach or whatever horse medication was being bought up across the country from desperate people ... Look, COVID was this very, very scary, very isolating, very unprecedented moment in American history, in world history. You think about that moment, the year 2020. Well, we all just spent the last 20 years being fed anti-vax, or at the very least just asking questions about vaccines kind of narratives. First, in the analog media, and then through social media and all over the internet. You have people who are already very alienated from the US healthcare system. They don't trust big corporations, they don't trust big pharma. There are good reasons for all of those things. They're sitting at home, they're by themselves. They don't have access to their broader community. What they do have access to is the internet and they start hearing the President of the United States talking about injecting God knows what nonsense into their veins, and there you have it. I think COVID was the rock bottom moment for this country, at least so far, in adoption of vaccines. I will say, I have family members, I'm sure so many people listening do, maybe you guys do, too. I have family members who chose not to be vaccinated for COVID in 2020 or 2021, and to this day are not vaccinated against COVID because they are scared of MRNA technology, they're scared of vaccines. They think that the vaccines do more harm than good. That is the institutional leadership that has brought us to this moment where we have children in the United States of America dying of measles. That is where we are. Michael Calore: That is a rough place to be. Katie Drummond: Yeah, it sucks. Michael Calore: Okay, let's take another break and then we'll come right back. Welcome back to Uncanny Valley . We're going to shift away from talking about health and we're going to talk about Signal. Because the thing that has been blowing up our group chat this week is, in fact, a group chat. It's a Silicon Valley group chat, it's a bunch of elites talking about God knows what. What do we know about the group chats? Lauren Goode: Katie do you want to take this one? Katie Drummond: Oh, Lauren, this is so yours. Lauren Goode: Well, normally with Overheard, we would talk about something we've each overheard in Silicon Valley, but in this case, we are talking about what Ben Smith overheard. Ben Smith is the founder of a news outlet called Semafor and he published a story this week about the Silicon Valley private group chats that have been shaping politics for years now. There are several chats referred to in this story, but they fall under the umbrella basically of something called Chatham House, which is based on the idea that, in order for people to express their ideas freely, they have to be able to speak in a private space. These chats that Ben reported about are made up of billionaires, venture capitalists, thought leaders, and they're views are mostly right-leaning or even fringe. These are members of a technocratic society who are expressing ideas that they believe would get them canceled online or shot down by the woke mob. Now instead of expressing them on Twitter like they might have a little while ago, they're putting them in group chats. These include reactions to a Harper's Letter that came out back in 2020 that was somewhat controversial. But also, more recently, these guys are responding to Trump's tariffs where people aren't necessarily falling along party lines. They're actually criticizing Trump's tariffs. What's interesting to me, aside from the content of these chats, is that we're in this moment where reactionary right-wing politics are playing out privately in private Signals, but actually have so much influence over the public sphere right now. It's the modern day version of salons. Katie Drummond: That's what I thought was so interesting about the story. It's a fascinating story and honestly it was the kind of story that made me wish there was more. I was like, "Come on, Ben. Get some screenshots, man." Lauren Goode: Get the goods. Katie Drummond: Show us the goods. Lauren Goode: Yeah. Katie Drummond: But it was this idea that, for so many of these people ... We are talking about millionaires, billionaires, we're talking about the wealthiest, most powerful people running businesses or VC firms, or what have you, in this country who are effectively saying, "I'm too scared to go on social media anymore because people are mean to me in the comments. I'm going to go hide with my other rich friends and we're going to talk on Signal instead." That was one of the big takeaways for me was this incredibly thin skin of some of these people to feel like they can't vocalize an opinion or share a point of view on social media. And that they feel like they need to take it to a safe space and workshop it with 100 of their closest billionaire friends, before they can all put it out in public together as a united front. I thought that was fascinating. I would say there are plenty of good reasons to shy away from using social media or sharing your opinions on social media. That is very real, the mob mentality, people going after you for what you think. Or your opinion if you are a public figure, making news in a way you may not like. But I had to laugh at the idea that, for some of these people, the idea of sharing a thought on Twitter of all places, which is a pretty safe space for people with pretty extreme points of view, I will say. That that just felt like too high risk in this woke world that we live in, that they need to go hide away in a confidential group chat to talk about what they really think. I thought that was a little bit ridiculous. Lauren Goode: Well, right. Then the moment that someone says something that goes against their ideologies, like Marc Andreesen says, "I think it's time to take a Signal break." I also thought it was interesting how Marc Andreesen appears to have a couple of lackeys who he just tells to assemble these group chats for him. "Put me in with smart people!" Then someone goes and assembles a group chat of 20 people, and Marc Andreesen apparently is one of the most prolific texters. Katie Drummond: Yes, I loved that. Lauren Goode: Someone else in the article was saying, "I don't know how he has the time to do this. He's much busier than me, and yet he's the most active participant in this group chat." Katie Drummond: Honestly, just imagining him frantically toggling between different text groups throughout his day and his night while trying to do his job made me feel very stressed out. Take a breather. Please. Lauren Goode: Yes. Katie Drummond: Please, just chill out a little bit, man. Lauren Goode: Yes. Katie Drummond: It's too much. But also, add Lauren and I to your group chats. Lauren Goode: Right. I was just going to say add us, you cowards. Add us to your group chats. We are open to joining the group chats. Max Reid did a pretty good analysis of this in his Substack newsletter. He described how this is the perfect confluence of events for people to be radicalized within these group chats. Katie Drummond: Right. Lauren Goode: Because they started back in 2019, 2020, and then Clubhouse was a thing. Clubhouse was a moment when people were saying the quiet parts out loud on Clubhouse. But really, there were all of these little networks and groups that were forming behind the scenes because were sitting home, nothing to do except be online and live online. That led to these people coming together, but coming together along these explicitly political lines, and then radicalizing each other. Katie Drummond: So they're saying the quiet part quietly over Signal, privately. I don't love where all of this leads. I was very glad to see this story come out and to have a little bit of sunlight cast on this phenomenon. Thank you, Ben Smith. You did a good one. Lauren Goode: Yeah, I don't think that there's anything else in Silicon Valley that people are talking about quite as much right now. Maybe we should be talking about other things, though. Michael Calore: Yeah, we did just talk about RFK and Health and Human Services for 30 minutes, so thank you for the levity at the end of the show. Thank you all for listening to Uncanny Valley . If you liked what you heard today, make sure to follow our show and rate it on your podcast app of choice. If you'd like to get in touch with us with any questions, comments, or show suggestions, write to us at uncannyvalley@ Today's show was produced by Kyana Moghadam. Amar Lal at Macro Sound mixed this episode. Paige Oamek fact-checked this episode. Jordan Bell is our executive producer. Katie Drummond is WIRED's global editorial director. Chris Bannon is our head of global audio.


WIRED
24-04-2025
- Entertainment
- WIRED
Protecting Your Phone—and Your Privacy—at the US Border
A close-up of the US Customs and Border Protection badge. Photograph:All products featured on Wired are independently selected by our editors. However, we may receive compensation from retailers and/or from purchases of products through these links. Under the new Trump administration, more and more visa holders and foreign visitors are being detained or denied entry at the border. It's also becoming more common for people to be questioned or detained because of content on their phones, laptops and cameras. In today's episode, we'll tell you what you need to know about your carrying devices across the US border, and how to stay safe. Plus, we share some pretty spectacular recommendations for your downtime. You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Katie Drummond on Bluesky at @katie-drummond. Write to us at uncannyvalley@ How to Listen You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for 'uncanny valley.' We're on Spotify too. Transcript Note: This is an automated transcript, which may contain errors. Michael Calore: Welcome back, Katie. Katie Drummond: Thank you so much. I am so happy to be back. My brain barely works. That's how great the trip was. We had a fantastic time. We did travel internationally. I took my family on a really fantastic vacation. Lauren Goode: What vibes did you get from people outside of the US about the US? Katie Drummond: It's interesting. If they were dealing with me or my child, I have a very cute child, they were lovely. I speak French. Look, I was in France. I don't know why I'm trying to conceal this. I have already come back. So I think that helped a lot. My husband, I'm married to someone who's very loud, and so he gives very American energy, and they didn't love that. But I don't know how much of that is just like loud man versus like, "Ugh, loud American." You know what I mean? Depending on which member of the family we're talking about, I think experiences differ. But I did great. Michael Calore: I remember going to France during the George W. Bush administration and I would tell people I'm from California. I wouldn't say I'm from the United States. I'd say I'm from California. And then they would express sympathy toward me. They would say, "Oh, I'm so sorry." They were very, very aware that California was politically left from the rest of the country. Katie Drummond: That's a very smart tactical move. Lauren Goode: It's probably more extreme now. Michael Calore: Yes, probably. Well let's get on with the show because today we are talking about travel, specifically international travel. Katie, you've inspired us. We're going to talk about how to protect yourself from phone searches when you're crossing the border into the US this is WIRED's Uncanny Valley , a show about the people, power and influence of Silicon Valley. I'm Michael Calore, Director of Consumer Tech and Culture here at WIRED. Lauren Goode: I'm Lauren Goode. I'm a senior writer at WIRED. Katie Drummond: And I'm Katie Drummond, WIRED's Global Editorial Director. Michael Calore: Travel to the US is getting dicey. With travel warnings being issued, customs and border protection cracking down on people, and President Trump's anti-immigrant rhetoric and policies, many people are reevaluating their travel to the US or abandoning their travel plans altogether. Lauren Goode: Yeah, there's a lot happening, and last week on the show we did an episode, sort of similar, about surveillance and protests and how to stay safe if you decide to take to the streets, exercise your First Amendment rights. We had a lot of good tips in that episode on safety, including what to do with your smartphone, how to protect your personal information. This week we are really homing in on the phone and we're talking specifically about what to do when you go through customs. Michael Calore: So this is the second part of a two-parter, we could say. Lauren Goode: That's right, and we should give a lot of credit, by the way, to our security team here at WIRED, our security desk because they've been doing a fantastic job reporting on all of this. Michael Calore: Yes, and there are several pieces that you should read on right now if you are traveling internationally. We'll of course link to all of them in the show notes, or you can just go to WIRED. They're close to the top of the page. So let's start this conversation with the basics. What do we need to know about how phones are searched at the border when you're coming into the United States? Katie Drummond: Well, I think the most important thing for people to realize, the very basic premise, which I think can be surprising, is that when you come into the United States, whether you're a US citizen or a visa holder or just a tourist, is that Customs and Border Patrol, so when you clear Customs at an airport or you're driving over the border between the US and Canada, that they do actually have the authority to ask to search your phone when you enter the country. That is within their rights, which I actually think, until the last several months, if you haven't been paying close attention to border security and how we travel, you're just an everyday traveler, you might actually not be aware of. So I think that's the foundation of this entire conversation is that yeah, they can ask to take your phone. Lauren Goode: Right And that can happen in other countries, too. Michael Calore: Yes. Katie Drummond: Absolutely. Lauren Goode: Not just the US. So when you're traveling internationally, it's part of the deal, and it has been, but there's a lot of attention on this right now. Michael Calore: So it feels like new information even though it's not new information, because I think most people would assume that if you're a US citizen, and even though you're in the border, you're standing in San Francisco or you're standing in New York City, you would have the protection of the Fourth Amendment, which protects you against unlawful search and seizure. But the Fourth Amendment works a little bit differently in certain parts of the country. Lauren Goode: Right. But then when you're at the airport, you're in a border zone, so all bets are off. Katie Drummond: Well wait, sorry. Explain what you mean by a border zone. If I am at JFK airport in New York City, I'm in a border zone? Lauren Goode: That's correct. Basically a hundred miles inland from any border, which includes airports, typically fall outside of the standard Fourth Amendment protections. The Fourth Amendment basically means that if an authority was going to search your device, they'd need a warrant first. But once you are in one of those border zones, basically there's no need for that warrant; customs can search any traveler's phone or electronic device. And that's not just phones. That's your iPad, it's your laptop, it's cameras, it's any electronic device. Michael Calore: Your iPod. Lauren Goode: Sure. For those of us carrying vintage iPods, that might be the way to go. Now it's just carry an iPod. Michael Calore: Well, it gets to an important point, which is what types of things are they looking for when they search a device? Katie Drummond: I think that the answer to that question has actually changed a lot as far as we can tell in just the last few months. Ostensibly, and I think the way this has typically worked as far as we are aware, is that they would be looking for potential terrorist activity, criminal activity. The majority of these searches, at least last year, were on non-US citizens. They're looking for people coming into the country who have the potential or the intention of committing criminal acts inside the United States. I think though that in this era, in everything we have seen so far with travelers being searched or detained or even denied entry into the United States, there is an overarching sense that border patrol officials are actually looking for something a little bit different. Even expressing your dislike for the Trump administration seems to be enough to cause you pretty serious problems at the border. Lauren Goode: What's concerning about what's happening right now is it's hard to say exactly why certain people are being targeted or detained at the border. In the past there was this understanding that maybe someone might be targeted for suspicion of terrorist or criminal activity, but now it just seems like anyone who is potentially deemed antagonistic in some way to the Trump administration, whether that's an academic doing a certain type of research or a journalist reporting critically on the administration or a student activist, a protester, it's just unclear what is going to be the thing that is getting you denied entry to the US. And there have been certain examples, like in March, a French scientist was denied entry to the US. There's also a story of a Wall Street Journal journalist returning from Beirut who was detained and searched after customs realized she had traveled to dangerous places. She did not allow them to search her phone. But that's one example of a journalist from a very established, reputable outlet being detained. Katie Drummond: One of the pieces of all of this in the last few months that has made it so difficult is that the press is reporting on these isolated incidents, like a scientist here, a journalist there, a tourist here, but there is not a codified or any sort of institutional guidance from the Trump administration about any new policies at the border, anything specific that they are now looking for or targeting. It all feels still very anecdotal, certainly anecdotal enough and common enough to be very scary, but isn't one prevailing trend that we can point to. It's a diffuse range of very disturbing stories that are showing up in news outlets. It is a little tough to figure out exactly what's going on, how methodical it is, how systematic it is. We have been obviously covering the Department of Government Efficiency, DOGE, quite a bit, and one of the major efforts they appear to be undertaking is an attempt to pull data from all of these different federal agencies into one readily accessible database. So they want to be able to access a whole range of information about a given person in one place. And my suspicion, and I think what we have reported, is that that information, let's say they have a bunch of information about me in one place in some sort of database, all of a sudden if that information is then tied to whatever border security is looking at, it allows them to basically systematically start targeting people at borders, which is a very scary prospect. Sounds very far-flung, dystopian, I think is actually closer to becoming a reality than we might assume. Michael Calore: Yeah. Lauren Goode: I don't think it's far-flung at all. Honestly, I think that reality is here. Katie Drummond: Well, two years ago it would sound a little nuts. Lauren Goode: Yeah, it would. And I think that there's this feeling right now that if you end up in that position where someone wants to pull you aside and search your phone and they're presenting you with this "Enter your passcode" or they're probably stern about it, there's probably this sense of no one can help you now. Who's going to help you? You're going to say, "I would like to talk to a lawyer," or "No, you can't search my device," or "Sure, you can search my device, but when am I getting it back and what are you doing with that information?" You're just going to be in that moment. And so the best thing that you can do is arm yourself with all the information you need in advance, and also just take the steps that we're about to talk about. Michael Calore: And also I think we should quickly outline what happens in that situation, in that scenario. A Customs and border protection agent can ask you to open your phone either using face Id or a fingerprint or by entering your PIN. So when somebody says, "Will you unlock your phone for me?" at the border, what they can do is they can search it manually. And that's what happens in most cases. They'll open up various apps, they'll look at things, they'll scroll through. They don't just pick up and start scrolling through all your Twitter messages, but they will look at your social media accounts, they'll look at emails, they'll look at contacts. Whatever they're looking for, they will have access to the phone to look for. In some cases, it can involve an electronic search where they hook your phone up to a machine and it sucks all the information off of your phone and then they can do more complex data analysis on it. But what you should know is that if you're a US citizen, or if you're a green cardholder, you can refuse to have your device searched at the border without being denied entry into the United States. You can have your phone confiscated, you can be brought into a little room and asked more questions. You will be scrutinized if you deny them the opportunity to search your phone, but they can't keep you from entering the country under normal circumstances. Katie Drummond: I think even just listening to you outline all of that out loud is very helpful. That is such an invasive prospect, and such a terrifying prospect to, Lauren, what you were saying a minute ago about you're talking to a stranger, they're in uniform, they're very stern, they are essentially blocking you from entering the country where potentially you are just visiting or potentially where you have a home. So much personal, private information is on this device, and I think just even imagining what it would feel like to be taken to a small room in an airport with a very mean person who's looking at your email, that's a really daunting thought and a really daunting prospect, and I think that just underlines how important it is to be prepared before you go to the airport, before you travel. Even to practice. I'll talk a little bit later about what I did to get ready for this trip, but it involved practicing. I did dry runs in my head of how I would handle different situations, and I think that's a really important thing, journalists or not, for anyone to be doing right now. Lauren Goode: And that's the best case scenario, if you're a US citizen. You can decline the search. You can still be allowed in. They might take your device, you might get it back at a later point, right? Michael Calore: Yeah. Lauren Goode: If you're a visa holder or a foreign visitor and you refuse, you can be detained or deported. Michael Calore: Yeah, they can say, "Well then, we're not going to let you into the country. You need to go talk to those people and they'll help you fly home." Lauren Goode: Right. Michael Calore: We're going to take a quick break and when we come back we'll talk about what you can do to prepare your phone and your devices to cross the border. Welcome back to Uncanny Valley . Let's talk about what folks can do to feel safe and to protect themselves when traveling into the US. What is the first thing that we want to recommend? Lauren Goode: Ahead of the traveling part, ahead of considering using a burner phone or a second phone or something like that, we should just talk about the type of apps that you're using, and particularly for messaging. This is very familiar for journalists already. We all use it, and of course now I think this has more brand awareness at a national level now because of Signalgate, which we've talked about before. But you should be using encrypted messaging, and an app like Signal in particular has a disappearing messages function so that after a certain period of time, you can set it to be a day, a week, a month, whatever it is, those messages will disappear. They're untraceable at that point, to the best of our knowledge. And so even ahead of a trip like that, you should be thinking about, are my accounts locked down? What do I have backed up to the cloud? Am I using truly encrypted messaging apps? Basically, you should have a good sense of digital hygiene, privacy hygiene, before you've even gotten to the border. Katie Drummond: Yeah, I think it's a good question to ask yourself. If you're spending time trash talking the President and the administration with your dad, should you really be doing that on iMessage? For journalists, I think we often think about Signal and encrypted messaging as a way to communicate with sources, and of course it's a very valuable tool for that, but this extends to all of the texting that you do with everyone in your life. And even for me, I would just say anecdotally, Mike and Lauren, I don't know about you guys, but a few months ago I actually had my family and all of my close friends start using Signal to text with me. I stopped using iMessage to text with them just because I don't want those personal interactions and off-the-cuff comments that everybody makes to be visible in any way. So it does suck when I go look at my family text group and I can't read anything that we sent yesterday. We've lost some very valuable and hilarious Drummond family jokes as a result, but it's worth it, because that long trail of conversation, just those very casual chats that you have during the day, are not visible to any prying eyes. Michael Calore: So if you're traveling into the US, through an airport or across a border, and you want to protect your device in the instance that somebody asks to search it, here are some things that you can do. You should disable biometrics on your phone. Turn off Face ID, or whatever face unlock feature you have if you have an Android phone, turn off fingerprint unlock, and only use a PIN. Katie Drummond: Can I interject here very quickly, Mike? Michael Calore: Yes. Katie Drummond: I do not understand for the life of me, truly, outside of even the context of travel, why anyone would be using Face ID to unlock their phone. It feels like such a high-risk move, that it's like, it's really that hard to remember four numbers? You want your face to be able to unlock your phone? Are you nuts? Are you nuts? Michael Calore: Well, Katie, just to interject on your interjection, you should be using a six-number PIN instead of a four-number PIN. Katie Drummond: Oh, my God. Hold on. I got to look at how many numbers my password is. Hold on. Oh, no, it's six. It's six. It's six. I relent. Michael Calore: But convenience. Convenience. Katie Drummond: No. Michael Calore: Human beings have been trading privacy for convenience— Katie Drummond: I know, I know. Michael Calore: —for— Lauren Goode: I know. Michael Calore: —millennia. Katie Drummond: Convenience is one of technology's most dangerous offerings. Michael Calore: And the convenience narrative is real though. We all really enjoy- Katie Drummond: It is, of course. Lauren Goode: Yes. Katie Drummond: —those conveniences that technology has brought us, but if you want the ultimate security— Lauren Goode: That's right. Michael Calore: —turn that shit off. Just in case anybody's wondering why you would want to turn it off, if somebody has your phone, they can just hold it up to your face and unlock it, or they can put you in handcuffs and then use your fingerprint to unlock your phone. Whereas if they need a PIN, you can say the helpful words, "I can't help you with that, Officer." Lauren Goode: Is that how you would say it in that tone? I just want to— Michael Calore: Yes. Lauren Goode: Have you practiced that? Michael Calore: Yes. Katie Drummond: That's how he practices. Michael Calore: Because you don't want to say, "I forgot" because then you might be lying. Lauren Goode: That's right. Michael Calore: You might be caught in a lie. You can just say, "I can't help you with that, Officer." Helpful tips on talking to cops here on Uncanny Valley . Katie Drummond: That is actually a very helpful tip. That's good. Michael Calore: Something else you can do: update your operating system. Lauren Goode: That's right. Michael Calore: Make sure it's fully up-to-date. Lauren Goode: That's right. You know when you go to the notes for OS updates and it will say- Michael Calore: Known security blah blah blah blah. Lauren Goode: —known security issues, fixes, bugs and security issues? Security issues is key here. Pretty much every major operating system update is patching some vulnerability, fixing some flaws in security, and so you want the latest. Michael Calore: And law enforcement has technological tools that can exploit those vulnerabilities in order to gain access to your phone. So by updating your phone, you are rendering their tools ineffective. Lauren Goode: Keep it going. Michael Calore: Turn off your phone and put it in your luggage when you're in the airport or when you're approaching border patrol. If it's out of sight, it may make it more difficult for somebody to be reminded that they should be checking your phone. It's just a small thing you can do. Something much bigger you can do: print your boarding pass. Do what Katie does and go fully analog. Lauren Goode: Yep. Katie Drummond: I always ... The first thing I do when I get to the airport, what is it? I print my boarding pass. Lauren Goode: Yeah. Michael Calore: Right. Lauren Goode: I do, too, actually. Michael Calore: And the point here is that you don't want to have to open your phone at all if you don't have to. So we should talk about who should be doing this. We mentioned that US citizens and people who hold green cards are more safe in border zones than folks who are here on a visa or some sort of non-citizen status, but should everybody be doing this? Lauren Goode: Yeah, this is a good question. A lot of this is going to depend on your own personal risk factor. We've already talked about the difference between being a US citizen or a green card holder versus being a visa holder or a foreign visitor. I hate that we're at this point, I hate that I'm saying this, but you should also factor in things like your nationality or your profession. Katie Drummond: I'll take this one step further and just say the very provocative thing, which I think race and sexuality and gender identity are also big pieces of this, unfortunately. Lauren Goode: Yes. Katie Drummond: That has been a real thing for a very long time. I think it is even more real now. If you are a middle-aged white man with a US passport traveling in a suit with a briefcase, that is a very different scenario than a person of color who identifies as queer and is crossing the border into the United States. It is just different, and it shouldn't be, but it is. Michael Calore: Yeah, and I think if you particularly are somebody who's outspoken in communities that have proven to have been targeted by the Trump administration in recent months, like if you are an active member of the trans rights community, if you're active in pro-Gaza groups, then those are things that, if your phone was searched, they could learn about on your phone. So if you have queer dating apps on your phone, if you belong to a bunch of organizations that send you email newsletters that are particularly leftist, those may be things that you would want to hide from anybody who wants to search your phone. Katie Drummond: To offer the furthest extreme point of view on this question that you asked, which is who should be thinking about this and acting on it when they're traveling, look, my view is everybody. Because I think that these are relatively simple steps. There is an inconvenience to some of what we are recommending, but we are in an unprecedented moment in this country. There is so much that we do not know about what is happening behind the scenes about how these decisions are being made, about what's possible at the border and with border security, that my feeling is very, very, very, very, very much better safe than sorry. For the vast majority of people, it is incredibly unlikely that their devices will be seized, that anything will happen, but why take the risk? Sure, there are risk factors that you can consider when you're deciding how extreme to go. Do you bring a burner phone and leave your phone at home? Do you just delete certain apps from your phone? There are different levels of intensity that you can take this in, but I think it would be incredibly naive for anyone at this moment in time to just treat this all as business as usual. This is not business as usual in this country right now. Lauren Goode: And if it sounds like we're flailing about or casting a very wide net of who could potentially be targeted by this, it's because we don't actually know— Katie Drummond: True. Lauren Goode: —and it's my belief that this is actually what the administration may want. There is a method to the chaos, and it is actually to instill the fear that we are all experiencing at this moment and forcing us to take these measures, because it is completely unprecedented for us, I think for many of us. Michael Calore: Yeah, agreed. Well, let's fight the fear by going extremely hardcore into safety territory, because we've talked about the first option, which is turn off these things on your phone. What else should people be doing with their phones? Do we delete apps? Do we hide apps in specific folders? And I have some thoughts about this. I think that you may just be better off, if you're really worried about crossing the border, you may be better off just starting over with a new phone. You set up a separate phone just for travel where you have absolute minimum amount of stuff that you need on it before you cross the border. Lauren Goode: Right. Yeah, and Lily and Matt, our colleagues who wrote the guide to this, they talk about a burner phone but also make the clarification that this is not a burner phone that you go to the corner store and you buy with cash and it doesn't have any apps on it. It's just like a... What are the old things ... The flip phones. Michael Calore: Flip phones, yes. Lauren Goode: Oh my gosh, yes. Michael Calore: What are those things called? Yes, Lauren. Lauren Goode: I know, because I'm just so young. I don't know what a flip phone is. You can use a smartphone, but it's a secondary smartphone that you're building from scratch. You're installing only the apps you need. You're not putting sensitive information or messages or accounts on there. You may even be using a different phone number so that your main phone number is not ... Because that can actually be tied to a bunch of digital services. And so you can still use a smartphone if you're traveling, is the thinking, but you're starting from a clean slate. Michael Calore: Katie, is this what you did? Katie Drummond: I did, actually. And I will add, I think a couple pieces of context. One is that I'm a journalist. WIRED has been publishing a lot about the administration, so there's a certain risk assessment that goes into that. And then I would also add that I have the benefit of working for a big company who, because of my job, I was able to say, "Hey, I'm traveling internationally. I don't want to bring my computer. I don't want to bring my phone." And they were able to provide me with clean devices, and it definitely took some getting used to. It was hard to be separated from my devices, which are what I spend most of my time with, sadly. Michael Calore: Yeah honestly, this is the hardest part about giving this advice, because people say, "Well, what can I do to be safe?" And we say, "Leave your phone at home, go out and purchase a second phone or borrow a second phone, and put absolute minimum amount of stuff on it." Like you said, everybody has to have their phone on them at all times and that is not what anybody wants to hear when they ask that question. Katie Drummond: And I will say, though, as I mentioned earlier, I was traveling with my very loud American husband, and our risk profiles are a little bit different for lots of reasons. He traveled with his phone, but he went through it and he deleted a bunch of apps. He logged out of all of his social media. He communicates using Signal as well. So he took a more measured approach. The idea of us going out and buying him a second iPhone, that sounds crazy. That's a lot of money to spend. Michael Calore: So did he just log out of social media apps or did he delete the apps from his phone? Katie Drummond: I believe he logged out and deleted the apps. He just wanted to travel with as clean a slate as possible. He backed a bunch of stuff up, and then just deleted it off of his hardware. Michael Calore: And that's the correct approach for social media. You want to log out of the app and delete it from your phone. Even if you delete the apps off of your phone and they know who you are, so they can look up and see that you have a social media presence online, they may turn a laptop towards you and compel you to log into your social media accounts, right there on the spot on a different device. In which case, it's very important that you have two-factor authentication turned on, and that the device that you use for two-factor authentication is the one that you left at home. Lauren Goode: Right. Michael Calore: It's not the one that you're traveling internationally with. Lauren Goode: Right, right. Katie Drummond: That's a really interesting point. I feel like that's very smart-in-the-weeds guidance. Mike, here is a follow-up question: if someone turned a device to you and said, "Log into your X account on this computer," can you say, "I'm sorry, Officer, I can't help you with that"? Michael Calore: Yes. And the explanation there, which should be what you actually do so that you're not lying, is that you're using a strong password that is stored in your password manager and that you do not have access to your password manager. Katie Drummond: Great answer. Lauren Goode: Here's maybe a dumb question about social media, but you're describing logging into a Twitter account. They can also just go to your public-facing Twitter account. And in our case, ours are public-facing. Michael Calore: Yes. Lauren Goode: And see what you're tweeting, anyway. Michael Calore: Yes. Lauren Goode: The next layer would be DMs or something like that, which Border Patrol could get into, but otherwise, anything you're tweeting, anything you're putting on Blue Sky, anything you're putting on a public-facing Instagram at this point, is fair game, easily searched. Katie Drummond: Which I think is really important for folks to keep in mind, particularly people who are here on visas, people who are not US citizens. There is a very, very, very real risk that your public-facing communications, what you put on social media, can be used against you at the border. And I think that that's a great point. There's only so much that logging out and deleting apps will accomplish. I think it is one step up from doing nothing, and so for some people that might be enough to make them feel safe crossing the border if their risk profile is relatively low; but if your risk profile is not relatively low, to Lauren's point, there are some pretty easy ways that Border Patrol can see what you're posting on the internet. Michael Calore: One more note about cleaning up what's on your phone. A few people have asked me if it's possible to take the sensitive apps, like let's say your gay dating app or your social media apps, and use that feature in iOS that lets you hide those apps in a folder. And that's useful to keep people from just casually snooping on your phone because they won't see that you have those apps, but it's protected by your Apple ID and it's protected by your PIN, so you can't get into those if you have biometrics turned off. However, people can still see that those apps are on your phone even if you have them in a hidden folder because they can look at your battery usage stats and they can look at your screen time stats and they can see that those apps show up in those lists. So that's why it's important to just delete them off of your phone instead of just hiding them in a hidden folder. Lauren Goode: That's smart. That's really smart. What do we know about this tool that Border Patrol is using to actually get this data from your phone? Because in the past, and I'm thinking particularly of the example of San Bernardino, when authorities had the shooter's iPhone but were unable to extract the information from messages that they wanted to extract, and so kept going to Apple and saying, "Could you give us access?" And Apple was responding and saying, "We don't even have backdoor access." What exactly are they able to get from our phones? Michael Calore: The sky's the limit. If you can access the phone, if you can get past that PIN, then the sky's the limit. Lauren Goode: But if it's locked? If it's locked and you said, "I'm sorry, Officer, I cannot help you with that," then exactly what information are they able to get? Michael Calore: Not much. Well, before we move on, I have to ask, what do we make of this moment? When we were trying to decide what we were going to talk about on this week's episode, we had to come back to privacy the second week in a row, and it feels like an important decision that we made, and I just want to talk about why we're spending so much time on this. Lauren Goode: This is all just incredibly disturbing, is what it is. Legal scholars right now, historians are saying that for other reasons, they believe that the US is already in a constitutional crisis, and all of what we're describing here today are activities and actions that could be taken legally, prior to this, at the border, for anyone who was coming through. It was extremely rare that someone would be detained and have their device searched, but the way that this is being done now feels as though it is part of a much bigger picture around people's rights being threatened. The fact that, Mike, even you said earlier, well, if you're using a gay dating app, you might want to hide that. Katie, you brought that up as well with regards to the LGBTQ community feeling vulnerable. The fact that we're talking about academics, professors, scientists, researchers, journalists, people being potentially targeted, to me, it's incredibly disturbing. Katie Drummond: I think it's really important to continue to acknowledge, to ourselves, it's a hard thing, I find, to acknowledge to myself, and in our roles as journalists to acknowledge to our audience, to all of you who are listening, we have never been here before. As I said earlier, this is not business as usual in this country. What is happening right now is, to Lauren's point, very disturbing; it is very distressing; and it is very much unknown, and that's not a very safe feeling. There are things about the way we all live that we need to change. The best thing that we can do at WIRED right now is to provide as much of that expertise and information and guidance as we possibly can, even as we're navigating it ourselves in our own personal lives. Michael Calore: Very well said. Thank you for that. Let's take another break and we'll come back and have some fun. Welcome back to Uncanny Valley . So it's been a very dark episode, so let's spread some love and light through the halls, shall we? Katie Drummond: Let's do it, Mike. Lauren Goode: Let's do it. Michael Calore: So we asked you to write in, and you have been writing in. Thank you very much for writing in, and we do read your messages, and there's one thing that we want to do because there has been a popular demand for it, and that is to give you our latest recommendations before we end the show. So we're going to talk about some things that we enjoy that you might also enjoy. Who's going to go first? Katie, Lauren? Lauren Goode: I think Katie should go first. She's the boss. Katie Drummond: So as you all know, because I can't stop talking about it, I recently returned from France, and one thing in particular that I am now pathologically fixated on, that I am recommending to all of you, if you want to feel just so good, if you want to just feel so good when you sit down and eat, is French butter. And so I ate so much butter last week. My entire family was obsessed with the butter. All we did was eat butter. We got home over the weekend, I went to a specialty grocery store, and I purchased French butter. And I have now been eating the French butter in the privacy of my own residence, and it's just been a very lovely thing. I got some nice bread, I got my French butter, I'm watching the country fall apart, and I'm very well nourished, and that is my recommendation for the week. Go get some French butter. You will feel so much better after you do. Lauren Goode: Salted or unsalted? Katie Drummond: Salted. As salted as possible. Lauren Goode: Room temperature? Katie Drummond: Oh, yeah. Cold, I would say, cold to room temperature. Lauren Goode: Cold to room temperature. Katie Drummond: You don't want that stuff soft and melted. That's not the energy. Lauren Goode: Yeah, no. This is so inspirational. Michael Calore: That's the only way. That's the only way I eat butter, is soft. Lauren Goode: Yeah. Katie, do you watch Temptation Island while you are having your bread and butter? Katie Drummond: I have actually now moved on. I am watching, throwback alert: The Bear . Lauren Goode: Oh, not stressful at all. Katie Drummond: The guy in the show, the main guy, is hot, so I feel like that checks the trash box. He's hot. Michael Calore: Yeah, okay. We stan. Katie Drummond: Thank you. Lauren Goode: We do stan. Michael Calore: Lauren. Lauren, what's your recommendation? Lauren Goode: I can't top that. Michael Calore: Sure, you can. Lauren Goode: That share had everything. Jeremy Allen White, butter, reality TV. Katie Drummond: It's too late. You should have gone first. Now you have to go second. Lauren Goode: I'm going to recommend a couple of movies that are about Popes. Michael Calore: Why? Lauren Goode: Pope Francis just died on Easter Monday, April 21st. And- Katie Drummond: Remind me, Lauren, who did he meet with right before he died? It's just—this is important. Lauren Goode: I'm trying to think of who it was. Yeah, it was, I think, JD Vance? Was it JD Vance? Katie Drummond: Oh, it was JD, yes. Lauren Goode: That's right. It was JD Vance. Katie Drummond: Right. So the Pope met with JD Vance and then died. Lauren Goode: That is correct. Katie Drummond: Thank you. Keep going. Lauren Goode: I would like to believe Pope Francis was holding on just so he could get one more message, one more scolding across. I actually saw a meme on the internet, of course. What else are we on the internet for these days? That said JD Vance stands for just killed de Pope Vance. Katie Drummond: Oh, no. Michael Calore: Oh, no. Katie Drummond: That's pretty good. Lauren Goode: I know. Katie Drummond: Sorry, that's pretty good. Lauren Goode: Yeah, I'll send it around. We're not going to link to it in the show notes, though, but I happen to have been on a Pope kick prior to this. I know that- Katie Drummond: Wait, really? Lauren Goode: Yes, I know that's a weird- Katie Drummond: What does it mean to be on a Pope kick? Lauren Goode: It just means- Katie Drummond: Don't tell me this is like my Jeremy Allen White kick, Lauren. Lauren Goode: It just means ... So Katie, Katie, this is the moment when I tell you that I'm leaving WIRED to go to divinity school. Katie Drummond: Oh no, Lauren. Lauren Goode: Yes. Michael Calore: Oh boy. Lauren Goode: I watched The Conclave a couple of months ago, was fascinated by it. I was messaging with ... One of my aunts is a retired religion teacher, and I was messaging with her about it, and she said, "Oh, well, if you liked that, you have to watch Two Popes ." So then I watched Two Popes , which is streaming on Netflix. It's a 2019 movie. Fantastic. I loved it, and I only realized after the fact that it's directed by Fernando Meirelles, who's this incredible Brazilian director. Also directed City of God and the Constant Gardener . That is like the Fernando Meirelles trifecta, by the way. Now I'm just giving you too many recommendations. I recommend the Conclave and Two Popes . Michael Calore: But not Young Pope . Lauren Goode: I think I've seen it, but I can't wholeheartedly recommend it at this moment. Michael Calore: It's surreal and weird. Katie Drummond: This is a very deep well of papal knowledge that you are bringing to the table here. Lauren Goode: Yes. Katie, just watch the movies and tell me you aren't at least curious about going down a rabbit hole. Katie Drummond: I have so many questions. Lauren Goode: Mike, what's your recommendation? Michael Calore: Oh, speaking of rabbit holes. So I try to maintain a veneer of professionalism on this show, but I have decided that today is the day where I am going to bring my authentic self to the microphone. Katie Drummond: Oh my God. Michael Calore: And recommend something that is entirely on brand for me, if you know me personally, but that I don't really talk about a lot here, which is in the new issue of the New Yorker, the current issue of the New Yorker, I should say, that you can get on newsstands now, and on the New Yorker website, there is a very long profile of the band Phish. Katie Drummond: Oh my God. Lauren Goode: Yes. Yes. Michael Calore: P-H-I-S-H. Lauren Goode: Yes. Katie Drummond: Yeah, we know the band. Lauren Goode: Can I just say, last weekend, Mike and I were at brunch, and Mike ran into an old friend and I immediately said, "Oh, are you a Phishhead?" Because I knew that Mike's community of friends around San Francisco, they've seen some shit together. Michael Calore: We have, yes. Many, many colors. I went to school in Vermont in the 1990s, so it's like I'm genetically a Phishhead. But this profile, there have been a lot of profiles about Phish. This one, written by Amanda Petrusich, is by far the definitive profile of the band. Katie Drummond: Oh, wow. Michael Calore: First of all, it has all of the things that you need to know. There's a bunch of little minutia. They've been together for 40 years, so there's this long story about how they got together and how they feel about music and blah, blah, and every profile has that. But the thing that this profile has that no other profile has, is it has the outsider's perspective of somebody who has been to shows and understands the band, but has not fully gone into the whole lifestyle, doesn't go to every show, is not like a super-duper fan. But understands them enough and loves them enough to write about them in a way that is compassionate and intelligent. The other thing this profile has is, it talks about this thing that happens during their performances where for 10 minutes, just the whole place just turns into this giant blob of humanity that's all on the same mental wavelength. Phish fans call it the portal, the band also call it the portal. Like when they go into the portal, this happens, and it lasts 15 or 20 minutes, and then when it closes and they come back to reality, the entire stadium erupts and cheers because everybody was feeling the moment. And that's the ineffable thing about going to one of their shows that I've never read before. And when I read it in this profile, I was like, thank God. Somebody got it. Katie Drummond: Well, maybe I'll like them. Lauren Goode: This is great. Butter, Pope and Phish. Katie Drummond: In the New Yorker. Phish in the New Yorker. That's what, it's not Phish. It's Phish in the New Yorker. Thank you for listening to Uncanny Valley . If you like what you heard today, make sure to follow our show and rate it your podcast app of choice. If you'd like to get in touch with us with any questions, comments, or show suggestions, or to rip on me about Phish, write to us at uncannyvalley@ Today's show is produced by Kyana Moghadam. Amar Lal at Macro Sound mixed this episode. Jake Lummus was our New York studio engineer. Samantha Spangler fact-checked this episode. Jordan Bell is our executive producer. Katie Drummond is WIRED's global editorial director, and Chris Bannon is the head of global audio.


WIRED
17-04-2025
- Politics
- WIRED
How Americans Are Surveilled During Protests
Protesters rally in Manhattan to demand an end to cuts in science, research, education and other areas by the Trump administration on April 08, 2025 in New York City. Photo-Illustration: WIRED Staff; Photograph: Spencer Platt There have been a number of protests in the past few months pushing back against President Trump's most recent policy changes, and we're likely to see more. Today on the show, WIRED's senior editor of security and investigations, Andrew Couts, talks us through the technology being used by law enforcement to surveil protests, how surveillance tech has evolved over the years, and what it means for anyone taking to the streets or posting to social media to voice their concerns. Plus, we share WIRED tips on how to stay safe, should you choose to protest. You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Andrew Couts on Bluesky at @couts. Write to us at uncannyvalley@ How to Listen You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for 'uncanny valley.' We're on Spotify too. Transcript Note: This is an automated transcript, which may contain errors. [Archival audio]: No justice, no peace. Ho ho. Trump and Musk have got to go. Michael Calore: People are taking to the streets to challenge President Donald Trump's most recent policy changes, some of which have been created with the aid of Elon Musk and his so-called Department of Government Efficiency. [Archival audio]: All 50 states saw these so-called hands-off rallies and so did a few cities in Europe. Michael Calore: The first hands-off protests occurred earlier this month. The Tesla Takedown demonstrations have been rolling for weeks and from the feel of it, we're looking at a summer full of protests. So today we're talking about the risks of being surveilled by law enforcement during protests. We'll talk about how surveillance tech is being used, how it's evolved over the years, and what it means for anyone taking to the streets or posting to social media to voice their concerns. This is WIRED's Uncanny Valley , a show about the people power and influence of Silicon Valley. I'm Michael Calore, Director of Consumer Tech and Culture here at WIRED. Lauren Goode: And I'm Lauren Goode. I'm a senior writer at WIRED. Michael Calore: Katie Drummond is out today, but we're joined by WIRED's Senior Editor of Security and Investigations, Andrew Couts. Andrew Couts: Thanks so much for having me. Michael Calore: So let's start by talking about what's going on right now. There are the hands-off protests, there are the Tesla Takedown protests. Are these related at all? Lauren Goode: The hands-off protests and the Tesla Takedown movement are not the same, but they are related. They're both in some way resisting some of the policies that Donald Trump has quickly enacted without congressional approval in the short time since he took office in January. Tesla Takedown is pegged directly at Elon Musk who has this official but unofficial role in Trump's administration as the leader of DOGE. We sometimes refer to him as the Buddy In Chief, and the idea there is to challenge Musk's power as one of the world's richest men by devaluing one of his most important businesses in the private sector, which is Tesla, whereas the hands-off protests are about all kinds of things. They're protesting the firing of federal workers, the overreaching and potentially unconstitutional immigration policies, threats to women's rights and LGBTQ rights, threats to social security, threats to healthcare. The list goes on. The idea is basically get your hands off my rights. Michael Calore: And how are the protests looking? Lauren Goode: They're fairly significant. Tesla Takedown is a grassroots movement that started outside of Tesla dealerships in showrooms back in February and has been happening on an ongoing basis and has gotten quite a bit of attention. Hands-off had its biggest day so far on April 5th I think, and organizers said that there were more 1,300 rallies of varying sizes across the United States on that Saturday. And if you haven't heard of these rallies or seen the sizes of the crowds that people like AOC and Bernie Sanders have been pulling in, then I would seriously question the media that you're consuming because this is really happening. Michael Calore: Yeah, there's been really striking footage of people walking in Manhattan and just wall to wall people down one of the major avenues just for like a mile. Lauren Goode: Right, and not AI generated. Michael Calore: The people who are out taking the streets and engaging in their constitutional right of free speech and assembly, what are they worried about? Lauren Goode: I can't speak for everyone and I want to toss this to Andrew because I think Andrew's going to give us the real meat here in terms of digital surveillance, but I would just say that I think with any protest, even before we all had smartphones and there were surveillance cameras everywhere on every street corner in every train station, you always had to weigh the risks of doing the surveilling as in being a watchdog of the powerful and questioning abuses of power and civil rights versus being surveilled at the same time you're doing it, but because we live in this digital world now, I think surveillance really is one of the biggest threats today. Andrew, do you want to say more about that? Andrew Couts: Yeah, I mean surveillance is just constant and we are all being surveilled constantly if you have a smartphone or just on the internet. So whether someone is being surveilled at a protest, the answer is a hundred percent yes, especially if they have their phone with them and there's obviously other types of surveillance, but I think one of the things that you have to think about if you're going to engage in any type of protest and engage in your first amendment right to speak out against whatever you want to speak out against is that it's not just what's happening at the protest that matters, it's also the constant surveillance that's happening of your social media feeds or any other types of publishing you might do online. You really need to be thinking about your entire life and your entire data footprint and how that's going to be contextualized within you being at a protest. The other thing I'd be worried about is bad actors or anybody committing crimes while you're at that protest, there's a difference between going and exercising your constitutional rights and committing crimes. And I think these days those two get conflated a lot, especially after the 2020 protests where there's a lot of vandalism and violence and the protesters and the people committing crimes get all lumped together and it's very easy to lump people together these days, and I feel like that's happening on an official level in terms of immigration right now with the Department of Justice, the state Department categorizing anybody who they deem as a problematic as either a criminal outright, they'll say that or just canceling visas because somebody spoke out against the war in Gaza. These things are all getting conflated, and so you don't necessarily have power over how you're going to be perceived if you go to a protest and something happens or somebody just decides to characterize that activity in a way that's inaccurate but is potentially consequential for your life. Michael Calore: And to get into how exactly that conflation happens, I want to talk a little bit about how devices and certain signals on social media are used in order to identify you and identify you as a certain type of person or a person who was somewhere. So let's talk specifically about the phone for a minute. What specifically does the phone do to identify you? Andrew Couts: So there's a few ways. The first is even if you had no apps on your phone except for the phone app basically, probably even not then, if you just have the device with you and it's powered on your phone is going to be pinging the nearby cell towers, it's going to ping whatever the tower is that has the highest signal that's close to you and that power is going to be collecting your device ID and the time and date when your phone pinged the tower. And so that information can easily be obtained by police with subpoenas and anything to get just whatever devices were pinging a specific tower. So that's one way. The other way is through the apps on your phone. And so we've done a ton of reporting at WIRED about the ways in which advertising data, which can be collected in a few different ways, but is often collected through developer kits or SDKs, and these can often include very, very precise location data down to which parking spot you parked your car in front of a Home Depot or something. It can be extremely precise and it's constant. And so as long as your phone is on and is communicating with any server that's connected to an SDK on whatever random apps on your phone, that data is then being backing up and used typically to serve you ads, but it can also be purchased by governments, it can be purchased by police departments or anybody, me or you, if you have the money to buy that data and you can see exactly where someone was at a specific time or at least you can see where the device was. And so it's not too difficult to kind of figure out where somebody was at any certain time if you have your device. And so that's one of the main reasons that having a phone with you at a protest, you got to make that decision about whether that's the best choice. Michael Calore: Right. The idea is that as you move around in the world, if law enforcement wants to sort of draw any sort of conclusions about what kind of person you are and who you hang out with and what sorts of places you go, it's relatively easy for them to do so. Andrew Couts: Yeah, absolutely. And the fact is that they're not going to just be using one or the other. They're going to be using basically every tool available to them. So that can include other people's social media posts that show you in photographs or videos. It's going to be police body cameras, it's going to be your own social media posts or statements saying that you were at a certain place at a certain time, and so it's all going to be used together to show like, yes, this person was at X place at X time. Lauren Goode: What is your advice then for sharing the social media from a protest, particularly since social media can be an important tool for getting a message out or letting people know there is a rally happening? Andrew Couts: When making these decisions, it's really depends on your risk threshold. I think if you are really concerned about your safety and maybe your immigration status or your ability to live freely in the United States, I would definitely limit your exposure to other people's social media posts, meaning wear a mask if you're able to, remove any identifying features that you can cover up or make sure you don't have your name on your shirt or anything like that. And definitely don't post to your own social media about the protest if you're really concerned about that. Not everybody's risk levels are going to be the same though. Maybe getting the word out is the most important thing to you, maybe that's your job, but it is definitely something to factor in that you are almost certainly going to be subjected to other people's video and photos and you need to take that into consideration before you decide to go to a protest or how you decide to conduct yourself there. Michael Calore: So if we can assume that what you're doing online and not only moving around in the world, but the things that you're doing online are being monitored, then what about your private conversations? What about if you're using Twitter DMs or if you're on Facebook and you're private messaging with people on Facebook? Lauren Goode: Or WhatsApp or any of the Facebook-owned apps? Michael Calore: Yeah, sure. Is it possible for those types of things to also be exposed through like a subpoena? Basically my question here is are tech companies protecting us in any way against governments prying into our DMs? Andrew Couts: So there's a difference between active surveillance and passive surveillance, especially when we're talking about social media. There are companies that are constantly collecting everything that is posted publicly online about a particular keyword or a hashtag or anything like that. So anytime you're posting about a certain protest or a certain political thing, you might be getting subjected to some kind of surveillance there, but it's very passive. You're part of many people who are talking about a thing presumably, and it's not targeted at you. Then there's active surveillance where you are a subject of an investigation or you're a person of interest to authorities, and that can be much more invasive. So if somebody suspects that you say caught a car on fire at a protest, you may be subjected to subpoenas or your communications may be subjected to subpoenas or warrants, search warrants, and the sky's the limit on how much the police are going to be able to get about your communications if you are subjected to a police investigation or some other government investigation. So those might not be subjected to it because those messages are much more limited in their availability. So that's going to be a big difference in terms of whether you're just at a protest, nothing has happened, you're just posting about stuff on social media that's just going to be probably passively surveilled in one degree or another. If you're subject to an active investigation, that's a much more serious type of surveillance and you're in a much more serious situation. Michael Calore: So there are several companies in Silicon Valley that specialize in surveillance technology. They basically make products that law enforcement and governments can use to surveil people. So I think we should identify some of them. Who are the big names here? Lauren Goode: Well, there are some companies that are specifically in data intelligence, and I think the Silicon Valley company that comes to mind for most people is Palantir. Palantir is building ICE's case management software. That's just one example. There's also Clearview AI, which is a facial recognition company, and then there are data aggregators like Data Miner, and then of course there's the whole network of other tech companies too, whether they're chip makers like Nvidia or Intel or they're cloud service providers like Amazon that directly or indirectly power some of the systems that governments around the world would use in their surveillance technology, if you want to call it a surveillance technology, but there are different contexts for all of these too. For example, Andrew, one of the things that you mentioned in your video series Incognito Mode is you call out Data Miner, but you also say, "But as a journalist I've used that too." Andrew Couts: Yeah, I mean there's a lot of overlap with what reporters do, what journalists do, and what other types of investigators do. You're trying to get the information and connect dots and try to see what you can prove. And so the motivation or the end product of that is going to be very different depending on what your job is. The thing, I think anybody using them regardless of why is just how powerful they are and how much data we're all producing all the time. And I think Data Miner is a good example. It's really one of the main ways that social media is surveilled, and I think when we're talking about social media, we're not just talking about X and Instagram and TikTok, we're talking about all of those plus Reddit forums, everything where there's user participation online is often getting sucked up into these tools as long as those posts are publicly available. A lot of these companies, they're now using AI to perform additional data analysis, at least on these conversations that are happening online and kind of flagging things to say, "This looks like it's maybe a threat," or, "This looks like it maybe falls into whatever parameters that an investigator of any type wants to look into." And so we're taking the human element out of it so it's not just some guy watching your Bluesky feed, it is a computer watching everybody's Bluesky feed and then using AI to flag that for human beings who can then maybe look into it further. It's happening constantly. We just have to assume everything you post, even if you delete it, whatever, it's all being vacuumed up into these big data tools and then potentially used by authorities in whatever way they're going to use them. And I think the biggest change from say the 2020 protests is we don't know how they're going to be used, what the authorities are going to be going after, what they could go after in a year from now. And so when we're talking about assessing our own personal risks, that has to be at the forefront of it is that we don't know what's going to matter or what's going to be a problem or what's going to even be a crime within the near future. Michael Calore: All right, that feels like a good place to take a break. We'll be right back. Okay, let's go back in time a little bit about five years ago to be exact. It's May 2020 and we're in the first year of the pandemic and George Floyd has been murdered by police in Minneapolis. This sparks nationwide an international protest. It also sparked a huge conversation about surveillance technology and how it was being used to monitor protesters. And Andrew, you wrote a story around this time about how hundreds of protesters in New York were arrested and eventually won a landmark settlement against the city of New York. Can you tell us about it and where the surveillance tech came in? Andrew Couts: Yeah, so this is an interesting case where the police body cam footage was ultimately used against the police department in the form of a lawsuit because the plaintiffs in this case and their legal team were able to gather, I think around 6,300 videos from protests around the New York City and use the body cam footage to document instances of police abuse in various ways against the protesters. And so they were able to win millions of dollars by doing this, and they were using the body cam footage that the police were capturing themselves. This is one instance where the system worked how it was supposed to in certain ways. They also used a tool that allowed them to go through this many, many hours of footage to be able to pinpoint instances of police use of force, use of pepper spray, other types of police infractions against the protesters. So it was really an interesting use of surveillance technology used against the police themselves as well as custom big data tools that are able to make sense of all this data because that's a lot of times when we're talking about surveilling protests, we're talking about just massive, massive amounts of data and the data doesn't matter unless you're able to make some sense of it. And so I think the tools that are used to analyze big batches of data are just as important as the tools capturing the activity or the speech or whatever it is themselves. Michael Calore: Back at the time of the 2020 protests, one of the tools that was used to identify who was in a specific location was a geofence warrant. How have geofence warrants evolved since 2020? Andrew Couts: First, let's just start with what a geofence warrant is. A geofence warrant essentially allows law enforcement to go to a tech company and ask for every device that was in a specific location and give us all the devices that were in that location at a specific time. Now, very often police departments would go to Google for this because Google's apps are on so many people's phones or Google makes people's phones, and so they're going to have the most data. They're going to probably get something on every single person who had a phone in that location, in that geofence area. Google has since said that it's no longer going to provide information that way. That doesn't mean police aren't going to still be able to get that data in some form or another, but Google isn't going to just hand over this big batch of data the way that it used to. And so that's one big change. They can also go to another company, they can go to TikTok, they can go to whatever. That said, there's been a couple of changes on the legal front as well. Last year there were two court rulings, one in the Fourth Circuit and one in the Fifth Circuit specifically about geofence warrants. And these court rulings looked almost identical from the beginning of the case, but the rulings were completely the opposite. So essentially the Fourth Circuit ruled that a geofence warrant, it doesn't constitute a search in the way that the fourth Amendment requires. The Fifth Circuit ruled that it does. Michael Calore: And as of April, the Fourth Circuit Court is actively reconsidering its stance on geofence warrants. So there's still more to come, right? Andrew Couts: There's still a lot of ambiguity around it and the changes that Google made definitely impacted police ability to get that information in such a clean one-shot way, but they're still happening. Michael Calore: What if I'm just walking by a protest going from one bus stop to another or getting a bagel? Do I get trapped in the circle that they've drawn on the map? Andrew Couts: Yeah, if you're there at the specific timeframe that the police have stipulated in their geofence warrant, then yeah, you would. Michael Calore: That's super reassuring. So we've talked a lot about police, specifically law enforcement and cities, but also the US government is collecting this information and analyzing the data that they're getting. What agencies are using these technologies to surveil people? Andrew Couts: So we know for certain that the FBI is going to be collecting data for national security purposes. We're likely seeing Department of Homeland Security collecting a lot of data. Customs and border protection are using social media surveillance. ICE is using social media surveillance. At this point, I think you just have to assume all of them are. I mean, part of the capitalism of it all is that these companies are competing and that means prices get lower. And so it's not just one company that's offering it. It's multiple companies that are offering different surveillance platforms or technologies. And so it gets cheaper for governments to get it, and then at some point it's going to make a lot more sense for a certain agency to have it, even if five, 10 years ago they wouldn't have had it. Michael Calore: Okay, let's take another break and then come right back. Welcome back to Uncanny Valley . Okay, let's talk now about what our listeners can do if they want to go and protest out in the streets or if they want to tweet through it, if they want to express themselves online, what measures should they take to protect themselves if they're worried about surveillance and if they feel as though they would not want to share as much information as we now know law enforcement and the government can collect on them? Now, Lauren, you co-authored a piece a few years ago and then just recently updated it with advice for people to go out and protest safely. And I know we have a few different guides on WIRED that people can read, but let's talk through some of the high-level stuff here. This question is for both of you, what are the top things that you would recommend for people who want to go out and protest in person? Andrew Couts: I think the top thing I would consider is whether you should bring your phone with you or not or potentially put it in a Faraday bag, which can block all signals to and from the device and limit that surveillance. That's going to be one of the greatest sources of data for anybody who wants to investigate anyone who's at a specific protest. Your phone is a surveillance machine. The best thing you can do is to throw it in the sea if you want to protect your privacy overall, but that's not practical, so consider leaving it at home. I would also be really careful about what you're posting online. If you're serious about an issue, avoid making flippant jokes that are going to be misconstrued by prosecutors basically. And don't joke about spray-painting Tesla's. Don't joke about committing crimes of any kind. Don't joke about engaging in violence and because that will be used against you if something happens and you find yourself under arrest. Michael Calore: Would you recommend that people turn off biometrics on their phone? That's a tip I see a lot. Lauren Goode: Yeah, that's one of our biggest pieces of advice. Turn off your face ID. Michael Calore: Face ID. Lauren Goode: What do they call it on the Google phone? Michael Calore: They call it fingerprint detection. Lauren Goode: Fingerprint. Sure. The idea being that if you are approached by authorities, and this goes for if you're even traveling through an airport by the way, and you're concerned that you might be detained, the idea is that someone could basically hold the phone up to your face or force you to unlock it versus using a numeric passcode. Michael Calore: Okay, and what stops somebody from holding up your phone and saying, "Plug in your passcode"? Andrew Couts: You can also just say, "I am exercising my right to remain silent," and you can say, "I'm exercising my Fifth Amendment rights." That's the law, which that advice actually stems is because police can't tell you to turn over evidence against yourself, which is ostensibly what a password is if they go in your phone and find something there. I think that advice is especially important. You mentioned airports, but the ACLU has pointed out the so-called a hundred-mile zone, which is a hundred miles from any US border or any ocean where ICE and other immigration authorities can basically just search anybody for any reason. You just have to be a much more cognizant of that. And if you're in the US on a visa, I'd be really, really careful about that because we've seen people who are here perfectly legally, and then their visas get just canceled. So if for some reason you're at a protest that is deemed not within the Trump administration's okay list, you might find yourself just automatically getting your visa canceled or anything like that if you're going to a protest. So I would just add being realistic about your own personal risk thresholds and what personal risks you probably face. The answer to that is to not go, and that's also very problematic because then you are limiting your First Amendment rights yourself and it's the chilling effect, but you have to balance those two things out. We're in kind of no man's land at the moment, and so you have to be really realistic about what makes sense for your own personal life. Michael Calore: So Lauren, what are some of the other things that you would recommend people do to stay safe if they want to go out and protest? Lauren Goode: Well, our guide recommends that you don't go alone. So traveling groups. I would also throw in there avoid taking your own car. Not only is your license plate likely to be scanned, but in terms of the location of your vehicle can be pinpointed specifically to a parking spot. Also, for whatever reason, you have to get out of there sort of quickly, having to get to your car and possibly get out of a log jam doesn't make any sense. So use public transit or traveling groups. Certainly back in 2020, we saw a lot of people wearing masks during the protests because it was covid. It was covid times. It's still not a bad idea to wear a mask, not just for health reasons, but because it obscures some of your face and therefore less of your face is being recorded and stored somewhere. This is kind of social media hygiene, which Andrew has given us a lot of great tips on, but don't capture people's faces in photos and videos. Be considerate. If you are going to take an image, maybe shoot from behind, you can't see people's faces. Try not to capture any sort of distinctive outfits, tattoos, something that could sort of set someone apart because you don't want to be a narc for them basically. Use encrypted messaging once you're on the ground. I mean, I think that these are all kind of standard good safety policies. If you suspect things are really going to get pretty hairy, it's a good idea to have important phone numbers written directly on your body. We sort of joke these days about how we don't remember anyone's phone numbers in our lives. They could be the most important person in your life. It could be your partner and you're like, "I don't know anyone's phone number because it's stored in my phone." But that can become a real issue if your stuff has been confiscated and you've been detained or arrested. A couple other things. Keep in mind the ACLU says you can protest at government buildings, but you should maybe try to stick to traditional public grounds like public streets and the sidewalks outside of government buildings. Don't block access to a government building if you're protesting. Don't do what January six protesters did, and Andrew mentioned your immigration status as well. But basically you really do have to consider the risks quite carefully if you are someone who is here on any kind of student visa or any kind of non-immigrant visa like an H-B or an O-I. I spoke to an immigration attorney who just said, really think twice about going. And she said, "It pains me not to tell people to exercise their First Amendment rights, but you're much more vulnerable in that situation and the risks are much higher for you." Michael Calore: Okay, well, this is all very good advice and I would just add to all of that hydrate, because it's going to be a very long summer and it's going to be very hot summer, and you need to make sure that you don't pass out while you're out there. Lauren Goode: That's good advice. Michael Calore: Andrew, thanks for joining us today for this conversation. It was filled with a lot of great info. Thank you. Lauren Goode: Thanks, Andrew. Andrew Couts: Thanks so much for having me. Michael Calore: And of course, everybody should check out Andrew's YouTube series on WIRED's channel. It is called Incognito Mode, and it's all about surveillance and it's all about digital privacy. Thanks for listening to Uncanny Valley . If you liked what you heard today, make sure to follow our show and rate it on your podcast app of choice. If you'd like to get in touch with us with any questions, comments, or show suggestions, write to us at uncannyvalley@ Today's show is produced by Kyana Moghadam. Amar Lal at Macro Sound mixed this episode. Page Oamek fact checked this episode. Jordan Bell is our executive producer, Katie Drummond is WIRED's Global Editorial Director, and Chris Bannon is the Head of Global Audio.