logo
#

Latest news with #AndrewCouts

The Texting Network for the End of the World
The Texting Network for the End of the World

WIRED

time6 days ago

  • General
  • WIRED

The Texting Network for the End of the World

By Andrew Couts and Dhruv Mehrotra Jun 4, 2025 6:00 AM Everyone knows what it's like to lose cell service. A burgeoning open source project called Meshtastic is filling the gap for when you're in the middle of nowhere—or when disaster strikes. Photograph: Michael Tessier Hypothetical: You wake up tomorrow morning to find a superstorm that developed overnight thanks to climate change has sparked a chain of events that abruptly ushers in a new ice age and alters human society as we know it. (Yes, this is the plot of The Day After Tomorrow . Stick with us.) All the communication networks you relied on are down. Your phone is basically worthless. The internet has functionally ceased to exist. But you need to connect with people you trust to get help and survive. What do you do? More importantly, how did you prepare? Less Hollywood-esque versions of having no cell service or Wi-Fi happen all the time, of course; maybe you're hiking in a secluded area, white-knuckling through a major natural disaster, or living under a repressive regime that cuts internet access to quash public protests. Fortunately, for all these scenarios, there's a low-budget solution: Meshtastic. Meshtastic is a program that enables devices to send text messages over long distances without needing Wi-Fi or cell service. Long range radio (LoRa) nodes help pass messages along, forming a network of devices that can talk to each other even in remote areas. Messages hop from device to device, with each node relaying messages it hasn't seen before—extending the network's reach across miles using minimal power. That is to say, Meshtastic is designed specifically for sending text messages over free-to-use radio frequencies to both groups and individuals, even when cell service and internet connections are nowhere to be found. 'The cool thing about Meshtastic is that it's like a radio infrastructure without the infrastructure. It's ad hoc,' says Eric Kristoff, a volunteer member of the Chicago chapter of the Mars Society, a nonprofit that advocates for the human exploration and colonization of Mars. Kristoff says the group has been testing the use of Meshtastic as a way to give Mars Society 'analog astronauts' the ability to communicate and keep track of each others' locations without the use of earthly infrastructure. 'We have a set of Meshtastic T-Echo radios, about the size of a deck of cards, and they are worn on the person of the analog astronaut,' he says. Photograph: Michael Tessier Photograph: Michael Tessier The radios that use Meshtastic cost roughly $30 (though you can spend two, three, or four times that if you want to). And because they operate over unlicensed radio frequencies on a network created by personal devices, they're essentially free to use. Each message is end-to-end encrypted, ensuring privacy while it's relayed through the network. And Meshtastic's optional location-tracking capabilities give people a way to monitor their communities and keep tabs on their kids—or fellow analog astronauts—without using invasive, data-hungry apps. Kirstoff says that Mars Society members will take weeks-long excursions in remote areas with little cellular or Wi-Fi connectivity, which creates additional risks. 'There is heat stroke. We are two hours from the nearest hospital. If you go too far from the campus, it can get dangerous,' Kirstoff says of the experience. 'So anytime there's a risk, the risk is made worse if people don't know where you are.' Most Meshtatic devices currently on the market need to pair with a phone over Bluetooth to function as a texting alternative. Some devices are just a radio, antenna, and battery, with the expectation that you'll make the housing yourself. The radio does all the device-to-device communication, while the iOS and Android apps or the web client let you read and compose messages that are received or sent over the network—no service plan needed. The apps also allow you to see the approximate location of nearby nodes and a map of the Meshtastic network. But fancier stand-alone devices are already available, like a line of Meshtastic-enabled gadgets from maker-friendly tech firm LilyGo that, in addition to the T-Echo model used by Mars Society members, includes Blackberry-like handhelds with their own keyboards, a smartphone-like device with an e-paper screen, and even a Meshtastic-enabled smartwatch. Meshtastic was created by technologist Kevin Hester in early 2020 as a way to communicate while doing 'any hobby where you don't have reliable internet access,' and it remains a grassroots endeavor, with established local communities spanning from Argentina to China that are ripe with a DIY ethos. The software itself is open source, meaning anyone can theoretically contribute, and hundreds have. Still, as with many open source projects, a core group of volunteer developers help maintain the Meshtastic firmware, mobile apps, and more. Jonathan Bennett, a self-described 'Linux guy' who upgraded Meshtastic to stronger end-to-end encryption for direct messaging and keeps the software working on Linux, says he first got involved in the project after a listener of one his podcasts wanted a way to communicate with friends while attending a festival where the cell network could get overloaded. 'I put my open source enthusiast hat on and I went looking, and I came across Meshtastic,' he says. 'And it immediately tickled my interest.' Bennett says he ultimately connected with Garth Vander Houwen, a C# developer who wrote Meshtastic's iOS app, and Ben Meadows, another C# developer who took on maintaining the Android app, web client, firmware, and other parts of the Meshtastic ecosystem after Hester needed to step back due to health issues. Like Bennett, Vander Houwen and Meadows got involved with Meshtastic while looking for solutions to real-life problems. Vander Houwen, an iPhone user, says he found Meshtastic while on the hunt for a way to communicate as he hiked on remote trails in the Seattle area, just to find that it only had an Android app. He decided to write the iOS app himself. 'So the fact that there was not an iOS app for Meshtastic was kind of how I got started,' he says, 'and it's been a lot of fun.' Meadows says he came to Meshtastic after a dangerous tornado hit his home state of Arkansas, causing major damage. 'My kind of initial use case was honestly a backup communication for storm-related outages,' Meadows says. After taking part in the cleanup effort around Little Rock, he realized the value of a decentralized, off-grid communication network like Meshtastic. 'It's just really handy to have anywhere where you've got a limited connection to the grid.' None of which is to say you should throw your cell phone in the sea and go all-in on Meshtastic. At least not yet. First, getting into the world of LoRa remains a little bit technical, so if the idea of 'flashing' your device with new firmware makes you instinctively pick up your phone to scroll TikTok, it might not be the hobby for you. Even if you are tech savvy, the system has some notable limitations. Using the decentralized mesh network requires having your Meshtastic device in range of at least one other radio; obstructions like buildings, trees, and hills or mountains can prevent the line-of-sight communication needed to join the mesh network. This means it may only be reliable when there's a variety of other Meshtastic nodes in the area. Next is what Meadows calls the 'narrowness' of the network's technical capabilities. 'One of the most frequent things that we get is, 'Can I replace the internet with this?' No, no you cannot,' he says. 'You can send text messages.' Mercifully, that does include emoji. Photograph: Michael Tessier This may be obvious, but you also need to have a network set up before disaster hits, Meadows says. So, set up anyone who you might need to communicate with during a cell and internet blackout before it actually happens. And, due to relatively frequent firmware updates, you can't just toss your device in a bug-out bag and forget about it. But 'if it's something that you actually use, like if you pull it out and use it once a month, you'll be good to go,' Bennett says. Then there's the issue of raw bandwidth. This limitation can cause issues when a lot of people are trying to use the network at the same time. At a ham radio convention in Dayton, Ohio, last year (yes, it's called Hamvention), the Meshtastic network crashed after someone ran a program that flooded the network with additional traffic, pushing the Meshtastic network to its limits. 'Because literally one person turned on this MQTT bridge, which then joined the rest of us into this mesh in a metal building in Dayton, it crashed the whole mesh immediately,' Vander Houwen says. After this incident, Vander Houwen, Bennett, and Meadows went to work to prepare for the upcoming Defcon hacker convention in Las Vegas, ultimately releasing a special firmware for the much larger event that Vander Houwen estimates allows 'somewhere between 2,000 and 2,500 nodes' to operate on the network simultaneously. A similar firmware released ahead of the 2025 Hamvention in May drew praise from the community. Despite Meshtastic's limitations, its promise as a backup communication system—and the sheer fun you can have with it—continues to pull in new enthusiasts. The Android app alone has drawn thousands of reviews, and the Meshtastic subreddit has grown to nearly 50,000 members. Some municipalities are even hoping to launch Meshtastic networks to help protect their communities in the event of natural disasters. For Bennett, Meadows, and Vander Houwen, they're excited to not just see the number of Meshtastic nodes increase, but to see the technology develop into something anyone can use without having to become an enthusiast or 'analog astronaut' at all. 'I think the biggest thing for me too is that it's not just accessible from the aspect of the hardware being available to more people. I want to make the software more accessible,' Meadows says. 'I want to make the experience such that I can hand this device to anybody and have them download the app and start messaging. We've come a long way. I think there's still some room to grow there.'

How Americans Are Surveilled During Protests
How Americans Are Surveilled During Protests

WIRED

time17-04-2025

  • Politics
  • WIRED

How Americans Are Surveilled During Protests

Protesters rally in Manhattan to demand an end to cuts in science, research, education and other areas by the Trump administration on April 08, 2025 in New York City. Photo-Illustration: WIRED Staff; Photograph: Spencer Platt There have been a number of protests in the past few months pushing back against President Trump's most recent policy changes, and we're likely to see more. Today on the show, WIRED's senior editor of security and investigations, Andrew Couts, talks us through the technology being used by law enforcement to surveil protests, how surveillance tech has evolved over the years, and what it means for anyone taking to the streets or posting to social media to voice their concerns. Plus, we share WIRED tips on how to stay safe, should you choose to protest. You can follow Michael Calore on Bluesky at @snackfight, Lauren Goode on Bluesky at @laurengoode, and Andrew Couts on Bluesky at @couts. Write to us at uncannyvalley@ How to Listen You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how: If you're on an iPhone or iPad, open the app called Podcasts, or just tap this link. You can also download an app like Overcast or Pocket Casts and search for 'uncanny valley.' We're on Spotify too. Transcript Note: This is an automated transcript, which may contain errors. [Archival audio]: No justice, no peace. Ho ho. Trump and Musk have got to go. Michael Calore: People are taking to the streets to challenge President Donald Trump's most recent policy changes, some of which have been created with the aid of Elon Musk and his so-called Department of Government Efficiency. [Archival audio]: All 50 states saw these so-called hands-off rallies and so did a few cities in Europe. Michael Calore: The first hands-off protests occurred earlier this month. The Tesla Takedown demonstrations have been rolling for weeks and from the feel of it, we're looking at a summer full of protests. So today we're talking about the risks of being surveilled by law enforcement during protests. We'll talk about how surveillance tech is being used, how it's evolved over the years, and what it means for anyone taking to the streets or posting to social media to voice their concerns. This is WIRED's Uncanny Valley , a show about the people power and influence of Silicon Valley. I'm Michael Calore, Director of Consumer Tech and Culture here at WIRED. Lauren Goode: And I'm Lauren Goode. I'm a senior writer at WIRED. Michael Calore: Katie Drummond is out today, but we're joined by WIRED's Senior Editor of Security and Investigations, Andrew Couts. Andrew Couts: Thanks so much for having me. Michael Calore: So let's start by talking about what's going on right now. There are the hands-off protests, there are the Tesla Takedown protests. Are these related at all? Lauren Goode: The hands-off protests and the Tesla Takedown movement are not the same, but they are related. They're both in some way resisting some of the policies that Donald Trump has quickly enacted without congressional approval in the short time since he took office in January. Tesla Takedown is pegged directly at Elon Musk who has this official but unofficial role in Trump's administration as the leader of DOGE. We sometimes refer to him as the Buddy In Chief, and the idea there is to challenge Musk's power as one of the world's richest men by devaluing one of his most important businesses in the private sector, which is Tesla, whereas the hands-off protests are about all kinds of things. They're protesting the firing of federal workers, the overreaching and potentially unconstitutional immigration policies, threats to women's rights and LGBTQ rights, threats to social security, threats to healthcare. The list goes on. The idea is basically get your hands off my rights. Michael Calore: And how are the protests looking? Lauren Goode: They're fairly significant. Tesla Takedown is a grassroots movement that started outside of Tesla dealerships in showrooms back in February and has been happening on an ongoing basis and has gotten quite a bit of attention. Hands-off had its biggest day so far on April 5th I think, and organizers said that there were more 1,300 rallies of varying sizes across the United States on that Saturday. And if you haven't heard of these rallies or seen the sizes of the crowds that people like AOC and Bernie Sanders have been pulling in, then I would seriously question the media that you're consuming because this is really happening. Michael Calore: Yeah, there's been really striking footage of people walking in Manhattan and just wall to wall people down one of the major avenues just for like a mile. Lauren Goode: Right, and not AI generated. Michael Calore: The people who are out taking the streets and engaging in their constitutional right of free speech and assembly, what are they worried about? Lauren Goode: I can't speak for everyone and I want to toss this to Andrew because I think Andrew's going to give us the real meat here in terms of digital surveillance, but I would just say that I think with any protest, even before we all had smartphones and there were surveillance cameras everywhere on every street corner in every train station, you always had to weigh the risks of doing the surveilling as in being a watchdog of the powerful and questioning abuses of power and civil rights versus being surveilled at the same time you're doing it, but because we live in this digital world now, I think surveillance really is one of the biggest threats today. Andrew, do you want to say more about that? Andrew Couts: Yeah, I mean surveillance is just constant and we are all being surveilled constantly if you have a smartphone or just on the internet. So whether someone is being surveilled at a protest, the answer is a hundred percent yes, especially if they have their phone with them and there's obviously other types of surveillance, but I think one of the things that you have to think about if you're going to engage in any type of protest and engage in your first amendment right to speak out against whatever you want to speak out against is that it's not just what's happening at the protest that matters, it's also the constant surveillance that's happening of your social media feeds or any other types of publishing you might do online. You really need to be thinking about your entire life and your entire data footprint and how that's going to be contextualized within you being at a protest. The other thing I'd be worried about is bad actors or anybody committing crimes while you're at that protest, there's a difference between going and exercising your constitutional rights and committing crimes. And I think these days those two get conflated a lot, especially after the 2020 protests where there's a lot of vandalism and violence and the protesters and the people committing crimes get all lumped together and it's very easy to lump people together these days, and I feel like that's happening on an official level in terms of immigration right now with the Department of Justice, the state Department categorizing anybody who they deem as a problematic as either a criminal outright, they'll say that or just canceling visas because somebody spoke out against the war in Gaza. These things are all getting conflated, and so you don't necessarily have power over how you're going to be perceived if you go to a protest and something happens or somebody just decides to characterize that activity in a way that's inaccurate but is potentially consequential for your life. Michael Calore: And to get into how exactly that conflation happens, I want to talk a little bit about how devices and certain signals on social media are used in order to identify you and identify you as a certain type of person or a person who was somewhere. So let's talk specifically about the phone for a minute. What specifically does the phone do to identify you? Andrew Couts: So there's a few ways. The first is even if you had no apps on your phone except for the phone app basically, probably even not then, if you just have the device with you and it's powered on your phone is going to be pinging the nearby cell towers, it's going to ping whatever the tower is that has the highest signal that's close to you and that power is going to be collecting your device ID and the time and date when your phone pinged the tower. And so that information can easily be obtained by police with subpoenas and anything to get just whatever devices were pinging a specific tower. So that's one way. The other way is through the apps on your phone. And so we've done a ton of reporting at WIRED about the ways in which advertising data, which can be collected in a few different ways, but is often collected through developer kits or SDKs, and these can often include very, very precise location data down to which parking spot you parked your car in front of a Home Depot or something. It can be extremely precise and it's constant. And so as long as your phone is on and is communicating with any server that's connected to an SDK on whatever random apps on your phone, that data is then being backing up and used typically to serve you ads, but it can also be purchased by governments, it can be purchased by police departments or anybody, me or you, if you have the money to buy that data and you can see exactly where someone was at a specific time or at least you can see where the device was. And so it's not too difficult to kind of figure out where somebody was at any certain time if you have your device. And so that's one of the main reasons that having a phone with you at a protest, you got to make that decision about whether that's the best choice. Michael Calore: Right. The idea is that as you move around in the world, if law enforcement wants to sort of draw any sort of conclusions about what kind of person you are and who you hang out with and what sorts of places you go, it's relatively easy for them to do so. Andrew Couts: Yeah, absolutely. And the fact is that they're not going to just be using one or the other. They're going to be using basically every tool available to them. So that can include other people's social media posts that show you in photographs or videos. It's going to be police body cameras, it's going to be your own social media posts or statements saying that you were at a certain place at a certain time, and so it's all going to be used together to show like, yes, this person was at X place at X time. Lauren Goode: What is your advice then for sharing the social media from a protest, particularly since social media can be an important tool for getting a message out or letting people know there is a rally happening? Andrew Couts: When making these decisions, it's really depends on your risk threshold. I think if you are really concerned about your safety and maybe your immigration status or your ability to live freely in the United States, I would definitely limit your exposure to other people's social media posts, meaning wear a mask if you're able to, remove any identifying features that you can cover up or make sure you don't have your name on your shirt or anything like that. And definitely don't post to your own social media about the protest if you're really concerned about that. Not everybody's risk levels are going to be the same though. Maybe getting the word out is the most important thing to you, maybe that's your job, but it is definitely something to factor in that you are almost certainly going to be subjected to other people's video and photos and you need to take that into consideration before you decide to go to a protest or how you decide to conduct yourself there. Michael Calore: So if we can assume that what you're doing online and not only moving around in the world, but the things that you're doing online are being monitored, then what about your private conversations? What about if you're using Twitter DMs or if you're on Facebook and you're private messaging with people on Facebook? Lauren Goode: Or WhatsApp or any of the Facebook-owned apps? Michael Calore: Yeah, sure. Is it possible for those types of things to also be exposed through like a subpoena? Basically my question here is are tech companies protecting us in any way against governments prying into our DMs? Andrew Couts: So there's a difference between active surveillance and passive surveillance, especially when we're talking about social media. There are companies that are constantly collecting everything that is posted publicly online about a particular keyword or a hashtag or anything like that. So anytime you're posting about a certain protest or a certain political thing, you might be getting subjected to some kind of surveillance there, but it's very passive. You're part of many people who are talking about a thing presumably, and it's not targeted at you. Then there's active surveillance where you are a subject of an investigation or you're a person of interest to authorities, and that can be much more invasive. So if somebody suspects that you say caught a car on fire at a protest, you may be subjected to subpoenas or your communications may be subjected to subpoenas or warrants, search warrants, and the sky's the limit on how much the police are going to be able to get about your communications if you are subjected to a police investigation or some other government investigation. So those might not be subjected to it because those messages are much more limited in their availability. So that's going to be a big difference in terms of whether you're just at a protest, nothing has happened, you're just posting about stuff on social media that's just going to be probably passively surveilled in one degree or another. If you're subject to an active investigation, that's a much more serious type of surveillance and you're in a much more serious situation. Michael Calore: So there are several companies in Silicon Valley that specialize in surveillance technology. They basically make products that law enforcement and governments can use to surveil people. So I think we should identify some of them. Who are the big names here? Lauren Goode: Well, there are some companies that are specifically in data intelligence, and I think the Silicon Valley company that comes to mind for most people is Palantir. Palantir is building ICE's case management software. That's just one example. There's also Clearview AI, which is a facial recognition company, and then there are data aggregators like Data Miner, and then of course there's the whole network of other tech companies too, whether they're chip makers like Nvidia or Intel or they're cloud service providers like Amazon that directly or indirectly power some of the systems that governments around the world would use in their surveillance technology, if you want to call it a surveillance technology, but there are different contexts for all of these too. For example, Andrew, one of the things that you mentioned in your video series Incognito Mode is you call out Data Miner, but you also say, "But as a journalist I've used that too." Andrew Couts: Yeah, I mean there's a lot of overlap with what reporters do, what journalists do, and what other types of investigators do. You're trying to get the information and connect dots and try to see what you can prove. And so the motivation or the end product of that is going to be very different depending on what your job is. The thing, I think anybody using them regardless of why is just how powerful they are and how much data we're all producing all the time. And I think Data Miner is a good example. It's really one of the main ways that social media is surveilled, and I think when we're talking about social media, we're not just talking about X and Instagram and TikTok, we're talking about all of those plus Reddit forums, everything where there's user participation online is often getting sucked up into these tools as long as those posts are publicly available. A lot of these companies, they're now using AI to perform additional data analysis, at least on these conversations that are happening online and kind of flagging things to say, "This looks like it's maybe a threat," or, "This looks like it maybe falls into whatever parameters that an investigator of any type wants to look into." And so we're taking the human element out of it so it's not just some guy watching your Bluesky feed, it is a computer watching everybody's Bluesky feed and then using AI to flag that for human beings who can then maybe look into it further. It's happening constantly. We just have to assume everything you post, even if you delete it, whatever, it's all being vacuumed up into these big data tools and then potentially used by authorities in whatever way they're going to use them. And I think the biggest change from say the 2020 protests is we don't know how they're going to be used, what the authorities are going to be going after, what they could go after in a year from now. And so when we're talking about assessing our own personal risks, that has to be at the forefront of it is that we don't know what's going to matter or what's going to be a problem or what's going to even be a crime within the near future. Michael Calore: All right, that feels like a good place to take a break. We'll be right back. Okay, let's go back in time a little bit about five years ago to be exact. It's May 2020 and we're in the first year of the pandemic and George Floyd has been murdered by police in Minneapolis. This sparks nationwide an international protest. It also sparked a huge conversation about surveillance technology and how it was being used to monitor protesters. And Andrew, you wrote a story around this time about how hundreds of protesters in New York were arrested and eventually won a landmark settlement against the city of New York. Can you tell us about it and where the surveillance tech came in? Andrew Couts: Yeah, so this is an interesting case where the police body cam footage was ultimately used against the police department in the form of a lawsuit because the plaintiffs in this case and their legal team were able to gather, I think around 6,300 videos from protests around the New York City and use the body cam footage to document instances of police abuse in various ways against the protesters. And so they were able to win millions of dollars by doing this, and they were using the body cam footage that the police were capturing themselves. This is one instance where the system worked how it was supposed to in certain ways. They also used a tool that allowed them to go through this many, many hours of footage to be able to pinpoint instances of police use of force, use of pepper spray, other types of police infractions against the protesters. So it was really an interesting use of surveillance technology used against the police themselves as well as custom big data tools that are able to make sense of all this data because that's a lot of times when we're talking about surveilling protests, we're talking about just massive, massive amounts of data and the data doesn't matter unless you're able to make some sense of it. And so I think the tools that are used to analyze big batches of data are just as important as the tools capturing the activity or the speech or whatever it is themselves. Michael Calore: Back at the time of the 2020 protests, one of the tools that was used to identify who was in a specific location was a geofence warrant. How have geofence warrants evolved since 2020? Andrew Couts: First, let's just start with what a geofence warrant is. A geofence warrant essentially allows law enforcement to go to a tech company and ask for every device that was in a specific location and give us all the devices that were in that location at a specific time. Now, very often police departments would go to Google for this because Google's apps are on so many people's phones or Google makes people's phones, and so they're going to have the most data. They're going to probably get something on every single person who had a phone in that location, in that geofence area. Google has since said that it's no longer going to provide information that way. That doesn't mean police aren't going to still be able to get that data in some form or another, but Google isn't going to just hand over this big batch of data the way that it used to. And so that's one big change. They can also go to another company, they can go to TikTok, they can go to whatever. That said, there's been a couple of changes on the legal front as well. Last year there were two court rulings, one in the Fourth Circuit and one in the Fifth Circuit specifically about geofence warrants. And these court rulings looked almost identical from the beginning of the case, but the rulings were completely the opposite. So essentially the Fourth Circuit ruled that a geofence warrant, it doesn't constitute a search in the way that the fourth Amendment requires. The Fifth Circuit ruled that it does. Michael Calore: And as of April, the Fourth Circuit Court is actively reconsidering its stance on geofence warrants. So there's still more to come, right? Andrew Couts: There's still a lot of ambiguity around it and the changes that Google made definitely impacted police ability to get that information in such a clean one-shot way, but they're still happening. Michael Calore: What if I'm just walking by a protest going from one bus stop to another or getting a bagel? Do I get trapped in the circle that they've drawn on the map? Andrew Couts: Yeah, if you're there at the specific timeframe that the police have stipulated in their geofence warrant, then yeah, you would. Michael Calore: That's super reassuring. So we've talked a lot about police, specifically law enforcement and cities, but also the US government is collecting this information and analyzing the data that they're getting. What agencies are using these technologies to surveil people? Andrew Couts: So we know for certain that the FBI is going to be collecting data for national security purposes. We're likely seeing Department of Homeland Security collecting a lot of data. Customs and border protection are using social media surveillance. ICE is using social media surveillance. At this point, I think you just have to assume all of them are. I mean, part of the capitalism of it all is that these companies are competing and that means prices get lower. And so it's not just one company that's offering it. It's multiple companies that are offering different surveillance platforms or technologies. And so it gets cheaper for governments to get it, and then at some point it's going to make a lot more sense for a certain agency to have it, even if five, 10 years ago they wouldn't have had it. Michael Calore: Okay, let's take another break and then come right back. Welcome back to Uncanny Valley . Okay, let's talk now about what our listeners can do if they want to go and protest out in the streets or if they want to tweet through it, if they want to express themselves online, what measures should they take to protect themselves if they're worried about surveillance and if they feel as though they would not want to share as much information as we now know law enforcement and the government can collect on them? Now, Lauren, you co-authored a piece a few years ago and then just recently updated it with advice for people to go out and protest safely. And I know we have a few different guides on WIRED that people can read, but let's talk through some of the high-level stuff here. This question is for both of you, what are the top things that you would recommend for people who want to go out and protest in person? Andrew Couts: I think the top thing I would consider is whether you should bring your phone with you or not or potentially put it in a Faraday bag, which can block all signals to and from the device and limit that surveillance. That's going to be one of the greatest sources of data for anybody who wants to investigate anyone who's at a specific protest. Your phone is a surveillance machine. The best thing you can do is to throw it in the sea if you want to protect your privacy overall, but that's not practical, so consider leaving it at home. I would also be really careful about what you're posting online. If you're serious about an issue, avoid making flippant jokes that are going to be misconstrued by prosecutors basically. And don't joke about spray-painting Tesla's. Don't joke about committing crimes of any kind. Don't joke about engaging in violence and because that will be used against you if something happens and you find yourself under arrest. Michael Calore: Would you recommend that people turn off biometrics on their phone? That's a tip I see a lot. Lauren Goode: Yeah, that's one of our biggest pieces of advice. Turn off your face ID. Michael Calore: Face ID. Lauren Goode: What do they call it on the Google phone? Michael Calore: They call it fingerprint detection. Lauren Goode: Fingerprint. Sure. The idea being that if you are approached by authorities, and this goes for if you're even traveling through an airport by the way, and you're concerned that you might be detained, the idea is that someone could basically hold the phone up to your face or force you to unlock it versus using a numeric passcode. Michael Calore: Okay, and what stops somebody from holding up your phone and saying, "Plug in your passcode"? Andrew Couts: You can also just say, "I am exercising my right to remain silent," and you can say, "I'm exercising my Fifth Amendment rights." That's the law, which that advice actually stems is because police can't tell you to turn over evidence against yourself, which is ostensibly what a password is if they go in your phone and find something there. I think that advice is especially important. You mentioned airports, but the ACLU has pointed out the so-called a hundred-mile zone, which is a hundred miles from any US border or any ocean where ICE and other immigration authorities can basically just search anybody for any reason. You just have to be a much more cognizant of that. And if you're in the US on a visa, I'd be really, really careful about that because we've seen people who are here perfectly legally, and then their visas get just canceled. So if for some reason you're at a protest that is deemed not within the Trump administration's okay list, you might find yourself just automatically getting your visa canceled or anything like that if you're going to a protest. So I would just add being realistic about your own personal risk thresholds and what personal risks you probably face. The answer to that is to not go, and that's also very problematic because then you are limiting your First Amendment rights yourself and it's the chilling effect, but you have to balance those two things out. We're in kind of no man's land at the moment, and so you have to be really realistic about what makes sense for your own personal life. Michael Calore: So Lauren, what are some of the other things that you would recommend people do to stay safe if they want to go out and protest? Lauren Goode: Well, our guide recommends that you don't go alone. So traveling groups. I would also throw in there avoid taking your own car. Not only is your license plate likely to be scanned, but in terms of the location of your vehicle can be pinpointed specifically to a parking spot. Also, for whatever reason, you have to get out of there sort of quickly, having to get to your car and possibly get out of a log jam doesn't make any sense. So use public transit or traveling groups. Certainly back in 2020, we saw a lot of people wearing masks during the protests because it was covid. It was covid times. It's still not a bad idea to wear a mask, not just for health reasons, but because it obscures some of your face and therefore less of your face is being recorded and stored somewhere. This is kind of social media hygiene, which Andrew has given us a lot of great tips on, but don't capture people's faces in photos and videos. Be considerate. If you are going to take an image, maybe shoot from behind, you can't see people's faces. Try not to capture any sort of distinctive outfits, tattoos, something that could sort of set someone apart because you don't want to be a narc for them basically. Use encrypted messaging once you're on the ground. I mean, I think that these are all kind of standard good safety policies. If you suspect things are really going to get pretty hairy, it's a good idea to have important phone numbers written directly on your body. We sort of joke these days about how we don't remember anyone's phone numbers in our lives. They could be the most important person in your life. It could be your partner and you're like, "I don't know anyone's phone number because it's stored in my phone." But that can become a real issue if your stuff has been confiscated and you've been detained or arrested. A couple other things. Keep in mind the ACLU says you can protest at government buildings, but you should maybe try to stick to traditional public grounds like public streets and the sidewalks outside of government buildings. Don't block access to a government building if you're protesting. Don't do what January six protesters did, and Andrew mentioned your immigration status as well. But basically you really do have to consider the risks quite carefully if you are someone who is here on any kind of student visa or any kind of non-immigrant visa like an H-B or an O-I. I spoke to an immigration attorney who just said, really think twice about going. And she said, "It pains me not to tell people to exercise their First Amendment rights, but you're much more vulnerable in that situation and the risks are much higher for you." Michael Calore: Okay, well, this is all very good advice and I would just add to all of that hydrate, because it's going to be a very long summer and it's going to be very hot summer, and you need to make sure that you don't pass out while you're out there. Lauren Goode: That's good advice. Michael Calore: Andrew, thanks for joining us today for this conversation. It was filled with a lot of great info. Thank you. Lauren Goode: Thanks, Andrew. Andrew Couts: Thanks so much for having me. Michael Calore: And of course, everybody should check out Andrew's YouTube series on WIRED's channel. It is called Incognito Mode, and it's all about surveillance and it's all about digital privacy. Thanks for listening to Uncanny Valley . If you liked what you heard today, make sure to follow our show and rate it on your podcast app of choice. If you'd like to get in touch with us with any questions, comments, or show suggestions, write to us at uncannyvalley@ Today's show is produced by Kyana Moghadam. Amar Lal at Macro Sound mixed this episode. Page Oamek fact checked this episode. Jordan Bell is our executive producer, Katie Drummond is WIRED's Global Editorial Director, and Chris Bannon is the Head of Global Audio.

How Governments Spy On Protestors—And How To Avoid It
How Governments Spy On Protestors—And How To Avoid It

Yahoo

time16-04-2025

  • Politics
  • Yahoo

How Governments Spy On Protestors—And How To Avoid It

Law enforcement's ability to track and profile political protestors has become increasingly multifaceted and technology driven. In this edition of Incognito Mode WIRED Senior Editor, Security & Investigations Andrew Couts and WIRED Senior Writer Lily Hay Newman discuss the technologies used by law enforcement that put citizens' privacy at risk—and how to avoid for products discussed in this episode of Incognito Mode:Silent Pocket SLNT Faraday Waterproof Backpack: the SLNT Storefront: you buy something through our affiliate links, we earn a commissionDirector: Efrat KashaiDirector of Photography: Brad WickhamEditor: Matthew ColbyHost: Andrew CoutsGuest: Lily NewmanLine Producer: Joseph BuscemiAssociate Producer: Paul GulyasProduction Manager: Peter BrunetteProduction Coordinator: Rhyan LarkCamera Operator: Mar AlfonsoGaffer: Niklas MollerSound Mixer: Sean PaulsenProduction Assistant: Malaia SimmsPost Production Supervisor: Christian OlguinSupervising Editor: Erica DeLeoAssistant Editor: Justin Symonds - Protests, almost by definition, are points of contention between citizens and their governments. [subdued music] Police tracking of protestors is multifaceted and includes a variety of tactics and gear that generate different data. Some surveillance is done at the protests, while other methods are used outside of it. - It's just like all different ways to get at this core thing of who was there, what are they up to, what do they think about things? I think that's sort of how I break it down because so many of these technologies are unseen or not intuitive. - In this episode, we'll discuss the technologies used by law enforcement that put citizens' privacy at risk. This is "Incognito Mode." [moody music] - The movies were way ahead on this, right? Like they were depicting, it's like the yellow box that goes around the face type of thing. Now, that is very real. This technology is more and more available to law enforcement. - Although law enforcement have had access to facial recognition tools for about 20 years, they previously were only able to search government images such as mugshots. This changed in 2018 when many police departments started using Clearview AI, a facial recognition app that allows them to match photos from around the web. Once a photo is uploaded, the app pulls up matches found online along with links to the source of those photos. - [Newsreader[ Clearview says more than 600 law enforcement agencies across the country use this software. - Based on the person's facial geometry, the images are converted by the system into a formula measuring things like eye distance. This means that law enforcement can use any image to search for a person who doesn't currently have a police record and isn't known to authorities, and potentially identify them in seconds. - I wanted to ask you, since you've covered this a lot, how do you view the risk of these platforms as they proliferate? - To be quite frank, it freaks me the hell out. Image recognition is just really, really good now and cheaper to deploy and so you know, I think it's more just kind of accepting that this is just part of life. Like just commuting every day, you're probably being subjected to some of these systems in one form or another. It's not just the systems where you have face rec built in. It can be deployed after the fact if you're in people's pictures that are posted on social media, it can get uploaded to these systems and then you can get picked out of a crowd in that way. - [Rioters] USA! USA! - We saw that with, you know, the January 6th Insurrection videos that were posted to Parler and other social media platforms. - [Newsreader] News tonight, an Auburn man has been found guilty of federal charges for his actions during the January 6th insurrection. - You know, the FBI took those, they saw people in the videos, they went back and and kind of looked to see like, "Okay, here's proof you were there." Governments in 78 countries use public facial recognition systems with varying degrees of support from their citizens. Many countries use the technology without transparent regulations. In Russia, facial recognition tools have been used not only to detain people protesting the war in Ukraine, but also to identify and arrest opponents of the government before they joined any demonstrations. Reuters reported that the facial recognition systems used in Moscow are powered by Western companies including NVIDIA and Intel. Other companies such as Amazon have also launched software that allows users to build a facial recognition database using their own photos. These systems, they're everywhere and things that you might think could kind of thwart these systems, even like wearing a mask and these kinds of things, some of the technologies can get around that. I don't know what to do with that information to be honest. - There are a lot of police here. Are you not frightened? - We are, but you know, we are together. That gives a real power. - I am frightened. Of course I'm frightened. That's why I'm just covering up all my face just so that they cannot even, you know, find my ID, but me being afraid doesn't mean that I'm not going to be here today and fight for my future. - I agree 100% with what you were saying about how masks and other deterrent measures aren't always effective at defeating these identification technologies. But clearly they are at least somewhat effective sometimes because you know, in a lot of crackdowns we've seen in the last few years by multiple governments, like one thing they'll do is try to ban mask wearing in certain settings. Yeah, are there any other things, please tell me that you have more. - Yeah, I mean I think there are ways to minimize the data and thus minimize the risks. Just simple things like not shooting pictures and videos while you're at a protest so you're not capturing yourself and anybody else who's around you is one way to keep it out of some types of systems. Avoiding some systems is better than avoiding no systems. You are going to be subjected to this technology in one way or the other and you just kind of have to proceed as best you can and minimize your contributions to those systems as much as as possible. - CCTVs or security cameras have been ubiquitous for a few decades now. One could have thought 20 or 30 years ago, like, "Well now everything is going to be captured on film all the time." But there are limitations still to just how much data is stored, for how long. You know, there've been a lot of high-profile events around the world in recent years where there wasn't adequate security footage to really know what had happened. It's not like every step you take, someone is paying to run the system and store the data to identify you. [subdued music] - In 2010, "Wired" reported on federal agents friending crime suspects on sites like MySpace in order to see their photos, communications, and personal relationships. More recently, police have used companies like Dataminr to more easily sift through massive amounts of data in order to glean information about how protests are organized, to identify activists, and to piece together people's connections to each other. - So social media accounts, right? It's a lot of data on everyone who's using these platforms. But I kind of think of these surveillance technologies in two buckets. One would be if authorities want to find out more about a specific person, right? What has Andrew been posting about or saying and are there photos you know, of Andrew online? Things like that. But then the other one would be coming at it the flipped where it's like they're looking for anyone who has been talking about X thing, or you know, anyone marking their location in a certain place on a certain day. Authorities can go directly to the sites or they might wanna use a service that kind of pulls a ton of data from social platforms together, you know, aggregates all of it and getting kind of lists of names. It gives the ability to like have this vibe check. Like those platforms themselves aren't inherently a surveillance tool, right? Sometimes we use them for journalism. - I've used some of these services like Dataminr before and once you see just the fire hose of information that you can get access to when you use it, it's becomes clear just how easy it is to kind of figure out what is going on. Even if it's not obvious to you in your own like curated timeline. Just the use of them has become more widespread. You wouldn't know without doing some investigating, "Definitely my local police department is using this or not." That creates an environment where you have to assume that that's what's happening. - Steps like making your account private or setting something to expire quickly. Maybe they can help. But I wouldn't assume those types of settings can really truly protect data on big mainstream platforms. - An example of how social media surveillance was used can be found through the MPD surveillance of the George Floyd protests in 2020. It was found that the MPD collected data about protest events including dates, locations, organizers, and estimated crowd sizes. The MPD shared this information with the Secret Service, National Park Service, and the Department of Defense. - So I think the other huge advice is about data minimization and not posting about things that you worry about getting into other people's hands. There's a tension here with chilling speech, right? The nature of the internet is to share information, right? That's like the whole purpose of the platform. When you put stuff out there, it's hard to say like, "Okay, it's out there but only for certain people," and control it. - Our perspective on it is probably a little bit different because we're journalists, we're kind of in the public eye in a way that some other people aren't, but I think anybody, no matter if you have one follower or a million, you should be really careful about what you post online and when you post it online. You know, if you're gonna post vacation pictures, I never post them while I'm actually on vacation. Because then that signal to somebody like, "Hey, my house is empty." You can apply that to all different types of risks and I think generally posting less is the way to go. - But also some people really wanna post or that's their like job, or you know, that's how they make money. It's just helpful to understand that the greater volume you're posting, the more there could be things you didn't think of that's exposing information that you didn't realize is now out there. [subdued music] - IMSI catchers, also known as cell site simulators and formerly referred to as StingRays, are devices that impersonate cell towers causing cell phones within a certain radius to connect to them. Initially designed for military and national security purposes, this technology has emerged in routine police use. Until recently, the use of IMSI catchers was withheld from the public. The FBI has even forced state and local police agencies to sign NDAs in order to use their devices. I mean, I find IMSI catchers fascinating just in that their use is really secretive, like there was a long time that police weren't allowed to say that they had them or that they were using them, so there's just- - And no one had seen one. - Right. Yeah, exactly. Can you tell us just a little bit about how that works? - These are devices that, at its core, just identify that your phone was physically in a certain location, like that's the baseline thing it's trying to achieve. Sometimes called an IMSI catcher because of this IMSI number that it's trying to pick up. They can work in different ways, they can work passively to just sort of sweep around and say what devices are in the area and let me try to, you know, decrypt their signal and catch that you know, an ID number. More often, they work actively as like a fake cell tower, taking advantage of the way the system works, that your phone is going to connect to the cell tower that's emitting the strongest signal in the area to give you the best service and then grab that ID number. Sometimes they can also potentially grab other stuff like unencrypted communications, like SMS text messages. It's important to know that one of the things that can happen when you bring a phone to an event like a protest is that the fact that you were there and potentially some other information could be sort of pulled out of the air by one of these devices. - Records show that IMSI catchers are used by 23 states and the District of Columbia, the DEA, ICE, FBI, NSA, and DHS, along with many additional agencies. In terms of how people gauge the risk of these, I mean for one thing, like you said, a lot of times they're looking to target one person or maybe a couple of people and it does end up looping in a lot of people just by the nature of how it works. But it's also one that I think is expensive and complicated to deploy and so it's probably not gonna be the top concern. If I were going to a protest, I don't think it's the thing I would be so concerned about, just as an average person. - Another thing in that vein, you know, if this technology that we're talking about is rogue cell towers, it means that actual cell towers also have all this information, right? Like your wireless provider knows where you go. So that data exists anyway and there are potentially other ways that, you know, authorities can get that information. [brooding music] - Geofence warrants, or reverse location warrants, allow law enforcement to request location data from apps or tech companies like Google or Apple for all devices in a specific area during a set time. Authorities can then track locations, identify users and collect additional data like social media accounts. - This is yet another layer in this multiple approaches to getting the same information: who was at a certain place at a certain time and what can we find out about what they were up to? - A lot of it's advertising data or what's being shared all the time from your device that you probably aren't paying much attention to and is used in a much more innocuous way typically. - And it's sort of slurping up all the data from this area, which is constrained in a way but doesn't account for passersby, people, you know, getting coffee at the deli next door, people just sort of coming up to a location to see what's going on. Like this is just bulk indiscriminate data. I am worried about it, but maybe not specifically. Like it's in the category to me of all the reasons that I might consider leaving a device at home or putting it in a Faraday bag. It's sort of just on that list of reasons that you might wanna minimize the data that your device is emitting. [subdued music] - Data brokers collect and sell personal data from public sources, websites, and apps people use every day. They aggregate all this info to build detailed profiles of people and to group them into simplified categories such as high income, new moms, pet owners, impulse buyers, and more. While advertisers are usually their primary clients, police can also purchase this data. Some of the largest data broker companies include Experian, Acxiom, and Equifax. The amount of data Equifax collected came to light in 2017 when a data breach exposed 147 million people's personal data. - I think it just fuels this ability to identify someone and track kind of their behavior across the web and potentially their speech. Similar to the way law enforcement can track people and surveil people through social media platforms, information from data brokers can aid investigations in two ways. They can be coming at it from a person of interest who they're trying to find out more about or authorities can be coming at it from, "I want information on anyone who has had an IP address in this area or anyone who has keyword searched, you know, and been shown these types of ads." - So how do data brokers collect information? The most common ways include web browsing history, everything from your Google searches, sites or apps you visit, cookies, social media activity, or even a quiz you just filled out for fun. All of that can be scraped and tracked. This data creates each person's online history map, which in turn allows brokers to build a profile on each user. The data that companies collect often include: name, address, phone number and email address, date of birth, gender, marital and family status, social security number, education, profession, income level, cars and real estate you own. It also comes from public sources. This can be anything in the public domain such as: birth certificates, drivers or marriage licenses, court or bankruptcy records, DMV records and voter registration information. It can also include commercial sources such as: your purchase history, loyalty cards, coupon use, and so forth. And finally, some websites or programs will ask for your consent to share your data. Sometimes it's anonymized in certain ways, especially when it comes to advertising data, but it's pretty trivial for law enforcement or other investigators to tie certain advertising behavior to a specific device, especially if it's collecting precise location data and there's also data brokers that are building network profiles so you can not just get information about yourself, but everybody you've interacted with, whether it's on social media or actually in real life. In the United States at least, we just lack laws that kind of regulate what these companies are able to collect. And if you have to participate in modern society, as nearly everyone does, it's almost impossible to avoid. I think in the context of protests, it's not an acute concern I would say, but it is generally speaking really freaky when the sky's the limit on what they could potentially use because there's just so much data. - I agree with what you said, sort of low on the acute scale, but high on the existential scale. [subdued music] - One of the big surveillance technologies that probably everyone who's driven on a highway knows about is license plate readers. Really just capturing what your license plate is and showing that your vehicle was at a certain place at a certain time. - Similar to like your phone, your car, it's a proxy for you. Maybe you were in the car, maybe you weren't, but that's where your car went. - There are three types of ALPR systems: stationary or fixed ALPR cameras, which are installed in a fixed location like a traffic light, telephone pole or a freeway exit ramp. The second type are mobile ALPR cameras, which are attached to police patrol cars, garbage trucks, and other vehicles, and allow them to capture data from license plates as they drive around the city. They can also assist law enforcement in gridding, which is when police officers drive up and down a neighborhood collecting license plates of all parked cars. There are also private vendors like Vigilant Solutions, which collect license plate data and sell that back to police. The third type are ALPR trailers, which are trailers police can tow to a particular area and leave for extended periods of time. It's been reported that the DE has disguised ALPR trailers as speed enforcement vehicles and placed them along the US-Mexico border. The things I'm concerned about aren't necessarily even it being used for license plates. Our colleague, Dhruv Mehrotra has done some reporting showing that license plates readers can also capture any words that are visible, so that can be what's on your t-shirt, that could be political signs in your yard. This technology may be able to be used in ways that we're not even familiar with or would imagine. You know, a lot of times when we're talking about any surveillance technologies, it's really about creating data that then is there and could potentially be used in any number of ways at any point in the future depending on who gets access to it and what they want to do with it. [moody music] - The key thing here is that these drones, even small quadcopters, like what we think of as consumer drones, they can carry a fair amount of cargo, meaning like cameras. - There are a number of different drones used by law enforcement varying in size and ability. For example, some drones have thermal imaging capabilities for night operations while others specialize in long periods of surveillance. Protestors have in the past reported drones flying overhead, for example in Minneapolis during the George Floyd protests. Police and government drones usually fly in the range of 11,200 feet above the ground. However, it's been reported that the drone used to surveil protests in Minneapolis in 2020 flew at 20,000 feet, nearly invisible to protestors on the ground. This was a Customs and Border Protection drone, which are often equipped with advanced cameras, radar, and potential cell phone geolocation tools. In terms of how freaked out are you about drones, how do you think about that? - Yeah, I would say fairly freaked out. But again, like you were saying about the layering of these technologies, I think it's not the drones themselves, it's everything they can do and how cheap they are and how easy it would be to deploy even more of this tech. When we talk about sort of evolution of different technologies, this capability is sort of similar to police helicopters and now it's just cheaper, lighter, easier. Even these sort of benign-seeming quadcopters that we see around all the time could be carrying equipment on them to do like very granular, detailed surveillance of something like a protest. [subdued music] - There are some technologies that are really just emerging and we don't even know if they've been used at protests or even used by authorities in the United States. - Right, and your face isn't the only thing sort of outside your body that can potentially identify you. For example, analyzing your gait, like how you walk. - Gait recognition technology can identify individuals by analyzing their unique walking patterns using machine learning. It captures movements through cameras, motion sensors, or even radar. It then processes this information, breaking it down into contours, silhouettes, and other distinguishing features. It offers high accuracy, but its effectiveness can be influenced by things like injuries or the types of terrain the subject is traversing. This tech is especially useful for authorities when people's faces are obscured. While there haven't been any reports of widespread use of this tech by law enforcement agencies in the US, Chinese authorities have been utilizing it on the streets of Shanghai and Beijing since at least 2018. In recent years, there have also been a number of companies working on creating emotional detection technology where AI uses biometric data to determine a person's emotional state and the likelihood they will become violent or cause a disturbance. "Wired" reporting found that Amazon-powered cameras have been scanning passengers faces in eight train stations in the UK to trial this new technology. The trials were testing the system for age and gender recognition as well as the emotional state of the person on camera. While there's no current documentation of this tech being used at protests, the BBC reported that emotional-detection tech has been used on Uyghurs in China. - Some of these could be really invasive because you know, reading your emotions, there start to be maybe inferences that someone could make about how you were feeling in a certain moment that may or may not be accurate, right? Because it's sort of being taken out of context. So it's difficult to have an algorithm just sort of come to one conclusion. Like sometimes I think you're doing your angry walk coming over when I haven't filed my story, but really then you're really nice about it and you're like, "It's okay Lily, you can do it." And you know, I took it totally the wrong way. But potentially there are more sort of in terms of just identifying someone in a certain place. It is scary that there's something characteristic about your walk. They're not saying, "Oh, it's Andrew's angry walk," but they're saying, "Oh, that's Andrew." - Certainly creating more systems that are replicating what other things like facial recognition do and applying it in to other biometrics of a person. That definitely is gonna create all the same concerns as we've seen with these other technologies that were emerging, you know, years or decades ago. But now it's your entire body, how you walk, and like you mentioned, like if we're having computers analyze like how I'm feeling in a certain moment, effectively establishing intent of whatever my actions are in that moment, that gets really scary because it might be completely inaccurate. Every time there's one of these new AI technologies, there's always some bias built in. There are gonna be people who suffer consequences unnecessarily because these systems are deployed without being fully debugged. Experts in the AI field have previously noted that emotional-detection tech is unreliable, immature, and some even call for the technology to be banned altogether. [subdued music] Here are a few simple and effective ways to protect yourself and your personal information at a protest. First, if you can, leave your phone at home, I know this might sound drastic, but the most effective way to ensure that your personal data isn't compromised and that your phone won't fall in the hands of law enforcement is by not having it with you. If that's not an option, you can put your phone in a Faraday bag so data can't be accessed. You should also turn off biometrics on your like facial recognition or fingerprint scanner, meaning you'll need a code to access it. That way your face or fingerprints can't be forcefully used to access your personal information. You can always say, "You just don't remember the code. Don't unlock it." Another thing to keep in mind is posting on social media. Jay Stanley, a senior policy analyst at the ACLU says, "if you post something online, you should do so under the assumption that it might be viewed by law enforcement." You should always check your sharing settings and make sure you know what posts are public. Try to minimize the amount of other people's faces you capture in your photos or videos, use end-to-end encrypted messaging services like Signal when possible, wear a mask in case photos or videos are taken, and finally, know your personal risks. Is your immigration status exposing you to additional dangers? Are you part of a minority group that is more likely to be targeted by law enforcement? Keeping these things in mind for yourself and your loved ones when deciding if you should go out to a protest. For more information about surveillance at protests, check out This was "Incognito Mode." Until next time. [otherwordly music]

Your Tesla Is Watching
Your Tesla Is Watching

Yahoo

time26-02-2025

  • Automotive
  • Yahoo

Your Tesla Is Watching

The 2024 Tesla Model 3 has some of the most advanced navigation, autonomous driving, and safety features currently on the market, meaning it's full of equipment that can record and track your surroundings—and you. How much data does Tesla collect? Where is it stored? And can you trust them to protect your sensitive information? WIRED decided to investigate. This is Incognito Mode. Director: Efrat Kashai Director of Photography: Brad Wickham Editor: Katie Wolford; Brady Jackson Host: Andrew Couts Line Producer: Joseph Buscemi Associate Producer: Brandon White Production Manager: Peter Brunette Camera Operator: Caleb Weiss Gaffer: David Djaco Sound Mixer: Sean Paulsen Production Assistant: Kameryn Hamilton Set Designer: Jeremy Derbyshire-Myles Writer: Eric Geller Researcher: Paul Gulyas Post Production Supervisor: Christian Olguin Supervising Editor: Doug Larsen Additional Editor: Jason Malizia Assistant Editor: Justin Symonds Special Thanks: P & P Shipping - Is this Tesla spying on me right now? Probably not. Can it? It definitely can. [mellow electronic music] This is a 2024 Tesla Model 3. It has some of the most advanced navigation, autonomous driving, and safety features currently on the market, meaning it's full of equipment that can record and track you and your surroundings, but what happens to that data? How much of it does Tesla collect and can you trust the company to protect your sensitive information? I'm Andrew Couts. Today, we're doing a deep dive into Tesla's privacy issues. This is "Incognito Mode." [somber electronic music] There have been roughly seven million Teslas sold, and each one of them is essentially a surveillance system on wheels. The manual says there are Wi-Fi and GPS antennas in the passenger-side mirror, which lets your car see where you are and get directions and download software updates when it's parked in your garage. Then, of course, there are all the cameras. Every Tesla has a rear-view camera, a common feature which has actually been required in all new cars since 2018, but Teslas also have either seven or eight other cameras depending on the model, two mounted on the door pillars, two mounted on the front fenders, one inside the car, and either two or three mounted on the windshield above the rear-view mirror. These cameras record around the car and feed that information to an onboard AI called Tesla Vision, which figures out things like whether you're staying in your lane, getting too close to other vehicles, or if you're approaching a red light. With this 360 view, you're really recording a massive amount of information about everything the car drives by, license plates, people's faces, where people are on the street. Basically, public information, but it's a whole lot for a car to collect. With all these sensors and cameras, your Tesla is awash in data, data that the company says your car needs to stay safe and get smarter, and Tesla says that it keeps your data private, but is it really that simple? Let's look at what happens to the most sensitive data that your Tesla collects. [somber electronic music] Your Tesla is constantly using its GPS sensor to monitor where you go, but according to Tesla, the company doesn't collect your location data with two exceptions. The first exception is if you experience what the company calls a safety event. Basically, if you get into an accident. If your car senses a crash, it'll send your location to Tesla. The second is if you turn on location data sharing. If you do that, your car will send location data to Tesla so it can evaluate how well its cars analyze and respond to road conditions. Tesla promises that, if you give it permission to collect this data, it'll make sure it's anonymized. The company says that it does not link your location with your account or your identity or keep a history of where you've been. [somber electronic music] Tesla says that it only collects camera footage in limited circumstances. For the exterior cameras, your car processes the data itself unless you enable data sharing, in which case recordings of up to 30 seconds are shared with Tesla so it can evaluate how well its cars analyze and respond to road conditions. Just like when you voluntarily share location data, Tesla says this information is anonymized. According to Tesla, the only time your car will share your camera footage and link it to your identity is if you're in an accident. Tesla's exterior cameras are also used for a feature called Sentry Mode, which basically lets you monitor your car's surroundings, or if your car detects a threat or unusual movement, you can watch a recording of the event. Tesla promises that these features are secure. The live view is protected with end-to-end encryption, meaning that even Tesla can't access it, and the recordings can only be saved on a USB drive. That means they aren't sent to Tesla. Still, you're gonna wanna keep Sentry Mode in mind, and we'll explain more later. As for the interior camera, the one that might be seeing more sensitive things, Tesla says that all its footage stays in the car unless you enable data sharing. If you turn on that sharing and you get in an accident, your car will send Tesla short anonymized clips. It's the same as with the exterior cameras, but what's different about the interior camera is, if you don't turn on data sharing, the footage will never be sent to Tesla, and if you're wondering about audio recordings generated by voice commands like asking your car to turn on the AC, Tesla says those recordings stay in the car too unless you enable data sharing so it can improve the accuracy of your car's responses. Just like with the cameras, Tesla says it doesn't capture continuous audio recordings, but can you trust Tesla's promises? The truth is your privacy isn't as simple as the company makes it sound. [somber electronic music] There are two big problems with Tesla's privacy claims. The first has to do with anonymization. Remember how Tesla claims that it anonymizes all the data so it can't be traced back to you? Well, this anonymization isn't fool-proof. Each piece of data gets a temporary ID when it's sent to Tesla servers, but as IEEE Spectrum explained, that temporary ID can stay active for days or even weeks, and during that time, everything associated with that ID is clearly linked for everyone at Tesla to see. That could include repeated visits to places that could clearly identify someone like homes, schools, and office buildings. As one Tesla owner who reverse-engineered the car's data collection system told IEEE Spectrum, "You could probably match everything to a single person if you wanted to." And that leads to our second big problem, how much you can trust Tesla itself. Tesla makes a lot of privacy guarantees, but when the Mozilla Foundation analyzed those promises, it wasn't very impressed. "Tesla does brag on its privacy pages about how they're committed to protecting your data privacy. However, we worry that their actions too often show otherwise." Mozilla criticized the policy's really vague language and lack of clarity on sharing with third parties, and it said it was very worried about Tesla's privacy. They even went as far as saying it's hard to trust them with their current track record. Tesla's had some pretty serious privacy and security incidents over the past few years. The biggest scandal involved Tesla employees spying on customers through the images and videos recorded by their car's cameras. - [Reporter] A car owner filing a potential class action suit after a Reuters report published Thursday said a small group of former employees described sharing sensitive customer videos internally. - According to Reuters, between 2019 and 2022, Tesla employees sent each other highly invasive images and videos from their customers' cars. This included footage of naked people and even a car hitting a child riding a bike. Tesla employees could see inside homes and garages as well as the GPS locations of their recordings. After the news broke in 2023, members of Congress took notice. Senators Ed Markey and Richard Blumenthal wrote to Tesla CEO Elon Musk, saying, "The apparent willful disregard of Tesla customers' privacy is unacceptable and raises serious questions about Tesla's management practices." Employees spying isn't the only reason to worry about how Tesla protects your privacy. The company has also had trouble holding on to its data. - You have privacy questions because information is being gathered about you at all times, where you park, where you go, and it's being kept by the automotive industry without any real standards that are in place today. - In May 2023, a whistleblower gave a German newspaper 100 gigabytes of internal Tesla documents, including sensitive employee details, customers' bank information, and even production secrets. A German data protection officer told The Guardian that the data breach was unprecedented in scale. Now that we've covered what's in a Tesla and what to make of the company's privacy promises, let's go over what could go wrong if you own, rent, ride in, or just walk by one of these cars. [somber electronic music] When it comes to Tesla's privacy risks, there's really three categories you have to consider, government surveillance, hacking, and good old-fashioned forgetfulness. You own a Tesla, the government can just search your car. They may not even need a warrant to do so. The Fourth Amendment's so-called automobile exception lets the police search a car without a warrant as long as they have probable cause to be suspicious of you, and searching a Tesla can involve plugging your car into a computer and analyzing all the data it's collected about your travel history, and here's something a lot of people don't know. Even if you don't own a Tesla and you don't take a ride in one, you could still get caught in the company's de facto surveillance. Remember that Sentry Mode feature we told you about earlier? If you're walking down the street near a Tesla and a crime occurs nearby, you could get caught up in a police investigation without having anything to do with the incident. Tesla's privacy policy contains vague and broad language about when the company will share car data with government authorities. The company says that it will share information in response to subpoenas, but it also says it will turn over data if the company thinks the law requires it for purposes of security or other issues of public importance. It's anyone's guess what that means. Apart from government investigations, there are also criminal gangs that like to hack companies, take huge piles of customer data, and charge ransoms not to post it online. There's also the possibility that hackers could steal your data right off your car. In early 2023, researchers at Pwn2Own, a security conference in Canada, hacked into a Tesla Model 3 in less than two minutes. "They went from what's essentially an external component, the Bluetooth chipset, to systems deep within the vehicle." In addition to police and hackers, there's a third privacy risk you should consider, yourself. When people rent Teslas or trade in their used ones, they sometimes forget to erase their car's memory. Researchers have found sensitive unencrypted data sitting on Teslas purchased at scrap yards, where their previous owners probably thought it would never be accessed again. [somber electronic music] Don't other cars have these problems too? It's a fair question. A 2023 Mozilla Foundation review of connected vehicle privacy concluded, "Modern cars are a privacy nightmare." The truth is all car companies collect a lot of data on their customers. Companies like Toyota use the car's interior camera to make sure drivers are paying attention, even to verify a driver's identity to prevent theft. Another example is Nissan's privacy policy, which states it could collect not just your location history, but anything you might do inside the privacy of your own car, but not all of your car's data is equally sensitive and not all car makers collect the same data and handle it in the same way, and when it comes to the most sensitive data, Tesla stands out. Given its unique quantity of cameras and sensor data, its alarming privacy and security failures, and its vague policy about sharing information with the government. "As far as we know," one researcher told IEEE Spectrum, "Tesla vehicles collect the most amount of data." And in its 2023 review of car companies, Mozilla said, "Tesla is only the second product we've ever reviewed to receive all of our privacy dings." Of course, we can't talk about Tesla's privacy risks without talking about Elon Musk. The world's richest man has recently inserted himself into the center of American politics and aligned himself with President Donald Trump. He's repeatedly twisted the policies of his social media network X to satisfy his own personal whims and support Trump's political agenda, and remember how Tesla says it doesn't collect continuous video from its cars? We don't know if that's a technological limitation or just a company policy. Basically, if the technology allows it, all it takes is a policy tweak for your level of privacy to drastically change. [somber electronic music] Now that you know how Tesla collects your information and why it might not be as private as the company claims, you're probably wondering how you can protect yourself. If you own a Tesla, the one thing you can do is disable data sharing. Of course, we should point out that disabling some of these features will make your Tesla less smart. Using technology is all about making trade-offs. You'll have to decide for yourself what you're most comfortable with. If you're getting rid of your Tesla, you can use the company's website or app to request a total account deletion, and if you're a pedestrian worried about ending up in the background of a Tesla recording, maybe just keep an eye out for that sleek T logo and cross the street any time you see one. Until next time. [playful electronic music]

Your Tesla Is Watching
Your Tesla Is Watching

WIRED

time26-02-2025

  • Automotive
  • WIRED

Your Tesla Is Watching

Is this Tesla spying on me right now? Probably not. Can it? It definitely can. [mellow electronic music] This is a 2024 Tesla Model 3. It has some of the most advanced navigation, autonomous driving, and safety features currently on the market, meaning it's full of equipment that can record and track you and your surroundings, but what happens to that data? How much of it does Tesla collect and can you trust the company to protect your sensitive information? I'm Andrew Couts. Today, we're doing a deep dive into Tesla's privacy issues. This is Incognito Mode. [somber electronic music] There have been roughly seven million Teslas sold, and each one of them is essentially a surveillance system on wheels. The manual says there are Wi-Fi and GPS antennas in the passenger-side mirror, which lets your car see where you are and get directions and download software updates when it's parked in your garage. Then, of course, there are all the cameras. Every Tesla has a rear-view camera, a common feature which has actually been required in all new cars since 2018, but Teslas also have either seven or eight other cameras depending on the model, two mounted on the door pillars, two mounted on the front fenders, one inside the car, and either two or three mounted on the windshield above the rear-view mirror. These cameras record around the car and feed that information to an onboard AI called Tesla Vision, which figures out things like whether you're staying in your lane, getting too close to other vehicles, or if you're approaching a red light. With this 360 view, you're really recording a massive amount of information about everything the car drives by, license plates, people's faces, where people are on the street. Basically, public information, but it's a whole lot for a car to collect. With all these sensors and cameras, your Tesla is awash in data, data that the company says your car needs to stay safe and get smarter, and Tesla says that it keeps your data private, but is it really that simple? Let's look at what happens to the most sensitive data that your Tesla collects. [somber electronic music] Your Tesla is constantly using its GPS sensor to monitor where you go, but according to Tesla, the company doesn't collect your location data with two exceptions. The first exception is if you experience what the company calls a safety event. Basically, if you get into an accident. If your car senses a crash, it'll send your location to Tesla. The second is if you turn on location data sharing. If you do that, your car will send location data to Tesla so it can evaluate how well its cars analyze and respond to road conditions. Tesla promises that, if you give it permission to collect this data, it'll make sure it's anonymized. The company says that it does not link your location with your account or your identity or keep a history of where you've been. [somber electronic music] Tesla says that it only collects camera footage in limited circumstances. For the exterior cameras, your car processes the data itself unless you enable data sharing, in which case recordings of up to 30 seconds are shared with Tesla so it can evaluate how well its cars analyze and respond to road conditions. Just like when you voluntarily share location data, Tesla says this information is anonymized. According to Tesla, the only time your car will share your camera footage and link it to your identity is if you're in an accident. Tesla's exterior cameras are also used for a feature called Sentry Mode, which basically lets you monitor your car's surroundings, or if your car detects a threat or unusual movement, you can watch a recording of the event. Tesla promises that these features are secure. The live view is protected with end-to-end encryption, meaning that even Tesla can't access it, and the recordings can only be saved on a USB drive. That means they aren't sent to Tesla. Still, you're gonna wanna keep Sentry Mode in mind, and we'll explain more later. As for the interior camera, the one that might be seeing more sensitive things, Tesla says that all its footage stays in the car unless you enable data sharing. If you turn on that sharing and you get in an accident, your car will send Tesla short anonymized clips. It's the same as with the exterior cameras, but what's different about the interior camera is, if you don't turn on data sharing, the footage will never be sent to Tesla, and if you're wondering about audio recordings generated by voice commands like asking your car to turn on the AC, Tesla says those recordings stay in the car too unless you enable data sharing so it can improve the accuracy of your car's responses. Just like with the cameras, Tesla says it doesn't capture continuous audio recordings, but can you trust Tesla's promises? The truth is your privacy isn't as simple as the company makes it sound. [somber electronic music] There are two big problems with Tesla's privacy claims. The first has to do with anonymization. Remember how Tesla claims that it anonymizes all the data so it can't be traced back to you? Well, this anonymization isn't fool-proof. Each piece of data gets a temporary ID when it's sent to Tesla servers, but as IEEE Spectrum explained, that temporary ID can stay active for days or even weeks, and during that time, everything associated with that ID is clearly linked for everyone at Tesla to see. That could include repeated visits to places that could clearly identify someone like homes, schools, and office buildings. As one Tesla owner who reverse-engineered the car's data collection system told IEEE Spectrum, You could probably match everything to a single person if you wanted to. And that leads to our second big problem, how much you can trust Tesla itself. Tesla makes a lot of privacy guarantees, but when the Mozilla Foundation analyzed those promises, it wasn't very impressed. Tesla does brag on its privacy pages about how they're committed to protecting your data privacy. However, we worry that their actions too often show otherwise. Mozilla criticized the policy's really vague language and lack of clarity on sharing with third parties, and it said it was very worried about Tesla's privacy. They even went as far as saying it's hard to trust them with their current track record. Tesla's had some pretty serious privacy and security incidents over the past few years. The biggest scandal involved Tesla employees spying on customers through the images and videos recorded by their car's cameras. [Reporter] A car owner filing a potential class action suit after a Reuters report published Thursday said a small group of former employees described sharing sensitive customer videos internally. According to Reuters, between 2019 and 2022, Tesla employees sent each other highly invasive images and videos from their customers' cars. This included footage of naked people and even a car hitting a child riding a bike. Tesla employees could see inside homes and garages as well as the GPS locations of their recordings. After the news broke in 2023, members of Congress took notice. Senators Ed Markey and Richard Blumenthal wrote to Tesla CEO Elon Musk, saying, The apparent willful disregard of Tesla customers' privacy is unacceptable and raises serious questions about Tesla's management practices. Employees spying isn't the only reason to worry about how Tesla protects your privacy. The company has also had trouble holding on to its data. You have privacy questions because information is being gathered about you at all times, where you park, where you go, and it's being kept by the automotive industry without any real standards that are in place today. In May 2023, a whistleblower gave a German newspaper 100 gigabytes of internal Tesla documents, including sensitive employee details, customers' bank information, and even production secrets. A German data protection officer told The Guardian that the data breach was unprecedented in scale. Now that we've covered what's in a Tesla and what to make of the company's privacy promises, let's go over what could go wrong if you own, rent, ride in, or just walk by one of these cars. [somber electronic music] When it comes to Tesla's privacy risks, there's really three categories you have to consider, government surveillance, hacking, and good old-fashioned forgetfulness. You own a Tesla, the government can just search your car. They may not even need a warrant to do so. The Fourth Amendment's so-called automobile exception lets the police search a car without a warrant as long as they have probable cause to be suspicious of you, and searching a Tesla can involve plugging your car into a computer and analyzing all the data it's collected about your travel history, and here's something a lot of people don't know. Even if you don't own a Tesla and you don't take a ride in one, you could still get caught in the company's de facto surveillance. Remember that Sentry Mode feature we told you about earlier? If you're walking down the street near a Tesla and a crime occurs nearby, you could get caught up in a police investigation without having anything to do with the incident. Tesla's privacy policy contains vague and broad language about when the company will share car data with government authorities. The company says that it will share information in response to subpoenas, but it also says it will turn over data if the company thinks the law requires it for purposes of security or other issues of public importance. It's anyone's guess what that means. Apart from government investigations, there are also criminal gangs that like to hack companies, take huge piles of customer data, and charge ransoms not to post it online. There's also the possibility that hackers could steal your data right off your car. In early 2023, researchers at Pwn2Own, a security conference in Canada, hacked into a Tesla Model 3 in less than two minutes. They went from what's essentially an external component, the Bluetooth chipset, to systems deep within the vehicle. In addition to police and hackers, there's a third privacy risk you should consider, yourself. When people rent Teslas or trade in their used ones, they sometimes forget to erase their car's memory. Researchers have found sensitive unencrypted data sitting on Teslas purchased at scrap yards, where their previous owners probably thought it would never be accessed again. [somber electronic music] Don't other cars have these problems too? It's a fair question. A 2023 Mozilla Foundation review of connected vehicle privacy concluded, Modern cars are a privacy nightmare. The truth is all car companies collect a lot of data on their customers. Companies like Toyota use the car's interior camera to make sure drivers are paying attention, even to verify a driver's identity to prevent theft. Another example is Nissan's privacy policy, which states it could collect not just your location history, but anything you might do inside the privacy of your own car, but not all of your car's data is equally sensitive and not all car makers collect the same data and handle it in the same way, and when it comes to the most sensitive data, Tesla stands out. Given its unique quantity of cameras and sensor data, its alarming privacy and security failures, and its vague policy about sharing information with the government. As far as we know, one researcher told IEEE Spectrum, Tesla vehicles collect the most amount of data. And in its 2023 review of car companies, Mozilla said, Tesla is only the second product we've ever reviewed to receive all of our privacy dings. Of course, we can't talk about Tesla's privacy risks without talking about Elon Musk. The world's richest man has recently inserted himself into the center of American politics and aligned himself with President Donald Trump. He's repeatedly twisted the policies of his social media network X to satisfy his own personal whims and support Trump's political agenda, and remember how Tesla says it doesn't collect continuous video from its cars? We don't know if that's a technological limitation or just a company policy. Basically, if the technology allows it, all it takes is a policy tweak for your level of privacy to drastically change. [somber electronic music] Now that you know how Tesla collects your information and why it might not be as private as the company claims, you're probably wondering how you can protect yourself. If you own a Tesla, the one thing you can do is disable data sharing. Of course, we should point out that disabling some of these features will make your Tesla less smart. Using technology is all about making trade-offs. You'll have to decide for yourself what you're most comfortable with. If you're getting rid of your Tesla, you can use the company's website or app to request a total account deletion, and if you're a pedestrian worried about ending up in the background of a Tesla recording, maybe just keep an eye out for that sleek T logo and cross the street any time you see one. Until next time. [playful electronic music]

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store