logo
#

Latest news with #Gaggle

‘Spy High:' Amazon Documentary Probes Dangers of Online Student Surveillance
‘Spy High:' Amazon Documentary Probes Dangers of Online Student Surveillance

Yahoo

time21-04-2025

  • Yahoo

‘Spy High:' Amazon Documentary Probes Dangers of Online Student Surveillance

It all began with a pixelated image of a Mike and Ike: the colorful, fruity candy that with a digital blur and authorities' preconceived notions could perhaps be mistaken for a pill. That's what happened to 15-year-old Blake Robbins, who was accused by officials in Pennsylvania's affluent Lower Merion School District of dealing drugs in 2009 after they surreptitiously snapped a photo of him at home with the chewy candy in his hand. The moment was captured by the webcam on his school-issued laptop, one of some 66,000 covert student images collected by the district, including one of Robbins asleep in his bed. Robbins sued and the subsequent case, dubbed 'WebcamGate,' is at the center of Spy High, a four-part documentary series now streaming on Amazon Prime, that examines the high-profile student surveillance scandal and the explosion of student privacy threats that followed it. Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter The Lower Merion School District, which settled the class-action lawsuit, was an early adopter of one-to-one computer education technology programs that provide school-issued laptops to students. Such programs have since become mainstream nationwide, particularly since the pandemic. So, too, have digital surveillance tools like Gaggle and GoGuardian, which alert educators when students express thoughts of self-harm or discuss topics deemed taboo, like sex, violence or drugs. Directed by Jody McVeigh-Schultz and executive produced by Mark Wahlberg, the documentary offers a cautionary tale about what happens when student monitoring initiatives — often intended to promote young people's safety and well-being — go awry. It also explores how covert student surveillance intersects with far-reaching school equity issues involving race, disability, privilege and discipline. After years of reporting on digital student surveillance myself, I caught up last week with McVeigh-Schultz, whose other documentaries include Shiny Happy People about reality TV's seemingly wholesome Duggar family and the Emmy-nominated The Murders at Starved Rock, which delves into the brutal 1960 killing of three women in an Illinois state park. We talked about what he wants viewers to take away from the Robbins' scandal 15 years after it unfolded and the lessons it holds for contemporary student privacy debates and schools' growing reliance on ed tech. The interview was edited for length and clarity. What motivated you to take a deep dive into the Robbins case, and why is it important right now? I grew up just outside of Philly in a suburb called Cheltenham and I had heard about this story. I knew Lower Merion as the high school that Kobe [Bryant] went to. That's what it was famous for, but I knew about the Robbins story and I was like, 'That's crazy,' when I heard about it back in 2010 and then I kind of never heard anything more about it. It was a really big story and then just kind of went away. When we talked to folks from wealthy suburbs outside of Philadelphia, I think it's very clear to me that one of the key indicators of status is education. It's more important than anything else to people. The public schools in Lower Merion are really highly rated and people care a ton about the quality of the education and the image of the institution. What are the real world implications of that? In this case, the way it played out, some of the things that happened were counterintuitive. Many folks from that community didn't want to see a lawsuit come to bear against their school. It was like, 'Oh well, you know, this actually is perhaps going to affect our home values,' if you're selling your home and the biggest selling point is the quality of the education. That's something that you wouldn't expect to be one of the first reactions to finding out that the schools may be surveilling your kids. But it was, and the fact that the Robbins family had lived in the community for a long time but just weren't considered part of the in-group just because of who they were was very interesting and, I think, led to people being skeptical of them. The documentary leaves it up to you to decide whether that skepticism is deserved or not. Absolutely. The documentary certainly highlights how people are complex and have complicated stories. What did you learn about debates over personal privacy, especially when it comes to information about children? People's expectations of how much privacy you should be afforded, and how much you should expect without having to ask any questions, those expectations vary a lot. Somebody who was interviewed in a news piece that ran in 2010 said, 'You know, this is the school district's laptop, they could tap in at any time and rightfully so.' I'm a parent, I have a 2-year-old and a 7-year-old who's in first grade. To me, that seems a bit absurd, but the truth is, I think there are certain contexts where a school-issued laptop is going to be surveilled. We know it's going to be surveilled, but we don't expect that it will be able to take pictures in our kids' bedrooms. Related To me it's a matter of where are [the] spaces where we should reasonably expect privacy? Transparency is the most important aspect of all of this. Not only were there no conversations going on like, 'Hey look, these laptops are going to be surveilled in a number of ways. You should not be leaving them open in your bedroom, You should not be going on any website you wouldn't want your principal to also see.' The IT department specifically thought it would be a bad idea if parents and students were alerted to the existence of the software that could take images. They felt like, 'Well then we won't be able to recover the stolen laptop because people will just put tape over it.' Well, that is their decision not to have images taken of them in their bedroom, right? One of the journalists we interviewed said it was like trying to kill a fly with a bazooka. This level of surveillance was not required to track inventory. It just wasn't. Hindsight is 20/20 but it's obvious from what transpired that they spent a lot more money on legal fees and settling these lawsuits than they ever saved by making sure a handful of laptops were not stolen or lost. What did you learn about the motives of the school district officials, the lawyers and the families involved? When I'm making a documentary I'm never thinking in terms of quote-unquote good guys and bad guys. Everyone in this story thought they were doing what was best for the students involved. But in the end, I think there was this balance of protecting students ' privacy and protecting the image of the school district. When a mistake is made, there is a reluctance to admit and take responsibility and accept blame. Once you do that, you are admitting to what happened and then there's all these legal ramifications. Multiple people are like, you know, these kids need therapists, they need somebody to check on them and to be like, 'Hey, your privacy was violated, are you doing OK?' and that did not happen. I can't say why that didn't happen but to me it seems likely that part of not offering people help is that the minute you say this person needs a therapist because of what we did, you're admitting to a pretty major violation. The documentary doesn't focus just on the Robbins case. It offers a deep dive into education policy debates around racial inequities, school integration, gender equality and LGBTQ+ rights. What did you find were the implications of surveillance for these populations? We talked to Elizabeth Laird at the Center for Democracy and Technology and one of the things she said she sees all the time is that when surveillance is ubiquitous and regularly used in education, vulnerable populations end up feeling the brunt of the negative repercussions. In this case, back in 2010, people discovered that a disproportionate amount of the students that were surveilled were African American. There was a sense that if this technology was being misused to discipline students or to check up on students then the chances are it was going to be misused for somebody that was a student of color. Related When we started talking to students of color who had their images taken, we started to understand, 'Oh, there is this whole context to what they're experiencing.' Somebody said you can't understand the laptop issue without understanding all these other battles that were happening at the time. There was a history of an achievement gap there and African-American parents felt like if you wanted to get an equal education for your kids, you had to fight for it. In this context, there was a real lack of trust of the school district by African-American parents. Keron Williams and his mother really wanted to tell his story. It was a story of somebody suspecting him of stealing a bracelet and him being brought into the principal's office. He says his laptop webcam was activated a couple days later after they searched his pockets and found nothing but a Boy Scouts handkerchief. There's racial profiling but also this idea of the misuse of technology meant to keep laptops from being stolen. If something like this is misused, vulnerable populations are going to feel the brunt of it more. Related That brings me to one of the other stories we talked about, which was more recent. In 2020, with the pandemic, school-issued devices and remote learning became the norm. We talked to two students who started high school online, went to classes on Zoom, and they were using their school-issued laptops for everything. The way they communicated instead of seeing their friends at lunch was through a Google Hangouts chat. What they didn't realize was their school was using monitoring software that essentially scooped up everything they wrote while logged into their school account, including private chats. They were brought to the principal's office and were confronted with what they wrote. The context of it is that the school decided it was bullying. What we reveal is that they were using the word 'gay' because they were. The term they used was 'we're a pretty gay friend group. Gay was a descriptor to us.' One of these kids had to come out in the principal's office with his father there. Luckily his parents were pretty great about it, but that's a really awful position to put a kid in and, you know, again, a vulnerable population bearing the brunt of overzealous surveillance. The goal of this surveillance is to protect kids, it's to make sure kids aren't hurting themselves, hurting other students. There's obviously a mental health crisis going on in terms of high school-aged kids, but there really has to be a discussion about whether these tactics are making the mental health crisis better or worse. You're talking about the tools that schools nationally have increasingly used to collect and analyze reams of information about students in the name of keeping them safe. This includes tools like Gaggle and GoGuardian. Given the growth in these tools, do any guardrails need to be put in place? First of all, it's so important that students know what is being used to surveil any device they're using. The fact that kids hadn't heard of Gaggle is really a problem. But if they know about it, that doesn't solve all the problems because what you're asking high schoolers especially to do is to find their own voice, understand how to freely express themselves, to be vulnerable. In some of my best creative writing courses my teachers were saying, 'Look, if it scares you to write this, you're probably going in the right direction.' The minute a kid realizes, 'Well, everything that I'm writing in a creative writing class — a poem, a personal essay — is going through this software, maybe going to my principal, maybe going to law enforcement,' they're going to express themselves differently. That's just a really dangerous road to go down. Related Students and parents have to be aware, but also I just think it should be less powerful. I don't think we should be able to say there are no ways in which you can use our technology, which is kind of unavoidable if you're a high school student, without being constantly surveilled. In Minnesota, the story we cover, they changed the law to outlaw surveillance software. That's a pretty huge step, and I think that'll happen more and more as people become more aware of this stuff. There are just places where we should not be allowing this.

School Surveillance Systems Threaten Student Privacy, New Knight Institute Lawsuit Alleges
School Surveillance Systems Threaten Student Privacy, New Knight Institute Lawsuit Alleges

Yahoo

time02-04-2025

  • Yahoo

School Surveillance Systems Threaten Student Privacy, New Knight Institute Lawsuit Alleges

Getty Images What happens when the era of AI-powered surveillance coincides with an authoritarian assault on public education? The transparency needed to answer that question does not exist, which is why the Knight First Amendment Institute, where I work, has filed a lawsuit seeking information about how one school district uses surveillance systems to monitor student laptops. It is estimated that millions of children — nearly half of K-12 students across the nation, according to a recent New York Times report —are subject to digital surveillance systems that can potentially monitor every word or phrase they type on school-issued laptops, tablets, and software. These software systems, supplied to school districts by private education technology companies like GoGuardian, Gaggle, and Lightspeed, scan students' communications, internet searches, and assignments, searching for keywords or phrases that may indicate cyberbullying, thoughts of self-harm, or thoughts of harming others. If a student uses a certain word or phrase deemed inappropriate by the vendor or the school district using the technology, administrators are notified and, in some instances, police get involved. The systems also have filtering capabilities, which can restrict students' access to certain websites and pages based on their content. Stay up-to-date with the politics team. Sign up for the Teen Vogue Take Given the widespread concerns about youth mental health, many people might view such digital surveillance as a godsend, a critical tool for combatting tragically high rates of youth suicide and depression, as well as school shootings. But that viewpoint likely rests on two assumptions: first, that the systems are as effective as the edtech industry claims, and second, that school districts are limiting the systems' purview to content that relates to student safety. It's not clear, however, that those assumptions have merit. The scant information we have about school surveillance suggests that the systems can lead to erroneous flags and can be used as a tool of overzealous surveillance by school administrators. Public records obtained by the Electronic Frontier Foundation have exposed several examples of students being flagged for innocuous visits to websites containing the text of Genesis from classic literature like Romeo and Juliet, and publications about Martin Luther King Jr. and the Civil Rights Movement. In other cases, surveillance of students' online activity has led to the unwanted disclosure of private details about their sexuality and the flagging or blocking of race-related content, according to a report from the Center for Democracy & Technology. And just last year, journalism students at a Lawrence, Kansas, high school successfully campaigned to get their files omitted from the purview of Gaggle, arguing that the school's use of the technology would have a chilling effect on critical reporting on district staff and administration. Despite the edtech industry's claims that their artificial intelligence products have saved thousands of students' lives, the Associated Press reports that no independent research has corroborated their efficacy. Industry data regarding the frequency and accuracy of the systems' alerts is usually kept hidden behind closed doors, exclusively in the hands of the for-profit companies that develop and market them. And while school districts maintain that their use of these tools is aimed at deterring self-harm, cyberbullying, and other harmful situations, their ability to customize the standard list of keywords and blocked websites provided by edtech companies raises serious concerns about the extent of surveillance and censorship school administrators can carry out, and the potential impact on students' privacy, speech, and associational rights. For example, school districts could easily block 'Black Lives Matter' websites, as at least one district reportedly did in the past, and flag any research or discussion of sexual orientation. In a moment where many children are prohibited from discussing racial history, learning about gender identity, or reading banned books in the classroom, it is critical that they have alternative channels for exploration and expression, including the internet and messaging platforms. Digital surveillance endangers both the sanctity of their private communications and their freedom to access more accurate, complex, and engaging ideas than their schools might permit. These concerns have intensified in recent weeks as the Trump administration escalates efforts to control how K-12 schools teach history, race, and gender. It's not clear what role, if any, schools' deeply entrenched surveillance capabilities will play in their efforts to comply, but it's important that the public knows about it. In an effort to learn more about how schools are monitoring our students, The Knight Institute submitted a records request to the Grapevine Colleyville Independent School District in Texas. The district has resisted our efforts to obtain information about student surveillance, claiming the records are statutorily exempt from disclosure. The district's position is that all information about the keywords and websites it monitors or blocks falls within the public records statute's computer network security exception, and disclosure of that information would expose the district's network to security threats. But it's not clear how transparency about the scope of student surveillance would lead to a security incident, and the district has offered no basis for its contention that it could. Until the public has access to this information, the extent of the district's monitoring remains unknown. We're hopeful that our new lawsuit succeeds in its efforts to compel the disclosure of this information at a critically important time. The public — particularly the students, parents, and teachers most directly impacted by this technology — deserves to know whether tools that purport to protect student safety are being exploited to stifle and censor student expression. Editor's note: Gaggle, GoGuardian, and the Grapevine Colleyville Independent School District did not respond to Teen Vogue's request for comment. In an email, Lightspeed chief of staff Amy Bennett said, 'Monitoring activity on district devices to keep students safe isn't just the right thing to do—it's also required by federal law under the Children's Internet Protection Act (CIPA). To minimize false alerts, Lightspeed Alert uses a combination of AI monitoring and trained human Safety Specialists.' Bennett added that their programs are 'built with Privacy by Design. We hide personally identifiable information (PII), share only the minimum data needed with only the personnel required, and fiercely secure our systems. And our AI is specifically trained to look for signs of distress and potential harm—not things like bad words or flirting.' Originally Appeared on Teen Vogue Check out more Teen Vogue education coverage: Affirmative Action Benefits White Women Most How Our Obsession With Trauma Took Over College Essays So Many People With Student Debt Never Graduated College The Modern American University Is a Right-Wing Institution

Takeaways from our investigation on AI-powered school surveillance
Takeaways from our investigation on AI-powered school surveillance

Yahoo

time12-03-2025

  • Yahoo

Takeaways from our investigation on AI-powered school surveillance

Thousands of American schools are turning to AI-powered surveillance technology for 24/7 monitoring of student accounts and school-issued devices like laptops and tablets. The goal is to keep children safe, especially amid a mental health crisis and the threat of school shootings. Machine-learning algorithms detect potential indicators of problems like bullying, self-harm or suicide and then alert school officials. But these tools raise serious questions about privacy and security. In fact, when The Seattle Times and The Associated Press partnered to investigate school surveillance, reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request. The documents were stored without a password or firewall, and anyone with the link could read them. See for yourself — The Yodel is the go-to source for daily news, entertainment and feel-good stories. By signing up, you agree to our Terms and Privacy Policy. Here are key takeaways from the investigation. Surveillance tech like Gaggle isn't always secure The privacy and security risks became apparent when Seattle Times and AP reporters submitted a public records request to Vancouver Public Schools in Washington, seeking information about the kind of content flagged by the monitoring tool Gaggle. Used by around 1,500 districts, Gaggle is one of many different companies offering surveillance services, including GoGuardian and Securly. Gaggle saved screenshots of digital activity that set off each alert. School officials accidentally provided the reporters with links to them, not realizing they weren't protected by a password. Students in these documents opened up about the most intimate aspects of their personal lives, including suicide attempts. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so the school's emergency contacts — who often receive these alerts late at night on their phones — can respond quickly. There's no independent research showing surveillance tech increases safety The long-term effects of surveillance technology on safety are unclear. No independent studies have shown it measurably lowers student suicide rates or reduces violence. A 2023 RAND report found only 'scant evidence' of either benefits or risks from artificial intelligence surveillance. 'If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,' said report co-author Benjamin Boudreaux, an AI ethics researcher. Experts warn having privacy to express feelings is important to healthy child development. But proponents of digital monitoring point out school computers are not the appropriate setting for this kind of unlimited self-exploration. LGBTQ+ students are particularly vulnerable Surveillance software poses unique risks to LGBTQ+ students, advocates warn. In the records released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. When Durham Public Schools in North Carolina piloted Gaggle, an LGBTQ+ advocate reported a Gaggle alert about self-harm had led to a student being outed to their family. Another student brought concerns about losing trust with teachers. The board voted to stop using the technology, finding it wasn't worth the risk of eroding relationships with adults. Parents often don't know their kids are being watched Parents interviewed for this article said their child's school either did not disclose it used surveillance software or buried the disclosure in long technology use forms. Even when families are aware of surveillance, schools may refuse to let them opt out. 'Imagine growing up in a world where everything you've ever said on a computer is monitored by the government,' said Tim Reiland, who unsuccessfully lobbied his kids' school district in Owasso, Oklahoma, to let his children opt out of Gaggle. 'And you just have to accept it and move on. What kind of adults are we creating?' ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at

Takeaways from our investigation on AI-powered school surveillance
Takeaways from our investigation on AI-powered school surveillance

The Independent

time12-03-2025

  • The Independent

Takeaways from our investigation on AI-powered school surveillance

Thousands of American schools are turning to AI-powered surveillance technology for 24/7 monitoring of student accounts and school-issued devices like laptops and tablets. The goal is to keep children safe, especially amid a mental health crisis and the threat of school shootings. Machine-learning algorithms detect potential indicators of problems like bullying, self-harm or suicide and then alert school officials. But these tools raise serious questions about privacy and security. In fact, when The Seattle Times and The Associated Press partnered to investigate school surveillance, reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request. The documents were stored without a password or firewall, and anyone with the link could read them. Here are key takeaways from the investigation. Surveillance tech like Gaggle isn't always secure The privacy and security risks became apparent when Seattle Times and AP reporters submitted a public records request to Vancouver Public Schools in Washington, seeking information about the kind of content flagged by the monitoring tool Gaggle. Used by around 1,500 districts, Gaggle is one of many different companies offering surveillance services, including GoGuardian and Securly. Gaggle saved screenshots of digital activity that set off each alert. School officials accidentally provided the reporters with links to them, not realizing they weren't protected by a password. Students in these documents opened up about the most intimate aspects of their personal lives, including suicide attempts. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so the school's emergency contacts — who often receive these alerts late at night on their phones — can respond quickly. There's no independent research showing surveillance tech increases safety The long-term effects of surveillance technology on safety are unclear. No independent studies have shown it measurably lowers student suicide rates or reduces violence. A 2023 RAND report found only 'scant evidence' of either benefits or risks from artificial intelligence surveillance. 'If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,' said report co-author Benjamin Boudreaux, an AI ethics researcher. Experts warn having privacy to express feelings is important to healthy child development. But proponents of digital monitoring point out school computers are not the appropriate setting for this kind of unlimited self-exploration. LGBTQ+ students are particularly vulnerable Surveillance software poses unique risks to LGBTQ+ students, advocates warn. In the records released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. When Durham Public Schools in North Carolina piloted Gaggle, an LGBTQ+ advocate reported a Gaggle alert about self-harm had led to a student being outed to their family. Another student brought concerns about losing trust with teachers. The board voted to stop using the technology, finding it wasn't worth the risk of eroding relationships with adults. Parents often don't know their kids are being watched Parents interviewed for this article said their child's school either did not disclose it used surveillance software or buried the disclosure in long technology use forms. Even when families are aware of surveillance, schools may refuse to let them opt out. 'Imagine growing up in a world where everything you've ever said on a computer is monitored by the government,' said Tim Reiland, who unsuccessfully lobbied his kids' school district in Owasso, Oklahoma, to let his children opt out of Gaggle. 'And you just have to accept it and move on. What kind of adults are we creating?' ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at

Takeaways from our investigation on AI-powered school surveillance
Takeaways from our investigation on AI-powered school surveillance

Associated Press

time12-03-2025

  • Associated Press

Takeaways from our investigation on AI-powered school surveillance

By and CLAIRE BRYAN of The Seattle Times Thousands of American schools are turning to AI-powered surveillance technology for 24/7 monitoring of student accounts and school-issued devices like laptops and tablets. The goal is to keep children safe, especially amid a mental health crisis and the threat of school shootings. Machine-learning algorithms detect potential indicators of problems like bullying, self-harm or suicide and then alert school officials. But these tools raise serious questions about privacy and security. In fact, when The Seattle Times and The Associated Press partnered to investigate school surveillance, reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request. The documents were stored without a password or firewall, and anyone with the link could read them. Here are key takeaways from the investigation. Surveillance tech like Gaggle isn't always secure The privacy and security risks became apparent when Seattle Times and AP reporters submitted a public records request to Vancouver Public Schools in Washington, seeking information about the kind of content flagged by the monitoring tool Gaggle. Used by around 1,500 districts, Gaggle is one of many different companies offering surveillance services, including GoGuardian and Securly. Gaggle saved screenshots of digital activity that set off each alert. School officials accidentally provided the reporters with links to them, not realizing they weren't protected by a password. Students in these documents opened up about the most intimate aspects of their personal lives, including suicide attempts. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so the school's emergency contacts — who often receive these alerts late at night on their phones — can respond quickly. There's no independent research showing surveillance tech increases safety The long-term effects of surveillance technology on safety are unclear. No independent studies have shown it measurably lowers student suicide rates or reduces violence. A 2023 RAND report found only 'scant evidence' of either benefits or risks from artificial intelligence surveillance. 'If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,' said report co-author Benjamin Boudreaux, an AI ethics researcher. Experts warn having privacy to express feelings is important to healthy child development. But proponents of digital monitoring point out school computers are not the appropriate setting for this kind of unlimited self-exploration. LGBTQ+ students are particularly vulnerable Surveillance software poses unique risks to LGBTQ+ students, advocates warn. In the records released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. When Durham Public Schools in North Carolina piloted Gaggle, an LGBTQ+ advocate reported a Gaggle alert about self-harm had led to a student being outed to their family. Another student brought concerns about losing trust with teachers. The board voted to stop using the technology, finding it wasn't worth the risk of eroding relationships with adults. Parents often don't know their kids are being watched Parents interviewed for this article said their child's school either did not disclose it used surveillance software or buried the disclosure in long technology use forms. Even when families are aware of surveillance, schools may refuse to let them opt out. 'Imagine growing up in a world where everything you've ever said on a computer is monitored by the government,' said Tim Reiland, who unsuccessfully lobbied his kids' school district in Owasso, Oklahoma, to let his children opt out of Gaggle. 'And you just have to accept it and move on. What kind of adults are we creating?' ____

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store