Latest news with #JeffPatterson
Yahoo
30-05-2025
- Climate
- Yahoo
WCWS weather updates: Texas Tech-Ole Miss softball game starts after delay
USA TODAY and Yahoo may earn commission from links in this article. Pricing and availability subject to change. Texas Tech and Ole Miss softball's highly anticipated debut at the Women's College World Series was briefly interrupted by Mother Nature. Due to inclement weather in the Oklahoma City area, Thursday's WCWS game between the Red Raiders and Rebels had been delayed for just under an hour and a half by rain and lightning. Advertisement The game between Texas Tech and Ole Miss was originally scheduled for a 7 p.m. ET first pitch at Devon Park in Oklahoma City on Thursday. REQUIRED READING: Texas Tech vs Ole Miss softball live score: Live score, highlight of WCWS game To make the WCWS, Texas Tech took down No. 5 Florida State in the Tallahassee Super Regional, while Ole Miss, the lone unseeded team at the WCWS this year, took down No. 4 Arkansas in the Fayetteville Super Regional. Here's the latest weather updates on Texas Tech-Ole Miss at the WCWS: WCWS Weather updates: Latest on Texas Tech vs Ole Miss game All times Eastern 8:10 p.m.: After a 75-minute delay, Texas Tech and Ole Miss have started at the WCWS. Advertisement Click here for live updates of the Red Raiders and Rebels game. 7:50 p.m.: Texas Tech has taken the field at Devon Park in Oklahoma City for pregame warm-ups. First pitch is right around the corner! 7:33 p.m.: The tarp appears to be off the field in Oklahoma City. The newly announced first pitch between Texas Tech and Ole Miss is roughly 35 minutes away. 7:27 p.m.: The NCAA announces on X that the first pitch between Texas Tech and Ole Miss is now scheduled for 8:15 p.m. ET in Oklahoma City. 7:26 p.m.: As noted by The Oklahoman's Jeff Patterson, Thursday's Texas Tech vs. Ole Miss WCWS has been delayed due to lightning in the Oklahoma City area. Per NCAA rules, if lightning strikes within at least six miles of the venue of the event, the game must be suspended for at least 30 minutes. For every lightning strike that follows the initial lightning strike, the 30-minute clock is reset. 7:09 p.m.: Texas Tech's official X account (formerly Twitter) announced the originally scheduled first pitch for the 7 p.m. ET game won't happen due to a weather delay. WCWS weather forecast: Hour-by-hour weather in OKC According to The Weather Channel, it looks like once the current patch of inclement weather rolls through Oklahoma City, there should not be any more rain for the rest of Thursday night. Advertisement Here's an hourly forecast from The Weather Channel for Oklahoma City: 8 p.m.: Partly Cloudy (5% chance of rain) 9 p.m.: Partly Cloudy (2% chance of rain) 10 p.m.: Partly Cloudy (2% chance of rain) What TV channel is Texas Tech softball vs Ole Miss in WCWS on today? TV channel: ESPN2 Streaming options: ESPN app | Fubo (free trial) ESPN2 will nationally televise Thursday's WCWS game between Texas Tech and Ole Miss. Streaming options include the ESPN app (with a TV login) and Fubo, which carries the ESPN The USA TODAY app gets you to the heart of the news — fast. Download for award-winning coverage, crosswords, audio storytelling, the eNewspaper and more. This article originally appeared on USA TODAY: WCWS weather updates: Texas Tech-Ole Miss softball starts after delay
Yahoo
12-03-2025
- Yahoo
Schools use AI to monitor kids, hoping to prevent violence. Our investigation found security risks
One student asked a search engine, 'Why does my boyfriend hit me?' Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. See for yourself — The Yodel is the go-to source for daily news, entertainment and feel-good stories. By signing up, you agree to our Terms and Privacy Policy. The goal is to keep children safe, but these tools raise serious questions about privacy and security — as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. ___ The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times. ___ The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on 'a digital playground without fences or recess monitors,' CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. 'I don't think we could ever put a price on protecting students,' said Andy Meyer, principal of Vancouver's Skyview High School. 'Anytime we learn of something like that and we can intervene, we feel that is very positive.' Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. 'That's not good at all,' Foster said after learning the district inadvertently released the records. 'But what are my options? What do I do? Pull my kid out of school?' Foster says she'd be upset if her daughter's private information was compromised. 'At the same time,' she said, 'I would like to avoid a school shooting or suicide.' How student surveillance works Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years — approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. 'A lot of times, families don't know. We open that door for that help,' the counselor said. Gaggle is 'good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.' Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts — who often receive these alerts late at night on their phones — can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. 'I'm glad they're being safe about it, but I also think it can be a bit much,' Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. 'It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship,' said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 RAND study found only 'scant evidence' of either benefits or risks from AI surveillance, concluding: 'No research to date has comprehensively examined how these programs affect youth suicide prevention.' 'If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,' said report co-author Benjamin Boudreaux, an AI ethics researcher. LGBTQ+ students are most vulnerable In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. 'We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,' said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: 'I am not a mandated reporter, please tell me the whole truth.' When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then 'blindsided' when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. 'You can't just (surveil) people and not tell them. That's a horrible breach of security and trust,' said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. Parents don't really know The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When Reiland's daughter, Zoe, found out about Gaggle, she says she felt so 'freaked out' that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for 'searching up lady parts.' 'I was too scared to be curious,' she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with 'deliberate indifference' to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: 'This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying.' Long-term effects unknown Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield. 'They knew that the staff member was reading what they were writing,' Enfield said. 'It was, in essence, that student's way of asking for help.' Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. 'The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,' said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, 'the school's going to be held liable,' he said. 'If you're looking for that open free expression, it really can't happen on the school system's computers.' ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at


The Independent
12-03-2025
- The Independent
Schools use AI to monitor kids, hoping to prevent violence. Our investigation found security risks
One student asked a search engine, 'Why does my boyfriend hit me?' Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. The goal is to keep children safe, but these tools raise serious questions about privacy and security — as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. ___ The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times. ___ The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on 'a digital playground without fences or recess monitors,' CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. 'I don't think we could ever put a price on protecting students,' said Andy Meyer, principal of Vancouver's Skyview High School. 'Anytime we learn of something like that and we can intervene, we feel that is very positive.' Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. 'That's not good at all,' Foster said after learning the district inadvertently released the records. 'But what are my options? What do I do? Pull my kid out of school?' Foster says she'd be upset if her daughter's private information was compromised. 'At the same time,' she said, 'I would like to avoid a school shooting or suicide.' How student surveillance works Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years — approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. 'A lot of times, families don't know. We open that door for that help,' the counselor said. Gaggle is 'good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.' Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts — who often receive these alerts late at night on their phones — can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. 'I'm glad they're being safe about it, but I also think it can be a bit much,' Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. 'It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship,' said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 RAND study found only 'scant evidence' of either benefits or risks from AI surveillance, concluding: 'No research to date has comprehensively examined how these programs affect youth suicide prevention.' 'If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,' said report co-author Benjamin Boudreaux, an AI ethics researcher. LGBTQ+ students are most vulnerable In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. 'We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,' said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: 'I am not a mandated reporter, please tell me the whole truth.' When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then 'blindsided' when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. 'You can't just (surveil) people and not tell them. That's a horrible breach of security and trust,' said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. Parents don't really know The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When Reiland's daughter, Zoe, found out about Gaggle, she says she felt so 'freaked out' that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for 'searching up lady parts.' 'I was too scared to be curious,' she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with 'deliberate indifference' to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: 'This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying.' Long-term effects unknown Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield. 'They knew that the staff member was reading what they were writing,' Enfield said. 'It was, in essence, that student's way of asking for help.' Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. 'The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,' said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, 'the school's going to be held liable,' he said. 'If you're looking for that open free expression, it really can't happen on the school system's computers.' ____ The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at

Associated Press
12-03-2025
- Associated Press
Schools use AI to monitor kids, hoping to prevent violence. Our investigation found security risks
One student asked a search engine, 'Why does my boyfriend hit me?' Another threatened suicide in an email to an unrequited love. A gay teen opened up in an online diary about struggles with homophobic parents, writing they just wanted to be themselves. In each case and thousands of others, surveillance software powered by artificial intelligence immediately alerted Vancouver Public Schools staff in Washington state. Vancouver and many other districts around the country have turned to technology to monitor school-issued devices 24/7 for any signs of danger as they grapple with a student mental health crisis and the threat of shootings. The goal is to keep children safe, but these tools raise serious questions about privacy and security — as proven when Seattle Times and Associated Press reporters inadvertently received access to almost 3,500 sensitive, unredacted student documents through a records request about the district's surveillance technology. ___ The Education Reporting Collaborative, a coalition of eight newsrooms, is investigating the unintended consequences of AI-powered surveillance at schools. Members of the Collaborative are The Associated Press, The Christian Science Monitor, The Dallas Morning News, The Hechinger Report, Idaho Education News, The Post and Courier in South Carolina, and The Seattle Times. ___ The released documents show students use these laptops for more than just schoolwork; they are coping with angst in their personal lives. Students wrote about depression, heartbreak, suicide, addiction, bullying and eating disorders. There are poems, college essays and excerpts from role-play sessions with AI chatbots. Vancouver school staff and anyone else with links to the files could read everything. Firewalls or passwords didn't protect the documents, and student names were not redacted, which cybersecurity experts warned was a massive security risk. The monitoring tools often helped counselors reach out to students who might have otherwise struggled in silence. But the Vancouver case is a stark reminder of surveillance technology's unintended consequences in American schools. In some cases, the technology has outed LGBTQ+ children and eroded trust between students and school staff, while failing to keep schools completely safe. Gaggle Safety Management, the company that developed the software that tracks Vancouver schools students' online activity, believes not monitoring children is like letting them loose on 'a digital playground without fences or recess monitors,' CEO and founder Jeff Patterson said. Roughly 1,500 school districts nationwide use Gaggle's software to track the online activity of approximately 6 million students. It's one of many companies, like GoGuardian and Securly, that promise to keep kids safe through AI-assisted web surveillance. The technology has been in high demand since the pandemic, when nearly every child received a school-issued tablet or laptop. According to a U.S. Senate investigation, over 7,000 schools or districts used GoGuardian's surveillance products in 2021. Vancouver schools apologized for releasing the documents. Still, the district emphasizes Gaggle is necessary to protect students' well-being. 'I don't think we could ever put a price on protecting students,' said Andy Meyer, principal of Vancouver's Skyview High School. 'Anytime we learn of something like that and we can intervene, we feel that is very positive.' Dacia Foster, a parent in the district, commended the efforts to keep students safe but worries about privacy violations. 'That's not good at all,' Foster said after learning the district inadvertently released the records. 'But what are my options? What do I do? Pull my kid out of school?' Foster says she'd be upset if her daughter's private information was compromised. 'At the same time,' she said, 'I would like to avoid a school shooting or suicide.' How student surveillance works Gaggle uses a machine-learning algorithm to scan what students search or write online via a school-issued laptop or tablet 24 hours a day, or whenever they log into their school account on a personal device. The latest contract Vancouver signed, in summer 2024, shows a price of $328,036 for three school years — approximately the cost of employing one extra counselor. The algorithm detects potential indicators of problems like bullying, self-harm, suicide or school violence and then sends a screenshot to human reviewers. If Gaggle employees confirm the issue might be serious, the company alerts the school. In cases of imminent danger, Gaggle calls school officials directly. In rare instances where no one answers, Gaggle may contact law enforcement for a welfare check. A Vancouver school counselor who requested anonymity out of fear of retaliation said they receive three or four student Gaggle alerts per month. In about half the cases, the district contacts parents immediately. 'A lot of times, families don't know. We open that door for that help,' the counselor said. Gaggle is 'good for catching suicide and self-harm, but students find a workaround once they know they are getting flagged.' Seattle Times and AP reporters saw what kind of writing set off Gaggle's alerts after requesting information about the type of content flagged. Gaggle saved screenshots of activity that set off each alert, and school officials accidentally provided links to them, not realizing they weren't protected by a password. After learning about the records inadvertently released to reporters, Gaggle updated its system. Now, after 72 hours, only those logged into a Gaggle account can view the screenshots. Gaggle said this feature was already in the works but had not yet been rolled out to every customer. The company says the links must be accessible without a login during those 72 hours so emergency contacts — who often receive these alerts late at night on their phones — can respond quickly. In Vancouver, the monitoring technology flagged more than 1,000 documents for suicide and nearly 800 for threats of violence. While many alerts were serious, many others turned out to be false alarms, like a student essay about the importance of consent or a goofy chat between friends. Foster's daughter Bryn, a Vancouver School of Arts and Academics sophomore, was one such false alarm. She was called into the principal's office after writing a short story featuring a scene with mildly violent imagery. 'I'm glad they're being safe about it, but I also think it can be a bit much,' Bryn said. School officials maintain alerts are warranted even in less severe cases or false alarms, ensuring potential issues are addressed promptly. 'It allows me the opportunity to meet with a student I maybe haven't met before and build that relationship,' said Chele Pierce, a Skyview High School counselor. Between October 2023 and October 2024, nearly 2,200 students, about 10% of the district's enrollment, were the subject of a Gaggle alert. At the Vancouver School of Arts and Academics, where Bryn is a student, about 1 in 4 students had communications that triggered a Gaggle alert. While schools continue to use surveillance technology, its long-term effects on student safety are unclear. There's no independent research showing it measurably lowers student suicide rates or reduces violence. A 2023 RAND study found only 'scant evidence' of either benefits or risks from AI surveillance, concluding: 'No research to date has comprehensively examined how these programs affect youth suicide prevention.' 'If you don't have the right number of mental health counselors, issuing more alerts is not actually going to improve suicide prevention,' said report co-author Benjamin Boudreaux, an AI ethics researcher. LGBTQ+ students are most vulnerable In the screenshots released by Vancouver schools, at least six students were potentially outed to school officials after writing about being gay, transgender or struggling with gender dysphoria. LGBTQ+ students are more likely than their peers to suffer from depression and suicidal thoughts, and turn to the internet for support. 'We know that gay youth, especially those in more isolated environments, absolutely use the internet as a life preserver,' said Katy Pearce, a University of Washington professor who researches technology in authoritarian states. In one screenshot, a Vancouver high schooler wrote in a Google survey form they'd been subject to trans slurs and racist bullying. Who created this survey is unclear, but the person behind it had falsely promised confidentiality: 'I am not a mandated reporter, please tell me the whole truth.' When North Carolina's Durham Public Schools piloted Gaggle in 2021, surveys showed most staff members found it helpful. But community members raised concerns. An LGBTQ+ advocate reported to the Board of Education that a Gaggle alert about self-harm had led to a student being outed to their family, who were not supportive. Glenn Thompson, a Durham School of the Arts graduate, spoke up at a board meeting during his senior year. One of his teachers promised a student confidentiality for an assignment related to mental health. A classmate was then 'blindsided' when Gaggle alerted school officials about something private they'd disclosed. Thompson said no one in the class, including the teacher, knew the school was piloting Gaggle. 'You can't just (surveil) people and not tell them. That's a horrible breach of security and trust,' said Thompson, now a college student, in an interview. After hearing about these experiences, the Durham Board of Education voted to stop using Gaggle in 2023. The district ultimately decided it was not worth the risk of outing students or eroding relationships with adults. Parents don't really know The debate over privacy and security is complicated, and parents are often unaware it's even an issue. Pearce, the University of Washington professor, doesn't remember reading about Securly, the surveillance software Seattle Public Schools uses, when she signed the district's responsible use form before her son received a school laptop. Even when families learn about school surveillance, they may be unable to opt out. Owasso Public Schools in Oklahoma has used Gaggle since 2016 to monitor students outside of class. For years, Tim Reiland, the parent of two teenagers, had no idea the district was using Gaggle. He found out only after asking if his daughter could bring her personal laptop to school instead of being forced to use a district one because of privacy concerns. The district refused Reiland's request. When Reiland's daughter, Zoe, found out about Gaggle, she says she felt so 'freaked out' that she stopped Googling anything personal on her Chromebook, even questions about her menstrual period. She didn't want to get called into the office for 'searching up lady parts.' 'I was too scared to be curious,' she said. School officials say they don't track metrics measuring the technology's efficacy but believe it has saved lives. Yet technology alone doesn't create a safe space for all students. In 2024, a nonbinary teenager at Owasso High School named Nex Benedict died by suicide after relentless bullying from classmates. A subsequent U.S. Department of Education Office for Civil Rights investigation found the district responded with 'deliberate indifference' to some families' reports of sexual harassment, mainly in the form of homophobic bullying. During the 2023-24 school year, the Owasso schools received close to 1,000 Gaggle alerts, including 168 alerts for harassment and 281 for suicide. When asked why bullying remained a problem despite surveillance, Russell Thornton, the district's executive director of technology responded: 'This is one tool used by administrators. Obviously, one tool is not going to solve the world's problems and bullying.' Long-term effects unknown Despite the risks, surveillance technology can help teachers intervene before a tragedy. A middle school student in the Seattle-area Highline School District who was potentially being trafficked used Gaggle to communicate with campus staff, said former Superintendent Susan Enfield. 'They knew that the staff member was reading what they were writing,' Enfield said. 'It was, in essence, that student's way of asking for help.' Still, developmental psychology research shows it is vital for teens to have private spaces online to explore their thoughts and seek support. 'The idea that kids are constantly under surveillance by adults — I think that would make it hard to develop a private life, a space to make mistakes, a space to go through hard feelings without adults jumping in,' said Boudreaux, the AI ethics researcher. Gaggle's Patterson says school-issued devices are not the appropriate place for unlimited self-exploration. If that exploration takes a dark turn, such as making a threat, 'the school's going to be held liable,' he said. 'If you're looking for that open free expression, it really can't happen on the school system's computers.'
Yahoo
04-03-2025
- Automotive
- Yahoo
A Tesla Supercharging station was engulfed in flames. Police suspect arson.
Massachusettes police are investigating seven Tesla charging stations that were set on fire. Police told BI that the charging stations were the only ones in the small town outside Boston. Investigators suspect the fires were intentional. A group of Tesla Superchargers in a small town outside Boston caught fire this week — and investigators suspect it was arson. Massachusettes police are working with local officials to investigate seven Tesla charging stations that were engulfed in flames early Monday morning. Littleton Police said in a Monday press release that local officials have "determined that the fire appears to have been intentionally set." Littleton Police Deputy Chief Jeff Patterson told Business Insider that the seven charging stations that were damaged are the only ones in the town, and none of them are useable. However, he said they are actively being repaired. Tesla's charging account on X responded to a post about the incident on Monday and said the charging posts and wiring would be replaced in under 48 hours. Police chief Matthew Pinard said that officers were dispatched to The Point Shopping Center at 1:10 a.m., following reports of fires at the Tesla charging station. The officers said that "several Tesla charging stations were engulfed in flames and heavy, dark smoke" and another caught fire while they waited for the Electric Light & Water Department to arrive to shut down the power. Seven charging stations suffered heavy fire-related damage. There were no reported injuries, and all of the fires were extinguished. The Littleton police and fire departments, along with the Massachusetts State Police Fire and Explosion Investigation Unit, are investigating the incident. The Arson Watch Reward Program is offering rewards of up to $5,000 for information about the incident. Patterson told BI that he wasn't aware of any Tesla protests or vandalism incidents in the town. There have been dozens of demonstrations against Elon Musk and Tesla around the country in recent weeks in response to the Tesla CEO's efforts with the Trump administration and DOGE. Demonstrators have gathered in cities around the country to participate in "Tesla Takedown" protests, many of which have occurred outside Tesla showrooms. Some Tesla owners have also reported being subject to insults when driving or vandalism on their vehicles. Some of the anti-Tesla and Musk efforts have resulted in arrests. Colorado police arrested a woman last week on suspicion of her involvement in a series of vandalism incidents at a Tesla dealership, including painting "Nazi cars" in graffiti on the dealership building and throwing Molotov cocktails at vehicles. The suspect was charged with criminal intent to commit a felony, criminal mischief, and using explosives or incendiary devices during a felony, according to police records. Nine people were also arrested at a Tesla showroom protest in Manhattan on Saturday, Reuters reported. Police said hundreds of people showed up to the protest, some of whom entered the building, prompting employees to close the store. Videos from the protest also captured some of the store's glass shattered. Are you a Tesla driver or employee with a story to share? Contact the reporter at aaltchek@ Read the original article on Business Insider