Latest news with #EmanuelMaiberg


CBC
11-05-2025
- Entertainment
- CBC
'Vibe coding' makes designing apps easier than before — but it comes with risks
Cyanide ice cream. Cholera-inspired chocolate cake. A recipe featuring so-called ingredients the CBC's Language Guide discourages from repeating verbatim. These can all be found on a website created by U.S.-based tech entrepreneur Tom Blomfield that uses artificial intelligence to generate recipes after a user suggests a list of ingredients. "Some more mischievous users started to push the envelope on what kind of recipes his AI-powered site would generate. And they found that it would generate things that they thought were funny but are potentially dangerous," Emanuel Maiberg, a reporter for the tech website 404, told The Current host Matt Galloway. Blomfield built with a new method recently dubbed "vibe coding," where people use AI tools to build a program, app or game with prompts or suggestions much like how one would use ChatGPT to generate a written text answer. But the example calls attention to how building apps solely by "vibe" may lead to problematic, and even potentially dangerous results. Since Maiberg's story was published on April 2, the cyanide ice cream article has been removed from archive, but others including the cholera cake remain. CBC reached out to Blomfield for comment, but did not receive a reply. Vibe coding was coined by Andrej Karpathy, a Canadian computer scientist and co-founder of artificial intelligence giant OpenAI, in a 2023 post on X. "There's a new kind of coding I call 'vibe coding,' where you fully give in to the vibes, embrace exponentials, and forget that the code even exists," Karpathy wrote, as though describing something more akin to a meditation session than developing a computer program. Vibe coding has become possible recently, experts say, because AI tools have become sophisticated enough to build functional — or mostly functional — apps with little more than general prompts or suggestions. "The difference between last year and now is that large language models — LLMs — have gotten good enough that they can actually produce, you know, medium-scale games or apps, things like that. It actually works," said Michael Guerzhoy, an assistant professor teaching programming and machine intelligence at the University of Toronto. The process has opened doors for budding app makers like Chioma Janelle Efejedia, a psychotherapist and social worker based in Kitchener, Ont. Not knowing how to code, she might have had to pay a programmer thousands of dollars to make a mental health app. Instead, she vibe-coded her own app called OMA Life, which offers guided mindfulness in various languages including Ibo, Yoruba, and Urdu, culturally relatable relaxation sounds, and access to a directory of therapists. "I just think, you know, where tech is right now gives a great opportunity to say, OK, I can meet this need," Efejedia told CBC Radio's Manjula Selvarajah. 'Build something really cool' Tobin South, a researcher in AI security at the Massachusetts Institute of Technology, says he's excited by vibe coding's "ability to unlock your everyday person to build something really cool." "I went to a party the other day and someone wanted a really cool app to organize the party. And so I made a little app with bingo cards inside of it, and a party agenda. And I was able to just bring this into existence with the English language rather than typing any code," he said. But he also cautions that it can create "massive security risks" when they aren't written and checked by experts in the field. "If you're starting to build personal finance tools or other tools to augment your life, these things can get really tricky. You do not want your bank details leaked all over the internet because you vibe coded," he said as an example to Galloway. He likened traditional app building as something made out of Lego bricks by trained experts, brick by brick, drawing on the work and experience of previous versions and notes from their creators. Vibe coding, meanwhile, is sort of like dumping your hand blindly into a box of bricks and making something out of whatever you've clawed out. "Sometimes ... this leads to a Lego construction, a Lego house that might fall down, that's missing some essential bricks that hold it all together," he said. 'Unexpected, dangerous results' Someone vibe coding on their own won't benefit from the institutional knowledge of working in a tech corporation, either. "If you work at Google, there's already someone breathing down your neck about security and making sure everything's done the right way," South said. In other words, if you make a recipe app without vibe coding, there's almost certainly someone on your team making sure that if someone asks for a recipe with cyanide, it won't actually go ahead and make one. In late 2024, Google CEO Sundar Pichai said that 25 per cent of all new code for the company's products were made with AI, though under the supervision of human employees. Maiberg cautions that as more programmers — not just amateurs or hobbyists like Efejedia — use AI to build their code and programs, more lines of code that have never been checked by a human may creep into our collective technological backdrop. That could mean errors might never be found until the consequences rear their ugly head -- from a rude ice cream recipe, to easily hackable personal banking info to something worse we may have yet to predict. "I think my concern, and the concern of other people, is that we can get unexpected, dangerous results from having so much code written by AI in a way that we don't fully understand," Maiberg said.
.jpg&w=3840&q=100)

WIRED
17-04-2025
- Politics
- WIRED
This ‘College Protester' Isn't Real. It's an AI-Powered Undercover Bot for Cops
By Emanuel Maiberg and Jason Koebler Apr 17, 2025 6:30 AM Massive Blue is helping cops deploy AI-powered social media bots to talk to people they suspect are anything from violent sex criminals all the way to vaguely defined 'protesters.' American police departments near the United States-Mexico border are paying hundreds of thousands of dollars for an unproven and secretive technology that uses AI-generated online personas designed to interact with and collect intelligence on 'college protesters,' 'radicalized' political activists, and suspected drug and human traffickers, according to internal documents, contracts, and communications that 404 Media obtained via public records requests. This article is copublished in partnership with 404 Media. Massive Blue, the New York–based company that is selling police departments this technology, calls its product Overwatch, which it markets as an 'AI-powered force multiplier for public safety' that 'deploys lifelike virtual agents, which infiltrate and engage criminal networks across various channels.' According to a presentation obtained by 404 Media, Massive Blue is offering cops these virtual personas that can be deployed across the internet with the express purpose of interacting with suspects over text messages and social media. Massive Blue lists 'border security,' 'school safety,' and stopping 'human trafficking' among Overwatch's use cases. The technology—which as of last summer had not led to any known arrests—demonstrates the types of social media monitoring and undercover tools private companies are pitching to police and border agents. Concerns about tools like Massive Blue have taken on new urgency considering that the Trump administration has revoked the visas of hundreds of students, many of whom have protested against Israel's war in Gaza. 404 Media obtained a presentation showing some of these AI characters. These include a 'radicalized AI' 'protest persona,' which poses as a 36-year-old divorced woman who is lonely, has no children, is interested in baking, activism, and 'body positivity.' Another AI persona in the presentation is described as a ''Honeypot' AI Persona.' Her backstory says she's a 25-year-old from Dearborn, Michigan, whose parents emigrated from Yemen and who speaks the Sanaani dialect of Arabic. The presentation also says she uses various social media apps, that she's on Telegram and Signal, and that she has US and international SMS capabilities. Other personas are a 14-year-old boy 'child trafficking AI persona,' an 'AI pimp persona,' 'college protestor,' 'external recruiter for protests,' 'escorts,' and 'juveniles.' One example of an AI persona created by Massive Blue's Overwatch tool. The company adds backstories for many of its AI personas, in an apparent attempt to make them appear more realistic. Courtesy of Massive Blue/Texas Department of Public Safety Our reporting shows that cops are paying a company to help them deploy AI-powered bots across social media and the internet to talk to people they suspect are anything from violent sex criminals all the way to vaguely defined 'protestors' with the hopes of generating evidence that can be used against them. 'This idea of having an AI pretending to be somebody, a youth looking for pedophiles to talk online, or somebody who is a fake terrorist, is an idea that goes back a long time,' Dave Maass, who studies border surveillance technologies for the Electronic Frontier Foundation, told 404 Media. 'The problem with all these things is that these are ill-defined problems. What problem are they actually trying to solve? One version of the AI persona is an escort. I'm not concerned about escorts. I'm not concerned about college protesters. So like, what is it effective at, violating protesters' First Amendment rights?' Massive Blue has signed a $360,000 contract with Pinal County, Arizona, which is between Tucson and Phoenix. The county is paying for the contract with an anti-human trafficking grant from the Arizona Department of Public Safety. A Pinal County purchasing division report states that it has bought '24/7 monitoring of numerous web and social media platforms' and 'development, deployment, monitoring, and reporting on a virtual task force of up to 50 AI personas across 3 investigative categories.' Yuma County, in southwestern Arizona, meanwhile, signed a $10,000 contract to try Massive Blue in 2023 but did not renew the contract. A spokesperson for the Yuma County Sheriff's Office told 404 Media 'it did not meet our needs.' This image from a Massive Blue presentation for police departments shows how the company's RADAR program uses AI personas to provide law enforcement with 'intelligence reports.' Courtesy of Massive Blue/Texas Department of Public Safety Massive Blue cofounder Mike McGraw did not answer a series of specific questions from 404 Media about how Massive Blue works, what police departments it works with, and whether it had been used to generate any arrests. 'We are proud of the work we do to support the investigation and prosecution of human traffickers,' McGraw said. 'Our primary goal is to help bring these criminals to justice while helping victims who otherwise would remain trafficked. We cannot risk jeopardizing these investigations and putting victims' lives in further danger by disclosing proprietary information.' The Pinal County Sheriff's Office told 404 Media that Massive Blue has not thus far been used for any arrests. 'Our investigations are still underway. Massive Blue is one component of support in these investigations, which are still active and ongoing. No arrests have been made yet,' Sam Salzwedel, Pinal County Sheriff's Office public information officer, told 404 Media. 'It takes a multifaceted approach to disrupting human traffickers, narcotics traffickers, and other criminals. Massive Blue has been a valuable partner in these initiatives and has produced leads that detectives are actively pursuing. Given these are ongoing investigations, we cannot risk compromising our investigative efforts by providing specifics about any personas.' Salzwedel added, 'Massive Blue is not working on any immigration cases. Our agency does not enforce immigration law. Massive Blue's support is focused on the areas of human trafficking, narcotics trafficking, and other investigations.' Law enforcement agencies have taken steps to prevent specifics about what Massive Blue is and how it works from becoming public. At public appropriations hearings in Pinal County about the Massive Blue contract, the sheriff's office refused to tell county council members about what the product even is. Matthew Thomas, Pinal County Deputy Sheriff, told the county council he 'can't get into great detail' about what Massive Blue is and that doing so would 'tip our hand to the bad guys.' Pinal County Sheriff's Office did not respond to multiple requests for comment. The Arizona Department of Public Safety said, 'From what we can ascertain, Pinal County planned to implement technology to help identify and solve human trafficking cases, and that is what we funded,' but was unaware of any of the specifics of Overwatch. While the documents don't describe every technical aspect of how Overwatch works, they do give a high-level overview of what it is. The company describes a tool that uses AI-generated images and text to create social media profiles that can interact with suspected drug traffickers, human traffickers, and gun traffickers. After Overwatch scans open social media channels for potential suspects, these AI personas can also communicate with suspects over text, Discord, and other messaging services. The documents we obtained don't explain how Massive Blue determines who is a potential suspect based on their social media activity. Salzwedel, of Pinal County, said 'Massive Blue's solutions crawl multiple areas of the Internet, and social media outlets are just one component. We cannot disclose any further information to preserve the integrity of our investigations.' One slide in the Massive Blue presentation obtained by 404 Media gives the example of a 'Child Trafficking AI Persona' called Jason. The presentation gives a short 'backstory' for the persona, which says Jason is a 14-year-old boy from Los Angeles whose parents emigrated from Ecuador. He's bilingual and an only child, and his hobbies include anime and gaming. The presentation describes his personality as shy and that he has difficulty interacting with girls. It also says that his parents don't allow him to use social media and that he hides his use of Discord from them. This AI persona is also accompanied by an AI-generated image of a boy. Another example of an AI-generated persona, along with a sample of chats showing how the AI personas interact with targeted suspects. Courtesy of Massive Blue/Texas Department of Public Safety The presentation includes a conversation between this AI persona and what appears to be a predatory adult over text messages and Discord. 'Your parents around? Or you getting some awesome alone time,' a text from the adult says. 'Js chillin by myself, man. My momz @ work n my dadz outta town. So itz jus me n my vid games. 🎮,' Jason, the AI-generated child, responds. In another example of how the 'highly adaptable personas' can communicate with real people, the presentation shows a conversation between Clip, an 'AI pimp persona,' and what appears to be a sex worker. 'Dem tricks trippin 2nite tryin not pay,' the sex worker says. 'Facts, baby. Ain't lettin' these tricks slide,' the Clip persona replies. 'You stand your ground and make 'em pay what they owe. Daddy got your back, ain't let nobody disrespect our grind. Keep hustlin', ma, we gonna secure that bag💰💪✨' A list from Massive Blue's presentation showing the types of 'highly customizable' personas Overwatch can generate. Courtesy of Massive Blue/Texas Department of Public Safety 'The continuous evolution of operational, communication & recruitment tactics by bad actors drives exponential increases of threats and significant challenges in reducing demand,' says a one-page brochure provided to police departments that explains Overwatch's functionality. 'The Overwatch platform harnesses the power of AI & blockchain to scale your impact without operational or technical overhead.' Jorge Brignoni took notes for the Cochise County, Arizona, Sheriff's Office at a meeting with Massive Blue in August 2023, which 404 Media obtained. In the notes, he wrote that Overwatch does 'passive engagement, then active engagement, towards commitment' with a 'Bad Actor, Predator, DTO,' or drug trafficking organization. These targets are then 'HAND[ed] OFF to L.E. [law enforcement] to arrest, indict, convict.' 'Why is he talking about converting folks into 'buying something,'' Brignoni wrote. 'So dumb. Talk about the widget, not how you're selling the widget to L.E.' According to Brignoni's notes, in addition to collecting intelligence via these AI personas, Overwatch also leverages 'Telco & Geo Data' and 'Blockchain Data' in the form of 'full transaction history, top associated wallet IDs, sending & receiving cryptocurrency, potential off-ramps (Exchange names).' The Cochise County Sheriff's Office ultimately did not buy Massive Blue and did not provide answers to 404 Media's questions about its meeting with the company. Besides scanning social media and engaging suspects with AI personas, the presentation says that Overwatch can use generative AI to create 'proof of life' images of a person holding a sign with a username and date written on it in pen. A variety of AI-generated images of Massive Blue's personas, which are made to look realistic in an attempt to fool targets. Courtesy of Massive Blue/Texas Department of Public Safety The Massive Blue presentation gives an example of an 'Overwatch Recon Report' based on '24 hours of activity across Dallas, Houston, and Austin.' It claims that Overwatch identified 3,266 unique human traffickers, 25 percent of which were affiliated with 'larger sophisticated trafficking organizations' and 15 percent of which were flagged as 'potential juvenile traffickers.' 404 Media was not able to verify what these accounts were and whether they actually engaged in any criminal activity, and Massive Blue didn't respond to questions about what these accounts were and how exactly it identified them. On top of the ongoing contract with the Pinal County Sheriff's Office and the pilot with the Yuma County Sheriff's Department last year, Massive Blue has pitched its services to Cochise County in Arizona and the Texas Department of Public Safety, according to documents obtained as part of this investigation. In September 2023, Yuma County set up a meeting that was going to include federal law enforcement, but Massive Blue had to cancel the meeting: 'That's unfortunate, we had federal agents here that focus on human trafficking ready to go,' a Yuma County sergeant wrote in an email to Massive Blue CEO Brian Haley after Haley canceled the meeting. Much of Massive Blue's public-facing activity has been through its executive director of public safety, Chris Clem, who is a former US Customs and Border Protection agent who testified before Congress about border security last year and regularly appears on Fox News and other media outlets to discuss immigration and the border. In recent months, Clem has posted images of himself on LinkedIn at the border and with prominent Trump administration members Tulsi Gabbard and Robert F. Kennedy Jr. Massive Blue has also relied on former Kansas City Chiefs kicker Nick Lowery to introduce and endorse Overwatch to police departments. Clem and Lowery have spoken most extensively publicly about Overwatch, where they have described it as an amorphous 'cyberwall' that can do everything from stopping human traffickers to preventing hackers from breaking into 401(k) accounts to taking money back from hackers who have stolen from you, though they provide no specifics about how that would work. In a two-and-a-half-hour interview with podcaster Theo Von, Clem said, 'My company Massive Blue, we basically use deep tech to identify the habits and process of you know, look, I worked on a physical wall, now we've created a cyberwall,' adding that he believed it would 'save lives.' Von asked, 'OK, but how does your company do that?' 'Well, I'm not going to get into that too much,' Clem responded, adding that he is trying to sell the technology to US Border Patrol. More examples of Massive Blue's AI personas, which include a 'child trafficking AI persona,' an 'AI pimp persona,' 'college protestor,' 'external recruiter for protests,' 'escorts,' and 'juveniles.' Courtesy of Massive Blue/Texas Department of Public Safety On June 5, a Pinal County Board of Supervisors meeting was asked to approve a $500,000 contract between the county and Massive Blue in order to license Overwatch. 'I was looking at the website for Massive Blue, and it's a one-pager with no additional information and no links,' Kevin Cavanaugh, the then-supervisor for District 1, said to Pinal County's Chief Deputy at the Sheriff's Office, Matthew Thomas. 'They produce software that we buy, and it does what? Can you explain that to us?' 'I can't get into great detail because it's essentially trade secrets, and I don't want to tip our hand to the bad guys,' Thomas said. 'But what I can tell you is that the software is designed to help our investigators look for and find and build a case on human trafficking, drug trafficking, and gun trafficking.' Cavanaugh said at the board meeting that the basic information he got is that Massive Blue uses '50 AI bots.' He then asked whether the software has been successful and if it helped law enforcement make any arrests. Thomas explained they have not made any arrests yet because they've only seen the proof of concept, but that the proof of concept was 'good enough for us and our investigators to move forward with this. Once this gets approved and we get them [Massive Blue] under contract, then we are going to move forward with prosecution of cases.' Cavanaugh asked if Overwatch is used in other counties, which prompted Thomas to invite Clem to the podium to speak. Clem introduced himself as a recently retired border agent and said that Massive Blue is currently in negotiations with three counties in Arizona, including Pinal County. 'As a resident of 14 years of Pinal County I know what's happening here,' Clem said to the Board of Supervisors. 'To be able [to] use this program [...] to provide all the necessary information to go after the online exploitation of children, trafficking victims, and all the other verticals that the sheriff may want to go after.' Cavanaugh again asked if Massive Blue gathered any data that led to arrests. 'We have not made arrests yet, but there is a current investigation right now regarding arson, and we got the leads to the investigators,' Clem said, explaining that the program has been active for only about six months. 'Investigations take time, but we've been able to generate the necessary leads for the particular counties that we're involved with and also in the private sector.' The Pinal County Board of Supervisors concluded the exchange by approving payment for a handful of other, unrelated projects, but with board members asking to delay the vote on payment for Massive Blue 'for further study.' The decision not to fund Massive Blue that day was covered in a local newspaper. Cavanaugh told the paper that he asked the company to meet with supervisors to explain the merits of the software. 'The State of Arizona has provided a grant, but grant money is taxpayer money. No matter the source of the funding, fighting human and sex trafficking is too important to risk half a million dollars on unproven technology,' he said. 'If the company demonstrates that it can deliver evidence to arrest human traffickers, it may be worthwhile. However, it has yet to achieve this goal.' 404 Media's public record requests yielded several emails from Cavanaugh's office to IT professionals and other companies that provide AI products to law enforcement, asking them if they're familiar with Massive Blue. We don't know what was said in those meetings, or if they occurred, but when the Pinal County Board of Supervisors convened again on June 19 it voted to pay for Massive Blue's Overwatch without further discussion. 'Supervisor [Cavanaugh] ultimately voted for the agreement because Massive Blue is alleged to be in pursuit of human trafficking, a noble goal,' a representative from Cavanaugh's office told 404 Media in an email. 'A major concern regarding the use of the application, is that the government should not be monitoring each and every citizen. To his knowledge, no arrests have been made to date as a result of the use of the application. If Overwatch is used to bring about arrests of human traffickers, then the program should continue. However, if it is just being used to collect surveillance on law-abiding citizens and is not leading to any arrests, then the program needs to be discontinued.' In an August 7, 2024, Board of Supervisors meeting, Cavanaugh asked then-Pinal County Sheriff Mark Lamb for an update on Massive Blue. 'So they have not produced any results? They've produced no leads? No evidence that is actionable?' Cavanaugh asked. 'That would be public knowledge, that would be public information.' 'I think there's a lot of ongoing investigations that they're not going to give you information on, and we're not going to give you information on,' Lamb said.