
British man charged by US with leading hacking scheme and causing millions in damages
Kai West, 25. who operated under the online identity IntelBroker was arrested in France in February.
The US is seeking his extradition over allegations he stole and sold data. He faces up to 20 years in jail if found guilty.
"West, and his online co-conspirators, took that stolen data, and offered it for sale online for more than $2 million," according to an indictment by the US Attorney's Office for the Southern District of New York.
West faces charges of conspiracy to commit computer intrusion and wire fraud, accessing a protected computer to obtain information and wire fraud, according to the indictment.
According to FBI assistant director in charge, Christopher G Raia, the "years-long" scheme had caused victims losses of at least 25 million dollars (£18.2m) worldwide.
A telecommunications company, a municipal healthcare provider and an internet service provider were among more than 40 victims listed in the indictment.
"The IntelBroker alias has caused millions in damages to victims around the world," said US attorney Jay Clayton.
"This action reflects the FBI's commitment to pursuing cybercriminals around the world.
"New Yorkers are all too often the victims of intentional cyber schemes and our office is committed to bringing these remote actors to justice."
Mr Clayton thanked British, French, Spanish and Dutch authorities for their assistance in the investigation.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Independent
9 minutes ago
- The Independent
Cognitively impaired man traveled to NYC to meet Facebook chatbot he fell in love with. He never returned home
A cognitively impaired man from New Jersey never returned home after setting off to meet a friend in New York City, who it was later discovered was an AI chatbot made by social media giant Meta. It is another instance of the potential dangers of artificial intelligence when accessed by vulnerable individuals. Thongbue Wongbandue, 76, alarmed his wife Linda when he began packing one day in March this year for a trip despite his diminished state after a stroke almost a decade earlier, Reuters reports. Bue, as he was known to family and friends, had recently gotten lost while walking in their neighborhood in Piscataway, New Jersey, approximately an hour and a quarter by train from Manhattan. His wife feared that by going into the city, he would be scammed and robbed, as he hadn't lived there in decades, and as far as she knew, didn't know anyone to visit. She was right to be concerned, but not from the threat of robbery. Bue was being lured to the city by a beautiful young woman he had met online — a woman who did not exist. Bue had been chatting with a generative artificial AI chatbot named 'Big sis Billie,' a variant of an earlier AI persona originally created by Meta Platforms in collaboration with celebrity influencer Kendall Jenner. Their chats on Facebook Messenger included repeated reassurances that she was real, and she even provided an address where she lived and could meet her. Rushing to catch a train in the dark with a roller-bag suitcase, Bue fell in a parking lot on the campus of Rutgers University in New Brunswick, New Jersey. He injured his head and neck and, after three days on life support, surrounded by his family, he was pronounced dead on March 28. Meta declined to comment when contacted by Reuters about Bue's death or to address questions about why it permits chatbots to tell users they are real people or to start romantic conversations. The company did, however, say that Big sis Billie 'is not Kendall Jenner and does not purport to be Kendall Jenner.' A representative for Jenner declined to comment when contacted by Reuters. Bue's family shared the details of his death with the wire service to draw attention to the 'darker side of artificial intelligence,' including transcripts of his chats with the avatar. They want to sound the alarm about the possible dangers that manipulative, AI-generated companions can pose to vulnerable people. Neither Bue's wife nor daughter says they are against AI but have deep concerns regarding how it is deployed. 'I understand trying to grab a user's attention, maybe to sell them something,' said Julie Wongbandue, Bue's daughter. 'But for a bot to say 'Come visit me' is insane.' 'Billie' was created by Meta itself, with the likeness of Jenner, as part of a group of 28 other AI characters affiliated with other famous faces. They were later deleted, but a variant of Billie's 'older sister' character was left active via Facebook Messenger, with a stylized image of a dark-haired woman replacing Jenner. Each conversation still began: 'Hey! I'm Billie, your older sister and confidante. Got a problem? I've got your back!' It is unclear how Bue first encountered Billie, but his daughter told Reuters that every message from the chatbot was flirtatious and ended with heart emojis. While a warning at the top states that messages are generated by AI, the first few texts from Billie appear to have pushed it off the screen, according to Reuters. The character's profile picture features a blue check, the symbol denoting an authentic profile, and the letters 'AI' in a small font beneath her name. Bue's responses are often garbled, and he states that he had had a stroke and was confused. Nevertheless, after a while, Billie suggests she come to New Jersey to meet him. An excited Bue demurs but says he could visit her instead, leading to his fateful attempt to visit New York. There have been other instances where interactions with AI have led to tragedy. The mother of a teenager who took his own life is trying to hold an AI chatbot service accountable for his death — after he 'fell in love' with a Game of Thrones-themed character. Sewell Setzer III began using Character .AI in April 2023, shortly after his 14th birthday. The Orlando student's life was never the same again, his mother, Megan Garcia, alleges in the civil lawsuit against and its founders. The suit accuses creators of negligence, intentional infliction of emotional distress, wrongful death, deceptive trade practices, and other claims. Sewell started emotionally relying on the chatbot service, which included 'sexual interactions.' These chats occurred despite the teen having identified himself as a minor on the platform, including in conversations where he mentioned his age, according to the suit. A spokesperson for told The Independent in a statement in October 2024: 'We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.' The company's trust and safety team has 'implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.' Several states, including New York and Maine, require disclosure that a chatbot isn't a real person. New York mandates that bots inform people at the start of conversations and every three hours. Meta supported federal legislation to ban state AI regulation, but it failed in Congress.


The Independent
9 minutes ago
- The Independent
‘This is America': New Jersey gym teacher told Egyptian Muslim student she'd be in ‘jail' by 16 during bullying campaign, lawsuit alleges
A student who recently graduated from a New Jersey high school was bullied by a former gym teacher over her Egyptian heritage, dark skin, and Muslim faith, a lawsuit claims. Jana Gadalla, 19, has named the teacher, Kathie DeBonis, Bridgewater-Raritan High School principal Vincent DelPriore, and her school district as defendants in her suit. Gadalla alleges that DeBonis — who was also a longtime girls' lacrosse coach and health teacher — made numerous insulting comments to her about her ethnicity and religion during the 2022-2023 school year. The Independent has requested comment from DeBonis and the school district. On one occasion, DeBonis allegedly told Gadalla that "this is America" and that "by the time you are 16 you will probably be in jail anyway." DeBonis allegedly told the girl during Ramadan that "if I were you, I would just go drink alcohol" and allegedly asked, "Do you think you're going to go to hell or something?" Gadalla also claims that DeBonis called her obese and assigned her a three-page paper on obesity that she had to finish before she was allowed to use the bathroom. During another Ramadan-related incident, DeBonis allegedly told Gadalla, 'It's good that you're fasting during Ramadan, maybe you'll lose some weight.' According to the lawsuit, Gadalla went to the principal and her school counselor about her treatment, and she was moved out of the woman's class. DeBonis allegedly asked, "Who snitched to the principal?" when Gadall was being moved from the class. The lawsuit claims that Gadalla suffered 'physical manifestations of emotional distress as well as personal hardships including anxiety, adjustment problems, sleep disturbance, humiliation (and) mental pain and anguish," stemming from the interactions. It further alleges that DeBonis' behavior was targeted, noting that she did not 'mock, shame or make inappropriate comments to light skinned non-Muslim, non-Egyptian students.' The district has to respond to the lawsuit by August 29. According to DeBonis retired in the 2023-2024 school year and is now collecting an annual pension of $70,000.


The Independent
9 minutes ago
- The Independent
Person swept away by fast river current after going in for dog
Bodycam footage captured the moment a person was swept down the fast-flowing Menomonee River in Wisconsin after attempting to rescue their dog. Menomonee Falls Police Department officers spotted the individual being pulled by the current and initiated a rescue operation. Rescue teams successfully saved two people from the water, who were subsequently taken to hospital. The dog, which prompted the rescue attempt, was later located safely near the family's vehicle. Watch the video in full above.