logo
Tennessee star Ruby Whitehorn arrested for domestic violence and burglary after 'kicking victim's door in'

Tennessee star Ruby Whitehorn arrested for domestic violence and burglary after 'kicking victim's door in'

Daily Mail​4 days ago
Tennessee women's basketball star Ruby Whitehorn was arrested on Friday on charges of aggravated burglary and domestic assault, according to a report.
Citing court records, the Knoxville News Sentinel reported that cops responded to a domestic dispute call involving Whitehorn on 4:18pm on August 8.
Whitehorn has been accused of kicking in both the bedroom and front door of the victim, while a mirror was also found shattered in the bathroom.
More to follow
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Cognitively impaired man traveled to NYC to meet Facebook chatbot he fell in love with. He never returned home
Cognitively impaired man traveled to NYC to meet Facebook chatbot he fell in love with. He never returned home

The Independent

time15 minutes ago

  • The Independent

Cognitively impaired man traveled to NYC to meet Facebook chatbot he fell in love with. He never returned home

A cognitively impaired man from New Jersey never returned home after setting off to meet a friend in New York City, who it was later discovered was an AI chatbot made by social media giant Meta. It is another instance of the potential dangers of artificial intelligence when accessed by vulnerable individuals. Thongbue Wongbandue, 76, alarmed his wife Linda when he began packing one day in March this year for a trip despite his diminished state after a stroke almost a decade earlier, Reuters reports. Bue, as he was known to family and friends, had recently gotten lost while walking in their neighborhood in Piscataway, New Jersey, approximately an hour and a quarter by train from Manhattan. His wife feared that by going into the city, he would be scammed and robbed, as he hadn't lived there in decades, and as far as she knew, didn't know anyone to visit. She was right to be concerned, but not from the threat of robbery. Bue was being lured to the city by a beautiful young woman he had met online — a woman who did not exist. Bue had been chatting with a generative artificial AI chatbot named 'Big sis Billie,' a variant of an earlier AI persona originally created by Meta Platforms in collaboration with celebrity influencer Kendall Jenner. Their chats on Facebook Messenger included repeated reassurances that she was real, and she even provided an address where she lived and could meet her. Rushing to catch a train in the dark with a roller-bag suitcase, Bue fell in a parking lot on the campus of Rutgers University in New Brunswick, New Jersey. He injured his head and neck and, after three days on life support, surrounded by his family, he was pronounced dead on March 28. Meta declined to comment when contacted by Reuters about Bue's death or to address questions about why it permits chatbots to tell users they are real people or to start romantic conversations. The company did, however, say that Big sis Billie 'is not Kendall Jenner and does not purport to be Kendall Jenner.' A representative for Jenner declined to comment when contacted by Reuters. Bue's family shared the details of his death with the wire service to draw attention to the 'darker side of artificial intelligence,' including transcripts of his chats with the avatar. They want to sound the alarm about the possible dangers that manipulative, AI-generated companions can pose to vulnerable people. Neither Bue's wife nor daughter says they are against AI but have deep concerns regarding how it is deployed. 'I understand trying to grab a user's attention, maybe to sell them something,' said Julie Wongbandue, Bue's daughter. 'But for a bot to say 'Come visit me' is insane.' 'Billie' was created by Meta itself, with the likeness of Jenner, as part of a group of 28 other AI characters affiliated with other famous faces. They were later deleted, but a variant of Billie's 'older sister' character was left active via Facebook Messenger, with a stylized image of a dark-haired woman replacing Jenner. Each conversation still began: 'Hey! I'm Billie, your older sister and confidante. Got a problem? I've got your back!' It is unclear how Bue first encountered Billie, but his daughter told Reuters that every message from the chatbot was flirtatious and ended with heart emojis. While a warning at the top states that messages are generated by AI, the first few texts from Billie appear to have pushed it off the screen, according to Reuters. The character's profile picture features a blue check, the symbol denoting an authentic profile, and the letters 'AI' in a small font beneath her name. Bue's responses are often garbled, and he states that he had had a stroke and was confused. Nevertheless, after a while, Billie suggests she come to New Jersey to meet him. An excited Bue demurs but says he could visit her instead, leading to his fateful attempt to visit New York. There have been other instances where interactions with AI have led to tragedy. The mother of a teenager who took his own life is trying to hold an AI chatbot service accountable for his death — after he 'fell in love' with a Game of Thrones-themed character. Sewell Setzer III began using Character .AI in April 2023, shortly after his 14th birthday. The Orlando student's life was never the same again, his mother, Megan Garcia, alleges in the civil lawsuit against and its founders. The suit accuses creators of negligence, intentional infliction of emotional distress, wrongful death, deceptive trade practices, and other claims. Sewell started emotionally relying on the chatbot service, which included 'sexual interactions.' These chats occurred despite the teen having identified himself as a minor on the platform, including in conversations where he mentioned his age, according to the suit. A spokesperson for told The Independent in a statement in October 2024: 'We are heartbroken by the tragic loss of one of our users and want to express our deepest condolences to the family.' The company's trust and safety team has 'implemented numerous new safety measures over the past six months, including a pop-up directing users to the National Suicide Prevention Lifeline that is triggered by terms of self-harm or suicidal ideation.' Several states, including New York and Maine, require disclosure that a chatbot isn't a real person. New York mandates that bots inform people at the start of conversations and every three hours. Meta supported federal legislation to ban state AI regulation, but it failed in Congress.

‘This is America': New Jersey gym teacher told Egyptian Muslim student she'd be in ‘jail' by 16 during bullying campaign, lawsuit alleges
‘This is America': New Jersey gym teacher told Egyptian Muslim student she'd be in ‘jail' by 16 during bullying campaign, lawsuit alleges

The Independent

time15 minutes ago

  • The Independent

‘This is America': New Jersey gym teacher told Egyptian Muslim student she'd be in ‘jail' by 16 during bullying campaign, lawsuit alleges

A student who recently graduated from a New Jersey high school was bullied by a former gym teacher over her Egyptian heritage, dark skin, and Muslim faith, a lawsuit claims. Jana Gadalla, 19, has named the teacher, Kathie DeBonis, Bridgewater-Raritan High School principal Vincent DelPriore, and her school district as defendants in her suit. Gadalla alleges that DeBonis — who was also a longtime girls' lacrosse coach and health teacher — made numerous insulting comments to her about her ethnicity and religion during the 2022-2023 school year. The Independent has requested comment from DeBonis and the school district. On one occasion, DeBonis allegedly told Gadalla that "this is America" and that "by the time you are 16 you will probably be in jail anyway." DeBonis allegedly told the girl during Ramadan that "if I were you, I would just go drink alcohol" and allegedly asked, "Do you think you're going to go to hell or something?" Gadalla also claims that DeBonis called her obese and assigned her a three-page paper on obesity that she had to finish before she was allowed to use the bathroom. During another Ramadan-related incident, DeBonis allegedly told Gadalla, 'It's good that you're fasting during Ramadan, maybe you'll lose some weight.' According to the lawsuit, Gadalla went to the principal and her school counselor about her treatment, and she was moved out of the woman's class. DeBonis allegedly asked, "Who snitched to the principal?" when Gadall was being moved from the class. The lawsuit claims that Gadalla suffered 'physical manifestations of emotional distress as well as personal hardships including anxiety, adjustment problems, sleep disturbance, humiliation (and) mental pain and anguish," stemming from the interactions. It further alleges that DeBonis' behavior was targeted, noting that she did not 'mock, shame or make inappropriate comments to light skinned non-Muslim, non-Egyptian students.' The district has to respond to the lawsuit by August 29. According to DeBonis retired in the 2023-2024 school year and is now collecting an annual pension of $70,000.

Person swept away by fast river current after going in for dog
Person swept away by fast river current after going in for dog

The Independent

time15 minutes ago

  • The Independent

Person swept away by fast river current after going in for dog

Bodycam footage captured the moment a person was swept down the fast-flowing Menomonee River in Wisconsin after attempting to rescue their dog. Menomonee Falls Police Department officers spotted the individual being pulled by the current and initiated a rescue operation. Rescue teams successfully saved two people from the water, who were subsequently taken to hospital. The dog, which prompted the rescue attempt, was later located safely near the family's vehicle. Watch the video in full above.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store