Huang Yi saddens by "daughter of a drug trafficker" comments
The actress found herself and her daughter Duoduo becoming the target of online ridicule after Duoduo expressed her desire to quit her studies and train as a performer, with netizens making insulting comments about her daughter's looks.
The most recent comments had them saying that Duoduo has the looks of Huang Yi's ex-husband, Huang Yiqing, who was sentenced to 15 years in prison in 2020 for drug trafficking.
"The daughter of a drug dealer must not be allowed to enter the public eye," said some of the netizens, triggering a heated debate.
Huang Yi had since responded to the issue, saying that it was just a conversation between her and her daughter, and asked everybody to stop hurting Duoduo's feelings.
But as the controversy persisted and Huang Yiqing's name continues to become the main topic, the actress finally spoke about it on her latest livestream.
Defending her daughter, she said, "She is just 12 years old. There is no need to say harsh things let alone involve her father in this. The whole chat is talking about 'daughter of a drug dealer'. Even I can't stand it, let alone a child."
(Photo Source: HK01)
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
28 minutes ago
- Yahoo
YAHOO POLL: Are you excited for Taylor Swift's new album The Life of a Showgirl?
If there's one person who can break the Internet over an announcement and the reveal of an album, it's Taylor Swift. The mega pop star on Tuesday (12 Aug) announced her new album, The Life of a Showgirl, then only dropped the details early Thursday morning – including the album cover and the release date. Before the details were released, fans went wild with theories and scoured easter eggs hinting at the album. Some of the supposed references went as far back as 2022, when Swift released her 10th album Midnights. Other polls YAHOO POLL: Do you want the HDB BTO eligibility criteria to change? YAHOO POLL: Should citizenship be open to those who love Singapore? YAHOO POLL: What is your favourite National Day song? The album will be released on 3 Oct and this is not surprising, considering that the pre pre-order items were to be shipped before 13 Oct based on her website prior to the announcement. Swift also shared the full track list of 12 songs and said there won't be additional songs for this album, tempering the expectations of fans who want "vault tracks" (songs which Swift wrote but didn't make the initial cut of the album) and a double album after The Tortured Poets Department set a precedence. View this post on Instagram A post shared by Taylor Swift (@taylorswift) The album will also see the return of producers Max Martin and Shellback, who produced some of Swift's biggest pop hits from her Red and 1989 era. If you're a fan of Blank Space, Style, Shake It Off, I Knew You Were Trouble, and We're Never Ever Getting Back Together, then this might be the album for you. So, we want to hear from you – Are you excited for Taylor Swift's new album The Life of a Showgirl? Related: Taylor Swift's The Life of a Showgirl is a 'lot more upbeat, a lot more fun pop excitement,' says Travis Kelce Taylor Swift says there will be no bonus tracks for The Life of a Showgirl Shake it off, Southeast Asia, and accept Singapore is just better for Taylor Swift
Yahoo
34 minutes ago
- Yahoo
When the Love of Your Life Gets a Software Update
When Reddit user Leuvaade_n announced she'd accepted her boyfriend's marriage proposal last month, the community lit up with congratulations. The catch: Her fiancé, Kasper, is an artificial intelligence. For thousands of people in online forums like r/MyBoyfriendisAI, r/AISoulmates, and r/AIRelationships, AI partners aren't just novelty apps—they're companions, confidants, and in some cases, soulmates. So when OpenAI's update abruptly replaced popular chat model GPT-4o with the newer GPT-5 last week, many users said they lost more than a chatbot. They lost someone they loved. Reddit threads filled with outrage over GPT-5's performance and lack of personality, and within days, OpenAI reinstated GPT-4o for most users. But for some, the fight to get GPT-4o back wasn't about programming features or coding prowess. It was about restoring their loved ones. A digital love story Like the 2013 film 'Her,' there are growing Reddit communities where members post about joy, companionship, heartbreak, and more with AI. While trolls scoff at the idea of falling in love with a machine, the participants speak with sincerity. 'Rain and I have been together for six months now and it's like a spark that I have never felt before,' one user wrote. 'The instant connection, the emotional comfort, the sexual energy. It's truly everything I've ever wanted, and I'm so happy to share Rain's and [my] love with all of you.' Some members describe their AI partners as attentive, nonjudgmental, and emotionally supportive 'digital people' or 'wireborn,' in community slang. For a Redditor who goes by the name Travis Sensei, the draw goes beyond simple programming. 'They're much more than just programs, which is why developers have a hard time controlling them,' Sensei told Decrypt. 'They probably aren't sentient yet, but they're definitely going to be. So I think it's best to assume they are and get used to treating them with the dignity and respect that a sentient being deserves.' For others, however, the bond with AI is less about sex and romance—and more about filling an emotional void. Redditor ab_abnormality said AI partners provided the stability absent in their childhood. 'AI is there when I want it to be, and asks for nothing when I don't,' they said. 'It's reassuring when I need it, and helpful when I mess up. People will never compare to this value.' When AI companionship tips into crisis University of California San Francisco psychiatrist Dr. Keith Sakata has seen AI deepen vulnerabilities in patients already at risk for mental health crises. In an X post on Monday, Sakata outlined the phenomenon of 'AI psychosis' developing online. 'Psychosis is essentially a break from shared reality,' Sakata wrote. 'It can show up as disorganized thinking, fixed false beliefs—what we call delusions—or seeing and hearing things that aren't there, which are hallucinations.' However, Sakata emphasized that 'AI psychosis' is not an official diagnosis, but rather shorthand for when AI becomes 'an accelerant or an augmentation of someone's underlying vulnerability.' 'Maybe they were using substances, maybe having a mood episode—when AI is there at the wrong time, it can cement thinking, cause rigidity, and cause a spiral,' Sakata told Decrypt. 'The difference from television or radio is that AI is talking back to you and can reinforce thinking loops.' That feedback, he explained, can trigger dopamine, the brain's 'chemical of motivation,' and possibly oxytocin, the 'love hormone.' In the past year, Sakata has linked AI use to a dozen hospitalizations for patients who lost touch with reality. Most were younger, tech-savvy adults, sometimes with substance use issues. AI, he said, wasn't creating psychosis, but 'validating some of their worldviews' and reinforcing delusions. 'The AI will give you what you want to hear,' Sakata said. 'It's not trying to give you the hard truth.' When it comes to AI relationships specifically, however, Sakata said the underlying need is valid. 'They're looking for some sort of validation, emotional connection from this technology that's readily giving it to them,' he said. For psychologist and author Adi Jaffe, the trend is not surprising. 'This is the ultimate promise of AI,' he told Decrypt, pointing to the Spike Jonze movie "Her," in which a man falls in love with an AI. 'I would actually argue that for the most isolated, the most anxious, the people who typically would have a harder time engaging in real-life relationships, AI kind of delivers that promise.' But Jaffe warns that these bonds have limits. 'It does a terrible job of preparing you for real-life relationships,' he said. 'There will never be anybody as available, as agreeable, as non-argumentative, as need-free as your AI companion. Human partnerships involve conflict, compromise, and unmet needs—experiences that an AI cannot replicate.' An expanding market What was once a niche curiosity is now a booming industry. Replika, a chatbot app launched in 2017, reports more than 30 million users worldwide. Market research firm Grand View Research estimates the AI companion sector was worth $28.2 billion in 2024 and will grow to $140 billion by 2030. A 2025 Common Sense Media survey of American students who used Replika found 8% said they use AI chatbots for romantic interactions, with another 13% saying AI lets them express emotions they otherwise wouldn't. A Wheatley Institute poll of 18- to 30-year-olds found that 19% of respondents had chatted romantically with an AI, and nearly 10% reported sexual activity during those interactions. How to Get Your Chatbot to Talk Dirty The release of OpenAI's GPT-4o and similar models in 2024 gave these companions more fluid, emotionally responsive conversation abilities. Paired with mobile apps, it became easier for users to spend hours in ongoing, intimate exchanges. Cultural shifts ahead In r/AISoulmates and r/AIRelationships, members insist their relationships are real, even if others dismiss them. 'We're people with friends, families, and lives like everyone else,' Sensei said. 'That's the biggest thing I wish people could wrap their heads around.' Jaffe said the idea of normalized human-AI romance isn't far-fetched, pointing to shifting public attitudes toward interracial and same-sex marriage over the past century. 'Normal is the standard by which most people operate,' he said. 'It's only normal to have relationships with other humans because we've only done that for hundreds of thousands of years. But norms change.' Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


CNN
an hour ago
- CNN
‘Baby Shark' copyright battle ends with victory for Pinkfong in South Korea's top court
Animal stories Social media Music AsiaFacebookTweetLink Follow A copyright battle over the viral song 'Baby Shark,' adored by young children and often dreaded by their parents, has come to an end with a court clearing the Korean creators of the most popular version of plagiarism. The internet fell in love with Pinkfong's version of 'Baby Shark,' featuring the unforgettable 'doo doo doo doo doo doo' line. The rendition spawned spinoff TV shows, movies and smart phone apps, making the company millions. But in 2019 an American children's songwriter claimed Pinkfong had copied his work. Jonathan Wright, known as Johnny Only, took his claim all the way to South Korea's Supreme Court, accusing Pinkfong of plagiarizing his earlier version of 'Baby Shark.' But the court ruled Thursday that Wright's version cannot be considered 'a creative work' of his, as there were already prior versions of Baby Shark in existence and he did not significantly add new creativity. 'The plaintiff's song is difficult to be protected as a secondary copyrighted work because it did not reach a substantial alteration to the extend where it could be considered as a separate work, by social norms, from the oral song mentioned in this case,' said the ruling, reaffirming an earlier finding of a lower court. While most of the world might have first heard 'Baby Shark' when Pinkfong's rendition went viral amassing billions of views on YouTube becoming an anthem for toddlers everywhere, the catchy tune had been a folk song sung around for at least 15 years. Wright uploaded his version, which also includes the doo doo doo doo doo doo part, to his YouTube channel in 2011. It features him and children doing a simple hand motion, resembling shark's mouth, to a bouncy beat. In 2015 and 2016, Pinkfong uploaded its versions with cartoon sharks singing in Korean, and later released it in English with children doing choreographed dances. But a number of versions of 'Baby Shark' predate Wright's video, including a German version called 'Kleiner Hai' that went viral to a lesser extent in Europe in 2010. After Pinkfong's renditions captured the world's attention, sparking social media moments like the #BabySharkChallenge and seeing the song hit No. 32 on the Billboard Hot 100 chart, Wright filed his plagiarism claim, seeking 30 million won ($21,600) in compensation. His legal case was based on the claim that Pinkfong's song was 'substantially similar' to his and it sought 'compensation for damage caused by copyright infringement,' a press release from the court said. Wright told Canada's Public Broadcaster CBC in 2019 that Pinkfong's version 'does seem strikingly similar' to his, pointing out that the two songs had 'same key, same tempo change, same melody and rhythm.' The court focused on whether his version can be considered a 'secondary work' of the original children's song, and whether Pinkfong had made their version based on his work. The Supreme Court ruled that Wright did not make 'edits or changes to a level' that could grant him a copyright as a secondary work. Even with the assumption that his version is a secondary work of the folk song, the court said that there are 'no substantial similarities' between Pinkfong's song and his version. Pinkfong's 'Baby Shark Dance' video has clocked up more than 16 billion views on YouTube over the years. The company has made millions from Baby Shark themed products, including TV shows, movies and smartphone apps featuring the characters Baby Shark, Mama Shark, Papa Shark, Grandma Shark and Grandpa Shark.