
Unauthorized AI Generation: How Can Rights of Voice Actors, Singers Be Protected?
The internet is flooded with videos of voice actors speaking lines unrelated to works in which they participated and singers performing other singers' songs. It is believed that such content has been created by training generative AI without permission to create highly similar voices.
The Japan Actors Union, whose members include many voice actors, has reported that it confirmed about 270 such cases in an investigation that was conducted over a three-month period. The union and other organizations have called for obtaining consent from rights holders when AI is used to learn voices, and they have also urged people to clearly indicate when content is generated by AI.
For voice actors and singers, their voices are 'products' crafted through training and experience accumulated for years. If videos posted online get views and earn profits by using their voices, this act appears to constitute a clear infringement on their rights.
Works such as text and musical compositions are protected by the Copyright Law, but voices have been interpreted conventionally as not falling under this protection.
Celebrities and athletes are said to have the 'right of publicity,' which allows them to exclusively use their names and likenesses. However, whether this right also covers voices remains unclear.
The Economy, Trade and Industry Ministry has warned that acts — such as a case in which voices of voice actors and other related parties were created by AI without their consent to produce alarm clocks and such products were sold — could constitute a violation of the Unfair Competition Prevention Law.
However, some people argue that applying this law is difficult because proving a violation requires the AI-generated voice to be immediately recognizable as the voice of a specific person, among other requirements.
In the U.S. state of Tennessee, a law was enacted last year to protect individuals' rights to their voices and likenesses from digital reproduction. It is hoped that Japan will explore ways to institute legal protection by referring to such instances.
In the process of considering that, various issues will likely arise, such as how to handle impersonations by entertainers. Certain considerations may be necessary to prevent a decline in forms of cultural expression such as parody.
Japan amended the Copyright Law in 2018 to allow AI tools to learn text and music without permission. As a result, the act of having AI learn animation and other works has become legal, creating a situation in which voices have been used freely without consent from rights holders.
It is necessary to amend the law again to stave off unauthorized AI training in the first place.
The use of AI has also raised concerns over issues such as fake videos featuring politicians that could influence elections and the spread of sexually explicit fake images, including child pornography. Measures against these deepfakes, among other problems, also must be implemented urgently.
(From The Yomiuri Shimbun, July 27, 2025)
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Yomiuri Shimbun
2 days ago
- Yomiuri Shimbun
Unauthorized AI Generation: How Can Rights of Voice Actors, Singers Be Protected?
The act of using artificial intelligence to generate voices that sound identical to those of voice actors and singers and uploading them to the internet has been rampant. Discussions should be deepened on how to protect their 'voice rights.' The internet is flooded with videos of voice actors speaking lines unrelated to works in which they participated and singers performing other singers' songs. It is believed that such content has been created by training generative AI without permission to create highly similar voices. The Japan Actors Union, whose members include many voice actors, has reported that it confirmed about 270 such cases in an investigation that was conducted over a three-month period. The union and other organizations have called for obtaining consent from rights holders when AI is used to learn voices, and they have also urged people to clearly indicate when content is generated by AI. For voice actors and singers, their voices are 'products' crafted through training and experience accumulated for years. If videos posted online get views and earn profits by using their voices, this act appears to constitute a clear infringement on their rights. Works such as text and musical compositions are protected by the Copyright Law, but voices have been interpreted conventionally as not falling under this protection. Celebrities and athletes are said to have the 'right of publicity,' which allows them to exclusively use their names and likenesses. However, whether this right also covers voices remains unclear. The Economy, Trade and Industry Ministry has warned that acts — such as a case in which voices of voice actors and other related parties were created by AI without their consent to produce alarm clocks and such products were sold — could constitute a violation of the Unfair Competition Prevention Law. However, some people argue that applying this law is difficult because proving a violation requires the AI-generated voice to be immediately recognizable as the voice of a specific person, among other requirements. In the U.S. state of Tennessee, a law was enacted last year to protect individuals' rights to their voices and likenesses from digital reproduction. It is hoped that Japan will explore ways to institute legal protection by referring to such instances. In the process of considering that, various issues will likely arise, such as how to handle impersonations by entertainers. Certain considerations may be necessary to prevent a decline in forms of cultural expression such as parody. Japan amended the Copyright Law in 2018 to allow AI tools to learn text and music without permission. As a result, the act of having AI learn animation and other works has become legal, creating a situation in which voices have been used freely without consent from rights holders. It is necessary to amend the law again to stave off unauthorized AI training in the first place. The use of AI has also raised concerns over issues such as fake videos featuring politicians that could influence elections and the spread of sexually explicit fake images, including child pornography. Measures against these deepfakes, among other problems, also must be implemented urgently. (From The Yomiuri Shimbun, July 27, 2025)


SoraNews24
2 days ago
- SoraNews24
Gacha capsule toy machine you control with your brain/prayers created in Japan【Video】
Designers want Mushin Gacha in arcades, anime specialty shops, and character cafes. We're in a gacha capsule toy golden age in Japan right now, with an unprecedented variety of items available to suit just about every interest and aesthetic taste. But while toy designers are constantly coming up with new prizes to stock gacha machines with, the purchasing process has changed very little: toss your coins into the slot, then turn the handle until a capsule drops for you. But what if instead of turning the handle with your hand, you could turn it with your mind? That's the idea behind Mushin Gacha, a new type of capsule toy machine dreamed up by Tokyo-based neurotechnology and AI research/solutions company Araya. 'Mushin' has a number of possible meanings, but among them are associations with innocence or desire, and those are aspects of the mental state you must achieve to get their Mushin Gacha prize capsule to drop. ▼ Mushin Gacha demonstration video Users don an electroencephalograph head sensor which measures their brain's current level of alpha waves, said to be produced when the mind is in a relaxed state. At the same time, a camera equipped with AI image recognition software will check to see if you've got your hands clasped in a 'please give me a gacha capsule' pose. Fulfilling those conditions, showing an unabashed genuine desire, will activate the Mushin Gacha's motor, causing it to drop your prize capsule. ▼ If you're wondering why it needs a crank at all, it's because gacha is the onomatopoeia of the clunking noise it makes as it turns, so it'd be weird to make a gacha machine without that traditional aural factor, even if it's so cutting-edge you operate it with your brain. Araya says it hopes to install Mushin Gacha machines at video game arcades, anime/manga specialty stores, shopping center game corners, fan events, popup stores, and themed cafes. The preview images show a 'Please insert coin' message on the screen, but it's not clear what happens if you don't achieve the right combination of prayer pose and alpha waves right away. Theoretically, operators could set it to so that players have an unlimited amount of time/number of chances to keep trying in, or perhaps could set a time limit after which no prize is won and the player needs to step aside and let someone else have a turn. Although the unit in the preview video is made out of cardboard, it feels like a safe bet that that's just a pre-production mockup, and that a more substantial and high-tech housing is in the works, and perhaps something Araya has in the works for its upcoming demonstration for visitors at this year's Tokyo Game Show in September. Source: PR Times Top image: PR Times Insert images: YouTube/Araya Inc., PR Times ● Want to hear about SoraNews24's latest articles as soon as they're published? Follow us on Facebook and Twitter!


Japan Today
6 days ago
- Japan Today
These tips from experts can help your teenager navigate AI companions
Bruce Perry, 17, demonstrates Character AI, an artificial intelligence chatbot software that allows users to chat with popular characters such as EVE from Disney's 2008 animated film, WALL-E, Tuesday, July 15, 2025, in Russellville, Ark. (AP Photo/Katie Adkins) By JOCELYN GECKER As artificial intelligence technology becomes part of daily life, adolescents are turning to chatbots for advice, guidance and conversation. The appeal is clear: Chatbots are patient, never judgmental, supportive and always available. That worries experts who say the booming AI industry is largely unregulated and that many parents have no idea about how their kids are using AI tools or the extent of personal information they are sharing with chatbots. New research shows more than 70% of American teenagers have used AI companions and more than half converse with them regularly. The study by Common Sense Media focused on 'AI companions,' like Character. AI, Nomi and Replika, which it defines as 'digital friends or characters you can text or talk with whenever you want,' versus AI assistants or tools like ChatGPT, though it notes they can be used the same way. It's important that parents understand the technology. Experts suggest some things parents can do to help protect their kids: — Start a conversation, without judgment, says Michael Robb, head researcher at Common Sense Media. Approach your teen with curiosity and basic questions: 'Have you heard of AI companions?' 'Do you use apps that talk to you like a friend?' Listen and understand what appeals to your teen before being dismissive or saying you're worried about it. — Help teens recognize that AI companions are programmed to be agreeable and validating. Explain that's not how real relationships work and that real friends with their own points of view can help navigate difficult situations in ways that AI companions cannot. 'One of the things that's really concerning is not only what's happening on screen but how much time it's taking kids away from relationships in real life,' says Mitch Prinstein, chief of psychology at the American Psychological Association. 'We need to teach kids that this is a form of entertainment. It's not real, and it's really important they distinguish it from reality and should not have it replace relationships in your actual life.' The APA recently put out a health advisory on AI and adolescent well-being, and tips for parents. — Parents should watch for signs of unhealthy attachments. 'If your teen is preferring AI interactions over real relationships or spending hours talking to AI companions, or showing that they are becoming emotionally distressed when separated from them — those are patterns that suggest AI companions might be replacing rather than complementing human connection,' Robb says. — Parents can set rules about AI use, just like they do for screen time and social media. Have discussions about when and how AI tools can and cannot be used. Many AI companions are designed for adult use and can mimic romantic, intimate and role-playing scenarios. While AI companions may feel supportive, children should understand the tools are not equipped to handle a real crisis or provide genuine mental health support. If kids are struggling with depression, anxiety, loneliness, an eating disorder or other mental health challenges, they need human support — whether it is family, friends or a mental health professional. — Get informed. The more parents know about AI, the better. 'I don't think people quite get what AI can do, how many teens are using it and why it's starting to get a little scary,' says Prinstein, one of many experts calling for regulations to ensure safety guardrails for children. 'A lot of us throw our hands up and say, 'I don't know what this is!' This sounds crazy!' Unfortunately, that tells kids if you have a problem with this, don't come to me because I am going to diminish it and belittle it.' Older teenagers have advice, too, for parents and kids. Banning AI tools is not a solution because the technology is becoming ubiquitous, says Ganesh Nair, 18. 'Trying not to use AI is like trying to not use social media today. It is too ingrained in everything we do,' says Nair, who is trying to step back from using AI companions after seeing them affect real-life friendships in his high school. 'The best way you can try to regulate it is to embrace being challenged.' 'Anything that is difficult, AI can make easy. But that is a problem,' says Nair. 'Actively seek out challenges, whether academic or personal. If you fall for the idea that easier is better, then you are the most vulnerable to being absorbed into this newly artificial world.' The Associated Press' education coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP's standards for working with philanthropies, a list of supporters and funded coverage areas at © Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.