logo
#

Latest news with #ErinKunz

Groundbreaking brain chip decodes people's inner monologues in real time
Groundbreaking brain chip decodes people's inner monologues in real time

Yahoo

time17 hours ago

  • Health
  • Yahoo

Groundbreaking brain chip decodes people's inner monologues in real time

A new brain-computer interface technology can decode a user's inner monologue, an advance that scientists say can help patients with severe speech paralysis communicate their thoughts. The research, published in Cell, found the new brain-computer interface, or BCI, could successfully decode a user's inner speech on command with up to 74 per cent accuracy. Scientists hope the new technology can help people who are unable to speak audibly to communicate more easily. 'This is the first time we've managed to understand what brain activity looks like when you just think about speaking,' Erin Kunz, a co-author of the study from Stanford University, said. 'For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally.' A BCI uses sensors implanted in brain regions that control movement to decode neural signals and translate them into actions such as operating a prosthetic hand. Recent research shows that BCIs can even decode attempted speech among people with paralysis. When users attempt to speak by activating the muscles involved in producing sounds, BCIs can interpret the corresponding brain activity and convert it into text, even if the spoken words are unintelligible. 'If you just have to think about speech instead of actually trying to speak, it's potentially easier and faster for people,' Benyamin Meschede-Krasa, another author of the study, said. In the latest study, researchers implanted microelectrodes in the motor cortex – the region of brain responsible for speaking – of four participants with severe paralysis from either amyotrophic lateral sclerosis or a brainstem stroke. Participants were asked to either attempt to speak or imagine saying a set of words. While both attempted speech and inner speech activated overlapping regions in the brain and evoked similar patterns of neural activity, they were also different enough to be reliably distinguished from each other, researchers found. The latter tended to show a weaker magnitude of activation overall, scientists observed. They then used artificial intelligence to interpret their imagined words. They demonstrated that their BCI could decode imagined sentences from a vocabulary of up to 125,000 words with an accuracy rate as high as 74 per cent. The technology could even pick up some inner speech the participants were never told to make, such as numbers when they were asked to tally the pink circles on a screen. Researchers developed a password-controlled mechanism to prevent the BCI from decoding inner speech unless temporarily unlocked with a chosen keyword. In an experiment, users could think of the phrase 'chitty chitty bang bang' to begin inner-speech decoding. The system could recognise this password with more than 98 per cent accuracy, the study found. 'This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech,' Frank Willett, another author of the study, said.

How a brain-computer chip can read people's minds
How a brain-computer chip can read people's minds

Euronews

time4 days ago

  • Health
  • Euronews

How a brain-computer chip can read people's minds

An experimental brain implant can read people's minds, translating their inner thoughts into text. In an early test, scientists from Stanford University used a brain-computer interface (BCI) device to decipher sentences that were thought, but not spoken aloud. The implant was correct up to 74 per cent of the time. BCIs work by connecting a person's nervous system to devices that can interpret their brain activity, allowing them to take action – like using a computer or moving a prosthetic hand – with only their thoughts. They have emerged as a possible way for people with disabilities to regain some independence. Perhaps the most famous is Elon Musk's Neuralink implant, an experimental device that is in early trials testing its safety and functionality in people with specific medical conditions that limit their mobility. The latest findings, published in the journal Cell, could one day make it easier for people who cannot speak to communicate more easily, the researchers said. 'This is the first time we've managed to understand what brain activity looks like when you just think about speaking,' said Erin Kunz, one of the study's authors and a researcher at Stanford University in the United States. Working with four study participants, the research team implanted microelectrodes – which record neural signals – into the motor cortex, which is the part of the brain responsible for speech. The researchers asked participants to either attempt to speak or to imagine saying a set of words. Both actions activated overlapping parts of the brain and elicited similar types of brain activity, though to different degrees. They then trained artificial intelligence (AI) models to interpret words that the participants thought but did not say aloud. In a demonstration, the brain chip could translate the imagined sentences with an accuracy rate of up to 74 per cent. In another test, the researchers set a password to prevent the BCI from decoding people's inner speech unless they first thought of the code. The system recognised the password with around 99 per cent accuracy. The password? 'Chitty chitty bang bang'. For now, brain chips cannot interpret inner speech without significant guardrails. But the researchers said more advanced models may be able to do so in the future. Frank Willett, one of the study's authors and an assistant professor of neurosurgery at Stanford University, said in a statement that BCIs could also be trained to ignore inner speech. 'This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech,' he said.

New Brain Device Is First to Read Out Inner Speech
New Brain Device Is First to Read Out Inner Speech

Scientific American

time5 days ago

  • Health
  • Scientific American

New Brain Device Is First to Read Out Inner Speech

After a brain stem stroke left him almost entirely paralyzed in the 1990s, French journalist Jean-Dominique Bauby wrote a book about his experiences—letter by letter, blinking his left eye in response to a helper who repeatedly recited the alphabet. Today people with similar conditions often have far more communication options. Some devices, for example, track eye movements or other small muscle twitches to let users select words from a screen. And on the cutting edge of this field, neuroscientists have more recently developed brain implants that can turn neural signals directly into whole words. These brain-computer interfaces (BCIs) largely require users to physically attempt to speak, however—and that can be a slow and tiring process. But now a new development in neural prosthetics changes that, allowing users to communicate by simply thinking what they want to say. The new system relies on much of the same technology as the more common 'attempted speech' devices. Both use sensors implanted in a part of the brain called the motor cortex, which sends motion commands to the vocal tract. The brain activation detected by these sensors is then fed into a machine-learning model to interpret which brain signals correspond to which sounds for an individual user. It then uses those data to predict which word the user is attempting to say. On supporting science journalism If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. But the motor cortex doesn't only light up when we attempt to speak; it's also involved, to a lesser extent, in imagined speech. The researchers took advantage of this to develop their 'inner speech' decoding device and published the results on Thursday in Cell. The team studied three people with amyotrophic lateral sclerosis (ALS) and one with a brain stem stroke, all of whom had previously had the sensors implanted. Using this new 'inner speech' system, the participants needed only to think a sentence they wanted to say and it would appear on a screen in real time. While previous inner speech decoders were limited to only a handful of words, the new device allowed participants to draw from a dictionary of 125,000 words. 'As researchers, our goal is to find a system that is comfortable [for the user] and ideally reaches a naturalistic ability,' says lead author Erin Kunz, a postdoctoral researcher who is developing neural prostheses at Stanford University. Previous research found that 'physically attempting to speak was tiring and that there were inherent speed limitations with it, too,' she says. Attempted speech devices such as the one used in the study require users to inhale as if they are actually saying the words. But because of impaired breathing, many users need multiple breaths to complete a single word with that method. Attempting to speak can also produce distracting noises and facial expressions that users find undesirable. With the new technology, the study's participants could communicate at a comfortable conversational rate of about 120 to 150 words per minute, with no more effort than it took to think of what they wanted to say. Like most BCIs that translate brain activation into speech, the new technology only works if people are able to convert the general idea of what they want to say into a plan for how to say it. Alexander Huth, who researches BCIs at the University of California, Berkeley, and wasn't involved in the new study, explains that in typical speech, 'you start with an idea of what you want to say. That idea gets translated into a plan for how to move your [vocal] articulators. That plan gets sent to the actual muscles, and then they carry it out.' But in many cases, people with impaired speech aren't able to complete that first step. 'This technology only works in cases where the 'idea to plan' part is functional but the 'plan to movement' part is broken'—a collection of conditions called dysarthria—Huth says. According to Kunz, the four research participants are eager about the new technology. 'Largely, [there was] a lot of excitement about potentially being able to communicate fast again,' she says—adding that one participant was particularly thrilled by his newfound potential to interrupt a conversation—something he couldn't do with the slower pace of an attempted speech device. To ensure private thoughts remained private, the researchers implemented a code phrase: 'chitty chitty bang bang.' When internally spoken by participants, this would prompt the BCI to start or stop transcribing. Brain-reading implants inevitably raise concerns about mental privacy. For now, Huth isn't concerned about the technology being misused or developed recklessly, speaking to the integrity of the research groups involved in neural prosthetics research. 'I think they're doing great work; they're led by doctors; they're very patient-focused. A lot of what they do is really trying to solve problems for the patients,' he says, 'even when those problems aren't necessarily things that we might think of,' such as being able to interrupt a conversation or 'making a voice that sounds more like them.' For Kunz, this research is particularly close to home. 'My father actually had ALS and lost the ability to speak,' she says, adding that this is why she got into her field of research. 'I kind of became his own personal speech translator toward the end of his life since I was kind of the only one that could understand him. That's why I personally know the importance and the impact this sort of research can have.' The contribution and willingness of the research participants are crucial in studies like this, Kunz notes. 'The participants that we have are truly incredible individuals who volunteered to be in the study not necessarily to get a benefit to themselves but to help develop this technology for people with paralysis down the line. And I think that they deserve all the credit in the world for that.'

New Brain Interface Interprets Inner Monologues With Startling Accuracy
New Brain Interface Interprets Inner Monologues With Startling Accuracy

Gizmodo

time5 days ago

  • Health
  • Gizmodo

New Brain Interface Interprets Inner Monologues With Startling Accuracy

Scientists can now decipher brain activity related to the silent inner monologue in people's heads with up to 74% accuracy, according to a new study. In new research published today in Cell, scientists from Stanford University decoded imagined words from four participants with severe paralysis due to ALS or brainstem stroke. Aside from being absolutely wild, the findings could help people who are unable to speak communicate more easily using brain-computer interfaces (BCIs), the researchers say. 'This is the first time we've managed to understand what brain activity looks like when you just think about speaking,' lead author Erin Kunz, a graduate student in electrical engineering at Stanford University, said in a statement. 'For people with severe speech and motor impairments, BCIs capable of decoding inner speech could help them communicate much more easily and more naturally.' Previously, scientists have managed to decode attempted speech using BCIs. When people physically attempt to speak out loud by engaging the muscles related to speech, these technologies can interpret the resulting brain activity and type out what they're trying to say. But while effective, the current methods of BCI-assisted communication can still be exhausting for people with limited muscle control. The new study is the first to directly take on inner speech. To do so, the researchers recorded activity in the motor cortex—the region responsible for controlling voluntary movements, including speech—using microelectrodes implanted in the motor cortex of the four participants. The researchers found that attempted and imagined speech activate similar, though not identical, patterns of brain activity. They trained an AI model to interpret these imagined speech signals, decoding sentences from a vocabulary of up to 125,000 words with as much as 74% accuracy. In some cases, the system even picked up unprompted inner thoughts, like numbers participants silently counted during a task. For people who want to use the new technology but don't always want their inner thoughts on full blast, the team added a password-controlled mechanism that prevented the BCI from decoding inner speech unless the participants thought of a password ('chitty chitty bang bang' in this case). The system recognized the password with more than 98% accuracy. While 74% accuracy is high, the current technology still makes a substantial amount of errors. But the researchers are hopeful that soon, more sensitive recording devices and better algorithms could boost their performance even more. 'The future of BCIs is bright,' Frank Willett, assistant professor in the department of neurosurgery at Stanford and the study's lead author, said in a statement. 'This work gives real hope that speech BCIs can one day restore communication that is as fluent, natural, and comfortable as conversational speech.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store