logo
For some patients, the 'inner voice' may soon be audible

For some patients, the 'inner voice' may soon be audible

Time of India2 days ago
For decades, neuro-engineers have dreamed of helping people who have been cut off from the world of language. A disease like amyotrophic lateral sclerosis, or ALS, weakens the muscles in the airway.
Tired of too many ads? go ad free now
A stroke can kill neurons that normally relay commands for speaking. Perhaps, by implanting electrodes, scientists could record the brain's electric activity and translate that into spoken words.
Now a team of researchers has made an important advance toward that goal. Previously they succeeded in decoding the signals produced when people tried to speak. In the new study, published Thursday in the journal Cell, their computer often made correct guesses when the subjects simply imagined saying words.
Christian Herff, a neuroscientist at Maastricht University in the Netherlands who was not involved in the research, said the result went beyond the merely technological and shed light on the mystery of language. "It's a fantastic advance," Herff said. The new study is the latest result in a long-running clinical trial, called BrainGate2, that has already seen some remarkable successes.
One participant, Casey Harrell, now uses his brain-machine interface to hold conversations.
In 2023, after ALS had made his voice unintelligible, Harrell agreed to have electrodes implanted in his brain. A computer recorded the electrical activity from the implants as Harrell attempted to say different words. Over time, with the help of AI, the computer predicted 6,000 words, with 97.5% accuracy.
But successes like this raised a troubling question: Could a computer accidentally record more than patients actually wanted to say?Could it eavesdrop on their inner voice? "We wanted to investigate if there was a risk of the system decoding words that weren't meant to be said aloud," said Erin Kunz, a neuroscientist at Stanford University and an author of the study.
Tired of too many ads? go ad free now
She and her colleagues also wondered if patients might actually prefer using inner speech.
Kunz and her colleagues decided to investigate the mystery for themselves. The scientists gave participants seven different words, including "kite" and "day," then compared the brain signals when participants attempted to say the words and when they only imagined saying them.
As it turned out, imagining a word produced a pattern of activity similar to that of trying to say it, but the signal was weaker.
The computer did a good job of predicting which of the seven words the participants were thinking. For Harrell, it didn't do much better than a random guess would have, but for another participant it picked the right word more than 70% of the time.
The researchers put the computer through more training, this time specifically on inner speech. Its performance improved significantly, including on Harrell. Now when the participants imagined saying entire sentences, such as "I don't know how long you've been here," the computer could accurately decode most of the words.
Herff, who has done his own studies, was surprised that the experiment succeeded. Before, he would have said that inner speech is fundamentally different from the motor cortex signals that produce actual speech. "But in this study, they show that, for some people, it isn't that different," he said.
Kunz emphasized that the computer's current performance involving inner speech would not be good enough to let people hold conversations.
"The results are an initial proof of concept more than anything," she said. But she is optimistic that decoding inner speech could become the new standard for brain-computer interfaces. In recent trials, she and her colleagues have improved the computer's accuracy. "We haven't hit the ceiling yet," she said. NYT
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Connexin proteins rally arteries to nourish brain on demand
Connexin proteins rally arteries to nourish brain on demand

The Hindu

time18 hours ago

  • The Hindu

Connexin proteins rally arteries to nourish brain on demand

The brain is a guzzler, burning through about a fifth of our resting energy and keeping almost nothing in reserve. When a few thousand neurons suddenly burst into activity — e.g. when you spot a familiar face in the crowd — the fuel has to arrive immediately. Blood vessels open wide to let it in but they can't rob neighbouring regions to pay for the rush. The whole supply network must pitch in, and here lies the mystery: even the most distant arteries seem to respond almost instantly. Scientists call this process neurovascular coupling. Neurons fire, nearby capillaries widen, and blood flow rises as arteries join in, pushing more fuel into the pipeline. Researchers have seen messages travelling 'upstream' from smaller vessels to bigger ones but the known chemical messengers moved too slowly to explain the brain's split-second feats. Something else was clearly at work that passed the call to action almost instantaneously. Cells lining the brain's blood vessels are linked by gap junctions, narrow portals that let neighbouring cells exchange ions and small molecules. When Chengua Gu's lab at Harvard University introduced serotonin into one cell, it slipped through the junctions to its neighbours. A later test revealed a web of connections that were strongest in the arteries and weaker in the veins. The team found that two connexin proteins, Cx37 and Cx40, were especially abundant in the arteries and inferred they may be responsible for the rapid call to action. The findings were published in Cell in July. University College London neuroscientist David Attwell said this arrangement lets signals travel along vessel walls to widen upstream arteries, boosting blood flow to active brain areas. Brant Isakson, a vascular physiologist at the University of Virginia, added that different vessels use different connexins to pass certain signals better, 'like specific pipes for specific fluids'. To prove the link, the Harvard team bred mice that lacked Cx37 and Cx40 in their artery walls. In healthy mice, a burst of brain activity sent a widening signal along the arteries that reached more than a millimetre in a quarter of a second. In the modified mice, the signal moved at a third of the speed. The gap became most obvious when large swaths of the brain lit up. In healthy mice, the widening action spread rapidly and in sync across the arterial network. In the modified mice, it was slower, weaker, and stuck near the source. The results suggested that gap junctions acted as a 'scaling mechanism' that let blood delivery grow to match bursts of brain activity. Anna Devor, a neuroscientist at Boston University who studies how blood flow shapes fMRI signals, said the study nailed down the mechanism that lets vessel-widening signals travel along the vessel walls and measured how fast that happens. 'Knowing both the mechanism and the speed is priceless for computer models linking brain activity to blood flow,' she said. Such models, according to her, could help detect vascular problems, test drugs virtually, and guide therapies, especially when paired with artificial intelligence models. The results could also help explain mismatches between brain activity and blood flow. Devor recalled the late imaging pioneer Amiram Grinvald likening the brain's oxygen supply to 'watering the entire garden for one thirsty flower'. Signals to widen vessels often travel upstream, adding delays: hundreds of milliseconds in small arteries and over a second in larger ones. This study shows that gap junctions account for much of that lag, with the rest due to slower chemical messengers reaching their target vessels. The work may also raise questions about disease. Attwell noted that it's possible, but unproven, that losing gap junction connections in aging or small vessel diseases could lower brain blood flow. Testing that idea, he said, would mean boosting the proteins in lab animals and seeing if that improved brain function. According to Isakson, the findings could help develop drugs to activate connexins as well as discover how the brain's 20-plus connexin protein types combine into mosaic junctions that fine-tune messages from cell to cell. The brain's energy efficiency depends on more than just responsive neurons: it requires a hidden vascular network. Here, the arteries exchange rapid messages through the gap junctions, coordinating supply lines across millimetres in the blink of an eye. This chatter is a reminder that the brain's lifeblood is as much in its wiring as in its firing. Anirban Mukhopadhyay is a geneticist by training and science communicator from Delhi.

How scientists built a password-protected mind-reading brain implant
How scientists built a password-protected mind-reading brain implant

Indian Express

timea day ago

  • Indian Express

How scientists built a password-protected mind-reading brain implant

Scientists have developed a brain-computer interface (BCI) — a device that allows the human brain to communicate with external software or hardware — which works only when the user thinks of a preset password. The findings were detailed in a study, 'Inner speech in motor cortex and implications for speech neuroprostheses', published in the journal Cell on August 14. The new system was developed by researchers based at Stanford University (the United States). Here is a look at how scientists built a password-protected BCI. But first, why are brain-computer interfaces significant? BCIs allow the user to control an application or a device using only their mind. Usually, when someone wants to interact with an application — let's say, they want to switch on a lamp — they first have to decide what they want to do, then they coordinate and use the muscles in their arms, legs or feet to perform the action — like pressing the lamp's on/off switch with their fingers. Then, the device — in this case, the lamp — responds to the action. What BCIs do is help skip the second step of coordinating and using the muscles to perform an action. Instead, they use a computer to identify the desired action and then control the device directly. This is the reason why BCIs have emerged as promising tools for people with severe physical disabilities. They are also being used to restore speech in people who have limited reliable control over their muscles. How was a password-protected BCI developed? The researchers involved in the study focused on 'internal-speech' BCIs, which translate brain signals into text or audio. While these types of devices do not require users to speak out loud, there is always a risk that they could accidentally decode sentences users never intended to say. To resolve this issue, the researchers first 'analysed brain signals collected by microelectrodes placed in the motor cortex — the region involved in voluntary movements — of four participants,' according to a report by the journal Nature. All of these participants had trouble speaking and were asked to either try to say a set of words or imagine saying them. The researchers then analysed the recordings of participants' brain activity. This helped them discover that attempted and internal speech originated in the same brain region and generated similar neural signals, but those associated with internal speech were weaker. This data was used to train artificial intelligence models, which helped BCIs to interpret sentences imagined by the participants after they were asked to think of specific phrases. The devices correctly interpreted 74% of the imagined sentences. To ensure that the BCIs do not decode sentences that users do not intend to utter, the researchers added a password to the system, allowing users to control when decoding began. 'When a participant imagined the password 'Chitty-Chitty-Bang-Bang' (the name of an English-language children's novel), the BCI recognised it with an accuracy of more than 98%,' the Nature report said. (With inputs from Nature)

For some patients, the 'inner voice' may soon be audible
For some patients, the 'inner voice' may soon be audible

Time of India

time2 days ago

  • Time of India

For some patients, the 'inner voice' may soon be audible

For decades, neuro-engineers have dreamed of helping people who have been cut off from the world of language. A disease like amyotrophic lateral sclerosis, or ALS, weakens the muscles in the airway. Tired of too many ads? go ad free now A stroke can kill neurons that normally relay commands for speaking. Perhaps, by implanting electrodes, scientists could record the brain's electric activity and translate that into spoken words. Now a team of researchers has made an important advance toward that goal. Previously they succeeded in decoding the signals produced when people tried to speak. In the new study, published Thursday in the journal Cell, their computer often made correct guesses when the subjects simply imagined saying words. Christian Herff, a neuroscientist at Maastricht University in the Netherlands who was not involved in the research, said the result went beyond the merely technological and shed light on the mystery of language. "It's a fantastic advance," Herff said. The new study is the latest result in a long-running clinical trial, called BrainGate2, that has already seen some remarkable successes. One participant, Casey Harrell, now uses his brain-machine interface to hold conversations. In 2023, after ALS had made his voice unintelligible, Harrell agreed to have electrodes implanted in his brain. A computer recorded the electrical activity from the implants as Harrell attempted to say different words. Over time, with the help of AI, the computer predicted 6,000 words, with 97.5% accuracy. But successes like this raised a troubling question: Could a computer accidentally record more than patients actually wanted to say?Could it eavesdrop on their inner voice? "We wanted to investigate if there was a risk of the system decoding words that weren't meant to be said aloud," said Erin Kunz, a neuroscientist at Stanford University and an author of the study. Tired of too many ads? go ad free now She and her colleagues also wondered if patients might actually prefer using inner speech. Kunz and her colleagues decided to investigate the mystery for themselves. The scientists gave participants seven different words, including "kite" and "day," then compared the brain signals when participants attempted to say the words and when they only imagined saying them. As it turned out, imagining a word produced a pattern of activity similar to that of trying to say it, but the signal was weaker. The computer did a good job of predicting which of the seven words the participants were thinking. For Harrell, it didn't do much better than a random guess would have, but for another participant it picked the right word more than 70% of the time. The researchers put the computer through more training, this time specifically on inner speech. Its performance improved significantly, including on Harrell. Now when the participants imagined saying entire sentences, such as "I don't know how long you've been here," the computer could accurately decode most of the words. Herff, who has done his own studies, was surprised that the experiment succeeded. Before, he would have said that inner speech is fundamentally different from the motor cortex signals that produce actual speech. "But in this study, they show that, for some people, it isn't that different," he said. Kunz emphasized that the computer's current performance involving inner speech would not be good enough to let people hold conversations. "The results are an initial proof of concept more than anything," she said. But she is optimistic that decoding inner speech could become the new standard for brain-computer interfaces. In recent trials, she and her colleagues have improved the computer's accuracy. "We haven't hit the ceiling yet," she said. NYT

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store