Active police presence reported at Reid Health in Richmond
[DOWNLOAD: Free WHIO-TV News app for alerts as news breaks]
A Reid Health supervisor confirmed to News Center 7 over the phone that the hospital is on lockdown due to some threat.
TRENDING STORIES:
2 local elementary schools to close at the end of the year; here's why
NFL Hall of Famer dies at 67 after ALS battle
A Popeyes suddenly closed, now we know why
Several Richmond Police officers are on the scene investigating, according to a Richmond Police dispatcher.
No other information is available.
We will update this developing story.
[SIGN UP: WHIO-TV Daily Headlines Newsletter]

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNN
2 minutes ago
- CNN
Woman with ALS hopes Uruguay is closer to legalizing euthanasia
CNN's Dario Klein speaks with a woman living with ALS who says she doesn't want to suffer anymore and supports legalizing euthanasia in Uruguay.


CNN
32 minutes ago
- CNN
Woman with ALS hopes Uruguay is closer to legalizing euthanasia
CNN's Dario Klein speaks with a woman living with ALS who says she doesn't want to suffer anymore and supports legalizing euthanasia in Uruguay.


Boston Globe
2 days ago
- Boston Globe
For some patients, the ‘inner voice' may soon be audible
Christian Herff, a neuroscientist at Maastricht University in the Netherlands who was not involved in the research, said the result went beyond the merely technological and shed light on the mystery of language. 'It's a fantastic advance,' Herff said. Get Starting Point A guide through the most important stories of the morning, delivered Monday through Friday. Enter Email Sign Up The new study is the latest result in a long-running clinical trial, called BrainGate2, that has already seen some remarkable successes. One participant, Casey Harrell, now uses his brain-machine interface to hold conversations with his family and friends. Advertisement In 2023, after ALS had made his voice unintelligible, Harrell agreed to have electrodes implanted in his brain. Surgeons placed four arrays of tiny needles on the left side, in a patch of tissue called the motor cortex. The region becomes active when the brain creates commands for muscles to produce speech. A computer recorded the electrical activity from the implants as Harrell attempted to say different words. Over time, with the help of artificial intelligence, the computer accurately predicted almost 6,000 words, with an accuracy of 97.5 percent. It could then synthesize those words using Harrell's voice, based on recordings made before he developed ALS. Advertisement But successes like this one raised a troubling question: Could a computer accidentally record more than patients wanted to say? Could it eavesdrop on their inner voice? 'We wanted to investigate if there was a risk of the system decoding words that weren't meant to be said aloud,' said Erin Kunz, a neuroscientist at Stanford University and an author of the new study. She and her colleagues also wondered if patients might actually prefer using inner speech. They noticed that Harrell and other participants became fatigued when they tried to speak; could simply imagining a sentence be easier for them and allow the system to work faster? 'If we could decode that, then that could bypass the physical effort,' Kunz said. 'It would be less tiring, so they could use the system for longer.' But it wasn't clear if the researchers could decode inner speech. Scientists don't even agree on what 'inner speech' is. Some researchers have indeed argued that language is essential for thought. But others, pointing to recent studies, maintain that much of our thinking does not involve language at all and that people who hear an inner voice are just perceiving a kind of sporadic commentary in their heads. 'Many people have no idea what you're talking about when you say you have an inner voice,' said Evelina Fedorenko, a cognitive neuroscientist at the Massachusetts Institute of Technology. 'They're like, 'You know, maybe you should go see a doctor if you're hearing words in your head.'' Fedorenko said she has an inner voice, while her husband does not. Advertisement Kunz and her colleagues decided to investigate the mystery for themselves. The scientists gave participants seven different words, including 'kite' and 'day,' then compared the brain signals when participants attempted to say the words and when they only imagined saying them. As it turned out, imagining a word produced a pattern of activity similar to that of trying to say it, but the signal was weaker. The computer did a pretty good job of predicting which of the seven words the participants were thinking. For Harrell, it didn't do much better than a random guess would have, but for another participant, it picked the right word more than 70 percent of the time. The researchers put the computer through more training, this time specifically on inner speech. Its performance improved significantly, including on Harrell. Now, when the participants imagined saying entire sentences, such as 'I don't know how long you've been here,' the computer could accurately decode most or all of the words. Herff, who has done studies on inner speech, was surprised that the experiment succeeded. Before, he would have said that inner speech is fundamentally different from the motor cortex signals that produce actual speech. 'But in this study, they show that, for some people, it really isn't that different,' he said. Kunz emphasized that the computer's current performance involving inner speech would not be good enough to let people hold conversations. 'The results are an initial proof of concept more than anything,' she said. But she is optimistic that decoding inner speech could become the new standard for brain-computer interfaces. In more recent trials, the results of which have yet to be published, she and her colleagues have improved the computer's accuracy and speed. 'We haven't hit the ceiling yet,' she said. Advertisement As for mental privacy, Kunz and her colleagues found some reason for concern: On occasion, the researchers were able to detect words that the participants weren't imagining out loud. Kunz and her colleagues explored ways to prevent the computer from eavesdropping on private thoughts. They came up with two possible solutions. One would be to only decode attempted speech, while blocking inner speech. The new study suggests this strategy could work. Even though the two kinds of thought are similar, they are different enough that a computer can learn to tell them apart. In one trial, the participants mixed sentences in their minds of both attempted and imagined speech. The computer was able to ignore the imagined speech. For people who would prefer to communicate with inner speech, Kunz and her colleagues came up with a second strategy: an inner password to turn the decoding on and off. The password would have to be a long, unusual phrase, they decided, so they chose 'Chitty Chitty Bang Bang,' the name of a 1964 novel by Ian Fleming as well as a 1968 movie starring Dick van Dyke. One of the participants, a 68-year-old woman with ALS, imagined saying 'Chitty Chitty Bang Bang' along with an assortment of other words. The computer eventually learned to recognize the password with 98.75 percent accuracy — and decoded her inner speech only after detecting the password. 'This study represents a step in the right direction, ethically speaking,' said Cohen Marcus Lionel Brown, a bioethicist at the University of Wollongong in Australia. 'If implemented faithfully, it would give patients even greater power to decide what information they share and when.' Advertisement This article originally appeared in