'New pathway' to cure for HIV discovered using tech from COVID-19 vaccine
Researchers have taken a giant leap in the search for an HIV cure by discovering a way to identify the virus even as it is camouflaged among other cells.
HIV spreads by invading and multiplying within white blood cells, which fight disease and infection. One of the main roadblocks in developing a cure has been finding a way to isolate and kill the virus without also killing white blood cells and harming the body's immune system.
Researchers from the Peter Doherty Institute for Infection and Immunity in Melbourne, Australia have now cultivated a method to identify the virus among white blood cells, as demonstrated in a recent paper published in Nature Communications, isolating the virus for potential treatment.
The technology involves mRNA — molecules isolated from DNA that can teach the body how to make a specific protein — which were also used in the COVID-19 vaccines. By introducing mRNA to white blood cells, it can force the cells to reveal the virus.
Using mRNA in this way was 'previously thought impossible,' research fellow at the Doherty Institute and co-first author of the study Paula Cevaal told The Guardian, but the new development "could be a new pathway to an HIV cure.'
'In the field of biomedicine, many things eventually don't make it into the clinic – that is the unfortunate truth; I don't want to paint a prettier picture than what is the reality,' Cevaal said. 'But in terms of specifically the field of HIV cure, we have never seen anything close to as good as what we are seeing, in terms of how well we are able to reveal this virus.
A cure is still years away, as Cevaal said it would still need to be tested on animals and then humans to see if it can be done safely on living beings before they can test whether or not a potential treatment would even work. However, she added that that "we're very hopeful that we are also able to see this type of response in an animal, and that we could eventually do this in humans.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


WebMD
an hour ago
- WebMD
Living with HIV in the Rural South
Living with HIV can be challenging anywhere. But stigma can be devastatingly severe in small communities where privacy is more infringed, services can be limited and difficult to access, and the disease can be more deadly due to religious culture. The rural South can seem to be littered with a church of every denomination at almost every intersection. Local legislation and governing bodies reflect highly conservative views. And the culture is permeated with religious ideologies even in people who aren't religious, because Southern culture is pervasively influenced by the strictest traditions of Christianity. I'm not saying there's anything wrong with religion. But it should be a matter of choice, not coercion. In my opinion, Southern culture makes Christianity feel more coerced than chosen. Religious people have been at war with LGBT people for years, and LGBT people are disproportionately affected by HIV. At the beginning of the AIDS epidemic in 1981, the then-mysterious disease seemed to only affect gay men. So it was stigmatized by the church. This had a significant consequence. The fear of being judged or ostracized creates a perfect environment for the disease to flourish, thrive, and even spread because people are afraid to be tested. In small communities, privacy is usually heavily infringed upon and everyone knows everyone. So it only takes one person to reveal your HIV status. And disclosure of status can have devastating effects. Being tested for HIV may seem like a small thing, but it's not. If you're afraid to get tested, you might contract the virus and not know until you're very sick. Sometimes it's too late. Stigma alone perpetuates the spread of HIV and AIDS, and it's a barrier to prevention efforts. That's how religion makes the virus more deadly. But not all woes of rural communities are rooted in religion. The sprawling distances common in such areas can create barriers to accessing health services. This is another factor contributing to health care gaps in rural communities. Not only are distances longer, transportation can be less reliable or even nonexistent. Many rural areas have little or no public transportation. Not everyone has family or other means to provide travel to basic necessities like groceries and doctor appointments. Telemedicine can help, but it can't fill all needs. Rural isolation and loneliness can lead to mental health concerns. When you live with HIV, depression is common, partly because it's highly stigmatized. Loneliness intensifies the pain of depression. I live in a rural community, but I have quick access to the amenities of a small city. I'm blessed to live where I can find health care with HIV expertise, but resources can be limited. Wraparound services often fall through the cracks of a sparsely funded health care system. Some rely on help like the Ryan White HIV/AIDS Program that provides assistance for low-income people living with HIV. Without this assistance, people would die. Many rural areas have rising numbers of new HIV diagnoses, due to intravenous drug use. Needle exchange programs can help prevent HIV transmission and eliminate the risk of a community outbreak, as they have in the past. But many of these communities have religious objections to free needle exchanges. They claim to feel responsible for contributing to the drug problem by supplying clean needles. But they don't realize the greater danger is contracting HIV. In 2014-2015, in Scott County, Indiana, there was an HIV outbreak among drug-injecting populations. A temporary needle exchange program was established and the outbreak was curbed, showing needle exchanges work. But the program was halted when the outbreak got under control. This allows for the continuation of HIV transmission. But why do anything about it until it makes the news, right? Our HIV prevention system is broken, largely because of religious beliefs influencing legislation and policy pertaining to this disease. People will continue to contract and transmit HIV because of it. Regardless of geographical location. But religious stigma is strongest in rural America, where there are greater numbers of new HIV cases. Religion isn't the only factor behind this. But you can change stigma and stereotypes before you can change the distance of a mile. Rural areas will always have more miles to traverse with fewer reliable methods of transportation. They'll always be poorer and more vulnerable to HIV transmission. Some of these factors can't be changed. But harmful beliefs about HIV must change, or this disease will never be eliminated. Conservative, rural communities will continue to be hotbeds for HIV to thrive.
Yahoo
an hour ago
- Yahoo
14,000-year-old mummified ‘puppies' weren't dogs at all, new research shows
Sign up for CNN's Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more. Two well-preserved ice age 'puppies' found in Northern Siberia may not be dogs at all, according to new research. Still covered in fur and naturally preserved in ice for thousands of years, the 'Tumat Puppies,' as they are known, contain hints of a last meal in their stomachs, including meat from a woolly rhinoceros and feathers from a small bird called a wagtail. Previously thought to be early domesticated dogs or tamed wolves living near humans, the animals' remains were found near woolly mammoth bones that had been burned and cut by humans, suggesting the canids lived near a site where humans butchered mammoths. By analyzing genetic data from the gut contents and chemical signatures in the bones, teeth and soft tissue, researchers now think the animals were 2-month-old wolf pups that show no evidence of interacting with people, according to findings published Thursday in the journal Quaternary Research. Neither of the mummified wolf cubs, believed to be sisters, show signs of having been attacked or injured, indicating that they died suddenly when their underground den collapsed and trapped them inside more than 14,000 years ago. The den collapse may have been triggered by a landslide, according to the study. The wealth of data from the remains is shedding light on the everyday life of ice age animals, including how they ate, which is similar to the habits of modern wolves. 'It was incredible to find two sisters from this era so well preserved, but even more incredible that we can now tell so much of their story, down to the last meal that they ate,' wrote lead study author Anne Kathrine Wiborg Runge, formerly a doctoral student at the University of York and the University of Copenhagen, in a statement. 'Whilst many will be disappointed that these animals are almost certainly wolves and not early domesticated dogs, they have helped us get closer to understanding the environment at the time, how these animals lived, and how remarkably similar wolves from more than 14,000 years ago are to modern day wolves.' The multitude of research on these pups and other specimens also illustrates how difficult it is to prove when dogs, widely regarded as the first domesticated animal, became a part of human society. Trapped in thawing permafrost, the Tumat Puppies were discovered separately at the Syalakh site, about 25 miles (40 kilometers) from the nearest village of Tumat — one in 2011 and the other in 2015. They are approximately 14,046 to 14,965 years old. Hair, skin, claws and entire stomach contents can survive eons under the right conditions, said study coauthor Dr. Nathan Wales, senior lecturer in archaeology at the University of York in England. 'The most surprising thing to me is that the archaeologists managed to discover the second Tumat Puppy several years after the first was found,' Runge told CNN. 'It is very rare to find two specimens that are so well preserved and then they turn out to be siblings/littermates. It's extraordinary.' Like modern wolves, the pups ate both meat and plants. Though a woolly rhinoceros would be rather large prey for wolves to hunt, the piece of woolly rhino skin in one pup's stomach is proof of the canids' diet. The rhino skin, bearing blond fur, was only partially digested, suggesting the pups were resting in their den and died shortly after their last meal, Runge said. The color of the woolly rhino fur is consistent with that of a calf, based on previous research of a juvenile woolly rhino specimen found in the permafrost. Adult woolly rhinos likely had darker fur. The pack of adult wolves hunted the calf and brought it back to the den to feed the pups, according to the study authors. 'The hunting of an animal as large as a wooly rhinoceros, even a baby one, suggests that these wolves are perhaps bigger than the wolves we see today,' Wales wrote in a statement. The researchers also analyzed tiny plant remains fossilizing in the cubs' stomachs, revealing that the wolves lived in a dry, somewhat mild environment that could support diverse vegetation including prairie grasses, willows and shrub leaves. In addition to eating solid food, the pups were likely still nursing milk from their mother, according to the researchers. What scientists didn't find was evidence that mammoths were part of the cubs' diet, meaning it was unlikely that humans at the site were feeding the canids. Is it possible, though, that people shared woolly rhino meat with the cubs? That's something Wales considered, but now he believes the evidence points in the other direction. 'We know that modern wolves will hunt large prey like elk, moose and musk ox, and anyone who watches animal documentaries will know wolves tend to single out juvenile or weaker individuals when they hunt,' Wales wrote in an email. 'I lean toward the interpretation that the Tumat Puppies were fed part of a juvenile wooly rhino (by adult wolves).' The origin of the woolly rhino meat is impossible to pinpoint — the wolf pack could have hunted the calf or scavenged it from a carcass or even a butchering site — but given the age of the cubs and the fact that the den collapsed on them, it seems less likely that humans fed them directly, Runge said. That the cubs were being reared in a den and fed by their pack, similarly to how wolves breed and raise their young today, further suggests that the Tumat Puppies were wolves rather than dogs, Wales said. Painting a broader picture of ice age wolves is difficult because no written sources or cave art depicting them have been found, so it is unclear how wolf packs and ancient humans would have interacted, Runge said. 'We have to try to account for our own biases and preconceived notions based on human-wolf interactions today,' she wrote. 'And then we have to be okay with knowing we'll never be able to answer some of the questions.' Researchers are still trying to understand how domesticated dogs became companions to humans. One hypothesis is that wolves lived near humans and scavenged their food. But the domestication process would take generations and require humans to tolerate this behavior. Another hypothesis is that humans actively captured and hand-raised wolves, causing some of them to become isolated from wild populations, resulting in early dogs. Previous DNA tests on the cubs suggested they could have come from a now extinct population of wolves that eventually died out — and a population that did not act as a genetic bridge to modern dogs. 'When we're talking about the origins of dogs, we're talking about the very first domesticated animal,' Wales said. 'And for that reason, scientists have to have really solid evidence to make claims of early dogs.' All the evidence the authors of the new study found was compatible with the wolves living on their own, Wales said. 'Today, litters are often larger than two, and it is possible that the Tumat Puppies had siblings that escaped (the same) fate,' he said. 'There may also be more cubs hidden in the permafrost or lost to erosion.' Pinpointing where and when dogs were domesticated is still something of a holy grail in archaeology, evolutionary biology and ancient DNA research, said Dr. Linus Girdland-Flink, a lecturer in biomolecular archaeology at the University of Aberdeen in Scotland. Though Girdland-Flink's research is on ancient wolves and dogs, he was not involved in the new study. But determining whether ancient remains like the Tumat Puppies are early domestic dogs, wild wolves, scavengers or tamed individuals isn't straightforward because of the fragmented archaeological record, he said. No one piece of evidence can lead to a definitive answer. And it's even harder to do a comparison involving cubs because adult traits help distinguish between wild wolves and domesticated dogs. 'Instead, we have to bring together different lines of proxy evidence — archaeological, morphological, genetic, ecological — and think about how they all fit,' Girdland-Flink wrote in an email. 'So, I really welcome this new multi-disciplinary reanalysis of the Tumat puppies.' Girdland-Flink wasn't surprised the cubs weren't associated with the mammoth butchering site — an absence of evidence that matters. And combined with the lack of strong genetic ties to domestic dogs, he agreed the cubs must have come from a wolf population that did not live with humans.
Yahoo
an hour ago
- Yahoo
Using AI makes you stupid, researchers find
Artificial intelligence (AI) chatbots risk making people less intelligent by hampering the development of critical thinking, memory and language skills, research has found. A study by researchers at the Massachusetts Institute of Technology (MIT) found that people who relied on ChatGPT to write essays had lower brain activity than those who used their brain alone. The group who used AI also performed worse than the 'brain-only' participants in a series of tests. Those who had used AI also struggled when asked to perform tasks without it. 'Reliance on AI systems can lead to a passive approach and diminished activation of critical thinking skills when the person later performs tasks alone,' the paper said. Researchers warned that the findings raised 'concerns about the long-term educational implications' of using AI both in schools and in the workplace. It adds to a growing body of work that suggest people's brains switch-off when they use AI. The MIT study monitored 54 people who were asked to write four essays. Participants were divided into three groups. One wrote essays with the help of ChatGPT, another used internet search engines to conduct research and the third relied solely on brainpower. Researchers then asked them questions about their essays while performing so-called electroencephalogram (EEG) scans that measured activity in their brains. Those who relied on ChatGPT, a so-called 'large language model' that can answer complicated questions in plain English, 'performed worse than their counterparts in the brain-only group at all levels: neural, linguistic, scoring', the researchers said. The EEG scans found that 'brain connectivity systematically scaled down with the amount of external support' and was weakest in those who were relying on AI chatbots to help them write essays. The readings in particular showed reduced 'theta' brainwaves, which are associated with learning and memory formation, in those using chatbots. 'Essentially, some of the 'human thinking' and planning was offloaded,' the study said. The impact of AI contrasted with the use of search engines, which had relatively little effect on results. Of those who has used the chatbot, 83pc failed to provide a single correct quote from their essays – compared to around 10pc in those who used a search engine or their own brainpower. Participants who relied on chatbots were able to recall very little information about their essays, suggesting either they had not engaged with the material or had failed to remember it. Those using search engines showed only slightly lower levels of brain engagement compared to those writing without any technical aides and similar levels of recall. The findings will fuel concerns that AI chatbots are causing lasting damage to our brains. A study by Microsoft and Carnegie Mellon, published in February, found that workers reported lower levels of critical thinking when relying on AI. The authors warned that overuse of AI could leave cognitive muscles 'atrophied and unprepared' for when they are needed. Nataliya Kosmyna, the lead researcher on the MIT study, said the findings demonstrated the 'pressing matter of a likely decrease in learning skills' in those using AI tools when learning or at work. While the AI-assisted group was allowed to use a chatbot in their first three essays, in their final session they were asked to rely solely on their brains. The group continued to show lower memory and critical thinking skills, which the researchers said highlighted concerns that 'frequent AI tool users often bypass deeper engagement with material, leading to 'skill atrophy' in tasks like brainstorming and problem-solving'. The essays written with the help of ChatGPT were also found to be homogenous, repeating similar themes and language. Researchers said AI chatbots could increase 'cognitive debt' in students and lead to 'long-term costs, such as diminished critical inquiry, increased vulnerability to manipulation, decreased creativity'. Teachers have been sounding the alarm that pupils routinely cheating on tests and essays using AI chatbots. A survey by the Higher Education Policy Institute in February found 88pc of UK students were using AI chatbots to help with assessments and learning and that 18pc had directly plagiarised AI text into their work. OpenAI, the developer of ChatGPT, was contacted for comment.