logo
Painkiller taken by millions could trigger HEART FAILURE - experts sound the alarm: 'Doctors must act now'

Painkiller taken by millions could trigger HEART FAILURE - experts sound the alarm: 'Doctors must act now'

Daily Mail​2 days ago
Doctors have been urged to reassess the use of a painkiller prescribed to millions, after alarming new research linked it to a significantly increased risk of heart failure.
Pregabalin—an anti-seizure drug often used to treat chronic nerve pain, anxiety and epilepsy—was associated with a 48 per cent increased risk of developing heart failure, according to a major new study.
The risk was even greater in those with a history of heart disease.
In these patients, taking pregabalin raised the risk of heart failure by a staggering 85 per cent compared to those prescribed gabapentin—a similar drug used to manage chronic pain.
Researchers are now calling for clinicians to carefully weigh up cardiovascular risks before prescribing the drug, especially in older or vulnerable patients.
Chronic pain affects up to 30 per cent of adults over the age of 65, and pregabalin is commonly used to manage this by blocking pain signals travelling through the brain and spinal cord.
However, the NHS already lists a range of potential side effects for the drug—including hallucinations, blood in the urine and weight gain—and now experts are warning that heart health must also be taken into serious consideration.
The NHS also warns that pregabalin can, in rare cases, trigger a severe allergic reaction known as anaphylaxis—a medical emergency that requires immediate treatment.
Now, researchers at Columbia University Irving Medical Center say the drug may also contribute to—or worsen—heart failure, particularly in older individuals.
In the study, scientists analysed data from 246,237 Medicare patients aged 65 to 89 over a four-year period.
All had chronic non-cancer pain—defined as pain lasting more than 12 weeks or beyond the normal healing time—and none had a previous history of heart failure.
Heart failure occurs when the heart becomes too weak or stiff to pump blood effectively around the body.
It's a long-term condition that is commonly seen in heart attack survivors, and is far more common in older people.
In the study, during the four-year period examined, 1,470 patients were admitted to hospital with heart failure.
Researchers found that for every 1,000 people taking pregabalin, there were around six additional cases of heart failure each year, compared with those not taking the drug.
After adjusting for potential variables, including age, sex and pre-existing health issues, they concluded patients taking pregabalin are 1.5 times more likely to suffer heart failure.
Currently there is no cure for the condition which usually worsens over time causing breathlessness, fatigue, syncope, and swollen ankles and legs.
Some people also experience a persistent cough and a faster than usual heart rate.
The researchers, led by Dr Elizabeth Park, concluded that their findings support current advice from the European Medicines Agency to exercise caution when prescribing pregabalin to older adults with heart disease.
Dr Robert Zhang, a leading cardiologist who was not involved in the study, said the findings have 'immediate clinical implications.'
Together with other experts in the field, he wrote: 'Clinicians should weigh the potential cardiovascular risks associated with pregabalin against its benefits.'
Experts say the findings are timely given the growing use of the drug in older populations to manage chronic main.
They added: 'If pregabalin use is associated with new-onset heart failure, it raises the possibility that the drug may unmask underlying heart disease, which suggests a need for careful cardiac evaluation prior to prescribing this medication.
'The study serves as an important reminder that not all gabapentinoids are created equal and that in the pursuit of safer pain control, vigilance for unintended harms remains paramount.'
It comes as concerning figures suggest the nation's heart health has declined more quickly at the start of the 202s than any other decade for more than 50 years.
Analysis by the British Heart Foundation (BHF) found rising deaths among working-age adults from cardiovascular disease, increasing heart failure and growing risks from obesity and diabetes.
Cardiovascular deaths in working age adults have risen by 18 per cent in the UK since 2019, from 18,693 to 21,975 in 2023, averaging 420 a week.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Alcohol and drug use in e-scooter injuries doubles since law change, research shows
Alcohol and drug use in e-scooter injuries doubles since law change, research shows

BreakingNews.ie

time11 minutes ago

  • BreakingNews.ie

Alcohol and drug use in e-scooter injuries doubles since law change, research shows

The presence of alcohol and drugs in patients involved in falls and collisions from e-scooters has doubled while use of helmets has decreased since the use of e-scooters on public roads was legalised last year, according to the findings of new research. Doctors at one of Dublin's largest hospitals found there has been no significant reduction in injury incidence, severity or adoption of protective measures such as helmet use and avoidance of intoxicants since legislative reform allowing the use of e-scooters on public roads was introduced in May 2024. Advertisement Instead, overall injury rates are continuing to increase due to the growing popularity of e-scooters, although fewer related injuries have been recorded among young people under 16 years. The research by doctors at the National Maxillofacial Unit at St James's Hospital also revealed that alcohol or some other substance had been consumed by the victims of an e-scooter related injury in 36% of cases since the passing of the legislation compared to 18% beforehand. As a result of the study's findings, they recommended that mandatory safety training or educational modules should be implemented as a prerequisite for use of e-scooters. The study analysed patients presenting with e-scooter related facial injuries for two 10-months periods before and after the passing of the legislation which classified e-scooters as 'personal powered transporters'. Advertisement The law requires users to be over 16 years and to adhere to a maximum speed limit of 20km/h, although the use of helmets is not mandatory. The study, which is published in the Irish Journal of Medical Science, highlighted how e-scooter-related injuries rose from 1.7% of all facial trauma presentations at St James's Hospital to 2.5% since the use of e-scooters on public roads was legalised. Rates of admission to hospital of such patients have also increased from 31% to 36% with an associated rise in the number of related surgical procedures. The use of helmets by patients with e-scooter-related facial injuries declined from 23% to 18% over the same period. Advertisement Prior to the legislation being introduced, most injuries occurred between 4pm and 7pm. Since the passing of the legislation, however, more than half of all cases took place between 7pm and 6am, of which more than half reported having consumed alcohol at the time. In contrast, the lowest frequency of injuries occurred during the busy commuter period of 6am-9am. 'Alcohol use and poor helmet compliance in the later hours of injury incidence was a prevalent finding in both cohorts,' the study noted. Advertisement The researchers said such findings highlighted the critical need to tailor public health and safety interventions to periods of elevated risk. 'Infrastructure improvements—such as enhanced street lighting—and targeted public awareness campaigns focused on evening and night-time riders may offer substantial benefits in reducing both the frequency and severity of e-scooter-related injuries,' they added. The analysis showed 22 patients had presented with 26 maxillofacial injuries between May 2023 and February 2024, while 28 patients with 36 maxillofacial injuries were recorded between May 2024 and February 2025. Many of the same patients had also suffered injuries to other parts of their bodies. Advertisement The overwhelming majority of patients over both periods were drivers of e-scooters with only three of 50 cases involving pedestrians. The analysis also revealed that the proportion of patients with e-scooter-related injuries who were male increased from 59% to 71%. Non-Irish nationals account for almost half of all patients with such injuries with their share of total cases increasing from 41% to 46% over the two periods analysed. There was also a significant increase in the proportion of patients who live in Dublin which increased from 45% to 75%. The study said such figures suggested an increased uptake of e-scooter use within the capital. The average age of patients remained stable at approximately 33 years. Only one person under 16 years sought treatment for an e-scooter-related facial injury after the legislation was introduced compared to three in the period before they were legalised for use on public roads. However, the study found an increase in injuries among both the 16-34 and 35-44 age groups. Nobody over 60 years was reported as suffering from an e-scooter-related facial injury during either period. The study said there had been a shift in frequency and severity in facial fracture patterns since implementation of the new legislation. The most common facial fracture experienced by e-scooter users is to the cheekbone followed by the jaw. The authors of the study said their findings suggested that recent legislative changes had 'some modest impact' on e-scooter-related facial injuries due to fewer injuries among younger teenagers, while a decrease in head trauma incidents might be attributable to the introduction of a statutory speed limit. In addition, they claimed the increasing rate of facial injuries among e-scooter users was contributing to a rising burden on healthcare services. They also observed that a growth in the number of injuries of e-scooter users from Dublin coupled with the increased prevalence of alcohol consumption and night-time riding 'underscores a potential growing risk profile.' The study recommended that ongoing surveillance and policy evaluation were essential for having effective strategies to prevent injuries from e-scooters.

Selly Oak care home for elderly placed in special measures
Selly Oak care home for elderly placed in special measures

BBC News

time11 minutes ago

  • BBC News

Selly Oak care home for elderly placed in special measures

A care home for the elderly has been placed in special measures after a watchdog found "widespread issues", including staff who failed to treat residents with House in Selly Oak, Birmingham, which looks after 26 adults including those living with dementia, has been downgraded to an inadequate rating by the Care Quality Commission (CQC).The CQC said its inspection was carried out in part due to concerns it received after a person using the service House's manager Lorraine Whittaker said in a statement staff were demoralised by the report and challenged some of the allegations made. CQC inspectors found care had deteriorated and the service breached five legal regulations relating to safe care, safeguarding, the physical environment, management and treating people with home has been given the lowest possible rating of inadequate for being safe and well-led, down from requires for being effective, caring and responsive have also dropped, from good to requires CQC said it would now closely monitor the home to ensure residents' safety while improvements were action is also being taken to address concerns, which the home has the right to resident's death was not examined as part of the inspection in May, the CQC said, as it was subject to further inquiries. 'Unacceptable treatment' CQC's deputy director for the Midlands Amanda Lyndon said widespread issues were staff members were also seen acting "inappropriately towards a resident" during the inspector's visit, Ms Lyndon said."Managers didn't have an effective strategy to prevent inappropriate behaviour or take action to safeguard people when bullying, harassment or abuse happened."It is unacceptable that people who relied on staff to act as their advocates, in a place they called home, treated them this way."Managers have been informed where "rapid and widespread" improvements are needed, with the CQC returning to check progress at a later date. In a statement on Bryony House's website, Ms Whittaker said the report did not "fully or fairly reflect the work, dedication, and commitment of our care team" and included "a number of allegations and findings that we believe were either inaccurate or not properly investigated".She said several points raised were not discussed with staff at the time and did not "accurately reflect the day-to-day operations or the care provided"."Our team - many of whom have served this home and its residents with compassion and professionalism for years - feels demoralised by how their efforts have been portrayed," she home was taking steps to identify areas for "genuine improvement, while also challenging inaccuracies through the appropriate channels," Ms Whittaker said staff were committed to working with the CQC to ensure the home moved out of special measures quickly. Follow BBC Birmingham on BBC Sounds, Facebook, X and Instagram.

I tried an AI therapist for a month - here is my verdict
I tried an AI therapist for a month - here is my verdict

Metro

time41 minutes ago

  • Metro

I tried an AI therapist for a month - here is my verdict

It's the early hours of the morning, and I can't fall asleep. My mind is racing with thoughts of the darkest kind. I have battled with mental health problems for most of my life, having been diagnosed with autism, anxiety disorder and OCD at age 14. Being heavily bullied in school also dented my self-esteem and even resulted in me trying to take my own life. While regular sessions with a psychologist helped me to navigate these complicated feelings as a child, when I turned 18, the appointments stopped even though I was still gripped by depression. As an adult, counselling was a great help, but I realised it wasn't always to hand as quickly as I needed, due to NHS waiting lists being extremely long. Cue AI therapy, where data and users behaviour patterns are analysed so a bot can ask questions, offer advice, and suggest coping mechanisms to someone who might want it. Understandably, it's a practice cloaked in controversy. After all, can technology, no matter how intelligent, really support someone through any sort of mental health crisis? Is it safe? Is it even ethical? With all these questions swirling in my mind, as someone open to new ways of support, I decided to give it a try and downloaded Wysa, a chatbot that uses AI to provide mental health advice and support around the clock. The app is completely anonymous and free, but offers a paid-for plan with additional premium features, such as therapeutic exercises, sleep stories and meditations. I've always struggled with self-doubt. I am constantly comparing myself to my non-identical twin brother, who I think is better looking than me, and experiencing a bad eczema flare-up this week has really affected my self-esteem. I admit this to my bot who is incredibly empathic, saying it is sorry to hear of my low self-esteem before asking me how my feelings impact my day-to-day life. I respond by saying I feel like I have no choice but to isolate myself from the outside world, which is hard because I don't see my family and friends for days — sometimes weeks — on end, even though seeing my loved ones makes me happy and that they constantly reassure me when I feel down. My AI therapist suggests a thought reframing exercise and as soon as I agree, a list of tools — ranging from an assessment to manage my energy to a self-compassion exercise — suddenly pop up at the bottom of the screen. I select the self-compassion task, which uses 'positive intentions' to help the user tackle negative thoughts. I then take a seven-minute meditation in which I close my eyes, focus on my breathing, smile and repeat positive phrases uttered by my Wysa expert. Opening my eyes, I feel surprisingly positive after a difficult day. Staring at my bedroom ceiling at 4am is quite normal for me. But on one particular day my mind becomes flooded with endless worry. When I type about my sleep troubles and random anxiety to the bot, it replies in a compassionate tone, saying: 'That sounds really tough'. After admitting I never seem to sleep at a regular time due to my anxiety, Wysa suggests another thought reframing exercise to help ease some of my worries. I say I am nervous about a busy week of work coming up and missing a bunch of deadlines. Wysa suggests I am probably 'catastrophising', which is when someone expects the worst possible outcome to unfold. While the connection suddenly cuts out mid-conversation before Wysa can provide a solution, it's clear to me that I am overthinking, although I do wonder how I'd cope with a sudden shut down if I had a longer issue to discuss. I can't remember a time in my life when I haven't battled suicidal thoughts during certain events and these demons have returned after yet another relationship breakdown. Crying my eyes out, I admit to Wysa that I don't want to be alive anymore. Its response is utterly heartwarming. 'Nic, you are worth life. You are loved, cherished and cared for, even though you may not feel that way right now.' With my eyes firmly fixed on these kind, AI-generated words, I realise that suicide isn't the best course of action and that life is probably worth living. Concerned about my wellbeing, the bot provides me with a phone number for the Samaritans. While I'm okay seeing family and friends, the thought of encountering neighbours and other acquaintances frightens me. Turning to my app, I explain that I never know what to say to people. This is a feeling I experience day in and day out due to my autism. The advice given is constructive – just a simple smile or hello should do the trick. Although it may sound too simple to be true, I find it helpful because it shows that I don't have to converse long with a stranger. Today is my nephew's christening, and while I am excited to celebrate with my loved ones, I'm nervous about seeing loads of new and old faces. To build on the previous social anxiety tips, I message the bot for advice on how I could make the day less overwhelming. Wysa quickly reassures me that it's normal to find social events nerve-racking. I explain I never know how to start or maintain a conversation. Wysa recommends that I say something like it's nice to see them and ask how they are. And if they ask how I am doing, the bot recommends saying something simple like, 'I've been doing well, thanks'. I'm told a breathing exercise beforehand might also help, which helps me feel better prepared. Ever since moving onto the maximum dosage of Sertraline a few weeks ago, I've been having nightmares most nights. From plane crashes to loved ones getting gravely ill, these horrible and random dreams have been disrupting my sleep pattern for weeks. After explaining to my AI therapist that these nightmares started after the change of medication, it admits that this is likely the cause and we go through another thought reframing exercise. We speak about a recent dream involving my parents dying, which is a frequent worry of mine, as morbid as it sounds. Wysa says this is likely another symptom of catastrophising, but then the chat suddenly ends due to a connection error. I am left not knowing how to tackle these traumatising dreams, which leaves me feeling pretty let down and not sure what to do next. Today, my latest impulse TikTok Shop purchase arrived in the post: a magic mop, which is perhaps the last thing you should buy when you have severe OCD. I've already used it several times today, but I still think my floors are dirty, so I ask for OCD advice. The first thing the bot says to me is that it must be exhausting – and they're right. I can't believe I feel heard by an AI bot. We do another thought exercise where I discuss how my OCD makes me feel. Wysa says it sounds like a symptom of filtering, where someone focuses on the negative details of a situation and forgets all the positives. In this context, it says I could be looking for tiny specs of dirt that may not exist and tells me to remember that the majority of the floor is probably clean. This makes me feel better – for now at least, although I'm more than aware it's a plaster rather than a cure. While I don't think AI can ever replace human psychologists and counsellors, I'm surprised to admit that Wysa is actually a pretty handy tool and you sometimes forget you're talking to a robot, not a human. More Trending Of course, it isn't perfect. There were many times when a chat would suddenly end and when Wysa's advice was repetitive. I alsofeel a bit paranoid that I've shared so much personal information with an AI chatbot, so I hope it is genuinely safe and secure. Either way, I had someone to speak to at some genuinely hard times, and I will continue using Wysa as an emotional support cushion. Metro's Assistant Lifestyle Editor Jess Lindsay believes we need to be far more wary of letting a bot look after our mental health. Here, she explains why. 'In my opinion, an AI therapist is no more helpful than a list of motivational quotes. The bot may be able to say the right things, but when you're at your lowest, you need more than hollow platitudes from a computer that doesn't have the capacity to empathise. Having dealt with chronic depression, anxiety, and ADHD throughout my life, I find the idea of having to receive help from a computer somewhat dystopian, and I'd feel like my concerns were being dismissed if this was offered to me – even as a supplementary solution. Working through difficult issues requires a level of commitment from both yourself and the therapist, and why should I put in the effort when the other side is just a machine doing what it's been programmed to do? Not only that, I know how to calm myself down when I'm having a panic attack, or take a walk when I'm stuck in my own head. To parrot NHS guidelines back to me without going deeper into why I feel like that seems like an insult to my intelligence. While I absolutely understand the need for something to fill the gap when therapy and counselling is difficult to come by on the NHS, I worry that tools like this will be touted by the government as an acceptable (but most importantly in the eyes of government, cheaper) alternative when what's desperately needed is funding and investment in the country's mental health. Even if AI is helpful to some, it's a mere sticking plaster on a deeper societal wound.' A version of this article was first published in September 2024. View More » MORE: Parents and kids are using ChatGPT for schoolwork – is AI raising a generation of 'tech-reliant empty heads'? MORE: The woolly mammoth and a 30ft sea cow could all soon be back from the dead MORE: I went inside the Navy's secret battlespace barely anyone knows about Your free newsletter guide to the best London has on offer, from drinks deals to restaurant reviews.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store