A New Study Reveals The #1 Sleep Mistake That Harms Brain Health—And It Has Nothing To Do With Your Bedtime
The study found that 'long sleepers' were more likely to report symptoms of depression and worse cognitive performance.
Here's what you should know about how long you should sleep, with insight from experts.
When you're struggling to get the recommended seven-plus hours of sleep each night, logging anything more than that sounds like a dream come true. But new research suggests that there is actually a sleep sweet spot you should aim for—and that making sure you don't sleep *too* much could affect how well your brain works.
The study, which was published in the journal Alzheimer's & Dementia, specifically discovered that sleeping too much was linked with worse cognitive performance. Here's why and what the tipping point was, plus how to figure out the best amount of sleep for you.
Meet the experts: Vanessa Young, MS, lead study author and clinical research project manager at the Glenn Biggs Institute for Alzheimer's and Neurodegenerative Diseases at UT Health San Antonio; W. Christopher Winter, MD, a neurologist and sleep medicine physician with Charlottesville Neurology and Sleep Medicine and host of the Sleep Unplugged podcast
For the study, researchers analyzed data on cognition and how long people slept in more than 1,800 people without dementia who participated in the Framingham Heart Study, a community-based cohort study of residents in Framingham, Massachusetts. The study specifically focused on people between the ages of 27 and 85.
The researchers found that people who slept for nine hours or more a night had worse cognitive performance. That was especially pronounced in participants with depression, regardless of whether they used antidepressants.
The researchers also discovered that so-called 'long sleepers' were more likely to report symptoms of depression and that sleep might be a modifiable risk factor for cognitive decline in people who have depression.
This isn't the first study to find a link between sleeping for longer periods and lower cognitive performance. 'Regularly sleeping more than nine hours a night has been linked to lower cognitive performance in some studies—including ours,' says Vanessa Young, MS, lead study author and clinical research project manager at the Glenn Biggs Institute for Alzheimer's and Neurodegenerative Diseases at UT Health San Antonio.
There is a 'J' curve relationship between sleep and health, points out W. Christopher Winter, MD, a neurologist and sleep medicine physician with Charlottesville Neurology and Sleep Medicine and host of the Sleep Unplugged podcast. What this means is that more sleep isn't always better. 'Generally, the best health outcomes in adults are at seven hours," he says.
As for why that is, Young says sleeping for longer periods of time is likely 'a sign that something else is happening beneath the surface.' That could mean vascular issues, depression, changes in brain health, or something else, she says.
'While we can't say for certain whether longer sleep leads to worse cognition—or if people with emerging cognitive issues start sleeping more—our findings suggest that unusually long sleep might be worth paying attention to, especially if it's a change from your normal routine,' Young says.
First of all, everyone is different and requires different amounts of sleep, but research generally suggests that getting between seven and nine hours of sleep a night is best for cognitive performance.
Still, Dr. Winter stresses that sleep needs are individual and it's a good idea to pay attention to certain elements of your sleep routine to see what your needs are. He suggests looking at how long it takes for you to conk out at night, along with how you feel during the day.
'If it takes a while to fall asleep, you might be seeking too much time in bed,' Dr. Winter says. 'But if you struggle to stay awake during the day or fall asleep rapidly at night, you may not be sleeping enough.'
But Young says you shouldn't automatically assume that more sleep is better. 'Like many things in health, balance is important—and sleep may be just one part of a larger puzzle when it comes to brain health,' she says.
You Might Also Like
Jennifer Garner Swears By This Retinol Eye Cream
These New Kicks Will Help You Smash Your Cross-Training Goals
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time Magazine
30-07-2025
- Time Magazine
How Bureaucracy and Budgets Shape American Medical Research
President Donald Trump's proposed budget for the 2026 fiscal year has made drastic cuts to the National Institute of Health's (NIH) budget, sparking alarm among many. While the level of proposed cuts is unprecedented, calls for efficiency are nothing new. In fact, they echo decades-old efforts to make publicly funded science more accountable. What has gone largely unnoticed, however, is how these reforms reshaped how NIH research is managed—as well as the very definition of what counts as rigorous, worthwhile health research in the first place. Even as the NIH's budget soared over the past half-century, much of that growth came at a price: a narrowing of NIH's scientific imagination. Driven by bureaucratic reforms and the need to demonstrate fiscal responsibility, the agency gradually shifted away from large, community-based, longitudinal studies aimed at understanding what keeps people healthy. Instead, it prioritized smaller, faster studies with statistical significance and quantifiable data, but far less explanatory power about how to stay healthy. In the late 1950s, the NIH was beginning to expand its mission to address chronic ailments like heart disease and cancer. These growing health threats required a fundamentally different kind of science—slower, more complex, and deeply embedded in communities. Early NIH leaders, such as James Shannon, embraced this challenge with a bold vision: government-led, multi-site observational studies tracking large populations over decades. The Framingham Heart Study, launched in 1948, embodied this approach. It aimed to enroll over 5,000 healthy residents of Framingham, Mass., and follow them for at least 20 years to understand how lifestyle factors and social context shaped long-term health outcomes. Read More: RFK Jr. Says Ultra-Processed Foods Are 'Poison'—But That He Won't Ban Them Over the next decade, the NIH became the de facto institution for carrying out this sort of bold population-based investigation into health and disease. But as the 1960s progressed, this vision for the NIH ran afoul of a growing government-wide push for budgetary control. Reforms like Planning, Programming, Budgeting, and Execution and Zero-Base Budgeting demanded that all federal agencies and initiatives define outcomes in advance and justify expenses with quantifiable projections. Large-scale observational studies—by their very nature, exploratory, slow, and expensive—were easy targets for government watchdogs obsessed with efficiency. For example, the Wooldridge Committee, a task force appointed by President Lyndon B. Johnson's Office of Science and Technology to review the federal research enterprise, sharply criticized the NIH in 1965 for failing to provide adequate oversight of its biggest studies The committee warned that scientific freedom could no longer excuse a lack of fiscal discipline. The NIH responded, not by defending the long arc of discovery required for understanding the causes of chronic disease, but by adapting. Researchers were asked to project statistical returns on investment. Studies were re-evaluated not just for scientific merit, but for how likely they were to generate measurable results within a budget cycle. Framingham, once a flagship of public health research, was deemed too open-ended. By 1970, it had lost its privileged status and instead had to compete for grants like any university-based project. This shift marked an institutional pivot away from NIH-led, community-grounded studies and towards a more manageable model of research. During this time, the NIH also shelved several other large, prospective population-based studies of health and disease, including the Diet-Heart Study—an ambitious effort to definitively test the role of high-fat diets in causing heart disease. In their place, a new framework for investigating chronic diseases emerged, one built around smaller, investigator-initiated grants awarded to outside researchers. These grants, and the peer-review process that governed their approval, increasingly relied on the tools of biostatistics to demonstrate methodological rigor and fiscal discipline. From an administrative perspective, these outside projects were easier to justify: they were shorter in duration, cleaner in design, and more narrowly focused. Politically, they were appealing too—distributed across universities in different congressional districts, they helped spread NIH funding across the country. By encouraging investigators to design studies with tightly defined objectives, measurable outcomes, and clear statistical models, the NIH was able to present its growing budget as aligned with the broader federal push for transparency and accountability. In effect, the agency avoided deeper scrutiny by embedding oversight expectations into the very structure of scientific inquiry. By doing so, it created the conditions for its outside grant program to flourish. Yet, this shift also produced a subtle, but profound, change in the kinds of questions NIH research was designed to answer. Rather than pursuing the fundamental causes of health and disease, the types of population-based investigations that received NIH grants looked at discrete, isolated lifestyle factors and their relative impact on specific conditions — what have come to be known as risk factor epidemiology. In the case of heart disease, this involved studies on the impact of certain foods on conditions commonly associated with heart disease, especially high cholesterol, high blood pressure, and elevated body mass index. And while these types of investigations yielded a flood of peer-reviewed publications and some effective interventions at the individual level, they also left crucial questions unanswered. After decades of risk factor research, for example, we still do not fully understand the causes of heart disease—or how best to prevent it. Read More: NIH Budget Cuts Are the 'Apocalypse of American Science,' Experts Say In the decades since, many investigators and commentators have criticized the dominance of 'risk factor epidemiology.' Critics include Gary Taubes, a science journalist known for his writing on nutrition science and the history of dietary guidelines, and John Ioannidis, a Stanford researcher who has long argued that most epidemiological studies of nutrition are limited in scope and contradictory. They and others contend that risk factor–driven research has led to public health guidance built on fragile associations and patterns in data that do not reflect causality. These critics often point to the decades-long emphasis on reducing dietary fat to lower cholesterol and prevent heart disease as problematic. This advice led many Americans to adopt low-fat, high-carbohydrate diets—a diet that is now linked to obesity, diabetes, and ironically, heart disease. Today, many health experts and institutions have reversed course, encouraging the consumption of healthy fats and warning against excess sugar and refined carbohydrates. The result has been public confusion, eroded trust in nutrition science, and a generation of health advice that, in retrospect, may have done more harm than good. These studies have flourished since the 1970s not because they promised definitive answers on how to stay healthy, but because they appeared to offer a clear return on investment. Their study designs were statistically rigorous and focused on narrowly defined variables and outcome measures, which enabled these projects to routinely yield statistically significant results for the questions they were designed to answer. That gave policymakers and funders the impression that public dollars were driving scientific progress, even as it provided few answers to the biggest scientific questions. Ironically, it was the promotion of this particular style of research—narrow in scope, statistically precise, and managerially friendly—that helped the NIH expand its budget and reach. But the accumulation of these rigorous, but smaller-in-scope, findings rarely translated into an applicable understanding of the complex, long-term, and interconnected forces that truly shape health. Today, as the NIH again faces oversight and budget pressures, the American scientific establishment has a chance to course-correct. The current administration has emphasized health promotion and the importance of diet. But if those goals are to be more than talking points, President Trump, Congress, and the NIH must be willing to invest in the kind of science that can actually reveal what keeps us well. That means returning to community-based, long-term observational studies—even if they are expensive, even if they take decades, and even if they do not fit neatly into the bureaucratic logic of annual performance metrics. Sejal Patel-Tolksdorf is a health policy analyst and former chief research historian at the National Institutes of Health. Her work focuses on the politics and policy of American health research. Made by History takes readers beyond the headlines with articles written and edited by professional historians. Learn more about Made by History at TIME here. Opinions expressed do not necessarily reflect the views of TIME editors.
Yahoo
30-07-2025
- Yahoo
NYC Gunman Blamed This Rare Brain Disease For His Mental Illness
On Monday night, a gunman killed four people and himself in Manhattan in an attack that reportedly targeted the NFL's headquarters in the city. The gunman, Shane Tamura, left behind a note in which he said he'd been suffering from chronic traumatic encephalopathy, or CTE, as a result of playing football. He had asked for his brain to be studied as part of CTE research. The discourse about football's impact on the brain is not new to the NFL. Both players and their families have sued the league time and time again over the matter of brain damage and its effects on players' post-career lives. CTE is a 'progressive neurodegenerative disease,' explained Dr. Jeremy Tanner, an assistant professor of neurology at the Glenn Biggs Institute for Alzheimer's and Neurodegenerative Diseases at UT Health San Antonio. Research shows that football players, along with other people who participate in high-contact sports and activities, are more likely to develop the disease. A study out of Boston University's CTE Center found 40% of athletes under 30 had developed early signs of the disease upon their death. Another study out of BU found that roughly 91% of studied NFL players had the disease when they died. Experts told HuffPost that CTE research is ongoing, but there are clear links between head injuries, behavioral changes and the disease itself. Here's what to know: What is CTE? Like other brain diseases, such as Alzheimer's disease or Parkinson's disease, CTE affects how we think, behave, move and 'really anything that the brain is responsible for,' said Dr. Daniel H. Daneshvar, the co-director of Mass General Brigham Sports Concussion Clinic. 'It occurs in patients that have sustained traumatic brain injury, so blows to the head that may or may not have resulted in concussions,' said Dr. Aaron S. Lord, the chief of neurology and program director for clinical research at NYU Langone Hospital–Brooklyn. The more head injuries that occur, the higher the risk of developing chronic traumatic encephalopathy, Lord added. The thought pattern used to be that concussions increased CTE risk, but this isn't the case, added Daneshvar. 'Concussions themselves aren't what drives CTE risk. It's a repeated traumatic brain injury to the tune of hundreds or thousands over the course of decades that... increases the risk of someone having CTE,' he said, adding that the number of traumatic brain injuries also affects the severity of the disease. Can someone be diagnosed with CTE? You can't walk into a doctor's office and get a diagnosis of CTE. Right now, it can only be diagnosed after death, during an autopsy, said Lord. That means that people who may or may not have CTE can still get a gun license, a fact that can get lost in the discourse on social media about Tamura carrying the weaponry he had. The New York Police Department reported he had a 'documented mental health history,' but it's unclear if this would restrict his gun access. People who play contact sports are at higher risk — but any activity that involves frequent head impact is a risk factor. Specifically, CTE has been identified in players who engage in contact sports such as football, rugby, hockey and rodeo, said Tanner, adding that it's also been seen in soccer players. 'And it seems... the more total years played, the higher the risk,' Tanner said. It often is developed over time, after someone stops playing the sport, or the head trauma ends, explained Tanner. Daneshvar added that 'we've also seen [it] in individuals who experience intimate partner violence, in individuals who serve in the military.' Again, Daneshvar notes, the more often the head injury happens, the higher the risk of CTE. 'Famously, in the literature, there was a circus clown who was repeatedly shot out of a cannon and who was found to have CTE,' he said. CTE can cause behavioral changes and memory problems. 'Chronic traumatic encephalopathy [is] typically associated with changes in cognition and in behavior,' said Tanner. 'In behavior, a common symptom is what's called neurobehavioral dysregulation. For some people, this can present as a shortened fuse or more or irritability or agitation,' Tanner said. For others, this can mean paranoia, aggression, impulse control issues and trouble regulating emotions, he added. Neurobehavioral dysregulation tends to be more common in 'those affected by the disease in younger stages,' Tanner noted. 'Additionally, chronic traumatic encephalopathy is associated with changes in memory and with executive function, particularly planning, organization, multitasking... managing information and integrating it.' 'And those symptoms seem to be more common in older adults with the disease,' he added. It's important to note that a lot of these symptoms can also be signs of other neurological disorders or mental health diseases, said Tanner. In the end, individuals with a CTE diagnosis had problems with thinking, memory and behavior, said Daneshvar. But, once again, these things can be related to a multitude of other issues. 'We can't say for sure what clinical signs someone presents with are related to CTE pathology versus something else because humans are complex. We have a lot of different reasons for the way we behave,' said Daneshvar. If you do notice mental health changes, memory issues or other neurological problems, Tanner advises people to see a specialist for an evaluation. 'It's often a neurologist or a psychiatrist or a sports medicine specialist who has expertise in evaluating those with repetitive head impacts in sports,' Tanner said. 'It can be hard to distinguish what's the primary cause, and so looking at the number of years of head impact exposure can be a clue that there could be an increased risk for CTE.' Seeking medical attention for any neurological changes is essential, whether you're dealing with CTE or not. 'I see individuals who have histories of repeated traumatic brain injuries and are experiencing problems now, and I can't say with certainty whose problems are related to CTE versus not... but what I can do, and what I do every day, is treat them, and our treatments for people's symptoms are successful,' said Daneshvar. While there is no cure for CTE, doctors can still help, Daneshvar noted. 'I think that's a really important message, too. I think people think that CTE is some incurable, immovable thing, right? And it is treatable.' CTE has been linked to violence in some cases, but not always. ″In some cases, [CTE] has been linked to violent and aggressive behaviors,' Tanner said. One of the most talked about cases of CTE is that of Aaron Hernandez, a deceased former football player who was convicted of murder. 'The short fuse, the impulse control. One way to think about it could be when you're playing sports, you can turn [your aggression] on and off. You lose that ability to control the 'on and off' switch you use to regulate your aggression when on the field and off the field,' Tanner explained. There are some things you can do to lower your risk of CTE. As mentioned above, people who take part in particular sports or activities, such as football and rugby, are at higher risk of CTE. But there are a few habits that can help protect your brain. Lord added that wearing a helmet — whether that's on a bike ride or while playing football — is also an important way to protect yourself. Tanner suggests that instead of playing tackle football, try playing flag or touch football. 'We have a lot more to learn about how to prevent this disease,' Tanner said. 'What I would suggest is trying to minimize, as much as [you're] able, head contact and head impacts.' 'For former football players and others, there's the new Diagnose CTE study that's really looking at trying to understand how we can identify these symptoms during life and better understand this disease,' Tanner said. The study is actively recruiting former football players to learn about the unknowns of the disease. If you or someone you know needs help, call or text 988 or chat for mental health support. Additionally, you can find local mental health and crisis resources at Outside of the U.S., please visit the International Association for Suicide Prevention. Related... NYC Gunman Reportedly Left Note Blaming Specific Factor For His Mental Illness Gunman Opens Fire In Manhattan Office Building, Killing 4
Yahoo
29-07-2025
- Yahoo
Park Avenue shooter Shane Tamura's claims put NFL's CTE problem in spotlight
NEW YORK — The NFL's long history with CTE has come under new attention after Monday's deadly shooting at the Manhattan office building that houses the league's headquarters. The gunman, identified as Shane Tamura, was said to be in possession of a note claiming he had CTE, a neurodegenerative disease, and that asked for his brain to be studied. 'You can't go against the NFL,' the note read, police sources told the Daily News. 'They'll squash you.' The rifle-wielding shooter killed a police officer and three others after he entered the tower at 345 Park Ave. in Midtown, officials said. Tamura, who later took his own life, was believed to be targeting the NFL offices but went to the wrong elevator, Mayor Eric Adams said Tuesday in interviews with MSNBC and PIX11. CTE, or chronic traumatic encephalopathy, is linked to repeated head trauma but cannot be diagnosed without a postmortem brain autopsy. Tamura, 27, never played in the NFL but played football when he was younger. 'Please study brain for CTE. I'm sorry,' Tamura's note read, according to the police sources. 'The league knowingly concealed the dangers to our brains to maximize profits. They failed us.' CTE has repeatedly been at the forefront of discussions involving NFL player safety, with recent studies further illuminating the concerns about the brain disease. In 2023, Boston University diagnosed 345 former NFL players with CTE out of the 376 who were studied, or 91.7%. 'For comparison, a 2018 Boston University study of 164 brains of men and women donated to the Framingham Heart Study found that only 1 of 164 (0.6 percent) had CTE,' read the Boston University study. 'The lone CTE case was a former college football player.' Last year, a Harvard University study determined about one-third of the nearly 2,000 former NFL players included in their research believed they had CTE. 'According to the study, players who believed they had CTE reported significantly more cognitive problems and a higher proportion of low testosterone, depression, mood instability, headaches, chronic pain, and head injury compared with those who did not have concerns about CTE,' read the Harvard report. Several cases have been particularly high-profile. In 2011, former NFL safety Dave Duerson sent a message to his family asking to have his brain studied, then died of a self-inflicted gunshot wound to the chest. Boston University researchers determined Duerson, an 11-year NFL veteran who won Super Bowl XXV with the Giants, had CTE. In 2012, retired 12-time Pro Bowl linebacker Junior Seau fatally shot himself in the chest. His brain was also studied, and his family later revealed that Seau had been diagnosed with CTE, citing research by the National Institutes of Health. And in 2017, the director of Boston University's CTE Center said former New England Patriots tight end Aaron Hernandez suffered from a severe case of CTE before hanging himself inside a Massachusetts prison cell. Hernandez was serving a life sentence for the 2013 shooting of Odin Lloyd. Days before Hernandez's death, he had been acquitted in a 2012 double homicide. About a decade ago, the NFL reached a concussion settlement with former players. As of October, the league said it had paid more than $1.2 billion to more than 1,600 former players and their families, according to Boston University. Last season, the NFL said it recorded its fewest number of concussions since it began tracking the data in 2015. Last year's total was down by 17% from 2023, with the league pointing to improved helmets and its new dynamic kickoff rule. 'Today is an important milestone but not the end of our work,' Jeff Miller, the NFL executive vice president overseeing player health and safety, said in February. 'Through improved equipment, rules modifications and a continued culture change, we will make the game safer and more exciting.' NFL Commissioner Roger Goodell confirmed Tuesday in a message to staffers that one league employee was 'seriously injured' in the shooting. Goodell did not name the victim but said he was hospitalized in stable condition. A report by The Athletic identified the wounded NFL employee as Craig Clementi, who works in the league's finance department, and said he continued to make calls urging his coworkers to evacuate the building even after he was struck in the back by a bullet. Goodell encouraged New York-based employees to work from home on Tuesday and said it would be understandable to take the day off. 'Every one of you is a valued member of the NFL family,' Goodell wrote. 'We will get through this together.' ______