logo
Prehistoric diets were maggot heavy, a new study suggests

Prehistoric diets were maggot heavy, a new study suggests

CNN3 days ago
Neanderthals had a voracious appetite for meat. They hunted big game and chowed down on woolly mammoth steak as they huddled around a fire. Or so thought many archaeologists who study the Stone Age.
Fresh meat was far from the only thing on the menu, according to a growing body of research that has revealed our archaic cousins ate a varied diet that included pulses and shellfish.
Still, a chemical signature in Neanderthal remains that suggests robust meat eating — observed at higher levels than those seen in top predators such as lions and wolves — has puzzled researchers for decades. Now, new research hints at an unexpected Stone Age food.
Maggots — the larvae of flies, which hatch in and feed on decaying animal tissue — may also have been a staple of prehistoric diets, a study published Friday in the journal Science Advances suggests.
Lead author Melanie Beasley, an assistant professor of biological anthropology at Purdue University in West Lafayette, Indiana, found that a taste for maggots could explain a distinctive chemical signature detected in the bones of prehistoric humans, including Homo sapiens and Neanderthals, a species that went extinct 40,000 years ago.
The findings back up a hypothesis that had been put forward by Beasley's coauthor John Speth, an anthropologist at the University of Michigan, who has for nearly a decade argued that putrid meat and fish would have formed a key part of prehistoric diets. His work was based on ethnographic accounts of the diets of indigenous groups, who he said found rotten meat and maggots acceptable fare.
'Not a lot of people took notice, because it was like this is an out-there idea. And there wasn't any data,' said Beasley, who heard Speth give a talk in 2017 and subsequently decided to test his hypothesis.
To understand past diets and where an animal sat in the ancient food chain, scientists study the chemical signature of different isotopes, or variants, of elements such as nitrogen or carbon, which are preserved in teeth and bones over thousands of years.
Researchers first found in the 1990s that the fossilized bones of Neanderthals unearthed in Northern Europe had particularly elevated levels of the nitrogen-15 isotope, a chemical signature that suggests their meat consumption was on par with hypercarnivores such as lions or wolves.
'Grass will have one (nitrogen) value, but then the deer that eats the grass is going to have a higher value, and then the carnivore that eats the deer is going to have an even higher value,' Beasley explained. 'So you can track nitrogen through this trophic food web system.' Neanderthal remains, she said, had even higher nitrogen values than carnivores.
This was puzzling, however, because modern-day humans, unlike wolves and lions, cannot stomach large quantities of lean meat. Overindulging in it can lead to a potentially lethal form of malnutrition in which the liver fails to break down the protein and rid the body of excess nitrogen.
Known today as protein poisoning, the condition was more common among European explorers of North America — who dubbed the illness 'rabbit poisoning' or 'mal de caribou' — given that wild game was far leaner than today's farmed meat. Archaeologists believe that Neanderthals understood the importance of fatty nutrients, and, at least in one location in what's now Germany, processed animal bones on a large scale to extract the fat.
Rotten meat might be higher in nitrogen than fresh tissue and may have been responsible for boosting nitrogen levels in Neanderthal bones, Speth's research has suggested.
Not long after hearing Speth speak, Beasley, who was previously a postdoctoral fellow at the University of Tennessee, Knoxville, where she conducted research at its Forensic Anthropology Center, decided to investigate. The research facility, sometimes described as a body farm, was established to study how the human body decomposes.
There, she analyzed nitrogen levels in the rotting tissue of donated human corpses left outdoors and the fly larvae that formed in the muscle tissue. The work, conducted over a two-year period, required a strong stomach, she said.
Beasley found that nitrogen levels increased modestly over time in the human tissue. However, she observed much higher nitrogen levels in the fly larvae, suggesting that Neanderthals and early modern humans likely consumed animal meat laced with maggots on a regular basis.
'I started getting the (nitrogen) values back, and they were just astronomically high,' Beasley recalled.
'John (Speth) and I started talking: What if it's not just the putrid meat, but it's the fact that … they're never going to be able to prevent flies from coming and landing on the meat, and so fly larva just become part of the delicacy,' she said.
The data from her work not only provides insight into the Neanderthal diet but also informs modern forensic science, with nitrogen levels in maggots that form in human corpses helping scientists pinpoint time since death, she noted.
It was a 'no brainer' that Neanderthals ate maggots, said Karen Hardy, a professor of prehistoric archaeology at the University of Glasgow in Scotland.
Hardy, who wasn't involved in the study, said the authors provided a 'strong argument in favor of maggot consumption,' although such behavior is unlikely to be conclusively proven because maggot remains do not survive in the archaeological record.
'The surprise element is more to do with our Western perspective on what is edible and what is not,' she added.
Today, at least 2 billion people worldwide are estimated to consume insects as part of traditional diets, according to the United Nations Food and Agriculture Organization.
The study also noted that, according to historical accounts, many indigenous peoples such as the Inuit 'viewed thoroughly putrefied, maggot-infested animal foods as highly desirable fare, not starvation rations.' Many such groups, according to the study, 'routinely, often intentionally, allowed animal foods to decompose to the point where they were crawling with maggots, in some cases even beginning to liquify, and inevitably emitting a stench so overpowering that early European explorers, fur trappers, and missionaries were sickened by it.'
Knud Rasmussen, a polar explorer from Greenland, recorded the following culinary experience, cited in the study, in his 1931 book 'The Netsilik Eskimos: Social Life and Spiritual Culture.'
'The meat was green with age, and when we made a cut in it, it was like the bursting of a boil, so full of great white maggots was it. To my horror my companions scooped out handfuls of the crawling things and ate them with evident relish. I criticised their taste, but they … said, not illogically: 'You yourself like caribou meat, and what are these maggots but live caribou meat? They taste just the same as the meat and are refreshing to the mouth.'
The study also noted that maggots are not unknown in Western culinary traditions, noting the Sardinian cheese casu marzu is replete with the larvae of cheese skipper flies.
Beasley said that Northern latitude groups still process these foods today and consume them safely when prepared following traditional practices.
Beasley's research on modern-day corpses was exploratory and had several limitations, she cautioned.
The work, which involved small sample sizes, focused on human muscle tissue, not the tissue or organs of animals that might have been hunted by Neanderthals. What's more, the fly larvae, which came from three different families, might have differed from those that existed in the late Pleistocene, which ended around 11,000 years ago.
The study also didn't account for the wide variety of climates and temperatures that would have had an effect on stored meat in the Stone Age. She also added that the human body tissue wasn't cooked, processed or prepared in any way.
Beasley has spoken with researchers in Alaska in the hopes of connecting with native groups that would be interested in sharing traditional food preparations. Her goal is to better understand how that might affect the nitrogen level.
The new research has 'opened a fascinating line of inquiry' into the culinary practices of Stone Age hunter-gatherers such as Neanderthals, said Wil Roebroeks, professor emeritus of paleolithic archaeology at Leiden University in the Netherlands. He wasn't involved in the research.
'It certainly gives a fresh — if that is the right word here — perspective on Neanderthal and other Late Pleistocene humans' diets,' Roebroeks added.
Sign up for CNN's Wonder Theory science newsletter. Explore the universe with news on fascinating discoveries, scientific advancements and more.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Cellectis to Report Second Quarter 2025 Financial Results on August 4, 2025
Cellectis to Report Second Quarter 2025 Financial Results on August 4, 2025

Yahoo

time5 minutes ago

  • Yahoo

Cellectis to Report Second Quarter 2025 Financial Results on August 4, 2025

NEW YORK, July 28, 2025 (GLOBE NEWSWIRE) -- Cellectis (the 'Company') (Euronext Growth: ALCLS- NASDAQ: CLLS), a clinical-stage biotechnology company using its pioneering gene-editing platform to develop life-saving cell and gene therapies, today announced that it will report financial results for the second quarter 2025 ending June 30, 2025 on Monday August 4, 2025 after the close of the US market. The publication will be followed by an investor conference call and webcast on Tuesday August 5, 2025 at 8:00 AM ET / 2:00 PM CET. The call will include the Company's second quarter results and an update on business activities. Details for the call are as follows: Dial in information: Domestic: +1-800-343-5172 International: +1-203-518-9856 Conference ID: CLLSQ2 Webcast Link: About Cellectis Cellectis is a clinical-stage biotechnology company using its pioneering gene-editing platform to develop life-saving cell and gene therapies. The company utilizes an allogeneic approach for CAR T immunotherapies in oncology, pioneering the concept of off-the-shelf and ready-to-use gene-edited CAR T-cells to treat cancer patients, and a platform to develop gene therapies in other therapeutic indications. With its in-house manufacturing capabilities, Cellectis is one of the few end-to-end gene editing companies that controls the cell and gene therapy value chain from start to finish. Cellectis' headquarters are in Paris, France, with locations in New York and Raleigh, NC. Cellectis is listed on the Nasdaq Global Market (ticker: CLLS) and on Euronext Growth (ticker: ALCLS). To find out more, visit and follow Cellectis on LinkedIn and X. TALEN® is a registered trademark owned by Cellectis. For further information on Cellectis, please contact: Media contacts: Pascalyne Wilson, Director, Communications, + 33 (0)7 76 99 14 33, media@ Patricia Sosa Navarro, Chief of Staff to the CEO, +33 (0)7 76 77 46 93 Investor Relations contact: Arthur Stril, Chief Financial Officer & Chief Business Officer, investors@ Attachment 20250728_Q2 2025 earnings call announcement_ENGLISHError while retrieving data Sign in to access your portfolio Error while retrieving data Error while retrieving data Error while retrieving data Error while retrieving data

Nuclear Winter Would Be Even Worse Than We Thought
Nuclear Winter Would Be Even Worse Than We Thought

Gizmodo

time7 minutes ago

  • Gizmodo

Nuclear Winter Would Be Even Worse Than We Thought

Despite happening (thankfully) just once in real life, nuclear warfare has long been a staple element of science fiction. Popular depictions of nuclear conflict—from biographic thrillers like Oppenheimer to imagined disasters like The Day After—reflect the understanding that its consequences would be irreversible and catastrophic to modern society. Unsurprisingly, nuclear warfare and its potential repercussions concern scientists as much as fiction writers. In a recent paper published in Environmental Research Letters, researchers at Pennsylvania State University examined how nuclear war might disrupt food security worldwide, focusing specifically on the global production of corn, the most produced grain crop in the world. In the worst-case scenario, nuclear weapons would wreak havoc on our atmospheric systems, gradually cutting our annual corn production by up to 87%, the study warns. For their simulations, the authors considered 38,572 locations for corn production across six different nuclear war scenarios of increasing severity. The simulations took place under nuclear winter conditions, a hypothetical climate scenario following a large-scale nuclear war. During nuclear winters, black carbon from fires triggered by nuclear detonations would fill up the sky, obstructing sunlight. The resulting drop in global temperatures could last for over a decade—long enough to decimate agricultural systems worldwide, according to the scientists. In addition to black carbon, the authors examined the potential UV-B radiation exposure to plants. The Earth's ozone usually blocks this type of radiation, but this protective layer would be weakened in the wake of nuclear war. As UV-B radiation causes DNA damage and obstructs plant photosynthesis, the researchers modeled how overexposure to this energy source could affect the soil-plant-atmosphere system that drives crop growth. The results were disturbing. First, the 'best-case scenario,' a regional nuclear war, would release enough soot into the atmosphere to reduce annual corn production by 7%—which, to be clear, would severely impact the global food system, study lead author and meteorologist Yuning Shi explained in a press release. A global-scale war, on the other hand, would inject a massive 165 million tons of soot into the atmosphere, curbing global corn production by a whopping 80%. That wasn't all; radiation damage 'would peak in years 8 and 9' following the initial detonation of the bomb, causing an additional 7% decrease in corn yields, according to the paper. 'The blast and fireball of atomic explosions produce nitrogen oxides in the stratosphere,' Shi explained. This, in combination with heat-absorbing soot, injects a fiery cycle into the atmosphere that 'rapidly [destroys] ozone, increasing UV-B radiation levels at the Earth's surface.' Thankfully, these are just simulations. They nevertheless 'force us to realize the fragility of the biosphere—the totality of all living things and how they interact with one another and the environment,' Shi said. What's more, the study acts as an early precursor to a more refined, effective response plan for potential disasters, he added. Hopefully, that disaster won't be nuclear—though it could be something like a volcanic eruption, which obstructs sunlight in a similar way and is something we can better prepare for. For example, the paper recommended preparing 'agricultural resilience kits' containing seeds for crops that can grow under cooler conditions. 'These kits would help sustain food production during the unstable years following a nuclear war, while supply chains and infrastructure recover,' said Armen Kemanian, an environmental systems expert and paper senior author, in the same release. But these kits could easily assist food security in areas affected by severe volcanic activity, he added. Natural disasters are beyond our control, save for the preparatory part. A self-inflicted environmental catastrophe and global-scale famine—that's clearly another story. When it comes to nuclear winter, the 'best approach to preventing its devastating effects is to avoid it,' the scientists wrote.

Medscape 2050: Adam Rodman
Medscape 2050: Adam Rodman

Medscape

time7 minutes ago

  • Medscape

Medscape 2050: Adam Rodman

Medscape 2050: The Future of Medicine There will come a day, predicts Adam Rodman, MD, a general internist and medical educator at Beth Israel Deaconess Medical Center and assistant professor at Harvard Medical School, when AI systems change disease. That's the day when they can not only diagnose diseases more accurately than humans, but define diseases in ways that only machines can understand. Take heart attacks, for example. Rodman hopes cardiologists will forgive him for pointing out that AI can already detect blocked coronary arteries from an EKG in ways that humans can't. In the not-too-distant future, Rodman believes, medicine will begin redefining more diseases and treatments that are simply not understandable by the human brain. That day isn't here yet, Rodman explains, because today's AI systems are still pretty similar to us. 'They're trained on a human understanding of disease,' he says, 'so even the best models are following the guidelines that we give them.' They mimic human reasoning, albeit a lot faster and using a lot more data. But as new AI models develop, we could reach what Rodman calls 'a nonhuman nosology': our clinical reasoning vs a machines-only thought process. And what happens when those disagree? What does it mean — for both doctors and patients — to trust a computer that we can't understand? Is this the day when doctors are out of a job? Rodman doesn't think so. Because medicine is about more than computation. There are relationships and procedures that can't be replaced. But certain areas of clinical practice will certainly change. 'If you have a job where you can sit down at a computer and interpret most of the data that has already been collected for you to make a decision,' he says, you should start looking over your shoulder. Medicine is going through an 'epistemic shift,' Rodman says, where the parameters of how we think are changing, so it's hard to predict what will come next. But we should all get ready.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store