
Michelle McKeown: Crawford Lake and the marking of human impact on nature
Its deep, undisturbed waters have acted like a natural time capsule, trapping layers of sediment year after year, each recording a snapshot of the environment at the time of deposition.
Because of this, Crawford Lake became a star candidate in a global scientific quest: to identify the Golden Spike – a single, globally synchronous signal in Earth's geological record that would define the start of a new epoch, the Anthropocene.
Age of humans
The Anthropocene, meaning 'the age of humans,' is the proposed name for a new geological epoch to mark the profound and accelerating influence of humans on the Earth's systems.
For decades, scientists have debated whether our species' impact, through greenhouse gas emissions, industrial agriculture, nuclear testing, plastic pollution, and accelerated biodiversity loss, has been so significant that it warrants a formal entry into the geologic timescale.
The current epoch, the Holocene, began approximately 11,700 years ago following the last Ice Age. It marks a period of relative climate stability during which human civilisations flourished.
But the 20th century, particularly the post-World War II era known as the Great Acceleration, saw an unprecedented surge in human activity, industrialisation, population, and consumption, leaving an imprint on the planet.
These include rising carbon dioxide levels, radioactive isotopes from nuclear tests, and microplastic deposits. These changes are now preserved in sediments, ice cores, and even coral reefs.
To formally define the Anthropocene, geologists needed more than just evidence of human impact.
They needed a precise Global Boundary Stratotype Section and Point (GSSP), which is a physical reference point in the geological record that could be used to demarcate the boundary between epochs. This is where Crawford Lake came in.
Crawford Lake
What made Crawford Lake such a strong contender was its unique ability to record annual layers of sediment (known as varves) with exceptional clarity.
These fine layers act like tree rings, preserving an exact year-by-year account of environmental change.
Sediment cores taken from the lake bed revealed tell-tale signs of the Anthropocene's onset, including plutonium isotopes from nuclear bomb testing in the early 1950s, along with spikes in fly ash, heavy metals, and chemical pollutants.
In 2023, the Anthropocene Working Group (AWG), which is an international body of scientists studying the issue, voted in favour of using Crawford Lake as the site to define the start of the new epoch.
But in a twist worthy of geological drama, the proposal was rejected.
What happened?
In March 2024, the Subcommission on Quaternary Stratigraphy (the official scientific body tasked with approving changes to the geologic timescale), which advises the International Union of Geological Sciences (IUGS), voted against formalising the Anthropocene as a new epoch.
This decision halted the formalisation of the Anthropocene as a new epoch and left Crawford Lake without the golden spike claim.
Why the rejection?
At the heart of the issue is a fundamental debate about what geology should, and should not, do.
Critics of the Anthropocene designation argued that the concept is more cultural than geological, better suited to environmental studies, history, or political discourse than to a rigid geological framework.
They questioned whether the changes observed in the mid-20th century are truly global, continuous, and long-lasting enough to warrant a formal stratigraphic boundary.
After all, many of the markers, such as plastics and radionuclides, are relatively new, and their long-term persistence in the geological record remains uncertain.
Others expressed concern that the proposal was too narrow, focused excessively on recent decades without sufficient regard for earlier human impacts on the planet.
For example, large-scale deforestation, species extinctions, and agricultural transformations have been reshaping the Earth for thousands of years. So why draw the line in the 1950s?
Is the Anthropocene real?
But rejection of the proposal doesn't mean the Anthropocene isn't real.
In fact, most scientists agree that human activity has pushed the Earth into a new state, marked by climate breakdown, biodiversity collapse, and novel materials like concrete and plastics.
What's in dispute is not whether we've altered the planet, but how best to categorise that change.
Crawford Lake, meanwhile, remains a place of global importance. It tells the story not just of atomic fallout and pollution, but of indigenous communities who lived around its shores centuries ago, leaving behind traces of corn and wood ash in its sediments.
It is both a natural archive and a cultural mirror, reflecting the deep entanglement of humans and nature.
A story still worth telling
Perhaps the Anthropocene doesn't need a formal boundary, or a single lake, to change how we see our place in the world.
While Crawford Lake offered a strikingly clear and symbolic record of recent human impact, no single site can fully capture the complexity or timeline of our planetary influence.
What matters more is the broader shift in awareness the debate has sparked.
The true legacy of the Anthropocene may lie not in a line drawn in the mud, but in how it urges us to confront the scale of our actions, and to choose, with urgency and humility, what kind of future we want to leave behind.
Read More
Michelle McKeown: Shedding light on the wild world of bioluminescence
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Irish Independent
25-07-2025
- Irish Independent
Cannibal human ancestors ate children, grisly discovery in Spain shows
Archaeologists working at the Gran Dolina cave site in Burgos, northern Spain, found a human neck bone belonging to a child aged between two and four, with clear butchery marks. The vertebra was found with other bones and teeth belonging to Homo antecessor, considered to be the last common ancestor of both Homo sapiens and Neanderthals, and who lived between 1.2 million and 800,000 years ago. Although cannibalism is well documented in early humans, experts say it is unusual to find a child being eaten, and it marks the earliest evidence of the practice found to date. The site of the marks on the neck bone suggests the youngster was decapitated. 'This case is particularly striking, not only because of the child's age, but also due to the precision of the cut marks,' said Dr Palmira Saladi, the co-director of the Gran Dolina excavation of the Catalan Institute of Human Paleoecology and Social Evolution. 'The vertebra presents clear incisions at key anatomical points for disarticulating the head. 'It is direct evidence that the child was processed like any other prey.' Other adult bones belonging to Homo antecessor found at the site show evidence of defleshing marks and intentional fractures, similar to those found on animal bones consumed by humans. Experts say it suggests 'early humans exploited their peers as a food resource' and may also have used cannibalism as a method of controlling territory. Homo antecessor is the earliest human to move into Europe and was given the species name antecessor as it means 'pioneer' or 'early settler' in Latin. It was of stockier build than modern humans, with males ranging in height from 5.2ft to 5.9ft, but had some modern facial features, such as a hollowed cheekbones and a projecting nose, unlike earlier hominids. The earliest evidence of human cannibalism dates back 1.45 million years ago. Some archaeologists suggest that before formal burials, human populations would eat the dead as a funerary ritual. At Gough's Cave in Cheddar Gorge, Somerset, human skulls were found that appeared to have been used as cups, dating from 15,000 years ago. Gnawed human bones were also excavated from the same period. Experts believe that during the Ice Age, cannibalism would have been a good way of finding food in tough times while also removing rival groups and ritually absorbing their power.


RTÉ News
24-07-2025
- RTÉ News
Why our Stone Age brains aren't designed for sustained attention
Analysis: Our modern world demands sustained vigilance, but we're using brains that weren't designed for this kind of focus We've all been there. You're trying to focus on an important task, but your mind starts wandering. Maybe you're a student struggling to pay attention during a long class or lecture, or you're at work trying to concentrate on a detailed report. You might think the solution is simple: just try harder, practice more, or maybe find the right productivity hack. But what if we told you that perfect, unwavering attention isn't just difficult but actually impossible to achieve? Recent research from cognitive scientists reveals something that might surprise you: the human brain simply isn't designed for perfect sustained attention. This isn't a personal failing or something you can overcome with enough willpower. It's a fundamental feature of how our minds work. From RTÉ Radio 1's Today with Claire Byrne, why is our attention more vulnerable than ever and what can we do about it? Think of your attention like a flashlight with a flickering battery. No matter how much you want that beam to stay perfectly steady, it's going to flicker. Scientists have discovered that our brains operate through rhythmic pulses, with attention naturally cycling several times per second. It's not something you can control or train away. The brain networks responsible for attention, particularly areas in the front and sides of your head, show constant fluctuations in activity. Even when you think you're paying perfect attention, brain scans reveal that your neural activity is constantly shifting. It's like trying to hold water in your cupped hands. During World War II, researchers noticed something troubling: radar operators whose job was to spot enemy aircraft on screens would inevitably miss targets after just a short time on duty. This wasn't because they weren't trying hard enough or lacked training. Even the most skilled, motivated operators showed the same pattern. From RTÉ Radio 1's Drivetime, Dr. Philipp Hövel from UCC on what's competing for our shrinking attention spans Scientists called this the "vigilance decrement", which is the inevitable decline in attention over time. This finding has been replicated thousands of times across different jobs and situations. Air traffic controllers, security guards watching CCTV screens, and even lifeguards at busy beaches all show the same pattern. It doesn't matter how important the job is or how much training someone has received: performance starts to drop within minutes of starting a vigilance task. You might wonder why evolution would give us such a seemingly flawed attention system. The answer seems to lie in survival. Having attention that automatically shifts and scans the environment was actually a survival advantage for spotting danger in the wild. Our "distractible" attention system kept our ancestors alive. Even experienced meditators, who spend years training their attention, don't achieve perfect sustained focus. Brain scan studies of Buddhist monks with decades of meditation experience show they still have, and can't eliminate, natural fluctuations in attention. What meditation does teach is awareness of these fluctuations and the ability to gently redirect attention when it wanders. From RTÉ 2fm's Morning with Laura Fox, tech journalist Elaine Burke on how the entertainment industry is adapting to shorter attention spans Our attention limitations create real problems in today's world. Medical errors in hospitals, aviation accidents and industrial disasters often involve attention failures. We've built a modern world that demands sustained vigilance, but we're using Stone Age brains that weren't designed for this kind of focus. The traditional response has been to try harder: more training, more motivation, more discipline. But this approach is like trying to make water flow uphill. It fights against the fundamental nature of human attention rather than working with it. Instead of fighting our attention limitations, we need to design systems that work with them. This means creating technology that can handle the boring, repetitive vigilance tasks while humans focus on what we do best: creative problem-solving, understanding context (including emotions aroused), and making complex decisions. Some industries are already moving in this direction. Modern aircraft use automation to handle routine monitoring while pilots focus on higher-level decision-making. Medical devices can continuously monitor patients while nurses provide care and interpret complex situations. The key is finding the right balance. Complete automation isn't the answer either. Humans need to stay engaged and maintain skills. But neither is expecting perfect human attention in situations where technology could provide better, more reliable monitoring. Accepting that perfect attention is impossible might actually be liberating. Instead of feeling guilty when your mind wanders during a long meeting or beating yourself up for losing focus while studying, you can recognise this as normal human behaviour. The goal isn't to eliminate attention fluctuations but to work with them. Take regular breaks, change tasks periodically, and design your environment to support rather than fight your natural attention rhythms. Perfect attention isn't just difficult to achieve but theoretically impossible - and that's OK.


Irish Examiner
11-07-2025
- Irish Examiner
Michelle McKeown: Crawford Lake and the marking of human impact on nature
Nestled in the heart of southern Ontario, Canada, Crawford Lake is a tranquil body of water with an extraordinary secret: it preserves history with uncanny precision. Its deep, undisturbed waters have acted like a natural time capsule, trapping layers of sediment year after year, each recording a snapshot of the environment at the time of deposition. Because of this, Crawford Lake became a star candidate in a global scientific quest: to identify the Golden Spike – a single, globally synchronous signal in Earth's geological record that would define the start of a new epoch, the Anthropocene. Age of humans The Anthropocene, meaning 'the age of humans,' is the proposed name for a new geological epoch to mark the profound and accelerating influence of humans on the Earth's systems. For decades, scientists have debated whether our species' impact, through greenhouse gas emissions, industrial agriculture, nuclear testing, plastic pollution, and accelerated biodiversity loss, has been so significant that it warrants a formal entry into the geologic timescale. The current epoch, the Holocene, began approximately 11,700 years ago following the last Ice Age. It marks a period of relative climate stability during which human civilisations flourished. But the 20th century, particularly the post-World War II era known as the Great Acceleration, saw an unprecedented surge in human activity, industrialisation, population, and consumption, leaving an imprint on the planet. These include rising carbon dioxide levels, radioactive isotopes from nuclear tests, and microplastic deposits. These changes are now preserved in sediments, ice cores, and even coral reefs. To formally define the Anthropocene, geologists needed more than just evidence of human impact. They needed a precise Global Boundary Stratotype Section and Point (GSSP), which is a physical reference point in the geological record that could be used to demarcate the boundary between epochs. This is where Crawford Lake came in. Crawford Lake What made Crawford Lake such a strong contender was its unique ability to record annual layers of sediment (known as varves) with exceptional clarity. These fine layers act like tree rings, preserving an exact year-by-year account of environmental change. Sediment cores taken from the lake bed revealed tell-tale signs of the Anthropocene's onset, including plutonium isotopes from nuclear bomb testing in the early 1950s, along with spikes in fly ash, heavy metals, and chemical pollutants. In 2023, the Anthropocene Working Group (AWG), which is an international body of scientists studying the issue, voted in favour of using Crawford Lake as the site to define the start of the new epoch. But in a twist worthy of geological drama, the proposal was rejected. What happened? In March 2024, the Subcommission on Quaternary Stratigraphy (the official scientific body tasked with approving changes to the geologic timescale), which advises the International Union of Geological Sciences (IUGS), voted against formalising the Anthropocene as a new epoch. This decision halted the formalisation of the Anthropocene as a new epoch and left Crawford Lake without the golden spike claim. Why the rejection? At the heart of the issue is a fundamental debate about what geology should, and should not, do. Critics of the Anthropocene designation argued that the concept is more cultural than geological, better suited to environmental studies, history, or political discourse than to a rigid geological framework. They questioned whether the changes observed in the mid-20th century are truly global, continuous, and long-lasting enough to warrant a formal stratigraphic boundary. After all, many of the markers, such as plastics and radionuclides, are relatively new, and their long-term persistence in the geological record remains uncertain. Others expressed concern that the proposal was too narrow, focused excessively on recent decades without sufficient regard for earlier human impacts on the planet. For example, large-scale deforestation, species extinctions, and agricultural transformations have been reshaping the Earth for thousands of years. So why draw the line in the 1950s? Is the Anthropocene real? But rejection of the proposal doesn't mean the Anthropocene isn't real. In fact, most scientists agree that human activity has pushed the Earth into a new state, marked by climate breakdown, biodiversity collapse, and novel materials like concrete and plastics. What's in dispute is not whether we've altered the planet, but how best to categorise that change. Crawford Lake, meanwhile, remains a place of global importance. It tells the story not just of atomic fallout and pollution, but of indigenous communities who lived around its shores centuries ago, leaving behind traces of corn and wood ash in its sediments. It is both a natural archive and a cultural mirror, reflecting the deep entanglement of humans and nature. A story still worth telling Perhaps the Anthropocene doesn't need a formal boundary, or a single lake, to change how we see our place in the world. While Crawford Lake offered a strikingly clear and symbolic record of recent human impact, no single site can fully capture the complexity or timeline of our planetary influence. What matters more is the broader shift in awareness the debate has sparked. The true legacy of the Anthropocene may lie not in a line drawn in the mud, but in how it urges us to confront the scale of our actions, and to choose, with urgency and humility, what kind of future we want to leave behind. Read More Michelle McKeown: Shedding light on the wild world of bioluminescence