logo
Your eyes can reveal the accuracy of your memories

Your eyes can reveal the accuracy of your memories

Yahoo20-05-2025
We like to think our brains are reliable recorders—but reality says otherwise. From misremembered childhood moments to mistakenly 'recalling' that you took your pills when you didn't, false memories are surprisingly common. And in high-stakes situations like courtroom testimony, these errors can have devastating consequences. Wouldn't it be amazing if there were an objective way to measure just how accurate someone's memory really is?
New research suggests we might be able to do just that—by watching the eyes.
Scientists have known since the 1960s that our pupils tend to widen when we're thinking hard—whether we're remembering something, solving a problem, or paying close attention. But those early studies mostly looked at short-term memory, so it wasn't clear whether the same effect applied to long-term recall.
Then came a curious discovery in the 1970s: people's pupils also dilated when they recognized something they'd seen before. This phenomenon, called the 'pupil old/new effect,' has since been confirmed in multiple experiments. But recent research has taken this a step further, suggesting that pupil dilation may not just reflect whether something feels familiar, but also how clearly and precisely it's remembered.
In a new study published in the Journal of Experimental Psychology: Learning, Memory, and Cognition, researchers Ádám Albi and Péter Pajkossy from the Budapest University of Technology and Economics set out to test this idea. They recruited 28 volunteers in Hungary and asked them to study 80 two- or three-syllable words that are infrequently used in the Hungarian language. The words were presented to the participants on a screen at a specific point along the edge of an invisible circle.
Later, participants were shown a mix of old and new words, this time centered on the screen. For each word they recognized, they were asked to recall where it had originally appeared. While participants responded, the researchers tracked their pupil size.
The results were striking. When people recognized a word they'd seen earlier, their pupils dilated—and the effect was more pronounced when participants could precisely remember the word's original location.
Even when people weren't sure where on the screen they had seen the word before but recognized the word as familiar, their pupils still dilated more than when they saw a brand-new word. This suggests our eyes reflect two layers of memory: a general sense of familiarity, and the precision of specific details, Albi tells Popular Science.
So, what's actually going on inside the brain?
'To date, there is no consensus on the precise cognitive and neurobiological mechanisms that drive pupil responses during different forms of memory retrieval, such as recognition,' Albi says.
But one leading theory centers on the concept of attentional salience—how much something grabs our focus. A vivid memory might not just come to mind; it demands attention. That memory could trigger activity in a region of the brain called the locus coeruleus–noradrenergic system, which regulates attention. When activated, this system also causes the pupils to dilate.
This growing understanding opens up some exciting possibilities. 'Pupil dilation could serve as a non-invasive marker of memory quality in settings such as education, clinical assessment, or legal testimony—especially when evaluating the depth or reliability of someone's memory,' Mohamed El Haj, a neuropsychologist and professor at the University of Nantes in France, who was not involved in the study, tells Popular Science.
And because pupil measurement is noninvasive, cost-effective, and methodologically simpler than other brain analysis techniques like magnetic resonance imaging (MRI) or electroencephalogram (EEG), as Albi points out, it holds real promise for widespread use.
Imagine being able to gauge the reliability of an eyewitness just by tracking their pupils. That future may not be far off.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Want to get smarter? Neuroscience says 5 simple steps significantly boost memory, learning, and cognition
Want to get smarter? Neuroscience says 5 simple steps significantly boost memory, learning, and cognition

Fast Company

time2 days ago

  • Fast Company

Want to get smarter? Neuroscience says 5 simple steps significantly boost memory, learning, and cognition

BY Since no one ever does anything worthwhile on their own, who you know is important. But what you know — and what you do with what you know — is crucial. Learning, memory, and cognitive skills are a competitive advantage. Here are five neuroscience-based ways to learn more quickly, and even more importantly, better retain what you learn. Best of all, each takes a couple of minutes at most, and one requires no effort at all. Say it out loud. We took the grandkids to surf lessons. They wanted to go back for another session, the instructor was great, so I asked him his name. Problem is, I'm terrible at remembering names. So I said it aloud three or four times. Why? A study published in the Journal of Experimental Psychology found that saying words out loud (or even just mouthing the words) makes them more memorable. While the underlying mechanism is unclear, neuroscientists theorize saying something out loud separates and distinguishes it from 'mere' thoughts. (You didn't just think it. You also heard it.) That makes the information, idea, or plan you want to remember even more memorable. When you need to remember something, say it aloud, or mouth it to yourself. Your cerebral cortex will help you retain it longer. Then… Do a 40-second replay. Remembering a name is fairly simple. Remembering something more complex requires memory consolidation, the process of transforming temporary memories into more stable, long-lasting memories. Even though memory consolidation can be sped up, storing a memory in a lasting way takes time. A good way to increase the odds is to mentally replay whatever you want to remember for 40 seconds. A 2015 study published in Journal of Neuroscience found that a brief period of rehearsal — replaying an event in your mind, going over what someone said in a meeting, mentally mapping out a series of steps, etc. — makes it significantly more likely you will remember what you replayed. As the researchers write: A brief period of rehearsal has a huge effect on our ability to remember complex, lifelike events over periods of one to two weeks. We have also linked this rehearsal effect to processing in a particular part of the brain, the posterior cingulate. A week or two? That should be long enough for you to actually do something with whatever you wanted to remember. Then… Make a prediction. While it sounds odd, a study published in the Canadian Journal of Experimental Psychology shows the act of asking yourself whether you will remember something significantly improves the odds that you will remember, in some cases by as much as 50 percent. That's especially true for prospective memories, or remembering to perform a planned action or intention at some point in the future. Following up with a customer. Checking on a vendor's status. After you deal with a problem, determining the the root cause. Why playing the prediction game works is also somewhat unclear. Possibly the act of predicting is a little like testing yourself; as research shows, quizzing yourself is a highly effective way to speed up the learning process. What is clear is that the act of predicting helps your hippocampus better form and index those episodic memories for later access. Want to remember to do something in the future? Take a second and predict whether you will remember. That act alone makes it more likely you will. Then… Zone out for two minutes. According to a study published in Nature Reviews Psychology, 'even a few minutes of rest with your eyes closed can improve memory, perhaps to the same degree as a full night of sleep.' Psychologists call it 'offline waking rest.' In its purest form, offline waking rest can be closing your eyes and zoning out for a couple of minutes. But you can also daydream. Meditate. Clear your mind and think happy thoughts. While none of those sound productive — should you really be wasting time you could be learning? — intermittent lack of focus improves memory consolidation; in simple terms, constantly going from one thing to the next makes it hard for your brain to keep up. As the researchers write: Periods of reduced attention to the external world are a universal feature of human experience, which suggests that spending a portion of time disengaged from the sensory environment … permit the reactivation of recently formed memory traces. This iterative reactivation of memory could strengthen and stabilize newly formed memories over time, contributing to early stages of memory consolidation during the first few minutes following encoding. The key is to be intentional about it. First, replay what you want to remember for 40 seconds or so. Then, predict whether you will remember it. Then, close your eyes, zone out, and engage in a minute or two of offline waking rest. As the researchers write, 'Moments of unoccupied rest should be recognized as a critical contributor to human waking cognitive functions.' And finally… Get a good night's sleep. Here's the effortless aspect of improving your memory. According to a study published in Psychological Science, people who studied before bed, slept, and then did a quick review the next morning spent less time studying — and increased their long-term retention by 50 percent. The underlying mechanism is what psychologists call sleep-dependent memory consolidation: 'Converging evidence, from the molecular to the phenomenological, leaves little doubt that offline memory reprocessing during sleep is an important component of how our memories are formed and ultimately shaped.' In simple terms, sleeping on it helps your brain file away what you've learned, and makes it easier to access when you need it. That's also true where longer-term memory is concerned. Learning, then getting a good night's sleep, and then learning again is an extremely effective way to boost intelligence and skill. As the researchers write: We found that interleaving sleep between learning sessions not only reduced the amount of practice needed by half but also ensured much better long-term retention. Sleeping after learning is definitely a good strategy, but sleeping between two learning sessions is a better strategy. Say you're learning a new sales demo. After a practice session, say the main bullets of your presentation out loud. Then mentally replay key elements of your presentation. Then predict whether you'll remember what you've learned. Then take a minute or two to zone out. Then get a good night's sleep, do a quick review the next day, and work on the next chunk of information. Rinse and repeat, and neuroscience says you'll spend less time learning — and you'll remember a lot more. Which means you'll be able to do more.

100 years ago, scientists thought we'd be eating food made from air
100 years ago, scientists thought we'd be eating food made from air

Yahoo

time2 days ago

  • Yahoo

100 years ago, scientists thought we'd be eating food made from air

In the early 1920s, on the left bank of the Seine just outside Paris, a small laboratory garden bloomed on a plot of land sandwiched between the soaring Paris Observatory and the sprawling grounds of Chalais Park. Unlike a typical garden filled with well-groomed plants and the smell of fresh-turned soil, this garden had an industrial feel. Dubbed 'the Garden of Wonders' by a contemporary journalist, the plot was lined with elevated white boxes fed with water from large glass canisters. Nearby greenhouses included equally unusual accessories. But it's what happened inside the low-slung laboratory buildings that made this garden so wondrous. In August 1925, Popular Science contributing writer Norman C. McCloud described how Daniel Berthelot—a decorated chemist and physicist from France—was conducting revolutionary 'factory-made vegetable' experiments in his Garden of Wonders. Berthelot, son of Marcellin Berthelot, a renowned 19th century chemist and French diplomat, was using the garden to expand upon his father's groundbreaking work. Starting in 1851, the elder Berthelot began creating synthetic organic compounds, such as fats and sugars (he coined the name 'triglyceride'), from inorganic compounds like hydrogen, carbon, oxygen, and nitrogen. It was a revolutionary first step toward artificial food. '[The younger] Berthelot already has produced foodstuffs artificially by subjecting various gases to the influence of ultra-violet light,' wrote McCloud. 'These experiments,' he added, quoting Berthelot, 'show that by means of light, vegetable foods can be manufactured from air gases.' But Berthelot's experiment didn't exactly catch on. A century later, most food is still grown the traditional way—by plants—but the idea of manufacturing food in controlled, factory environments has been gaining ground. In fact, Berthelot's revolutionary idea may finally be bearing fruit—just not in the way he imagined. A revolution in food chemistry Berthelot never fully accomplished his goal of trying to artificially reproduce what plants do naturally. Nonetheless, his experiments, as sensational as they might seem today, would have been considered quite plausible in 1925. That's because his father's discoveries had unleashed a revolution in chemistry and a tidal wave of optimism about the future of food. By the 1930s, chemists had begun synthesizing everything from basic nutrients, like vitamins, to medicines, like aspirin (acetylsalicylic acid), to food additives, such as artificial thickeners, emulsifiers, colors, and flavors. In an interview for McClure's magazine in 1894 dubbed 'Foods in the Year 2000,' Berthelot's father boldly predicted that all foods would be artificial by the year 2000. 'The epicure of the future is to dine upon artificial meat, artificial flour, and artificial vegetables,' wrote Henry Dam for McClure's, articulating Marcellin Berthelot's vision. 'Wheat fields and corn fields are to disappear from the face of the earth. Herds of cattle, flocks of sheep, and droves of swine will cease to be bred because beef and mutton and pork will be manufactured direct[ly] from their elements.' Welcome to the Garden of Wonders Such was the vision that the younger Berthelot was pursuing in his Garden of Wonders. His goal, he told McCloud, was to produce 'sugar and starch from the elements without the intervention of living organisms.' To achieve this, Berthelot envisioned a factory with 'glass tanks of great capacity.' Gases would be pumped into the tanks, and 'suspended from the ceiling [would] be lamps producing the rays of ultra-violet light.' Berthelot imagined that when the chemical elements combined 'through the glass walls of the tank we shall see something in the nature of a gentle snowfall that will accumulate on the floor of the tanks…our finished product—vegetable starches and vegetable sugars created in a faithful reproduction of the works of nature.' By 1925, he had succeeded in using light and gas (carbon, hydrogen, oxygen, and nitrogen) to create the basic compound formamide, which is used to produce sulfa drugs (a kind of synthetic antibiotic) and other medicines as well as industrial products. But his progress toward reproducing photosynthesis ended there. Berthelot died just two years after McCloud's story ran in Popular Science, in 1927, without ever realizing his dream. Despite the bold predictions of the time, producing food from only air and light was wildly aspirational in 1925, if for no other reason than photosynthesis was poorly understood. The term had only been coined a few decades earlier when Charles Barnes, an influential American botanist, lobbied for a more precise description of a plant's internal mechanisms than the generic 'assimilation' then in favor. Chlorophyll had been discovered in the prior century, but what happened at a cellular level in plants remained largely theoretical until the 1950s. Although Berthelot may have been onto something with his experiments, adding to the momentum that became the artificial food industry, he was a long way from replicating what comes naturally to plants. We still are, but recent discoveries may have enabled a workaround—depending on your definition of 'food.' A modern answer to Berthelot's innovative garden From vertical indoor farms to hydroponics to genetically modified crops, since the 1960s commercial agriculture has been focused on coaxing more yield from fewer resources, including land, water, and nutrients. The drive began when Nobel Peace Prize winner Norman Borlaug, an American biologist, helped spark the Green Revolution by selectively breeding a grain-packed, dwarf variety of wheat. The theoretical limit of that revolutionary goal would liberate food production from traditional agriculture altogether, eliminating all resources except air and light—Berthelot's original vision. In the last century, we've inched toward creating food from nothing, making progress by teasing apart the incredibly complex biochemical pathways associated with plant physiology. But if we've learned anything since Berthelot's experiments, it's that photosynthesis—what plants are naturally programmed to do—can't be easily replicated industrially. But that hasn't stopped a handful of companies from trying. In April 2024, Solar Foods opened a factory in Vantaa, Finland—a sleek facility where workers monitor large tanks filled with atmospheric gases. Inside the tanks, water transforms into a protein-rich slurry. Dehydrated, the slurry becomes a golden powder packed with protein and other nutrients, ready to be turned into pasta, ice cream, and protein bars. The powdery substance, Solein, resembles Berthelot's vision, as does the factory, which uses atmospheric gases to enable 'food production anywhere in the world,' according to a 2025 company press release, 'as production is not dependent on weather, climate conditions, or land use.' But the similarities with Berthelot's vision end there. Solar Foods may not require land or plants to produce food, but their technology derives from a living organism. Using a form of fermentation, it relies on a microbe to digest air and water to produce protein. Related Archival Stories 100 years ago, scientists predicted we'd live to 1,000 years old 100 years ago, the battle for television raged A century ago, suspended monorails were serious mass-transit contenders 100 years of deep-sea filmmaking and ocean exploration 100 years of aliens: From Mars beavers to little gray men The U.S.-based company Kiverdi uses a similar microbial fermentation process, first devised by NASA as far back as the 1960s for deep space travel, to convert carbon dioxide into protein. Austria-based Arkeon Technologies has developed its own microbial fermentation process to also produce food from carbon dioxide without the need for land or other nutrients. Microbial fermentation may represent a promising new chapter in synthetic foods, but don't expect tomatoes or corn to materialize from thin air anytime soon—it's not artificial photosynthesis. While Berthelot's understanding of photosynthesis was primitive a century ago, he was ahead of his time in many ways, and his vision was remarkably prescient. Although we still haven't figured out how to replicate photosynthesis chemically—literally growing fruits and vegetables as plants do from air and light—it's worth acknowledging the strides we've made in just the last decade: Companies like Arkeon Technologies and Kiverdi may help remove excess carbon dioxide from the atmosphere while offering solutions to future food shortages. Or they may not. Only the next century will tell. Solve the daily Crossword

Scientists Are Hunting Down Humanity's Earliest Artificial Memories
Scientists Are Hunting Down Humanity's Earliest Artificial Memories

Yahoo

time3 days ago

  • Yahoo

Scientists Are Hunting Down Humanity's Earliest Artificial Memories

"Hearst Magazines and Yahoo may earn commission or revenue on some items through these links." Here's what you'll learn when you read this story: Researchers analyzed bone markings from 70,000-year-old artifacts to determine if artificial memory systems were in use. The team found that patterns in markings differentiated between butchering, decorative, and counting use cases. While not conclusive, the findings could show that cognitive abilities grew much earlier in the human timeline than previously believed. Humanity's ancestors could have been counting long before any formal writing system existed. And it may have been advanced enough that entire societies knew what was going on. A team of researchers recently published a study detailing an analysis of 22 artifacts (dated to between 15,000 and 70,000 years ago) intended to determine how scientists could better identify artificial memory systems, or AMSs. As the authors describe them, AMSs are 'tools that allow for the storage and retrieval of coded information beyond the physical body.' Nowadays, we use computers and smartphones for the bulk of our out of body memory tracking, but the concept includes everything from systems of writing to sticks carved with tally marks known appropriately as 'tally sticks.' Anything that would allow you to 'make a note' of an idea and come back to it later—that counts. But these things aren't easy to identify—especially the earliest examples on record, which consist of nothing more than a few marks on an object. And over the course of millenia, marks can be collected by an object for a variety of reasons. So, the team behind this new paper set out to find a better method for identifying these early recording devices through the use of 'new statistical tools and empirical evidence.'Basically, the team wanted to be able to identify what markings meant AMS, and what markings meant, say, butchery or art. So, they analyzed their 22 artifacts—which included a 44,000-year-old baboon bone from South Africa, a 70,000-year-old bone, a reindeer antler from France, medieval tally sticks, and Indigenous calendars—in the hopes of gathering enough information to create descriptions of what features markings of different purposes would have. In particular, the team looked for signs of regular, intentionally spaced markings that would indicate an objet was being used as an AMS. And according to the study, the team found what they were looking for. 'Upper Paleolithic AMSs,' the authors wrote, 'are endowed with systematically different signatures that distinguish them from the other artifacts.' Butchery marks were found to be clustered, and abstract decorative motifs displayed randomness in spacing (both had significant variation in angle). But potential AMSs show regular spatial patterns—what researchers claim are 'distinct separation[s]' between the different types of markings. 'These findings,' the researchers wrote, 'suggest that modern humans in at least Africa and Europe had sophisticated cognitive capabilities for information storage and retrieval, providing insights into the possible development of quantity-related cognition.' Some examples come from regions and time periods linked to Homo sapiens, while others could be tied to Neanderthals. Questions, however, still abound. For instance, just what information these AMSs were recording remains a mystery. Possibilities include days of the week, lunar movements, community events, objects, and even numbers of people. The researchers caution that they 'cannot truly speculate about the exact nature of these population's precise linguistic repertoire.' But they surmise that the recording of information by marking objects offers a 'form of external representation, defining a communication technology that implies the transmitting of knowledge within a community, necessitating a shared understanding of the processes behind the production and use of these devices.' Despite the mystery that still surrpounds many of these artifacts, there is no question as to the importance of the finds. 'These marks,' the authors wrote, 'could reflect a crucial step in the transition from basic cognitive abilities—like distinguishing between 'few' and 'many'—toward abstract concepts like numbers.' You Might Also Like The Do's and Don'ts of Using Painter's Tape The Best Portable BBQ Grills for Cooking Anywhere Can a Smart Watch Prolong Your Life? Solve the daily Crossword

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store