Scientists Quantified The Speed of Human Thought, And It's a Big Surprise
The speed of the human brain's ability to process information has been investigated in a new study, and according to scientists, we're not as mentally quick as we might like to think.
In fact, research suggests our brains process information at a speed of just 10 bits per second. But how is this possible, in comparison to the trillions of operations computers can perform every second?
Research suggests this is the result of how we internally process thoughts in single file, making for a slow, congested queue.
This stands in stark contrast to the way the peripheral nervous system operates, amassing sensory data at gigabits a second in parallel, magnitudes higher than our paltry 10-bit cognitive computer.
To neurobiologists Jieyu Zheng and Markus Meister from the California Institute of Technology, this mismatch in sensory input and processing speed poses something of a mystery.
"Every moment, we are extracting just 10 bits from the trillion that our senses are taking in and using those 10 to perceive the world around us and make decisions," says Meister.
"This raises a paradox: What is the brain doing to filter all of this information?"
In their recently published paper, Zheng and Meister raise a clear defense of the suggestion that in spite of the richness of the scenery in our mind's eye, the existence of photographic memory, and the potential of unconscious processing, our brains really do operate at a mind-numbingly slow pace that rarely peaks above tens of bits a second.
According to the researchers, solving a Rubik's cube blindfolded requires processing of just under 12 bits a second. Playing the strategy computer game StarCraft at a professional level? Around 10 bits a second. Reading this article? That might stretch you to 50 bits a second, at least temporarily.
Assuming it's true, the pair lay out the state of research on the disparity between our "outer brain's" processing of external stimuli and the "inner brain's" calculations, demonstrating just how little we know about our own thinking.
"The current understanding is not commensurate with the enormous processing resources available, and we have seen no viable proposal for what would create a neural bottleneck that forces single-strand operation," the authors write.
The human brain is a beast when it comes to pure analytical power. Its 80-odd-billion neurons form trillions of connections grouped in ways that allow us to feel, imagine, and plan our way through existence with other humans by our sides.
Fruit flies, on the other hand, have maybe a hundred thousand or so neurons, which is plenty enough for them to find food, flap about, and talk fly-business with other flies. Why couldn't a single human brain behave like a swarm of flies, each unit processing a handful of bits each second collectively at super speed?
Though there are no obvious answers, Zheng and Meister propose it may simply have to do with necessity. Or rather, a lack of necessity.
"Our ancestors have chosen an ecological niche where the world is slow enough to make survival possible," the team writes.
"In fact, the 10 bits per second are needed only in worst-case situations, and most of the time our environment changes at a much more leisurely pace."
Research into comparable rates of processing in other species is remarkably limited, the pair explain, though what they could locate seems to validate a view that generally our external environment only changes at a rate that requires decision-making to occur at a few bits a second.
What might we make of a future where we demand more of our bottlenecked brains, perhaps through technological advances that link our single-file cognitive computing directly with a computer's parallel processing?
Knowing how our brains evolved could give us insights into both improving artificial intelligence and shaping it to suit our especially particular neural architecture. At the very least, it could reveal the deeper benefits of slowing down and approaching the world one simple question at a time.
This perspective was published in Neuron.
There's Something Strange About These Ancient Egyptian Sheep Horns
Flat Earthers Went to Antarctica to Look at The Sun. Here's What Happened.
Twins Were Typical Among Our Primate Ancestors. What Changed?

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNBC
2 days ago
- CNBC
Google overhauls internal learning platform to focus on AI, 'business priorities'
Google is overhauling a popular internal learning platform to focus on teaching employees how to use modern artificial intelligence tools in their daily work routines, CNBC has learned. Grow, as the learning service is called, was previously filled with a wide array of courses, ranging from teaching Google employees how to build products, use 3D printers, help with their personal finance or even how to solve a Rubik's cube. Those offerings have all been replaced primarily by AI-related courses. The revamp underscores how companies, both within and outside of tech, are racing to train their employees on the advanced AI tools that have been created since OpenAI's launch of ChatGPT in late 2022 ushered in the age of generative AI. Employees with previously scheduled Grow sessions were notified in the spring that the sessions they signed up for would be cancelled and that course materials would be archived, according to internal correspondence viewed by CNBC. Grow, which was started more than 10 years ago, had grown to more than 500,000 listings before the AI shakeup. Grow is popular among employees and is considered to be one of the unique perks of working at Google, according to sources and an internal discussion forum. "We have an active learning culture with numerous in-house courses tied to company priorities, along with generous educational reimbursement," a Google spokesperson said in an emailed statement. "Our internal course offerings have ballooned since we launched it ten years ago, and we're refreshing Grow to help employees find the most relevant learning opportunities." The move to overhaul Grow shows that Google is shifting away from some of its nice-to-have programs to more business-essential offerings as it streamlines operations to prioritize AI. As the company fights to retain its relevance in search amid a heated AI arms race, it has streamlined operations, headcount and employee benefits. Google has enacted rolling layoffs within several units across the company, particularly after finance chief Anat Ashkenazi's said last fall that the company could "push a little further" on cost cuts. Google, like many other tech giants, has also rolled back programs like its diversity, equity and inclusion, or DEI, trainings amid business streamlining as well as from President Trump's executive orders. In a memo sent out to employees who had created Grow courses, Google leaders wrote that many of the platform's "courses were unused," and "not relevant to the work we do today," according to an internal message. "Those that orgs have confirmed are up-to-date and focused on business priorities will still be available," wrote Google's people operations staff. Employees commented on an internal forum that the use of "focused on business priorities" reiterated a sign of the times — Google is primarily focused on programs that contribute to the bottom line.


CNBC
3 days ago
- CNBC
26. Iambic Therapeutics
Founders: Tom Miller (CEO), Fred ManbyLaunched: 2020Headquarters: San Diego, CaliforniaFunding: $220 millionValuation: N/AKey Technologies: Artificial intelligence, cloud computing, deep neural networks/deep learning, generative AI, machine learning, robotics, quantum computingIndustry: BiotechPrevious appearances on Disruptor 50 list: 0 It can take 10-15 years for today's biopharmaceutical companies to bring new drugs all the way through discovery and clinical trial. San Diego-based Iambic Therapeutics' AI-driven platform can accelerate the pace of drug discovery and development, enabling drug development in just a few years. The company has novel medications in its pipeline to treat breast cancer and other HER2 cancers, and recently formed a research collaboration with pharmaceutical giant Lundbeck for a small molecule therapeutic to treat migraines. By predicting how its new molecules will interact with human systems, Iambic's technology can also reduce the need for clinical trials. The company was originally called Entos, and was founded when CEO Thomas Miller, a theoretical chemist and professor at California Institute of Technology, teamed up with longtime collaborator Fred Manby. In its first iteration, the company worked on making better chemical predictions across many industries and worked with companies including Toyota and Procter & Gamble. But the founders saw applications for their work in what's called small molecule drug discovery. Small molecule drugs, often synthesized chemically, target specific proteins or cellular pathways. Iambic's platform for drug discovery is called Enchant. The company says it provides high-confidence predictions in data-poor situations, such as early-stage and clinical-stage drug programs. "(Cancer) is an area of huge need," Miller told an interviewer for the California Institute of Technology in 2022. "It's an incredibly fast, quickly advancing disease. Many people are afflicted by it. There's many varieties of it. It is the combination of those things that means that if you have the ability to design a new drug, there's a way to … have a relatively fast timescale to advance that to the point where it's in human trials." Rather than selling its drug-discovery services and software to pharmaceutical companies, Iambic has focused on producing its own drugs. "Instead of running around, trying to convince people that this software is so great, and they should buy it, you can actually just use it and execute with it, and actually make better molecules. Then those molecules can stand on their own two feet," Miller said. In 2024, Iambic completed a B round of funding, with investors including OrbiMed, Nvidia and Sequoia Capital, and announced a collaboration with Nvidia, which has been teaming up as a venture investor with many startups across sectors using AI, including, for example, agtech Disruptor Carbon Robotics. Iambic also moved into a new headquarters in San Diego last year and took its headcount to about 100, enabling the company to run experiments on thousands of newly discovered molecules each week. It also announced an update to NeuralPLexer, which predicts protein-ligand structures, and published data in Nature Machine Intelligence showing that NeuralPLexer outperformed AlphaFold, the Nobel Prize in Chemistry winner developed by Google's DeepMind. Iambic also hired its first CFO, Michael Secora, who previously worked at publicly held Recursion.


San Francisco Chronicle
6 days ago
- San Francisco Chronicle
Study says California is overdue for a major earthquake. Does that mean ‘the big one' is coming?
Unlike other earthquake-prone places around the planet, California is overdue for a major quake, according to a recent study. But that doesn't mean a catastrophic event like the 1906 San Francisco earthquake is on the verge of striking. 'A fault's 'overdue' is not a loan payment overdue,' said Lucy Jones, founder of the Dr. Lucy Jones Center for Science and Society and a research associate at the California Institute of Technology, who wasn't part of the work. The new study reported that a large share of California faults have been running 'late,' based on the expected time span between damaging temblors. The researchers compiled a geologic data set of nearly 900 large earthquakes on active faults in Japan, Greece, New Zealand and the western United States, including California. Faults are cracks in the planet's crust, where giant slabs of earth, known as tectonic plates, meet. The Hayward Fault is slowly creeping in the East Bay and moves around 5 millimeters per year, according to the U.S. Geological Survey. But sometimes plates get stuck and pressure builds. Earthquakes occur when plates suddenly slip, producing a jolt of energy that causes the ground to shake. Scientists study ruptured rock layers deep beneath the surface to estimate when large earthquakes occurred in the past. In the new study, the authors collected data stretching back tens of thousands of years. For a region spanning the Great Basin to northern Mexico, this paleoearthquake record stretched back about 80,000 years. For California, the record extended back about 5,000 years. The scientists used these records to calculate how much time typically passes between large surface-rupturing earthquakes around the planet. The average interval was around 100 years for some sites on the San Andreas Fault; it was 2,100 years on the less famous Compton thrust fault beneath the Los Angeles area. About 45% of the faults analyzed for California are running behind schedule for a major earthquake, meaning that more time has passed since the last large quake on a fault than the historical average. In the other regions studied, this statistic ranged from 9% to 18%. The researchers' analysis only included large surface-rupturing earthquakes. It didn't include the magnitude 6.9 Loma Prieta earthquake in 1989, which was below the magnitude 7 threshold that the study authors used for quakes on the San Andreas Fault. The authors associated seismic punctuality with slip rates, or how fast the two sides of a fault move past each other. 'Our analysis showed that the faster the faults are moving, the more likely it is that they will appear overdue,' said study author Vasiliki Mouslopoulou, a senior scientist at the National Observatory of Athens, in Greece. In tectonically active California, the San Andreas Fault has a particularly high slip rate. The Pacific and North American plates slide past each other an average of more than inch per year in some spots. 'Faults in California are among the fastest-slipping faults in the world,' Mouslopoulou said, adding that other factors are also probably contributing due to the pattern of chronically late large earthquakes. Previous studies had also shown that seismic activity has been unusually subdued in California, compared with paleorecords. A 2019 study reported that there's been a 100-year hiatus in ground-rupturing earthquakes at a number of paleoseismic sites in California, including on the San Andreas and Hayward faults. The authors of the 2019 study treated large earthquakes at these sites as independent events, akin to flipping pennies and counting how many turn up heads. They calculated a 0.3% probability that there'd be a 100-year hiatus in ground-rupturing quakes across all the California sites. Scientists have suggested that there could be earthquake 'supercycles,' with large quakes occurring in clusters, with less active periods in between. 'There are these longer-term, decadal, century-long ups and downs in the rate of earthquakes,' Jones said. Potentially, California is in a quiet time and large earthquakes are currently less likely. Katherine Scharer, a U.S. Geological Survey research geologist who wasn't part of the new research, commended the authors of the study, explaining that compiling the paleoseismic records was a 'tremendous amount of work' and will enable more scientists to investigate earthquakes. California's relatively sparse big earthquake activity could be connected to the geometry of its faults. While the analyzed faults in California were more or less in line with each other, those in other regions resembled 'a plate of spaghetti,' Scharer said. 'From the study, I think you would say that the main California faults are mechanically different somehow than the averages from these other places,' Glenn Biasi, a geophysicist with the U.S. Geological Survey, who wasn't part of the new work. Biasi emphasized that it's impossible to say if California's faults are truly overdue for a big earthquake. 'The faults slip on their own schedule and for their own reasons,' Biasi said. Scientists can't accurately predict large earthquakes in advance but paleoearthquake data could help. The authors of the new study found that, excluding California's recent lack of large earthquakes, faults around the entire planet have generally produced surface-rupturing quakes at intervals expected from paleoearthquake and historic records. Considering such data could improve earthquake forecasts, Mouslopoulou said.