'Actual intelligence': Franken-PC debuts in Melbourne with a $35,000 price tag and claims of exceptional performance
When you buy through links on our articles, Future and its syndication partners may earn a commission.
Cortical Labs has built the first deployable biological computer, priced at $35,000
The CL1 integrates living neurons with silicon for real-time computation
The next step will be to build a biological neural network server stack
Despite the unquestionably impressive advancements we've witnessed in recent years, AI is still lagging far behind human intelligence. While it can process vast amounts of data, recognize patterns, and generate responses at speed, it lacks true understanding and reasoning, and although it's getting better, the issue of hallucinations - when the AI makes stuff up - remains a problem.
Two years ago, researchers from Johns Hopkins University in Australia, together with scientists at Cortical Labs in Melbourne, suggested that the answer to real, less artificial AI was organoids - computers built with human brain cells. Fast forward to today, and Cortical Labs has turned the theory into reality with the production of the world's first commercialized biological computer.
The CL1, which will be manufactured to order but is available for purchase online (the option to buy time on the chips will also be offered), is a Synthetic Biological Intelligence (SBI).
'Real neurons are cultivated inside a nutrient-rich solution, supplying them with everything they need to be healthy. They grow across a silicon chip, which sends and receives electrical impulses into the neural structure," the company says.
The world the neurons exist in is created by Cortical Labs' Biological Intelligence Operating System (biOS) and 'runs a simulated world and sends information directly to the neurons about their environment. As the neurons react, their impulses affect their simulated world. We bring these neurons to life, and integrate them into the biOS with a mixture of hard silicon and soft tissue. You get to connect directly to these neurons.'
By deploying code directly to the real neurons, the company claims the CL1 can solve today's most difficult challenges, 'The neuron is self-programming, infinitely flexible, and the result of four billion years of evolution. What digital AI models spend tremendous resources trying to emulate, we begin with.'
"Today is the culmination of a vision that has powered Cortical Labs for almost six years," noted Dr. Hon Weng Chong, Founder and CEO of Cortical Labs. "However, our long-term mission has been to democratize this technology, making it accessible to researchers without specialized hardware and software. The CL1 is the realization of that mission. While today's announcement is incredibly exciting, it's the foundation for the next stage of innovation. The real impact and the real implications will come from every researcher, academic, or innovator that builds on top of it."
A report from New Atlasclaims Cortical is constructing a 'first-of-its-kind biological neural network server stack, housing 30 individual units that each contain the cells on their electrode array, which is expected to go online in the coming months.' The site reports the company is aiming to have four stacks available for commercial use via a cloud system by the end of 2025.
As for pricing, the CL1 will be surprisingly affordable. 'The units themselves are expected to have a price tag of around US$35,000, to start with (anything close to this kind of tech is currently priced at €80,000, or nearly US$85,000),' New Atlas adds.
For context, Apple's 'best failure' the Lisa, which paved the way for the Macintosh and even Microsoft Windows, sold for $9,995.00 in January 1983 which, adjusting for inflation, works out to a comparable $32,500 today. Will the CL1 prove be as important to computing's future as the Lisa was? It's impossible to say, but for now its impact will largely depend on scalability, practical applications, and how well it integrates into existing AI and computing systems.
Computers built by human brain cells could help make AI less artificial
Brain-like computers could become reality sooner than you think
'An extension of a scientist's brain': Researchers explore AI to augment inspiration
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
12 hours ago
- Yahoo
I'm expecting the Wake Up Dead Man trailer during Netflix Tudum event, but here's what I really want to know
When you buy through links on our articles, Future and its syndication partners may earn a commission. Netflix is ready to turn on the hype machine with its Netflix Tudum event streaming live on Netflix at 8 pm ET/5 pm PT on Saturday, May 31. While live performances and star appearances are part of the proceedings, the big draw for Netflix Tudum is that it's where the streamer is expected to show exclusive looks at some of their biggest upcoming TV shows and movies. Right near the top of that list of anticipated titles is Wake Up Dead Man: A Knives Out Mystery, the third Benoit Blanc movie starring Daniel Craig and pegged for a 2025 release. While Netflix has officially included Wake Up Dead Man as part of the Netflix Tudum lineup, it has not said what exactly they're going to be sharing from the movie. If I had to bet, I would say the evening will likely bring our first look at the Wake Up Dead Man trailer. As excited as I am to get a first-look at the third movie in Rian Johnson's Knives Out franchise, that is not the bit of news I am most interested in hearing. Instead, I want to know what is going on with the Wake Up Dead Man release. Not just its release date, but if, when, where and for how long Wake Up Dead Man may be playing in movie theaters? Netflix, famously, is not all that interested in the movie theater business. A majority of its movies never release in theaters, instead simply premiering on the streaming platform right away for subscribers. The lone exception is when Netflix wants to position a movie for potential Oscars. The Academy Awards require that eligible movies have an exclusive release in movie theaters for at least seven days in at least one select markets (Los Angeles, New York and a couple other major cities qualify). For all the Netflix movies that have been nominated in its history, they typically do the bare minimum required to become Oscar eligible, then they're pulled from theaters and made available to stream (either right away or with a slight delay). That was the case for Glass Onion, the previous Knives Out movie. Netflix gave Glass Onion a limited release in 600 movie theaters for one week in late November 2023, pulled it, never reported box office grosses for the movie and then put it on the streaming platform around Christmas time. In terms of awards, Glass Onion earned a single Oscar nomination for Best Adapted Screenplay. I fully expect that to be the minimum of what Netflix will do with Wake Up Dead Man, but I'm holding out hope that the streamer may finally realize they can have their cake and eat it too: a box office smash and then a streaming hit. Multiple studies and analysis from the last five years have shown evidence that movies that first get a theatrical release perform better on streaming services than those that simply premiere on streamers. To examine that claim, let's take a look at Netflix itself. After One of Them Days, one of the best reviewed movies of 2025 and a solid box office performer, premiered on Netflix in late March, it spent three straight weeks in the top 10 of Netflix's most watched movies. Conversely, one of the big Netflix original movies to be released so far this year, Havoc, only spent two weeks in the top 10. In fairness, Vince Vaughn's Nonnas has matched One of Them Days' three weeks in the top 10 and could potentially top it (that will be determined next week). But whether or not that happens, the idea that a movie theater release does appear to give some added cache to a title when it does hit streaming appears to have merit. That's likely why many other streamers and studios — Apple TV Plus, Prime Video, Warner Bros. — have opted to put many of their biggest movies in theaters before debuting them on streaming. Netflix has shown signs that they are starting to budge a little, including announcing earlier this year that Greta Gerwig's Chronicles of Narnia movie will be released in IMAX movie theaters around the world for two weeks in 2026. However, Netflix CEO Ted Sarandos said at the time that there was 'no change at all' to Netflix's theatrical strategy. I hope Sarandos and Netflix have had a change in heart and realize the boom that could be releasing their movies exclusively in movie theaters for an extended run. Especially a star-studded one like Wake Up Dead Man, which in addition to Craig features Jeremy Renner, Josh O'Connor, Cailee Spaeny, Josh Brolin, Mila Kunis, Andrew Scott, Kerry Washington, Thomas Haden Church and Glenn Close as potential suspects. Maybe not a run like Sinners or A Minecraft Movie, but a wide release so everyone who wants to see it on the big screen can easily do so. That's what I'm wishing will be the big take away from the Netflix Tudum event — though I'm also just excited to see the Wake Up Dead Man trailer.
Yahoo
13 hours ago
- Yahoo
Data centers are at the heart of the AI revolution and here's how they are changing
When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:
Yahoo
13 hours ago
- Yahoo
Data centers are at the heart of the AI revolution and here's how they are changing
When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: