logo
The world's first 'body in a box' biological computer costs $35,000 and looks both cool as hell plus creepy as heck

The world's first 'body in a box' biological computer costs $35,000 and looks both cool as hell plus creepy as heck

Yahoo09-03-2025

When you buy through links on our articles, Future and its syndication partners may earn a commission.
Here's one for you: when is a 'body in a box' not as macabre as it sounds? Simple—when it's a tech startup. Wait! Put the turn-of-the-millennium trench coat and sunglasses combo down! Let me explain.
The CL1 is described as "the world's first code deployable biological computer" according to the splashy website, incorporating human brain cells in order to send and receive electrical signals (via The Independent). These cells hang out on the surface of the computer's silicon chip, and the machine's Biological Intelligence Operating System (or biOS for short—cute), allows users to wrangle the neurons for a variety of computing tasks.
Organic hardware like this for research purposes isn't new—for just one example, FinalSpark's Neuroplatform began offering rentable 'minibrains' last year.
The neurons central to the CL1 are lab-grown, cultivated inside a nutrient rich solution and then kept alive thanks to a tightly temperature controlled environment working alongside an internal life support system. Under favourable conditions, the cells can survive for up to six months. Hence, the project's chief scientific officer Brett Kagan pitching it "like a body in a box."
Should you be so inclined to pick up your own surprisingly fleshy, short-lived computer, you can do so from June…for $35,000. Now, I know what you're thinking—not because you're actually living life in a Matrix-style pod, but purely because I'm asking the same question: Why?
First, a smidge more background on this brain box, which is the latest project from Cortical Labs, and was unveiled this week at Mobile World Congress in Barcelona. We've covered this Melbourne-based company before, with highlights including that time their team coaxed brain cells in a petri dish to learn Pong faster than AI.
That lattermost experiment is the CL1's great grandparent, with continued scientific interest fostered by the hope that 'wetware' like lab-grown brain cells could give robotics and AI a serious leg-up. Whereas traditional AI can play something like the theatre kid favourite of 'yes, and' but totally lacks any true understanding of context, the lab-grown neurons could potentially learn and adapt.
Furthermore, the lab-grown cells are apparently much more energy efficient compared to the power demands of AI using more traditional, non-biological computers. Turns out the old noggin cells are still showing that new-fangled silicon a trick or two. Who would have thought?
However, there's no avoiding the question of ethics: what are these brain cells experiencing, and is it anything like sentience—or suffering? Perhaps my questions verge on the hyperbolic, but my own osseous brain box can do nothing but wonder.
Best gaming PC: The top pre-built machines.Best gaming laptop: Great devices for mobile gaming.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Data centers are at the heart of the AI revolution and here's how they are changing
Data centers are at the heart of the AI revolution and here's how they are changing

Yahoo

time13 hours ago

  • Yahoo

Data centers are at the heart of the AI revolution and here's how they are changing

When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

Data centers are at the heart of the AI revolution and here's how they are changing
Data centers are at the heart of the AI revolution and here's how they are changing

Yahoo

time13 hours ago

  • Yahoo

Data centers are at the heart of the AI revolution and here's how they are changing

When you buy through links on our articles, Future and its syndication partners may earn a commission. As demand for AI and cloud computing soars, pundits are suggesting that the world is teetering on the edge of a potential data center crunch—where capacity can't keep up with the digital load. Concerns and the hype have led to plummeting vacancy rates: in Northern Virginia, the world's largest data center market, for example, vacancy rates have fallen below 1%. Echoing past fears of "peak oil" and "peak food," the spotlight now turns to "peak data." But rather than stall, the industry is evolving—adopting modular builds, renewable energy, and AI-optimized systems to redefine how tomorrow's data centers will power an increasingly digital world. Future data centers will increasingly move away from massive centralized facilities alone, embracing smaller, modular, and edge-based data centers. The sector is already splitting out in hyperscale data centers one end and smaller, edge-oriented facilities on the other. Smaller, modular and edge data centers can be built in a few months and tend to be located closer to end users to reduce latency. Unlike the huge campuses of hyperscale with facilities often covering millions of square feet these smaller data centers are sometimes built into repurposed buildings such as abandoned shopping malls, empty office towers, and factories in disuse, helping requalify ex-industrial brownfield areas. Leaner centers can be rapidly deployed, located closer to end users for reduced latency, and tailored to specific workloads such as autonomous vehicles and AR. To address energy demands and grid constraints, future data centers will increasingly be co-located with power generation facilities, such as nuclear or renewable plants. This reduces reliance on strained grid infrastructure and improves energy stability. Some companies are investing in nuclear power. Nuclear power provides massive, always-on power that is also free of carbon emissions. Modular reactors are being considered to overcome grid bottlenecks, long wait times for power delivery, and local utility limits. Similarly, they will be increasingly built in areas where the climate reduces operational strain. Lower cooling costs and access to water enables the use of energy-efficient liquid-cooling systems instead of air-cooling. We will be seeing more data centers pop up in places like Scandinavia and the Pacific Northwest. Artificial intelligence will play a major role in managing and optimizing data center operations, particularly for cooling and energy use. For instance, reinforcement learning algorithms are being used to cut energy use by optimizing cooling systems, achieving up to 21% energy savings. Similarly, fixes like replacing legacy servers with more energy-efficient machines, with newer chips or thermal design, can significantly expand compute capacity, without requiring new premises. Instead of only building new facilities, future capacity will be expanded by refreshing hardware with newer, denser, and more energy-efficient servers. This allows for more compute power in the same footprint, enabling quick scaling to meet surges in demand, particularly for AI workloads. These power-hungry centers are also putting a strain on electricity grids. Future data centers will leverage new solutions such as load shifting to optimize energy efficiency. Google is already partnering with PJM Interconnection, the largest electrical grid operator in North America, to leverage AI to automate tasks such as viability assessments of connection applications, thus enhancing grid efficiency. Issues are typically not due to lack of energy but insufficient transmission capacity. In addition to this, fortunately, data centers are usually running well below full capacity specifically to accommodate future growth. This added capacity will prove useful as facilities accommodate unexpected traffic spikes, and rapid scaling needs without requiring new constructions. Future data center locations will be chosen based on climate efficiency, grid access, and political zoning policies but also availability of AI-skilled workforce. Data centers aren't server rooms—they're among the most complex IT infrastructure projects in existence, requiring seamless power, cooling, high-speed networking, and top-tier security. Building them involves a wide range of experts, from engineers to logistics teams, coordinating everything from semiconductors to industrial HVAC systems. Data centers will thus drive up the demand for high-performance networking, thermal, power redundancy, and advanced cooling engineers. It's clear that the recent surge in infrastructure demand to power GPUs and high-performance computing, for example, is being driven primarily by AI. In fact, training massive models like OpenAI's GPT-4 or Google's Gemini requires immense computational resources, consuming GPU cycles at an astonishing rate. These training runs often last weeks and involve thousands of specialized chips, drawing on power and cooling infrastructure. But the story doesn't end there: even when a model is trained, running these models in real-time to generate responses, make predictions, or process user inputs (so-called AI inference) adds a new layer of energy demand. While not as intense as training, inference must happen at scale and with low latency, which means it's placing a steady, ongoing load on cloud infrastructure. However, here's a nuance that's frequently glossed over in much of the hype: AI workloads don't scale in a straight-forward, linear fashion: doubling the number of GPUs or increasing the size of a model will not always lead to proportionally better results. Experience has shown that as models grow in size, the performance gains actually may taper off or introduce new challenges, such as brittleness, hallucination, or the need for more careful fine-tuning. In short, the current AI boom is real, but it may not be boundless. Understanding the limitations of scale and the nonlinear nature of progress is crucial for policymakers, investors, and businesses alike as they plan for data center demand that is shaped by AI exponential growth. The data center industry therefore stands at a pivotal crossroads. Far from buckling under the weight of AI tools and cloud-driven demand, however, it's adapting at speed through smarter design, greener power, and more efficient hardware. From modular builds in repurposed buildings to AI-optimized cooling systems and co-location with power plants, the future of data infrastructure will be leaner, more distributed, and strategically sited. As data becomes the world's most valuable resource, the facilities that store, process, and protect it are becoming smarter, greener, and more essential than ever. We list the best colocation providers. This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here:

Venus is at its farthest from the sun on June 1: Here's how to see the bright 'morning star' this weekend
Venus is at its farthest from the sun on June 1: Here's how to see the bright 'morning star' this weekend

Yahoo

time18 hours ago

  • Yahoo

Venus is at its farthest from the sun on June 1: Here's how to see the bright 'morning star' this weekend

When you buy through links on our articles, Future and its syndication partners may earn a commission. Venus reaches its point of greatest western elongation on June 1, at which time the dazzling 'morning star' will be at its most distant point from the sun in Earth's sky during its pre-dawn apparition. The rocky planet will hit the orbital milestone at 00.00 a.m. EDT (0400 GMT) on June 1, while Venus is below the horizon for skywatchers in the U.S, according to stargazing website At this time, Venus will be separated from the sun by a gulf of 46 degrees along the line of the ecliptic, which is the apparent path taken by the sun and planets as they journey through the constellations crowding the night sky. The best time to spot Venus for stargazers in the U.S. is during the pre-dawn hours on May 31 and June 1, when the planet will appear as a bright, magnitude -4.3 morning star rising over the eastern horizon, easily visible to the naked eye (remember, the brightest objects in the sky have lower or negative magnitudes). You'll need a telescope with an aperture of at least 60mm (2.4") to see the planet's disk, which appears half lit at this point in the Venutian orbit, according to telescope-maker Celestron. Venus has been a regular fixture in the morning sky following its inferior conjunction on March 22, when it passed between Earth and the sun, marking an end to its evening appearances. Its tight orbit around the sun ensures that Venus never strays far from the horizon, at least in comparison to Mars, Jupiter, Saturn, Uranus and Neptune, whose distant orbits allow them to be seen throughout the night when conditions allow. TOP TELESCOPE PICK: Want to see the planets of our solar system for yourself? The Celestron NexStar 4SE is ideal for beginners wanting quality, reliable and quick views of celestial objects. For a more in-depth look at our Celestron NexStar 4SE review. While June 1 may mark the point of greatest separation between the sun and Venus during its morning apparition, it won't be the highest that the planet will rise above the eastern horizon over the coming months. This is because a planet's altitude in the sky is dependent in part on the inclination of the ecliptic relative to the horizon, which shifts throughout the year due to Earth's wobbling orbit. Editor's Note: If you would like to share your astrophotography with readers, then please send your photo(s), comments, and your name and location to spacephotos@

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store