Data centers powered by brain cells
A startup is growing human brain cells on silicon chips to create biological computers that use far less power.
Australian startup Cortical Labs is pioneering a radical new approach to computing with its CL1 chip, a system that integrates living biological neurons with traditional silicon hardware. The core technology involves cultivating a dense layer of human neurons—derived from stem cells—inside a specialized microfluidic chamber that provides a constant flow of nutrients and oxygen. These neurons grow across and interface with a high-density multi-electrode array (MEA) chip, which both stimulates the neural network with precise electrical impulses and reads its resulting electrical activity. The company claims this "wetware" system can be trained, through methods akin to biological learning, to perform specific computational tasks like pattern recognition and signal processing with extreme energy efficiency.
This development represents a significant leap toward biological computing, where the innate parallel processing and adaptive learning capabilities of neural tissue are harnessed for information processing. Unlike large language models (LLMs) that run on power-hungry GPU clusters, the CL1's biological neural network (BNN) operates on a fraction of the energy. The long-term vision is to create scalable, hybrid bio-silicon data centers where these low-power biological processors handle specific workloads, potentially revolutionizing the economics and environmental impact of large-scale AI. While still in the R&D phase, the technology has demonstrated proof-of-concept by learning to play simple video games like Pong, showcasing its ability to process information and adapt its responses based on feedback.
- The CL1 chip grows a functional layer of living human neurons on a silicon multi-electrode array, creating a direct bio-electrical interface.
- The biological neural network (BNN) can be trained for computational tasks and operates with up to 100x greater energy efficiency than digital hardware for certain workloads.
- The technology is a proof-of-concept for future hybrid data centers, aiming to mitigate the massive power consumption of traditional AI training and inference.
Why It Matters
It could fundamentally alter the power footprint of advanced computing, making large-scale AI more sustainable and economically viable.