They Manage to Connect 16 Mini Human Brains to Create a Low-energy 'Biocomputer'

Swiss startup FinalSpark uses an approach known as wetware computing, using lab-grown human brain cells. This type of bioprocessor consumes a million times less energy than traditional digital processors.

Brain artificial intelligence neural networks neurons
It is becoming increasingly necessary to find ways to make computing more energy efficient. The wetware computing approach could be an interesting alternative.

In times where artificial intelligence advances by leaps and bounds, a new Swiss company has just presented a biocomputer that connects to living and pulsating brain cells, managing to demonstrate a minimal consumption of energy with respect to the growing energy demand shown by bit-based computing.

FinalSpark's online platform "takes advantage of" spherical groups of human brain cells cultured in the laboratory, called organoids. A total of 16 organoids are housed in four matrices connected to eight electrodes each and to a microfluid system that supplies water and nutrients to the cells.

In this case, the approach, known as wetware computing, takes advantage of the ability of researchers to grow organoids in the laboratory, a fairly new technology that allows scientists to study what are essentially mini-replicas of individual organs.

Brains vs. machines: immeasurable energy savings

The rise of organoids as a research technique occurs at a time when artificial neural networks, on which large linguistic models such as Chat GPT are based, have also skyrocketed in use and processing power.

Brain artificial intelligence neural networks neurons
The mini-brains can be maintained for up to 100 days and measure their electrical activity 24 hours a day.

FinalSpark affirms that the so-called bioprocessors, like the brain-machine interface system they are developing, "consume a million times less energy than traditional digital processors."

According to Science Alert, although no figures are available on its specific system, its energy consumption or its processing power, the FinalSpark research team affirms that training a single large language model such as the GPT-3, precursor of the GPT-4, required 10 gigawatt hours or about 6,000 times the energy consumed by a European citizen per year.

Meanwhile, the human brain operates its 86 billion neurons using only a fraction of that energy: just 0.3 kilowatt hours a day.

Technological trends also indicate that the thriving AI industry will consume 3.5% of the world's electricity by 2030. The computer industry as a whole is already responsible for about 2% of the world's CO2 emissions.

In this context it becomes increasingly necessary to find ways to make computing more energy efficient, and the synergies between neural networks and computer circuits are an obvious parallel to explore.

Steps to the future

FinalSpark is not the first team that tries to connect probes to biological systems or reliably program neural networks so that they perform specific input-output functions when ordered.

Although the final goal may be new energy-efficient computer approaches, for now the system is used so that researchers can carry out long experiments on brain organoids, just like their predecessors.

However, there are some improvements: The FinalSpark team affirms that researchers can connect to their system remotely and that the mini-brains can be maintained for up to 100 days and measure their electrical activity 24 hours a day.

"In the future, we plan to expand the capabilities of our platform to manage a wider range of experimental protocols relevant to wetware computing," such as the injection of molecules and drugs into organoids to perform tests, the team concludes.

Reference of the news:

Jordan F., et.al., Open and remotely accessible Neuroplatform for research in wetware computing, Frontiers in Artificial Intelligence.