摘要:Biologists are finally beginning to corral molecules, cells and whole organisms to carry out complex computations. These living processors could find use in everything from smart materials to new kinds of artificial intelligence

WHAT’S the difference between a thimbleful of bacteria and a supercomputer? Believe it or not, the bacteria contain more circuits and more processing power.

That is perhaps not so surprising when you consider that all life computes: from individual cells responding to chemical signals to complex organisms navigating their environment, information processing is central to living systems. What’s more intriguing, however, is that after decades of trying we are finally starting to corral cells, molecules and even whole organisms to carry out computational tasks for our own ends.

That isn’t to say biological computers will replace the microchips you find in your smartphone or laptop, never mind supercomputers. But as bioengineers get to grips with the wet and squishy components nature provides, they are beginning to figure out where biological computers might ultimately be useful – from smart materials and logistics solutions to intelligent machines powered by tiny amounts of energy.

If the applications seem unusual and eclectic, that is the point. “Biocomputing is not competing against conventional computers,” says Angel Goñi-Moreno at the Technical University of Madrid in Spain. “It’s a radically different point of view that could help us tackle problems in domains that were simply not reachable before.” It might even force us to rethink our assumptions about what computing is, and what it can do for us.

For decades, computing has been dominated by silicon chips. These are made up of billions of tiny switches called transistors that encode data in bits, or binary digits. If a switch is open and electrical current is allowed to flow, this represents a 1. If it is closed and the current is blocked it represents a 0. What makes chips so powerful is the way they are wired up. Transistors are arranged into logic gates, which take one or more bits as input and then output a single bit based on simple rules. By piling millions of these simple operations on top of each other, it is possible to carry out incredibly complicated computations.

Read more: Memcomputer chips could solve tasks that defeat conventional computers

This has brought us a long way, and yet it isn’t the only way. At their heart, computers are just information processors, and there is growing recognition that nature is rich with such capabilities. The most obvious example lies in the nervous systems of complex organisms, which process data from the environment to direct all kinds of sophisticated animal behaviour. But even the tiniest cells are replete with intricate biomolecular pathways that respond to incoming signals by switching genes on and off, producing chemicals or self-organising into complex tissues. And ultimately, all of life’s incredible feats rely on DNA’s ability to store, replicate and transmit the genetic instructions that make them possible.

Building biological computers

Biological systems also have some peculiar advantages over existing technology. They tend to be far more energy efficient, they sustain and repair themselves, and they are uniquely adapted to processing signals from the natural world. They are also astonishingly compact, of course. “The incredible thing about biology is that if you take all the DNA that’s in 1 millilitre of bacteria, there’s enough information storage for the entire internet and there are as many circuits as billions of [silicon] processors,” says Chris Voigt, a synthetic biologist at the Massachusetts Institute of Technology.

We have been trying to leverage these abilities since the 1990s. In the past 20 years, armed with new and more powerful tools to engineer cells and molecules, researchers have finally begun to demonstrate the potential of using biological material to build computers that actually work.

At the core of the approach is the idea that cellular processes can be thought of as “biological circuits”, says Voigt – analogous to the electrical ones found in computers. These circuits involve various biomolecules interacting to take an input and process it to generate a different output, much like their silicon counterparts. By editing the genetic instructions that underpin these processes, we can now rewire these circuits to carry out functions nature never intended.

In 2019, a group at the Swiss Federal Institute of Technology in Zurich built the biological equivalent of a computer’s central processing unit (CPU) from a modified version of the protein used in CRISPR gene editing. This CPU was inserted into a cell where it regulated the activity of different genes in response to specially designed sequences of RNA, a form of genetic material, letting the researchers prompt the cell to implement logic gates akin to those in silicon computers.

Read more: Genetically engineered bacteria have learned to play tic-tac-toe

A group at the Saha Institute of Nuclear Physics in India took things a step further in 2021, coaxing a colony of Escherichia coli bacteria to compute the solutions to simple mazes. The circuitry was distributed between several strains of E. coli, each engineered to solve part of the problem. By sharing information, the colony successfully worked out how you could navigate multiple mazes.

To be clear, these circuits operate orders of magnitudes slower than electronic ones and are rudimentary in comparison. Their power lies in the opportunity they offer to implement programs that interface directly with living systems, says Voigt. They could be used to create everything from tiny robots that treat disease inside the body to complex, multi-step biomanufacturing processes, he says. “You don’t have to beat computers to be useful. The real valuable stuff early on is just simple control over the biology,” says Voigt.

However, thinking about cellular processes in terms of circuits could be short-sighted, says Goñi-Moreno: “We are trying to force our electrical engineering mindset into living systems and that’s not necessarily how they work.” The thing is, most biological systems aren’t limited to the binary logic of classical computers. They also don’t work through problems step-by-step, like computer chips. They are full of duplications, strange feedback loops and wildly different processes operating side-by-side at various speeds.

Failing to account for this complexity often results in biological circuits not performing as expected, says Goñi-Moreno, and it means the full functionality of cells isn’t being exploited. Conversely, finding ways to model and rewire biochemical interactions within and between living cells could bring more ambitious goals into reach, he says. To that end, Goñi-Moreno is trying to create multicellular communities of soil bacteria that can switch between removing different pollutants depending on which is more prevalent.

Biology’s powers of computation might also be exploited in ways that are entirely divorced from their natural context. Heiner Linke at Lund University in Sweden has been experimenting with a radically different approach to biocomputing, using tiny protein filaments propelled around a maze by molecular motors. The proposal is aimed at solving a class of tricky computational challenges called NP-complete, for which the only known way to find the answer is to try every possible solution.

These problems crop up in everything from logistics network planning to computer chip design. But they present a challenge for conventional computers because the sequential way in which they operate means that as the problem gets bigger, the time taken to check every solution rises exponentially.

Linke’s approach offers a workaround. The structure of the maze is carefully designed to encode the problem that needs to be solved,with every possible path through it corresponding to a potential solution. For instance, this could involve exploring the quickest route a truck could take between multiple stops. As the filaments whizz around the maze, they explore every option such that, by counting the number of filaments that pop out at predetermined exits, you can work out which path represents the correct answer.

The beauty of the approach, says Linke, is that the filaments explore all routes simultaneously. That means solving a bigger problem doesn’t require more time, just more filaments. And because they use about 10,000 times less energy per operation than electronic transistors, scaling up is far more feasible than with conventional computers.