In think tanks around the world, analysts try to identify new developments in science and technology that might spur the next big technology wave. These trend spotters will undoubtedly be interested in new research conducted by Prof. Elisha Moses of the Physics of Complex Systems Department and his former research students Drs. Ofer Feinerman and Assaf Rotem. This team has taken the first step in creating circuits and logic gates made of live nerves grown in the lab. The research, which recently appeared in Nature Physics, could have implications for the future creation of biological interfaces between human brains and artificial systems.
Neurons are complex cells, and the networks they form are even more complex: In the brain, large numbers of connections to other neurons are what enable us to think. But when the same cells are grown in lab cultures, they turn out to be quite sluggish, reverting to a very primitive level of functioning. Why does this happen, and how can cells grown in culture be coaxed into becoming more "brainy"?
Moses, Feinerman and Rotem investigated the idea that the design of the nerve network's physical structure might be the key to improving its function. Being physicists – scientists who are used to working with simplified systems – they created a model nerve network in a single dimension, one defined by a straight groove etched on a glass plate. The neurons were "allowed" to develop only along this line. When grown this way, the researchers found the nerve cells could, for the first time ever, be stimulated with a magnetic field. (Beforehand, neurons in culture had been successfully stimulated only with electricity.)
The scientists then investigated the lines further to see if the width of the neuron stripe had any effect on the signals transmitted from cell to cell. Growing the neuronal networks on lines of varying widths, they discovered a threshold thickness – one that allowed for the development of about 100 axons (nerve cell extensions). Below this number, the chances of a neuron passing on a signal it received were iffy. This is because the cells are "programmed" to receive a minimum number of incoming signals before they fire off another one in response. A large number of axonal connections to other nerve cells – say, several hundred – will more or less ensure a reliable response, whereas cells that are connected to just a few others will fire only sporadically. With about 100 neuron connections, signals are sometimes passed on and sometimes not, but adding just a few more raises the odds of a response quite a bit.
Putting these findings to work, the scientists used two thin stripes of around 100 axons each to create a logic gate similar to one in an electronic computer. Both of these "wires" were connected to a small number of nerve cells. When the cell received a signal along just one of the "wires," the outcome was uncertain; but a signal sent along both "wires" simultaneously was assured of a response. This type of structure is known as an AND gate.
The next structure the team created was yet more complex: Triangles fashioned from the neuron stripes were lined up in a row, point to rib, in a way that forced the axons to develop and send signals in one direction only. Several of these segmented shapes were then linked to one another in a loop to create a closed circuit. The regular relay of nerve signals around the circuit turned it into a sort of biological clock or pacemaker.
Moses: "By creating simple computation tools from neurons with unreliable connections, we learn about the problems the brain must overcome when designing its own complicated 'thought machinery'. Simple animals rely on single nerves to direct their behavior, so these must be very reliable; but our nervous system is built of huge networks of nerve cells that enable a wide range of responses. The extreme sophistication of our neurons carries with it a high risk of error, and a large number of cells is needed to reduce the likelihood of mistakes. Right now we're asking: 'What do nerve cells grown in culture require to be able to carry out complex calculations?' As we find answers, we get closer to understanding the conditions needed for creating a synthetic, many-neuron 'thinking' apparatus."
Rotem: "Neuron-based circuits are still much slower than the electronic circuits existing today. But they might offer the promise of performing calculations in parallel, which are impossible on today's computers."
Today, this research is contributing to our understanding of the physical principles underlying thought and brain function. In the more distant future, it might serve as the basis for designing new methods of connecting the human brain to various artificial devices and systems.