What can the human brain do for a computer? There's at least one team of researchers that thinks it might have the answer.
Working at IBM Research–Almaden in San Jose, California, they have just released more details of TrueNorth, a computer chip composed of one million digital "neurons".
Under way for several years, the project abandons traditional computer architecture for one inspired by biological synapses and axons. The latest results, published in Science, provide a timely reminder of the promise of brain-inspired computing.
The human brain still crushes any modern machines when it comes to tasks like vision or voice recognition. What's more, it manages to do so with less energy than it takes to power a light bulb.
Building those qualities into a computer is an alluring prospect to many researchers, like Kwabena Boahen of Stanford University in California.
"The first time I learned how computers worked, I thought it was ridiculous," he says. "I basically felt there had to be a better way."
Aping the brain's structure could help us build computers that are far more powerful and efficient than today's, says TrueNorth team leader Dharmendra Modha. "We want to approximate the anatomy and physiology, the structure and dynamics of the brain, within today's silicon technology," he says. "I think that the chip and the associated ecosystem have the potential to transform science, technology, business, government and society."
But how best to go about building a proper artificial brain is a matter of debate. Neurons are analogue: they operate using an ever-changing continuum of signals. This stands in stark contrast to computers, digital machines that work with discrete 1s and 0s. Brain-inspired computing projects must find a way to navigate somewhere between the analogue and digital realms. The former more accurately hews to how the brain works, but the latter can be easier to program and scale.
With TrueNorth, IBM is taking a digital approach, building a custom computer system that essentially mimics a neuron's activity.
Other groups, like Boahen's, are still largely digital but incorporate analog components that can handle information streaming in from many sources.
And while there are a number of groups hacking away at the problem, there is no obvious test yet to declare who is doing best.
Comparisons involve factors such as device size, power consumption, speed and, of course, which system can handle the trickier tasks, but how much importance you give to each factor is a matter of preference.
Explaining the quirks
These projects may also help us accomplish quite the opposite goal: to see what we can learn about the brain by reverse-engineering it. There are still many unanswered questions about how the brain works and why it works so well. If a system builds in some of the brain's mysterious quirks – such as the time delay between a neuron firing and the next occasion it fires, or the occasional signal that gets sent backward along the line – perhaps it can help illuminate why they're there in the first place.
"These things, we know that they are there in biology. There is experimental proof for that. But we do not know what their computational purpose is," says Karlheinz Meier, co-director of the Human Brain Project based in Lausanne, Switzerland. "It may just be an artefact of evolution, which is not important, but it also may have a big impact on the ability to compute."
Journal reference: Science, DOI: 10.1126/science.1254642