The brain as a computer
Modelling computers after the brain could revolutionize robotics and big data
- In the human brain, computation and data storage are accomplished together locally in a vast network consisting of roughly 100 billion neural cells (neurons) and more than 100 trillion connections (synapses).
- A crucial difference between brains and computers is that the brain accomplishes all its information processing without a central clock to synchronize it.
- But deep-learning networks are still a long way from the computational performance, energy efficiency, and learning capabilities of biological brains.
- The big gap between the brain and today’s computers is perhaps best underscored by looking at large-scale simulations of the brain.
- Simulating 1.73 billion neurons consumed 10 billion times as much energy as an equivalent size portion of the brain, even though it used very simplified models and did not perform any learning.
- The reason is that simulating the brain on a conventional computer requires billions of differential equations coupled together to describe the dynamics of cells and networks: analog processes like the movement of charges across a cell membrane.
- One of the core arguments behind what Mead came to call “neuromorphic” computing was that semiconductor devices can, when operated in a certain mode, follow the same physical rules as neurons do and that this analog behavior could be used to compute with a high level of energy efficiency.
- The TrueNorth chip, developed by Dharmendra Modha and his colleagues at the IBM Research laboratory in Almaden, Calif., abandons the use of microprocessors as computational units and is a truly neuromorphic computing system, with computation and memory intertwined.
- The furthest away from conventional computing and closest to the biological brain is the BrainScaleS system, developed at Heidelberg University, in Germany, for the Human Brain Project.