Computer engineers and scientists have long recognized that the brain computes in an entirely different way than that used by classical digital computers. The brain is a parallel computer many times faster than any modern computer. It performs routinely, within milliseconds, pattern recognition tasks such as recognizing a face in a crowd, whereas sequential computers may spend days on a recognition task of much lesser difficulty, and then fail to converge.

In 1943 McCulloch and Pitts introduced the idea of an artificial neural network, modeled after the perceived behavior of the brain. An artificial neural network is usually trained for a given task, such as recognizing a specific pattern in their input signals, but with effort, they can be trained to recognize several different patterns. They are quite adaptable, and if designed to do so, they can recognize a completely different set of patterns.

Artificial neural networks employ standard cells of artificial neurons usually connected as a tree with a great many inputs. Cells work in parallel and must be trained to meet given requirements. Training occurs by systematically varying the synaptic weights associated with each artificial neuron. Synaptic weights are analog, real numbers, positive or negative. Overall, the synaptic model is best described as an analog-to-digital converter. This is in contrast to the all-digital model for learning presented in this ...

Get Human Memory Modeled with Standard Analog and Digital Circuits: Inspiration for Man-made Computers now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.