4Artificial Neural Networks
There is nothing new about the idea of drawing inspiration from the brain for information processing. Back in 1943, Warren S. McCulloch and Walter Pitts (McCulloch and Pitts 1943) published an article in an attempt to understand how the brain could produce very complex models using basic cells, called neurons, linked together. The neuron model proposed by McCulloch and Pitts may have been highly simplified, yet this contribution proved essential to the development of artificial neural networks.
The next major contribution to occur in the study of neural networks was the concept of the perceptron, introduced by Frank Rosenblatt in 1958 (Rosenblatt 1958). Intrinsically, the perceptron represents an improvement on the neuron proposed by McCulloch and Pitts: the inputs are firstly allocated a weight before being summed, and a threshold function, which may or may not be particularly complex, determines a state of the output. We will explore the perceptron in detail in section 4.1.
Simultaneously to the F. Rosenblatt perceptron, J. Von Neumann’s book, “The Computer and the Brain”, was published. This book remained unfinished, as the mathematician, who established the architecture of our computers (see Chapter 1), died before it was published. This book explains how the brain can be viewed as a computing machine. The book is speculative in nature, but Von Neumann discusses the differences between the brain and the computers of his day, such as information ...
Get Neuro-inspired Information Processing now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.