3. Neural Networks
We learned about perceptrons in the previous chapter, and there is both good news and bad news. The good news is that perceptrons are likely to represent complicated functions. For example, the perceptron can (theoretically) represent complicated processes performed by a computer, as described in the previous chapter. The bad news is that weights must be defined manually first before the appropriate weights are determined in order to meet the expected inputs and outputs. In the previous chapter, we used the truth tables with AND and OR gates to determine the appropriate weights manually.
Neural networks exist to solve the bad news. More specifically, one important property of a neural network is that it can learn appropriate ...
Get Deep Learning from the Basics now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.