131

0

false
1

true

true
false

OR

This is such a common occurrence that there already exist many functions that deal
with Boolean data. Functions such as
OR, AND, NAND, NOR, and NOT are all Boolean func‐
tions. They take in two inputs that are true or false and output something that is true
or false.
These have been used for great advances in the electronics community through digi‐
tal logic gates and can be composed together to solve many problems. But how would
we go about constructing something like this?
A simple example of modeling the
OR function would be the following:
OR a, b = min 1, a + b
Perceptrons
Perceptrons take the idea of Boolean logic even further to include more fuzzy logic.
They usually involve returning a value based on a threshold being met or not. Lets
say that you’re a teacher and you wish to assign pass/fail grades to your students at
the end of the quarter. Obviously you need to come up with a way of cutting off the
people who failed from the ones who didnt. This can be quite subjective but usually
follows a general procedure of:
def threshold(x):
if sum(weights * x) + b > 0.5:
return 1
else:
return 0
x is a vector of all the grades you collected the entire quarter and weights is a vector
of weightings. For instance, you might want to weight the final grade higher.
b is just
a freebie to the students for showing up.
Using such a simple formula we could traverse the optimal weightings by determin‐
ing a priori how many people wed like to fail. Let’s say we have 100 students and only
want to fail the bottom 10%. This goal is something we can actually code.
How to Construct Feed-Forward Neural Nets
There are many different kinds of neural networks, but this chapter will focus on
feed-forward networks and recurrent networks.
What makes neural networks special is their use of a hidden layer of weighted func‐
tions called neurons, with which you can effectively build a network that maps a lot
of other functions (Figure 8-2). Without a hidden layer of functions, neural networks
would be just a set of simple weighted functions.
Perceptrons | 131

This is such a common occurrence that there already exist many functions that deal
with Boolean data. Functions such as
OR, AND, NAND, NOR, and NOT are all Boolean func‐
tions. They take in two inputs that are true or false and output something that is true
or false.
These have been used for great advances in the electronics community through digi‐
tal logic gates and can be composed together to solve many problems. But how would
we go about constructing something like this?
A simple example of modeling the
OR function would be the following:
OR a, b = min 1, a + b
Perceptrons
Perceptrons take the idea of Boolean logic even further to include more fuzzy logic.
They usually involve returning a value based on a threshold being met or not. Lets
say that you’re a teacher and you wish to assign pass/fail grades to your students at
the end of the quarter. Obviously you need to come up with a way of cutting off the
people who failed from the ones who didnt. This can be quite subjective but usually
follows a general procedure of:
def threshold(x):
if sum(weights * x) + b > 0.5:
return 1
else:
return 0
x is a vector of all the grades you collected the entire quarter and weights is a vector
of weightings. For instance, you might want to weight the final grade higher.
b is just
a freebie to the students for showing up.
Using such a simple formula we could traverse the optimal weightings by determin‐
ing a priori how many people wed like to fail. Let’s say we have 100 students and only
want to fail the bottom 10%. This goal is something we can actually code.
How to Construct Feed-Forward Neural Nets
There are many different kinds of neural networks, but this chapter will focus on
feed-forward networks and recurrent networks.
What makes neural networks special is their use of a hidden layer of weighted func‐
tions called neurons, with which you can effectively build a network that maps a lot
of other functions (Figure 8-2). Without a hidden layer of functions, neural networks
would be just a set of simple weighted functions.
Perceptrons | 131

x

b

100

10
％的学生“不通过”

8-2
）。如果没有隐藏层的这些函数，神经网络只是一组简单的加权函数罢了。

8-2：前馈网络

20

10

5

20-10-5

20-7-7-5
（中间两个
7
，代表两个具有
7

8-3

132
8

8-3：神经网络的输入层

XOR
）为一

8-1

8-1XOR 真值表
A B XOR(A,B)
0 0 0
0 1 1
1 0 1
1 1 0

8-4
）。给定两组数据，阴影区域显示这

XOR

8-4：维恩图中的 XOR 函数（图片来源：Wikimedia

true

false
。然而，神经

true

false

Get Python 机器学习实践：测试驱动的开发方法 now with O’Reilly online learning.

O’Reilly members experience live online training, plus books, videos, and digital content from 200+ publishers.