April 2026
461 pages
17h 56m
English
This is very easy! First, the network gets initialized, and then the predict method is called (see Listing 5.6). We implemented the initialization with weights that we had previously determined as a possible solution to the XOR problem. It’s important to note that the weights apply to the sigmoid activation function, not to the relu!
def main(): # Initialization of the weights W_HI = np.matrix([[0.0,0.0,0.0],[-10,20.0,20.0],[30,-20.0,-20.0]]) W_OH = np.matrix([[0.0,0.0,0.0],[-30,20.0,20.0]]) weights=[] weights.append(W_HI) weights.append(W_OH) nn = MLP(weights=weights) # Output network nn.print() # Test X=np.array([[1.0,1.0,1.0],[1.0,0,1.0],[1.0,1.0,0],[1.0,0,0]]) y=np.array([0,1.0,1.0,0]) print('Predict:') for idx,x in enumerate ...
Read now
Unlock full access