Appendix A. Interactive Neural Networks
This appendix explores fundamental notions of neural networks with basic Python code—on the basis of both simple and shallow neural networks. The goal is to provide a good grasp and intuition for important concepts that often disappear behind high-level, abstract APIs when working with standard machine and deep learning packages.
The appendix has the following sections:
-
“Tensors and Tensor Operations” covers the basics of tensors and the operations implemented on them.
-
“Simple Neural Networks” discusses simple neural networks, or neural networks that only have an input and an output layer.
-
“Shallow Neural Networks” focuses on shallow neural networks, or neural networks with one hidden layer.
Tensors and Tensor Operations
In addition to implementing several imports and configurations, the following Python code shows the four types of tensors relevant for the purposes of this appendix: scalar, vector, matrix, and cube tensors. Tensors are generally represented as potentially multidimensional ndarray
objects in Python. For more details and examples, see Chollet (2017, ch. 2):
In
[
1
]
:
import
math
import
numpy
as
np
import
pandas
as
pd
from
pylab
import
plt
,
mpl
np
.
random
.
seed
(
1
)
plt
.
style
.
use
(
'
seaborn
'
)
mpl
.
rcParams
[
'
savefig.dpi
'
]
=
300
mpl
.
rcParams
[
'
font.family
'
]
=
'
serif
'
np
.
set_printoptions
(
suppress
=
True
)
In
[
2
]
:
t0
=
np
.
array
(
10
)
t0
Out
[
2
]
:
array
(
10
)
In
[
3
]
:
t1
=
np
.
array
(
(
2
,
1
)
)
t1
Out
[
3
]
:
array
(
[
2
,
1
]
)
In
[
4
Get Artificial Intelligence in Finance now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.