Glossary

Activation function

The function at the last stage of a neural network layer. For example, a rectified linear unit (relu) function may be applied on the result of the matrix multiplication to generate the final output of a dense layer. An activation function can be linear or nonlinear. Nonlinear activation functions can be used to increase the representational power (or capacity) of a neural network. Examples of nonlinear activations include sigmoid, hyperbolic tangent (tanh), and the aforementioned relu.

Area under the curve (AUC)

A single number used to quantify the shape of an ROC curve. It is defined as the definite integral under the ROC curve, from false positive rate 0 to 1. See ROC curve.

Axis

In the context of TensorFlow.js, ...

Get Deep Learning with JavaScript now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.