Optimizing with the cross-entropy loss function

The last thing we need to do is define the loss function and set up the optimizer. We already know that we can use the cross-entropy loss function for the binary classification problem. Fortunately, the major loss functions are already provided by TensorFlow.js. We are going to use the tf.losses.sigmoidCrossEntropy API to calculate the loss function this time:

// A function to return the loss value calculated from the given input and prediction.const loss = (pred, label) => {  return tf.losses.sigmoidCrossEntropy(pred, label).asScalar();}// Adam optimizerconst optimizer = tf.train.adam(0.07);

As usual, we can iterate the optimization process. The following experiment iterates the optimization ...

Get Hands-On Machine Learning with TensorFlow.js now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.