The train function

We can begin to improve our model with a new function that loops through several optimization calls until we reach the minimum, at which point the model will be fully optimized. Here's what it looks like:

# Python implementationdef train(results, weight, bias, xs, learningRate, iterations): error = 0 for i in range(iterations):    weight, bias = optimizeWeightBias(results, weight, bias, xs, learningRate)    error = cost(results, weight, bias, xs)    print("Iteration: {}, weight: {:.4f}, bias: {:.4f}, error: {:.2}".format(i, weight, bias, error)) return weight, bias

The Solidity implementation looks pretty similar, although we have to make sure that the results and independent variables, values have the same length to avoid errors, ...

Get Mastering Ethereum now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.