The cost function

To train our linear-regression algorithm to generate accurate predictions, we need a cost function. A cost function is one way to analyze how well our prediction function is working for our dataset. It gives us an error rate, which is essentially the difference between the real-world result versus the prediction. The smaller the error, the better predictions we'll make. The cost function takes the real result and the prediction to output the error from our model, like so:

error = result - prediction

There are many different types of cost functions. In our case, we'll use the mean squared error (MSE) cost function, which looks like this:

error = sum((result - prediction)2) / numberOfDataPoints

To make it clearer, we can add ...

Get Mastering Ethereum now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.