As we mentioned earlier, we need to tune the parameters in order to see where we will get the best possible machine learning model. Tuning the parameters is a kind of de-facto standard in any machine learning model. In the following code, we have tried various combinations for number of iterations, latent factors, and learning rate. The entire code will remain more or less the same, but we always keep a tab on the least minimum errors we have seen; if any new errors comes up as less than the existing errors, we print the combinations accordingly:
# Grid Search on Collaborative Filtering >>> niters = [20,50,100,200] >>> factors = [30,50,70,100] >>> lambdas = [0.001,0.01,0.05,0.1] >>> init_error = float("inf") ...