O'Reilly logo

Hands-On Machine Learning with C# by Matt R. Cole

Stay ahead with the world's most comprehensive technology and business learning platform.

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, tutorials, and more.

Start Free Trial

No credit card required

Optimizers

Optimization algorithms minimize or maximize an error function depending on the model's parameters. Examples of parameters would be weights and biases. They help compute the output value and update the model towards the position of optimal solution by minimizing loss. Extending Kelp.Net to add your own optimization algorithms is a simple process, although adding the OpenCL and resource side of things is a coordinated effort.

Kelp.Net comes with many predefined optimizers, such as:

  • AdaDelta
  • AdaGrad
  • Adam
  • GradientClipping
  • MomentumSGD
  • RMSprop
  • SGD

These are all based on the abstract optimizer class.

 

With Safari, you learn the way you learn best. Get unlimited access to videos, live online training, learning paths, books, interactive tutorials, and more.

Start Free Trial

No credit card required