Evaluation
Before applying our multilayer perceptron to understand fluctuations in the currency market exchanges, let's get acquainted with some of the key learning parameters introduced in the first section.
The execution profile
Let's take a look at the convergence of the training of the multiple layer perceptron. The monitor trait (refer to the Monitor section under Utility classes in the Appendix A, Basic Concepts) collects and displays some execution parameters. We select to extract the profile for the convergence of the multiple layer perceptron using the difference of the backpropagation errors between two consecutive episodes (or epochs).
The test profiles the convergence of the MLP using a learning rate of η = 0.03 and a momentum factor ...
Get Scala for Machine Learning now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.