5Linear and Rule-Based Models

5.1 Least Squares Methods

The least square approach involves minimizing the sum of the squares of the offsets (residual component) of the points from the curve to get the best-fitting curve or line of best fit for a group of data points. The trend of outcomes is statistically evaluated throughout the process of determining the relationship between two variables. Regression analysis is the name for this procedure. Curve fitting is a technique for doing regression analysis. The least squares approach of fitting equations approximates the curves to provided raw data.

Depending on whether the residuals in all unknowns are linear, there are two forms of least squares problems: linear or ordinary least squares and nonlinear least squares. The linear least squares problem arises in statistical regression analysis; it has a closed-form solution. When dealing with nonlinear issues, iterative refinement is utilized; the system is approximated by a linear one at each iteration, and the core computation is the same in both cases.

5.2 The Perceptron

In a linear classification algorithm, the perceptron is used to classify data. This implies that it learns a decision boundary in the feature space that divides two classes using a line (known as a hyperplane). As a result, it is best suited to issues in which the classes can be easily separated using a line or linear model, sometimes known as linearly separable problems [1]. It is made up of a single node or neuron ...

Get Machine Learning for Industrial Applications now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.