7Feature Prioritization and Selection Methods

7.1 Introduction

Feature selection is an important step to reduce dimensions, whereby less important features are removed and ultimately limited to a small subset of main features. Choosing the optimal feature can improve learning performance, accuracy, and lower computational costs. This section is devoted to optimal feature selection methods and provides an overview of the types of features, methods, and techniques.

7.2 A Variety of Features Selection Methods

There are various methods for selecting effective features through which a subset of input variables is selected that can describe the input data more effectively and the negative effects of additional variables and input errors through topics, such as noise or additional variables. Subsequently, better predictive results, higher speeds, and lower computational complexity can be achieved. One analytical application of feature selection is to influence the properties of a category, or the extent to which features overlap, to describe an issue. Standard data can contain hundreds of attributes, many of which may be closely related to other variables (for example, when two attributes are perfectly related, only one attribute is sufficient to describe the data, and one additional attribute needs to be removed). Dependent variables do not provide any useful information about classification and are, therefore, considered a redundant feature for classification. This means that ...

Get Automation and Computational Intelligence for Road Maintenance and Management now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.