8Hyperparameter Tuning of Ensemble Classifiers Using Grid Search and Random Search for Prediction of Heart Disease
Dhilsath Fathima M.1* and S. Justin Samuel2
1Sathyabama Institute of Science and Technology, Chennai, Tamil Nadu, India
2Department of Computer Science and Engineering, PSN Engineering College, Tirunelveli, Tamil Nadu, India
Abstract
Tuning the model hyperparameters is an essential step in developing a high-performance machine learning model. Grid search algorithms and random search algorithms are used in machine learning to tune the hyperparameters of ML algorithms. Ensemble learners are a category of a machine learning algorithm. Ensemble classifiers are divided into two types: bagging, which is a parallel ensemble model, and boosting, which is a sequential ensemble model. The proposed work uses two boosting classifiers, the Adaboost Algorithm and Gradient boosting algorithm, and one bagging classifier, the Random forest algorithm. A model for early heart disease prediction has developed using the Adaboost classifier, random forest, and gradient boosting classifier. The Cleveland heart disease dataset is used to train and validate the ensemble classifiers in this heart disease prediction model. When comparing the performance of these ensemble learners, gradient boosting algorithms outperform AdaBoost and random forest classifiers. This paper evaluated the efficiency of the grid search algorithm and random search algorithm via tuning the hyperparameters of the ...
Get Computational Intelligence and Healthcare Informatics now with the O’Reilly learning platform.
O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.