We begin with a description of the techniques used in this recipe:
Undersampling: A very simple method to deal with class imbalance is to undersample the majority class—draw random samples from the majority class to obtain a 1:1, or any other desired ratio between the target classes. Using this method can lead to some issues, such as lower accuracy of the model trained on undersampled data (due to information loss, caused by discarding the majority of the training set). Another possible implication is the increased number of false positives, as the distribution of the training and test sets is not the same after the resampling. This results in a biased classifier.
Oversampling: In this approach, we sample multiple times with ...