March 2018
Beginner to intermediate
570 pages
13h 42m
English
We can get a more detailed look at our classifier's accuracy via a confusion matrix. You can get R to give up a confusion matrix with the following command:
table(test[,9], preds)
preds
neg pos
neg 86 9
pos 28 31
The columns in this matrix represent our classifier's predictions; the rows represent the true classifications of our testing set observations. If you recall from Chapter 3, Describing Relationships, this means that the confusion matrix is a cross-tabulation (or contingency table) of our predictions and the actual classifications. The cell in the top-left corner represents observations that didn't have diabetes that we correctly predicted as non-diabetic (true negatives). In contrast, the cell in the lower-right ...