- True Positives (TP):
- Predicted YES, and it turns out correct as a YES
- True Negatives (TN):
- Predicted NO, and it turns out correct as a NO
- False Positives (FP):
- Predicted YES, but it turns out to be a NO. (Also known as a "Type I error.")
- False Negatives (FN):
- Predicted NO, but it turns out to be a YES. (Also known as a "Type II error.")
- Confusion Matrix Rates
- Accuracy (Overall, how often is the classifier correct?):
- Misclassification Rate or Error Rate (Overall, how often is the classifier wrong?):
- (FP+FN)/(TP+TN+FP+FN) {or equivalent to 1 minus Accuracy}
- Sensitivity, Recall, or True Positive Rate (When it's actually YES, how often does it predict YES?):
- Specificity (When it's actually NO, how often does it predict NO?):
- Precision (When it predicts YES, how often is it correct?):
- Prevalence (How often does the YES condition actually occur in our sample?):