A confusion matrix is a summary of prediction results on a classification problem. The number of correct and incorrect predictions are summarised with count values and broken down by each class.
Each row of the matrix represents the instances in a predicted class, while each column represents the instances in an actual class (or vice versa).[8] In abstract terms, the confusion matrix is as follows:
A much better way to evaluate the performance of a classifier is to look at the confusion matrix. The general idea is to count the number of times instances of class A are classified as class B (Refer below example). Using this, we can identify accuracy, misclassification rate etc.
Precision checks the ratio of correct prediction out of all cases that are predicted as positive.
Recall checks the ratio of correct prediction out of all cases which are actually positive.
In below picture, p stands for precision and r stands for recall
ML model selection criteria
It depends on the problem at hand. For example,
For rare cancer data modeling, anything that doesn't account for false-negatives is a crime. Recall is a better measure than precision.
For YouTube recommendations, false-negatives is less of a concern. Precision is better here.
F-score also helps in decision
An ROC curve (receiver operating characteristic curve) is a graph showing the performance of a classification model at all classification thresholds. This curve plots two parameters: True Positive Rate. False Positive Rate
True Positive Rate = True Positives / (True Positives + False Negatives)
False Positive Rate = False Positives / (False Positives + True Negatives)
ROC curves should be used when there are roughly equal numbers of observations for each class.
Precision-Recall curves should be used when there is a moderate to large class imbalance.
Reference
https://en.wikipedia.org/wiki/Confusion_matrix
https://www.dataschool.io/simple-guide-to-confusion-matrix-terminology/
https://machinelearningmastery.com/confusion-matrix-machine-learning/
https://www.kdnuggets.com/2020/01/guide-precision-recall-confusion-matrix.html
https://coursera.org/share/5dfd8f6f955441c8e08d294a4ff78aa7
https://en.wikipedia.org/wiki/Precision_and_recall
https://datascience.stackexchange.com/questions/30881/when-is-precision-more-important-over-recall
https://images.app.goo.gl/izb1N6mJspvJGxsp9
https://youtu.be/VPZiJGNX4_s
https://machinelearningmastery.com/roc-curves-and-precision-recall-curves-for-classification-in-python/
https://developers.google.com/machine-learning/crash-course/classification/roc-and-auc