Classification Report
Confusion matrix
Precision
Recall
f1-score
Support
But before this, need to know what is TP, TN, FP, FN!
True positive (TP) / True negative (TN) = Correct! You are right, it is present (positive), it is absent (negative)
False positive (FP) / False negative (FN) = Wrong! You said yes, actually is no (FP)! You said no, actually is Yes (FN)!
What is Support
The number of the data exists.
For example, there are :
13 setosa in the testing model.
16 Versicolor in the model
9 Virginica in the model.
At y-axis = true label (the true
At x-axis = predicted abel (the predictions)
Precision
High = better
Example:
Virginica
TP = 9, FP =1 (It is versicolor but it identified as virginica)
Precision => 9 / 9+1 => 0.9
Versicolor
TP = 15, FP =0
Precision => 15/15+0 => 1.0
Setosa
TP = 13, FP=0
Precision => 13/13+0 => 1.0
Recall
How many predictions is correct in the positive case, high = better
Example:
Virginica
TP = 9 FN =0
Recall = 9/9+1 = 1.0
Versicolor
TP = 15 FN =1(Actually there is versicolor but identified as virginica, means it considered as absent (however it is present)
Recall = 15/15+1 = 0.94
Setosa
TP = 13, FN =0
Recall = 13/13 = 1.0
f1-score
When you don't know how good of the model as there are two parameters, so you this!
Example:
Virginica
2*1*0.9 / (0.9+1) = 0.95
Versicolor
2*0.94*1 / (0.94+1) = 0.97
Setosa
2*1*1 / (1+1) = 1.0