A confusion matrix is a table that is often used to describe the performance of a classification model (or "classifier") on a set of test data for which the true values are known.
For the following 2x2 matrix : x-axis: ground truth
0: no airplane in the image; 1: airplane visible in the image
y-axis: classification result
0: no airplane detected; 1: airplane detected
number in the cell: number of test images that fall in a that category
(0,0)- No plane in the image, no plane detected (True Negative)
(0,1)- No plane in the image, plane detected (False Positive)
(1,0)- Plane exists in the image, but not detected (False Negative)
(1,1)- Plane exists in the image, and detected (True Positive)
The accuracy of the test result calculated from the confusion matrix is 0.574468
This occurs due to the test image dataset being unbalanced (150 airplane images, 320 non-airplane images) and the huge variation among them.