Accuracy

Confusion Matrix is a simple method that is used to calculate the accuracy of an AI model, especially when dealing with an image-based AI model.

The Confusion Matrix table is as follows,

The positive/negative labels refer to the Predicted outcome of an experiment, while the True/False refers to the Actual outcome (if the Actual outcome is True then it is positive if it is not the Actual outcome is Negative). As an example, if the AI predicts that an Insect as FAW, but if the insect is actually from an other category, then it would fall into False Positive because the actual outcome was false (Negative) but the prediction was positive. Similarly,

True Positive (TP) – If the Prediction is Positive and the Actual outcome is also True/Positive ( Predicted as FAW and it is a FAW in Actuality).

False Positive (FP) – If the Prediction is Positive but if the Actual outcome is False/Negative ( Predicted as FAW but in Actuality, the insect is of another category).

False Negative (FN) – If the Prediction is Negative but if the Actual outcome is True/Positive (Predicted as another category but in Actuality the insect is FAW).

True Negative (TN) – If the Prediction is Negative and the Actual outcome is also False/Negative (Predicted as another category and it is an insect belonging to the other category in Actuality as well).

This can be confusing at first ( they are called confusion matrix, after all) but will be clear to you with the example.

Calculating the Accuracy

The Accuracy of the of the model is calculated by using the below formula,

Accuracy = (TP+TN) / (TP+TN+FP+FN)

Example

Let’s assume that we used an AI model to predict if insects in a sample population of 200 belong to FAW or other Categories and that there are 100 FAW and 100 insects belonging to the other category in the sample. The first step would be to create the Confusion Matrix table and it would be as below,

As shown above, the Positive label would be FAW and the Negative would be the Other category.

Lets say after testing the model it predicted 75 FAW correctly as FAW, predicted 25 FAW incorrectly as other, predicted 90 other category insects correctly as other and predicted 10 other category insects incorrectly as FAW. Now lets sort out which of these values are TP, TN, FP and FN.

Hence,

TP (predicted as FAW and is FAW in actuality) = 75

FN (predicted as other but is FAW in actuality) = 25

FP (predicted as FAW but is other in actuality) = 10

TN (predicted as other and is other in actuality) = 90

Now lets add these values to our Confusion Matrix and the table would look as below,

With the confusion matrix filled, now we can calculate the accuracy of the model by using the formula that we mentioned earlier in the article,

Accuracy = (TP+TN) / (TP+TN+FP+FN) = ( 75+90) / ( 75+90+10+25) = 165/ 200 = 0.825 or 82.5%

Hence, the accuracy of the AI model that predicts FAW vs Other is 82.5%.

We hope that this article gave you an understanding as well as the confidence to calculate an AI model’s accuracy. If you have any questions with regards to this, please contact kasun via email at kasun@gomicro.co.