Understanding Confusion Matrix
When we get the data, after data cleaning, pre-processing, and wrangling, the first step we do is to feed it to an outstanding model and of course, get output in probabilities. But hold on! How in the hell can we measure the effectiveness of our model. Better the effectiveness, better the performance, and that is exactly what we want. And it is where the Confusion matrix comes into the limelight. Confusion Matrix is a performance measurement for machine learning classification.
This blog aims to answer the following questions:
- What the confusion matrix is and why you need it?
- How to calculate Confusion Matrix for a 2-class classification problem?
Today, let’s understand the confusion matrix once and for all.
What is Confusion Matrix and why you need it?
Well, it is a performance measurement for machine learning classification problem where output can be two or more classes. It is a table with 4 different combinations of predicted and actual values.
It is extremely useful for measuring Recall, Precision, Specificity, Accuracy, and most importantly AUC-ROC curves.
Let’s understand TP, FP, FN, TN in terms of pregnancy analogy.
True Positive:
Interpretation: You predicted positive and it’s true. You predicted that a woman is pregnant and she actually is.
True Negative:
Interpretation: You predicted negative and it’s true. You predicted that a man is not pregnant and he actually is not.
False Positive: (Type 1 Error)
Interpretation: You predicted positive and it’s false. You predicted that a man is pregnant but he actually is not.
False Negative: (Type 2 Error)
Interpretation: You predicted negative and it’s false. You predicted that a woman is not pregnant but she actually is.
Just Remember, We describe predicted values as Positive and Negative and actual values as True and False.
How to Calculate Confusion Matrix for a 2-class classification problem?
Let’s understand the confusion matrix through math.
Recall
The above equation can be explained by saying, from all the positive classes, how many we predicted correctly.
Recall should be high as possible.
Precision
The above equation can be explained by saying, from all the classes we have predicted as positive, how many are actually positive.
Precision should be high as possible.
and
Accuracy
From all the classes (positive and negative), how many of them we have predicted correctly. In this case, it will be 4/7.
Accuracy should be high as possible.
F-measure
It is difficult to compare two models with low precision and high recall or vice versa. So to make them comparable, we use F-Score. F-score helps to measure Recall and Precision at the same time. It uses Harmonic Mean in place of Arithmetic Mean by punishing the extreme values more.
I hope I’ve given you some basic understanding of what exactly is the confusion matrix. If you like this post, a tad of extra motivation will be helpful by giving this post some claps 👏. I am always open to your questions and suggestions. You can share this on Facebook, Twitter, Linkedin, so someone in need might stumble upon this.
No comments:
Post a Comment