I tried to display my confusion matrix from actual and predicted values while using both functions plotconfusion
and Confusionmat
. Both give different results. It's really strange for me. It seems Confusionmat
is the transpose of plotconfusion
. What should I do to plot similar results of Confusionmat
on plotconfusion
?
plotconfusion:
59 0 0
0 68 0
0 3 48
And
Confusionmat:
59 0 0
0 68 3
0 0 48
You've understood things correctly - the confusion matrix generated by plotconfusion
is the transpose of the confusion matrix generated by confusionmat
.
This is documented - in the doc for plotconfusion
it says
the rows correspond to the predicted class (Output Class), and the columns show the true class (Target Class).
and in the doc for confusionmat
it says
C(i,j) is a count of observations known to be in group i but predicted to be in group j
If you want to convert between the two, just transpose them using '
.
Why is it like this? Mostly for not very good reasons. plotconfusion
is from Neural Network Toolbox, whereas confusionmat
is from Statistics Toolbox, and the two toolboxes have different histories, purposes and conventions.
Statistics Toolbox has always been developed directly by MathWorks. By contrast, Neural Network Toolbox was originally developed by external academic authors, and marketed and sold by MathWorks (although recently much development has been brought in-house). Early versions of Neural Network Toolbox were mostly focussed on the application of neural networks to control theory, not to predictive modelling. So the toolboxes had a different history and purpose, and built up a different set of conventions.
It would make sense nowadays to gradually make the toolboxes more consistent and unified in their design, but that hasn't been done yet.