I was going to draw confusion matrix in my model and I used Transfer learning concept based on Deep Learning model.
Confusion Matrix's code
def plot_confusion_matrix(cm, classes, normalize=False,title='Confusion Matrix', cmap=plt.cm.Blues):
plt.imshow(cm, interpolation='nearest', cmap=cmap)
plt.title(title)
plt.colorbar()
tick_marks = np.arange(len(classes))
plt.xticks(tick_marks, classes, rotation=45)
plt.yticks(tick_marks, classes)
if normalize:
cm=cm.astype('float') / cm.sum(axis=1)[:, np.newaxis]
print("Normalized Confusion Matrix")
else:
print("Confusion matrix, without normalization")
print(cm)
thresh = cm.max() / 2
for i, j in itertools.product(range(cm.shape[0]),range(cm.shape[1])):
plt.text(j, i, cm[i, j],
horizontalalignment="center",
color="white" if cm[i,j] > thresh else "black")
plt.tight_layout()
plt.ylabel('True Label')
plt.xlabel('Predicted Label')
Now below the shape of test_labels and Predictions are given,
test_labels.shape
(12,)
predictions.shape
(10,2)
The above code is perfectly working but I saw error in below. So please concern below code,
cm = confusion_matrix(test_labels, predictions.argmax(axis=1))
and here is the error,
ValueError Traceback (most recent call last)
<ipython-input-40-79fd4e2e074c> in <module>()
----> 1 cm = confusion_matrix(test_labels, predictions.argmax(axis=1))
2 frames
/usr/local/lib/python3.6/dist-packages/sklearn/utils/validation.py in check_consistent_length(*arrays)
210 if len(uniques) > 1:
211 raise ValueError("Found input variables with inconsistent numbers of"
--> 212 " samples: %r" % [int(l) for l in lengths])
213
214
ValueError: Found input variables with inconsistent numbers of samples: [12, 10]
Note: This is value error and I am confused about it I try more and more but I failed. So I need help to solve this error.
As the error suggests, you have different sample sizes for test_labels
and predictions
. This may happen when you use batches for prediction which may result in dropping of the last few samples.
One possibility is, you can use:
cm = confusion_matrix(test_labels[:-2], predictions.argmax(axis=1))
This may solve the shape mismatch problem (But it is based on the assumption that the last two samples are missing in prediction).
I may be able to provide a more useful answer if you can share the code used for prediction.