I don't understand what is wrong with my rMSE implementation. I'm training my model using MSE as loss function and same for metrics. After training, I use evaluate
function for evaluate my model in test set, and then predict
function to get values. Then I apply rMSE. My code is:
obs= model.compile(loss='mse', optimizer=keras.optimizers.Adam(lr=0.001),metrics=['mse'])
.......
test_eval = model.evaluate(X_test, Y_test, verbose=1)
print('Test loss (MSE):', test_eval[0])
predicted= model.predict(X_test, verbose=0)
rMSE = np.sqrt(pow(np.mean(predited- Y_test), 2))
print(rMSE)
And I had this results:
Test loss (MSE): 12.0075311661
2.90274470011
But square of 12.0075311661 isn't 2.90274470011. So, what is wrong?
Elementwise square the differences before finding the mean. You want to find the average of the squared difference, not the square of the average distance.