Search code examples
pythonpython-3.xmachine-learningdata-sciencenaivebayes

Implementing the loss function using mse


I'm using MSE to measure the loss. In the below code, I implemented loss_mse function that should computes MSE for the input set with the given theta

def loss_mse(X,y,theta):
    length = len(y)
    predictions = X.dot(theta)
    error = (1/2*length) * np.sum(np.square(predictions - y))
    return error

To test the above function, I wrote the following test cases:

X = np.array([[2.0, 1.0, 3.0], [3.0, 6.0, 2.0]])
y =  np.array([1.0, 1.0])
theta = np.array([[1.0], [2.0],[1.0]])
error = loss_mse(X, y, theta)
print(error)

I have to get the answer as 73 but I'm getting 584. I don't understand where I'm doing wrong.


Solution

  • You are multiplying by length rather than dividing.

    Try

    1/(2*length) * np.sum(np.square(predictions - y))
    

    For the given input, this results in 146, did you perhaps mean for it to be 1/(4*length)?