I am taking Andrew Ng class on Machine Learning and implementing linear regression algorithm.
What is wrong with my code?
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
m = length(y);
J_history = zeros(num_iters, 1);
h = (X*theta)
for iter = 1:num_iters
theta(1,1) = theta(1,1)-(alpha/m)*sum((h-y).*X(:,1));
theta(2,1) = theta(2,1)-(alpha/m)*sum((h-y).*X(:,2));
J_history(iter) = computeCost(X, y, theta);
end
end
Cost Function is given as:
function J = computeCost(X, y, theta)
m = length(y);
h = (X*theta)
J = (1/(2*m))*sum((h-y).^2)
end
The value of J_history
keeps increasing. It is giving very abnormal (large value) i.e. about 1000 times more than it should.
You need to update h
and theta
in the for loop as below
function [theta, J_history] = gradientDescent(X, y, theta, alpha, num_iters)
m = length(y);
J_history = zeros(num_iters, 1);
for iter = 1:num_iters
h = ((X*theta)-y)'*X;
theta = theta - alpha*(1/m)*h';
J_history(iter) = computeCost(X, y, theta);
end
end