I want to solve linear regression in the following way
When I try with minimizing the sum of cost it works fine,
import cvxpy as cp
import numpy as np
n = 5
np.random.seed(1)
x = np.linspace(0, 20, n)
y = np.random.rand(x.shape[0])
theta = cp.Variable(2)
# This way it works
objective = cp.Minimize(cp.sum_squares(theta[0]*x + theta[1] - y))
prob = cp.Problem(objective)
result = prob.solve()
print(theta.value)
I want to try minimizing the quadratic cost as follows:
#This way it does not work,
X = np.row_stack((np.ones_like(y), x)).T
objective_function = (y - X*theta).T*(y-X*theta)
obj = cp.Minimize(objective_function)
prob = cp.Problem(obj)
result = prob.solve()
print(theta.value)
However, I get the following error:
raise DCPError("Problem does not follow DCP rules. Specifically:\n" + append)
cvxpy.error.DCPError: The problem does not follow DCP rules. Specifically:
Any idea why this happens?
I think CVXPY does not understand that both y - X*theta
are the same in
objective_function = (y - X*theta).T*(y-X*theta)
Is
objective = cp.Minimize(cp.norm(y - X*theta)**2)
or
objective = cp.Minimize(cp.norm(y - X*theta))
acceptable? (Both give the same solution)