解决线性回归最小化二次成本
Solving linear regression minimizing quadratic cost
我想用下面的方式求解线性回归
当我尝试最小化成本总和时效果很好,
import cvxpy as cp
import numpy as np
n = 5
np.random.seed(1)
x = np.linspace(0, 20, n)
y = np.random.rand(x.shape[0])
theta = cp.Variable(2)
# This way it works
objective = cp.Minimize(cp.sum_squares(theta[0]*x + theta[1] - y))
prob = cp.Problem(objective)
result = prob.solve()
print(theta.value)
我想尝试最小化二次成本如下:
#This way it does not work,
X = np.row_stack((np.ones_like(y), x)).T
objective_function = (y - X*theta).T*(y-X*theta)
obj = cp.Minimize(objective_function)
prob = cp.Problem(obj)
result = prob.solve()
print(theta.value)
但是,我收到以下错误:
raise DCPError("Problem does not follow DCP rules. Specifically:\n" + append)
cvxpy.error.DCPError: The problem does not follow DCP rules. Specifically:
知道为什么会这样吗?
我认为CVXPY不明白y - X*theta
在
中是一样的
objective_function = (y - X*theta).T*(y-X*theta)
是
objective = cp.Minimize(cp.norm(y - X*theta)**2)
或
objective = cp.Minimize(cp.norm(y - X*theta))
可以接受吗? (两者给出相同的解决方案)
我想用下面的方式求解线性回归
当我尝试最小化成本总和时效果很好,
import cvxpy as cp
import numpy as np
n = 5
np.random.seed(1)
x = np.linspace(0, 20, n)
y = np.random.rand(x.shape[0])
theta = cp.Variable(2)
# This way it works
objective = cp.Minimize(cp.sum_squares(theta[0]*x + theta[1] - y))
prob = cp.Problem(objective)
result = prob.solve()
print(theta.value)
我想尝试最小化二次成本如下:
#This way it does not work,
X = np.row_stack((np.ones_like(y), x)).T
objective_function = (y - X*theta).T*(y-X*theta)
obj = cp.Minimize(objective_function)
prob = cp.Problem(obj)
result = prob.solve()
print(theta.value)
但是,我收到以下错误:
raise DCPError("Problem does not follow DCP rules. Specifically:\n" + append)
cvxpy.error.DCPError: The problem does not follow DCP rules. Specifically:
知道为什么会这样吗?
我认为CVXPY不明白y - X*theta
在
objective_function = (y - X*theta).T*(y-X*theta)
是
objective = cp.Minimize(cp.norm(y - X*theta)**2)
或
objective = cp.Minimize(cp.norm(y - X*theta))
可以接受吗? (两者给出相同的解决方案)