单变量线性回归
Linear Regression with one variable
在线性回归中实施梯度下降算法时,我的算法所做的预测和生成的回归线作为错误输出出现。任何人都可以看看我的实施并帮助我吗?另外,请指导我如何知道 "learning rate" 和 "number of iterations" 在特定回归问题中选择什么值?
theta0 = 0 #first parameter
theta1 = 0 #second parameter
alpha = 0.001 #learning rate (denoted by alpha)
num_of_iterations = 100 #total number of iterations performed by Gradient Descent
m = float(len(X)) #total number of training examples
for i in range(num_of_iterations):
y_predicted = theta0 + theta1 * X
derivative_theta0 = (1/m) * sum(y_predicted - Y)
derivative_theta1 = (1/m) * sum(X * (y_predicted - Y))
temp0 = theta0 - alpha * derivative_theta0
temp1 = theta1 - alpha * derivative_theta1
theta0 = temp0
theta1 = temp1
print(theta0, theta1)
y_predicted = theta0 + theta1 * X
plt.scatter(X,Y)
plt.plot(X, y_predicted, color = 'red')
plt.show()
Resulting regression line about which I need some help
你的学习率太高了,我通过将学习率降低到 alpha = 0.0001 来让它工作。
在线性回归中实施梯度下降算法时,我的算法所做的预测和生成的回归线作为错误输出出现。任何人都可以看看我的实施并帮助我吗?另外,请指导我如何知道 "learning rate" 和 "number of iterations" 在特定回归问题中选择什么值?
theta0 = 0 #first parameter
theta1 = 0 #second parameter
alpha = 0.001 #learning rate (denoted by alpha)
num_of_iterations = 100 #total number of iterations performed by Gradient Descent
m = float(len(X)) #total number of training examples
for i in range(num_of_iterations):
y_predicted = theta0 + theta1 * X
derivative_theta0 = (1/m) * sum(y_predicted - Y)
derivative_theta1 = (1/m) * sum(X * (y_predicted - Y))
temp0 = theta0 - alpha * derivative_theta0
temp1 = theta1 - alpha * derivative_theta1
theta0 = temp0
theta1 = temp1
print(theta0, theta1)
y_predicted = theta0 + theta1 * X
plt.scatter(X,Y)
plt.plot(X, y_predicted, color = 'red')
plt.show()
Resulting regression line about which I need some help
你的学习率太高了,我通过将学习率降低到 alpha = 0.0001 来让它工作。