Python 中的全批、随机和小批梯度下降,线性回归

Full Batch, Stochastic and Mini Batch gradient descent in Python, Linear Regression

我正在尝试理解和实现 Python 中的这些算法。 我为此目的使用 sklearn.linear_model.SGDRegressor,我的代码如下所示:

import numpy as np
from sklearn import linear_model
from sklearn.metrics import mean_squared_error
from math import sqrt

X = np.array([1,2,4,3,5]).reshape(-1,1)
y = np.array([1,3,3,2,5]).reshape(-1,1).ravel()

Model = linear_model.SGDRegressor(learning_rate = 'constant', alpha = 0, eta0 = 0.01, shuffle=True, max_iter = 4)

Model.fit(X,y)
y_predicted = Model.predict(X)

mse = mean_squared_error(y, y_predicted)
print("RMSE: ", sqrt(mse))
print("The intercept is:", Model.intercept_)
print("The slope is: ", Model.coef_)

我得到了以下结果:

RMSE:  0.7201328561288026
The intercept is: [ 0.21990009]
The slope is:  [ 0.79460054]

基于本文:https://machinelearningmastery.com/linear-regression-tutorial-using-gradient-descent-for-machine-learning/ 结果很相似,所以我想一切都很好。

现在我尝试实现以下代码:

from sklearn import linear_model
import numpy as np
from sklearn.metrics import mean_squared_error
from math import sqrt

X = np.array([1,2,4,3,5]).reshape(-1,1)
y = np.array([1,3,3,2,5]).reshape(-1,1).ravel()

numtraining = len(X)

def iter_minibatches(chunksize):
    # Provide chunks one by one
    chunkstartmaker = 0
    while chunkstartmaker < numtraining:
        chunkrows = range(chunkstartmaker, chunkstartmaker+chunksize)
        X_chunk = X[chunkrows] 
        y_chunk = y[chunkrows]
        yield X_chunk, y_chunk
        chunkstartmaker += chunksize

batcherator = iter_minibatches(chunksize=1)

Model = linear_model.SGDRegressor(learning_rate = 'constant', alpha = 0, eta0 = 0.01, shuffle=True, max_iter = 4)

for X_chunk, y_chunk in batcherator:

    Model.partial_fit(X_chunk, y_chunk, np.unique(y_chunk))

y_predicted = Model.predict(X)

mse = mean_squared_error(y, y_predicted)

print("RMSE: ", sqrt(mse))
print(Model.coef_)
print(Model.intercept_)

我得到了以下结果:

RMSE:  1.1051202460564218
[ 1.08765043]
[ 0.29586701]

据我理论理解:当chunksize = 1时,mini batch gradient descent和stochastic gradient descenet是一样的。在我的代码中不是这样。.代码有误还是我遗漏了什么?

我不完全确定发生了什么,但将 batcherator 转换为列表会有所帮助。

此外,要使用 SGDRegressor 正确实施小批量梯度下降,您应该手动迭代训练集(而不是设置 max_iter=4)。否则 SGDRegressor 只会对同一个训练批次连续进行四次梯度下降。此外,您可以打乱训练批次以获得更多随机性。

...

Model = linear_model.SGDRegressor(learning_rate = 'constant', alpha = 0, eta0 = 0.01, shuffle=True)

chunks = list(batcherator)
for _ in range(4):
    random.shuffle(chunks)
    for X_chunk, y_chunk in chunks:
        Model.partial_fit(X_chunk, y_chunk)

y_predicted = Model.predict(X)

...

这产生:

RMSE: 0.722033757406
The intercept is: 0.21990252
The slope is: 0.79236007