为什么 sklearn 的套索系数不等于线性回归系数?

Why does sklearn's Lasso coefficients not equal to Linear Regression ones?

我正在尝试在我的代码中实现 sklearn 的套索。为了测试它,我决定用 alpha = 0 进行测试。根据定义,这应该产生与 LinearRegression 相同的结果,但事实并非如此。
这是代码:

import pandas as pd
from sklearn.linear_model import Lasso
from sklearn.linear_model import LinearRegression

# Don't worry about this. It is made so that we can work with the same dataset.
df = pd.read_csv('http://web.stanford.edu/~oleg2/hse/Credit.csv').dropna()
df['Asian'] = df.Ethnicity=='Asian'
df['Caucasian'] = df.Ethnicity=='Caucasian'
df['African American'] = df.Ethnicity=='African American'
df = df.drop(['Ethnicity'],axis=1).replace(['Yes','No','Male','Female',True,False],[1,0,1,0,1,0])
# End of unimportant part

ft = Lasso(alpha=0).fit(x, df.Balance)
print(ft.intercept_)
ft = LinearRegression().fit(x, df.Balance)
print(ft.intercept_)

输出:

-485.3744897927978
-480.89071679937786

coef_也各不相同。

我做错了什么?

的确,这似乎会产生不同的结果。但是,运行 您的代码也产生了以下警告:

ft = Lasso(alpha=0).fit(X, y)
print(ft.intercept_)
ft = LinearRegression().fit(X, y)
print(ft.intercept_)

-485.3744897927984
-480.89071679937854 

UserWarning: With alpha=0, this algorithm does not converge well. You are advised to use the LinearRegression estimator

这让你知道,由于 alpha=0,这意味着我们只剩下一个普通的线性回归,算法不会很好地收敛。这就是为什么您看到截距有所不同,并且可能是指标恶化的原因。