带有sklearn的numpy多项式线性回归
numpy polynomial linear regression with sklearn
我正在尝试将多项式的线性系统拟合到数据中。 numpy
的 polynomial
模块包含一个拟合函数,非常好用。当我尝试使用 sklearn
线性求解器拟合模型时,拟合非常糟糕!我不明白出了什么问题。我构建了一个矩阵 X,其中 x_{ij} 对应于观察到的第 i
个输入和第 j
个多项式。我知道 X 矩阵没问题,因为当我找到带有 numpy
的系数时,数据非常适合。我使用 sklearn 的 fit
函数(我尝试了几个线性求解器),但它求解的系数(coef_
对象)是错误的。我究竟做错了什么?如何使 sklearn
线性求解器找到的系数与 numpy
找到的系数匹配?
import numpy as np
from sklearn import linear_model
from sklearn.linear_model import OrthogonalMatchingPursuit
import matplotlib.pyplot as plt
# accept x and polynomial order, return basis of that order
def legs(x, c):
s = np.zeros(c + 1)
s[-1] = 1
return np.polynomial.legendre.legval(x, s)
# Generate normalized samples
samples = np.random.uniform(2, 3, 5)
evals = samples ** 2
xnorm = (samples - 2) * 2 / (3 - 2) - 1
# instantiate linear regressor
omp = linear_model.LinearRegression()
#omp = linear_model.Lasso(alpha=0.000001)
#omp = OrthogonalMatchingPursuit(n_nonzero_coefs=2)
# construct X matrix. Each row is an observed value.
# Each column is a different polynomial.
X = np.array([[legs(xnorm[jj], ii) for ii in range(5)] for jj in range(xnorm.size)])
# Perform the fit. Why isn't this working?
omp.fit(X, evals)
# Plot the truth data
plt.scatter(xnorm, evals, label='data', s=15, marker='x')
# Dot the coefficients found with sklearn against X
plt.scatter(xnorm, omp.coef_.dot(X.T), label='linear regression')
# Dot the coefficients found with numpy against X
plt.scatter(xnorm, np.polynomial.legendre.legfit(xnorm, evals, 4).dot(X.T), label='Numpy regression')
# complete the plot
plt.legend(ncol=3, prop={'size':3})
plt.savefig('simpleExample')
plt.clf()
您的 omp.coef_.dot(X.T)
不包括拦截;手动添加或直接使用 omp.predict
。
即:
plt.scatter(xnorm, omp.coef_.dot(X.T) + omp.intercept_, label='linear regression')
plt.scatter(xnorm, evals, label='data', s=15, marker='x')
或
plt.scatter(xnorm, omp.predict(X), label='linear regression')
plt.scatter(xnorm, evals, label='data', s=15, marker='x')
我正在尝试将多项式的线性系统拟合到数据中。 numpy
的 polynomial
模块包含一个拟合函数,非常好用。当我尝试使用 sklearn
线性求解器拟合模型时,拟合非常糟糕!我不明白出了什么问题。我构建了一个矩阵 X,其中 x_{ij} 对应于观察到的第 i
个输入和第 j
个多项式。我知道 X 矩阵没问题,因为当我找到带有 numpy
的系数时,数据非常适合。我使用 sklearn 的 fit
函数(我尝试了几个线性求解器),但它求解的系数(coef_
对象)是错误的。我究竟做错了什么?如何使 sklearn
线性求解器找到的系数与 numpy
找到的系数匹配?
import numpy as np
from sklearn import linear_model
from sklearn.linear_model import OrthogonalMatchingPursuit
import matplotlib.pyplot as plt
# accept x and polynomial order, return basis of that order
def legs(x, c):
s = np.zeros(c + 1)
s[-1] = 1
return np.polynomial.legendre.legval(x, s)
# Generate normalized samples
samples = np.random.uniform(2, 3, 5)
evals = samples ** 2
xnorm = (samples - 2) * 2 / (3 - 2) - 1
# instantiate linear regressor
omp = linear_model.LinearRegression()
#omp = linear_model.Lasso(alpha=0.000001)
#omp = OrthogonalMatchingPursuit(n_nonzero_coefs=2)
# construct X matrix. Each row is an observed value.
# Each column is a different polynomial.
X = np.array([[legs(xnorm[jj], ii) for ii in range(5)] for jj in range(xnorm.size)])
# Perform the fit. Why isn't this working?
omp.fit(X, evals)
# Plot the truth data
plt.scatter(xnorm, evals, label='data', s=15, marker='x')
# Dot the coefficients found with sklearn against X
plt.scatter(xnorm, omp.coef_.dot(X.T), label='linear regression')
# Dot the coefficients found with numpy against X
plt.scatter(xnorm, np.polynomial.legendre.legfit(xnorm, evals, 4).dot(X.T), label='Numpy regression')
# complete the plot
plt.legend(ncol=3, prop={'size':3})
plt.savefig('simpleExample')
plt.clf()
您的 omp.coef_.dot(X.T)
不包括拦截;手动添加或直接使用 omp.predict
。
即:
plt.scatter(xnorm, omp.coef_.dot(X.T) + omp.intercept_, label='linear regression')
plt.scatter(xnorm, evals, label='data', s=15, marker='x')
或
plt.scatter(xnorm, omp.predict(X), label='linear regression')
plt.scatter(xnorm, evals, label='data', s=15, marker='x')