仅使用系数和截距模拟 sklearn 逻辑回归 predict_proba
Simulate sklearn logistic regression predict_proba with only coefficients and intercept
我将创建虚拟数据并在其上训练 sklearn Logistic 回归。然后我想得到 predict_proba
的输出,但只能用自己的 coef_
和 intercept_
计算,但结果不同。设置如下:
X = [[0,0,0], [0,1,0], [0,2,0], [1,1,1], [0,1,0], [0,2,0]]
y = [0,0,0,1,1,2]
# Fit the classifier
clf = linear_model.LogisticRegression(C=1e5, multi_class="ovr", class_weight="balanced")
clf.fit(X, y)
那我就简单的利用sigmoid和softmax的知识得到输出:
softmax([
expit(np.dot([[0,2,0]], clf.coef_[0]) + clf.intercept_[0]),
expit(np.dot([[0,2,0]], clf.coef_[1]) + clf.intercept_[1]),
expit(np.dot([[0,2,0]], clf.coef_[2]) + clf.intercept_[2])
])
但它会 return 不同的值
clf.predict_proba([[0,2,0]])
array([[0.281399 , 0.15997556, 0.55862544]])
与 array([[0.29882052], [0.24931448], [0.451865 ]])
相反
您可以使用估计参数复制预测概率的计算,如下所示:
from sklearn import linear_model
from scipy.special import expit, softmax
import numpy as np
# Data
X = [[0,0,0], [0,1,0], [0,2,0], [1,1,1], [0,1,0], [0,2,0]]
y = [0,0,0,1,1,2]
# Classifier
clf = linear_model.LogisticRegression(C=1e5, multi_class="ovr", class_weight="balanced")
clf.fit(X, y)
# Predicted probabilities
print(clf.predict_proba([[0,2,0]]))
#[[0.281399 0.15997556 0.55862544]]
# Recalculated predicted probabilities without softmax
prob1 = np.array([expit(np.dot([[0,2,0]], clf.coef_[0]) + clf.intercept_[0]),
expit(np.dot([[0,2,0]], clf.coef_[1]) + clf.intercept_[1]),
expit(np.dot([[0,2,0]], clf.coef_[2]) + clf.intercept_[2])]).reshape(1, -1)
print(prob1 / np.sum(prob1))
#[[0.281399 0.15997556 0.55862544]]
# Recalculated predicted probabilities with softmax
prob2 = np.log(prob1)
print(softmax(prob2))
#[[0.281399 0.15997556 0.55862544]]
我将创建虚拟数据并在其上训练 sklearn Logistic 回归。然后我想得到 predict_proba
的输出,但只能用自己的 coef_
和 intercept_
计算,但结果不同。设置如下:
X = [[0,0,0], [0,1,0], [0,2,0], [1,1,1], [0,1,0], [0,2,0]]
y = [0,0,0,1,1,2]
# Fit the classifier
clf = linear_model.LogisticRegression(C=1e5, multi_class="ovr", class_weight="balanced")
clf.fit(X, y)
那我就简单的利用sigmoid和softmax的知识得到输出:
softmax([
expit(np.dot([[0,2,0]], clf.coef_[0]) + clf.intercept_[0]),
expit(np.dot([[0,2,0]], clf.coef_[1]) + clf.intercept_[1]),
expit(np.dot([[0,2,0]], clf.coef_[2]) + clf.intercept_[2])
])
但它会 return 不同的值
clf.predict_proba([[0,2,0]])
array([[0.281399 , 0.15997556, 0.55862544]])
与 array([[0.29882052], [0.24931448], [0.451865 ]])
您可以使用估计参数复制预测概率的计算,如下所示:
from sklearn import linear_model
from scipy.special import expit, softmax
import numpy as np
# Data
X = [[0,0,0], [0,1,0], [0,2,0], [1,1,1], [0,1,0], [0,2,0]]
y = [0,0,0,1,1,2]
# Classifier
clf = linear_model.LogisticRegression(C=1e5, multi_class="ovr", class_weight="balanced")
clf.fit(X, y)
# Predicted probabilities
print(clf.predict_proba([[0,2,0]]))
#[[0.281399 0.15997556 0.55862544]]
# Recalculated predicted probabilities without softmax
prob1 = np.array([expit(np.dot([[0,2,0]], clf.coef_[0]) + clf.intercept_[0]),
expit(np.dot([[0,2,0]], clf.coef_[1]) + clf.intercept_[1]),
expit(np.dot([[0,2,0]], clf.coef_[2]) + clf.intercept_[2])]).reshape(1, -1)
print(prob1 / np.sum(prob1))
#[[0.281399 0.15997556 0.55862544]]
# Recalculated predicted probabilities with softmax
prob2 = np.log(prob1)
print(softmax(prob2))
#[[0.281399 0.15997556 0.55862544]]