来自 Scikit_Learn 混淆矩阵和 Scikit_Learn Recall_Score 的灵敏度不匹配
Sensitivity derived from Scikit_Learn Confusion Matrix and Scikit_Learn Recall_Score doesn't match
true = [1,0,0,1]
predict = [1,1,1,1]
cf = sk.metrics.confusion_matrix(true,predict)
print cf
数组
([[0, 2],
[0, 2]])
tp = cf[0][0]
fn = cf[0][1]
fp = cf[1][0]
tn = cf[1][1]
sensitivity= tp/(tp+fn)
print(sensitivity)
0.0
print(sk.metrics.recall_score(true, predict))
1.0
根据 Scikit 文档“Recall_Score”定义必须匹配。
有人可以解释一下吗?
必须按以下方式更新混淆矩阵标签:
tn = cf[0][0]
fp = cf[0][1]
fn = cf[1][0]
tp = cf[1][1]
sensitivity= tp/(tp+fn)
print(sensitivity)
1.0
true = [1,0,0,1]
predict = [1,1,1,1]
cf = sk.metrics.confusion_matrix(true,predict)
print cf
数组
([[0, 2],
[0, 2]])
tp = cf[0][0]
fn = cf[0][1]
fp = cf[1][0]
tn = cf[1][1]
sensitivity= tp/(tp+fn)
print(sensitivity)
0.0
print(sk.metrics.recall_score(true, predict))
1.0
根据 Scikit 文档“Recall_Score”定义必须匹配。 有人可以解释一下吗?
必须按以下方式更新混淆矩阵标签:
tn = cf[0][0]
fp = cf[0][1]
fn = cf[1][0]
tp = cf[1][1]
sensitivity= tp/(tp+fn)
print(sensitivity)
1.0