Error: Metric Kappa not applicable for regression models
Error: Metric Kappa not applicable for regression models
我尝试使用KNN方法做预测,我已经将因子转换为字符,并且我的数据集包含数值和特征变量,缺失已被删除。这是我的 R 代码:
library(caret)
set.seed(142)
ctrl <- trainControl(method = "repeatedcv", number = 10, repeats = 5)
train(LateStage ~ .,
data = finalP6,
method = "knn",
trControl = ctrl,
preProcess = c("range"),
tuneLength = 10,
metric = "Kappa")
我得到这个错误:
You are trying to do regression and your outcome only has two possible values Are you trying to do classification?
If so, use a 2 level factor as your outcome column.
Error: Metric Kappa not applicable for regression models
感谢您的帮助!
如果您提供数据示例并解释您尝试使用该模型做什么,将会有所帮助。
我假设您正在尝试使用 kNN 进行分类,如果您的自变量是因子,实际上就可以了。对于插入符号,您需要将因变量作为分类因素。
首先我生成了一些看起来像你的数据,(不要运行这个来解决你的问题):
finalP6 = data.frame(LateStage = sample(0:1,1000,replace=TRUE),
matrix(runif(1000),ncol=10))
然后运行模型和你可以看到同样的错误:
library(caret)
set.seed(142)
ctrl <- trainControl(method = "repeatedcv", number = 10, repeats = 5)
train(LateStage ~ ., data = finalP6, method = "knn",
trControl = ctrl, preProcess = c("range"),tuneLength = 10,
metric = "Kappa")
Error: Metric Kappa not applicable for regression models
In addition: Warning message:
In train.default(x, y, weights = w, ...) :
You are trying to do regression and your outcome only has two possible values Are you trying to do classification? If so, use a 2 level factor as your outcome column.
您需要做的是:
finalP6<span class="math-container">$LateStage = factor(finalP6$</span>LateStage)
train(LateStage ~ ., data = finalP6, method = "knn",
trControl = ctrl, preProcess = c("range"),tuneLength = 10,
metric = "Kappa")
k-Nearest Neighbors
1000 samples
10 predictor
2 classes: '0', '1'
Pre-processing: re-scaling to [0, 1] (10)
Resampling: Cross-Validated (10 fold, repeated 5 times)
Summary of sample sizes: 901, 900, 899, 900, 901, 900, ...
Resampling results across tuning parameters:
k Accuracy Kappa
5 0.4922998 -0.015433129
7 0.4889058 -0.022246806
9 0.5027024 0.005380261
11 0.5074765 0.014693566
13 0.5090744 0.017946500
15 0.5100462 0.019756722
17 0.5048402 0.009260903
19 0.4868552 -0.027103397
21 0.4842232 -0.032544799
23 0.4798550 -0.041286084
我尝试使用KNN方法做预测,我已经将因子转换为字符,并且我的数据集包含数值和特征变量,缺失已被删除。这是我的 R 代码:
library(caret)
set.seed(142)
ctrl <- trainControl(method = "repeatedcv", number = 10, repeats = 5)
train(LateStage ~ .,
data = finalP6,
method = "knn",
trControl = ctrl,
preProcess = c("range"),
tuneLength = 10,
metric = "Kappa")
我得到这个错误:
You are trying to do regression and your outcome only has two possible values Are you trying to do classification?
If so, use a 2 level factor as your outcome column.
Error: Metric Kappa not applicable for regression models
感谢您的帮助!
如果您提供数据示例并解释您尝试使用该模型做什么,将会有所帮助。
我假设您正在尝试使用 kNN 进行分类,如果您的自变量是因子,实际上就可以了。对于插入符号,您需要将因变量作为分类因素。
首先我生成了一些看起来像你的数据,(不要运行这个来解决你的问题):
finalP6 = data.frame(LateStage = sample(0:1,1000,replace=TRUE),
matrix(runif(1000),ncol=10))
然后运行模型和你可以看到同样的错误:
library(caret)
set.seed(142)
ctrl <- trainControl(method = "repeatedcv", number = 10, repeats = 5)
train(LateStage ~ ., data = finalP6, method = "knn",
trControl = ctrl, preProcess = c("range"),tuneLength = 10,
metric = "Kappa")
Error: Metric Kappa not applicable for regression models
In addition: Warning message:
In train.default(x, y, weights = w, ...) :
You are trying to do regression and your outcome only has two possible values Are you trying to do classification? If so, use a 2 level factor as your outcome column.
您需要做的是:
finalP6<span class="math-container">$LateStage = factor(finalP6$</span>LateStage)
train(LateStage ~ ., data = finalP6, method = "knn",
trControl = ctrl, preProcess = c("range"),tuneLength = 10,
metric = "Kappa")
k-Nearest Neighbors
1000 samples
10 predictor
2 classes: '0', '1'
Pre-processing: re-scaling to [0, 1] (10)
Resampling: Cross-Validated (10 fold, repeated 5 times)
Summary of sample sizes: 901, 900, 899, 900, 901, 900, ...
Resampling results across tuning parameters:
k Accuracy Kappa
5 0.4922998 -0.015433129
7 0.4889058 -0.022246806
9 0.5027024 0.005380261
11 0.5074765 0.014693566
13 0.5090744 0.017946500
15 0.5100462 0.019756722
17 0.5048402 0.009260903
19 0.4868552 -0.027103397
21 0.4842232 -0.032544799
23 0.4798550 -0.041286084