XGBoost:What参数'objective'设置了吗?
XGBoost:What is the parameter 'objective' set?
我想用 XGBoost 解决回归问题。我对学习任务参数感到困惑 objective [ default=reg:linear ](XGboost), **it seems that 'objective' is used for setting loss function.**But I can't understand 'reg:linear' how to influence loss function. In logistic regression demo(XGBoost logistic regression demo), objective = binary:logistic 表示损失函数是逻辑损失 function.So 'objective=reg:linear'对应哪个损失函数?
So 'objective=reg:linear' corresponds to which loss function?
平方误差
您可以在此处查看逻辑回归和线性回归的损失函数(基于梯度和 hessian)
https://github.com/dmlc/xgboost/blob/master/src/objective/regression_obj.cc
请注意,损失函数相当相似。只是 SecondOrderGradient
是平方损失中的常数
// common regressions
// linear regression
struct LinearSquareLoss {
static float PredTransform(float x) { return x; }
static bool CheckLabel(float x) { return true; }
static float FirstOrderGradient(float predt, float label) { return predt - label; }
static float SecondOrderGradient(float predt, float label) { return 1.0f; }
static float ProbToMargin(float base_score) { return base_score; }
static const char* LabelErrorMsg() { return ""; }
static const char* DefaultEvalMetric() { return "rmse"; }
};
// logistic loss for probability regression task
struct LogisticRegression {
static float PredTransform(float x) { return common::Sigmoid(x); }
static bool CheckLabel(float x) { return x >= 0.0f && x <= 1.0f; }
static float FirstOrderGradient(float predt, float label) { return predt - label; }
static float SecondOrderGradient(float predt, float label) {
const float eps = 1e-16f;
return std::max(predt * (1.0f - predt), eps);
}
作者在这里提到了这一点https://github.com/dmlc/xgboost/tree/master/demo/regression
我想用 XGBoost 解决回归问题。我对学习任务参数感到困惑 objective [ default=reg:linear ](XGboost), **it seems that 'objective' is used for setting loss function.**But I can't understand 'reg:linear' how to influence loss function. In logistic regression demo(XGBoost logistic regression demo), objective = binary:logistic 表示损失函数是逻辑损失 function.So 'objective=reg:linear'对应哪个损失函数?
So 'objective=reg:linear' corresponds to which loss function?
平方误差
您可以在此处查看逻辑回归和线性回归的损失函数(基于梯度和 hessian)
https://github.com/dmlc/xgboost/blob/master/src/objective/regression_obj.cc
请注意,损失函数相当相似。只是 SecondOrderGradient
是平方损失中的常数
// common regressions
// linear regression
struct LinearSquareLoss {
static float PredTransform(float x) { return x; }
static bool CheckLabel(float x) { return true; }
static float FirstOrderGradient(float predt, float label) { return predt - label; }
static float SecondOrderGradient(float predt, float label) { return 1.0f; }
static float ProbToMargin(float base_score) { return base_score; }
static const char* LabelErrorMsg() { return ""; }
static const char* DefaultEvalMetric() { return "rmse"; }
};
// logistic loss for probability regression task
struct LogisticRegression {
static float PredTransform(float x) { return common::Sigmoid(x); }
static bool CheckLabel(float x) { return x >= 0.0f && x <= 1.0f; }
static float FirstOrderGradient(float predt, float label) { return predt - label; }
static float SecondOrderGradient(float predt, float label) {
const float eps = 1e-16f;
return std::max(predt * (1.0f - predt), eps);
}
作者在这里提到了这一点https://github.com/dmlc/xgboost/tree/master/demo/regression