xgb.set_hyperparameters(objective='binary:logistic',num_round=100)
xgb.fit({'train': s3_input_train})
...
from sagemaker.tuner import IntegerParameter, CategoricalParameter, ContinuousParameter, HyperparameterTuner
hyperparameter_ranges = {'eta': ContinuousParameter(0, 1),
'min_child_weight': ContinuousParameter(1, 10),
'alpha': ContinuousParameter(0, 2),
'max_depth': IntegerParameter(1, 10),
'num_round': IntegerParameter(1, 300),
'gamma': ContinuousParameter(0, 5),
'lambda': ContinuousParameter(0, 1000),
'max_delta_step':IntegerParameter(1, 10),
'colsample_bylevel':ContinuousParameter(0.1, 1),
'colsample_bytree':ContinuousParameter(0.5, 1),
'subsample':ContinuousParameter(0.5, 1)}
objective_metric_name = 'validation:aucpr'
tuner = HyperparameterTuner(xgb,
objective_metric_name,
hyperparameter_ranges,
max_jobs=50,
max_parallel_jobs=3)
tuner.fit({'train': s3_input_train, 'validation': s3_input_val}, include_cls_metadata=False, wait=False)返回错误:
An error occurred (ValidationException) when calling the CreateHyperParameterTuningJob operation: The objective metric for the hyperparameter tuning job, [validation:aucpr], isn’t valid for the [811284229777.dkr.ecr.us-east-1.amazonaws.com/xgboost:latest] algorithm. Choose a valid objective metric.同样的情况也适用于用f1和日志丢失代替拍卖会。它们被明确地定义为文档中用于分类目的的评估指标。https://docs.aws.amazon.com/sagemaker/latest/dg/xgboost-tuning.html
我能做些什么来允许f1、拍卖行和日志损失评估指标?
发布于 2022-07-05 22:56:58
虽然validation:auc、validation:f1和validation:logloss确实是评估度量,但它们不是可调整的XGBoost超级参数。
请参阅可调超参数的下表
您的代码试图将objective metric设置为不支持的代码。
评价指标将作为超级参数的一部分输入:
例如,
xgb.set_hyperparameters(
eval_metric="auc",
objective="binary:logistic",
num_round=10,
rate_drop=0.3,
tweedie_variance_power=1.4,
)从你分享的医生那里:
a learning objective function to optimize during model training
an eval_metric to use to evaluate model performance during validationhttps://stackoverflow.com/questions/72833918
复制相似问题