学习GridSearchCV用于XGBRegressor模型的超参数调整。与XGBRegressor()中的指定XGBRegressor()、.fit()中指定的得分值无关,GridSearchCV生成相同的分数值。在selection.GridSearchCV.html上,它对参数评分说:“如果没有,则使用估计器的得分方法。”这是不可能的。总能得到同样的价值。如何获得与eval_metric 相对应的结果?
此示例代码:
import numpy as np
from sklearn.model_selection import GridSearchCV, KFold
from sklearn.datasets import load_boston
import xgboost as xgb
rng = np.random.RandomState(31337)
boston = load_boston()
y = boston['target']
X = boston['data']
kf = KFold(n_splits=2, random_state=42)
folds = list(kf.split(X))
xgb_model = xgb.XGBRegressor(objective='reg:squarederror', verbose=False)
reg = GridSearchCV(estimator=xgb_model,
param_grid= {'max_depth': [2], 'n_estimators': [50]},
cv=folds,
verbose=False)
reg.fit(X, y, **{'eval_metric': 'mae', 'verbose': False})
print('GridSearchCV mean(mae)?: ', reg.cv_results_['mean_test_score'])
# -----------------------------------------------
reg.fit(X, y, **{'eval_metric': 'rmse', 'verbose': False})
print('GridSearchCV mean(rmse)?: ', reg.cv_results_['mean_test_score'])
print("----------------------------------------------------")
xgb_model.set_params(**{'max_depth': 2, 'n_estimators': 50})
xgb_model.fit(X[folds[0][0],:],y[folds[0][0]], eval_metric='mae',
eval_set = [(X[folds[0][0],:],y[folds[0][0]])], verbose=False)
print('XGBRegressor 0-mae:', xgb_model.evals_result()['validation_0']['mae'][-1])
xgb_model.fit(X[folds[0][1],:],y[folds[0][1]], eval_metric='mae',
eval_set = [(X[folds[0][1],:],y[folds[0][1]])], verbose=False)
print('XGBRegressor 1-mae:', xgb_model.evals_result()['validation_0']['mae'][-1])
xgb_model.fit(X[folds[0][0],:],y[folds[0][0]], eval_metric='rmse',
eval_set = [(X[folds[0][0],:],y[folds[0][0]])], verbose=False)
print('XGBRegressor 0-rmse:', xgb_model.evals_result()['validation_0']['rmse'][-1])
xgb_model.fit(X[folds[0][1],:],y[folds[0][1]], eval_metric='rmse',
eval_set = [(X[folds[0][1],:],y[folds[0][1]])], verbose=False)
print('XGBRegressor 1-rmse:', xgb_model.evals_result()['validation_0']['rmse'][-1])返回(线上方的数字应该是线下数字的平均值)
GridSearchCV mean(mae)?: [0.70941007]
GridSearchCV mean(rmse)?: [0.70941007]
----------------------------------------------------
XGBRegressor 0-mae: 1.273626
XGBRegressor 1-mae: 1.004947
XGBRegressor 0-rmse: 1.647694
XGBRegressor 1-rmse: 1.290872发布于 2020-02-28 19:09:31
TL;DR:您返回的是所谓的R2或决定系数。这是XGBRegressor score函数的默认评分标准,由GridSearchCV if scoring=None提取。
将结果与显式编码scoring进行比较
from sklearn.metrics import make_scorer, r2_score, mean_squared_error
xgb_model = xgb.XGBRegressor(objective='reg:squarederror', verbose=False)
reg = GridSearchCV(estimator=xgb_model, scoring=make_scorer(r2_score),
param_grid= {'max_depth': [2], 'n_estimators': [50]},
cv=folds,
verbose=False)
reg.fit(X, y)
reg.best_score_
0.7333542105472226有scoring=None的人
reg = GridSearchCV(estimator=xgb_model, scoring=None,
param_grid= {'max_depth': [2], 'n_estimators': [50]},
cv=folds,
verbose=False)
reg.fit(X, y)
reg.best_score_
0.7333542105472226如果您阅读GridSearchCV 文档字符串:
估计器:估计器对象。假设这是为了实现scikit-学习估计器接口。估计器需要提供一个
score函数,或者必须传递scoring。
此时,您可能希望检查xgb_model.score?的文档。
签名: xgb_model.score(X,y,sample_weight=None) Docstring: 返回预测的决定系数R^2。
因此,在这些文档的帮助下,如果您不喜欢XGBRegressor的默认R2评分函数,请将评分函数显式地提供给GridSearchCV。
例如,如果你想要RMSE,你可以做:
reg = GridSearchCV(estimator=xgb_model,
scoring=make_scorer(mean_squared_error, squared=False),
param_grid= {'max_depth': [2], 'n_estimators': [50]},
cv=folds,
verbose=False)
reg.fit(X, y)
reg.best_score_
4.618242594168436https://stackoverflow.com/questions/60453878
复制相似问题