XGBoost XGBClassifier Defaults in Python XGBoost XGBClassifier Defaults in Python python python

XGBoost XGBClassifier Defaults in Python


That isn't how you set parameters in xgboost. You would either want to pass your param grid into your training function, such as xgboost's train or sklearn's GridSearchCV, or you would want to use your XGBClassifier's set_params method. Another thing to note is that if you're using xgboost's wrapper to sklearn (ie: the XGBClassifier() or XGBRegressor() classes) then the paramater names used are the same ones used in sklearn's own GBM class (ex: eta --> learning_rate). I'm not seeing where the exact documentation for the sklearn wrapper is hidden, but the code for those classes is here: https://github.com/dmlc/xgboost/blob/master/python-package/xgboost/sklearn.py

For your reference here is how you would set the model object parameters directly.

>>> grid = {'max_depth':10}>>> >>> clf = XGBClassifier()>>> clf.max_depth3>>> clf.set_params(**grid)XGBClassifier(base_score=0.5, colsample_bylevel=1, colsample_bytree=1,       gamma=0, learning_rate=0.1, max_delta_step=0, max_depth=10,       min_child_weight=1, missing=None, n_estimators=100, nthread=-1,       objective='binary:logistic', reg_alpha=0, reg_lambda=1,       scale_pos_weight=1, seed=0, silent=True, subsample=1)>>> clf.max_depth10

EDIT:I suppose you can set parameters on model creation, it just isn't super typical to do so since most people grid search in some means. However if you do so you would need to either list them as full params or use **kwargs. For example:

>>> XGBClassifier(max_depth=10)XGBClassifier(base_score=0.5, colsample_bylevel=1, colsample_bytree=1,       gamma=0, learning_rate=0.1, max_delta_step=0, max_depth=10,       min_child_weight=1, missing=None, n_estimators=100, nthread=-1,       objective='binary:logistic', reg_alpha=0, reg_lambda=1,       scale_pos_weight=1, seed=0, silent=True, subsample=1)>>> XGBClassifier(**grid)XGBClassifier(base_score=0.5, colsample_bylevel=1, colsample_bytree=1,       gamma=0, learning_rate=0.1, max_delta_step=0, max_depth=10,       min_child_weight=1, missing=None, n_estimators=100, nthread=-1,       objective='binary:logistic', reg_alpha=0, reg_lambda=1,       scale_pos_weight=1, seed=0, silent=True, subsample=1)

Using a dictionary as input without **kwargs will set that parameter to literally be your dictionary:

>>> XGBClassifier(grid)XGBClassifier(base_score=0.5, colsample_bylevel=1, colsample_bytree=1,       gamma=0, learning_rate=0.1, max_delta_step=0,       max_depth={'max_depth': 10}, min_child_weight=1, missing=None,       n_estimators=100, nthread=-1, objective='binary:logistic',       reg_alpha=0, reg_lambda=1, scale_pos_weight=1, seed=0, silent=True,       subsample=1)


The defaults for XGBClassifier are:

  • max_depth=3
  • learning_rate=0.1
  • n_estimators=100
  • silent=True
  • objective='binary:logistic'
  • booster='gbtree'
  • n_jobs=1
  • nthread=None
  • gamma=0
  • min_child_weight=1
  • max_delta_step=0
  • subsample=1
  • colsample_bytree=1
  • colsample_bylevel=1
  • reg_alpha=0
  • reg_lambda=1
  • scale_pos_weight=1
  • base_score=0.5
  • random_state=0
  • seed=None
  • missing=None

Link to XGBClassifier documentation with class defaults: https://xgboost.readthedocs.io/en/latest/python/python_api.html#xgboost.XGBClassifier


For starters, looks like you're missing an s for your variable param.

You wrote param at the top:

param = {}param['booster'] = 'gbtree'param['objective'] = 'binary:logistic'  .  .  .

...but use params farther down, when training the model:

clf = xgb.XGBClassifier(params)  <-- different variable!

Was that just a typo in your example?