XGBoost plot_importance doesn't show feature names XGBoost plot_importance doesn't show feature names python python

XGBoost plot_importance doesn't show feature names


If you're using the scikit-learn wrapper you'll need to access the underlying XGBoost Booster and set the feature names on it, instead of the scikit model, like so:

model = joblib.load("your_saved.model")model.get_booster().feature_names = ["your", "feature", "name", "list"]xgboost.plot_importance(model.get_booster())


You want to use the feature_names parameter when creating your xgb.DMatrix

dtrain = xgb.DMatrix(Xtrain, label=ytrain, feature_names=feature_names)


train_test_split will convert the dataframe to numpy array which dont have columns information anymore.

Either you can do what @piRSquared suggested and pass the features as a parameter to DMatrix constructor. Or else, you can convert the numpy array returned from the train_test_split to a Dataframe and then use your code.

Xtrain, Xval, ytrain, yval = train_test_split(df[feature_names], y, \                                    test_size=0.2, random_state=42)# See below two linesX_train = pd.DataFrame(data=Xtrain, columns=feature_names)Xval = pd.DataFrame(data=Xval, columns=feature_names)dtrain = xgb.DMatrix(Xtrain, label=ytrain)