Do we have a tutorial that sets up tuning multiple trainers (e.g., combine xgboost, catboost,lgbm,LR) while serving (e.g hyperopt) from their respective hyperparameter search spaces.?
Do we have a tutorial that sets up tuning multiple trainers (e.g., combine xgboost, catboost,lgbm,LR) while serving (e.g hyperopt) from their respective hyperparameter search spaces.?