Multiple model training/hp tune

Do we have a tutorial that sets up tuning multiple trainers (e.g., combine xgboost, catboost,lgbm,LR) while serving (e.g hyperopt) from their respective hyperparameter search spaces.?