Do we have a tutorial that sets up tuning multiple trainers (e.g., combine xgboost, catboost,lgbm,LR) while serving (e.g hyperopt) from their respective hyperparameter search spaces.?
Related topics
Topic | Replies | Views | Activity | |
---|---|---|---|---|
Distributed data loading using Ray Data with XGBoost official (or XGBoost Sklearn) model
|
1 | 311 | August 26, 2022 | |
One Trainer, multiple Datasets
|
2 | 294 | November 27, 2023 | |
About the Ray Tune category | 0 | 846 | November 17, 2020 | |
Multi-gpu ray tune for hparams not parallelizing and only using first gpu
|
0 | 72 | July 10, 2024 | |
How to parallelize training multiple models | 4 | 1145 | April 5, 2021 |