Does Ray Trainer allow/support Trainers as members (so that it’s “composite”)?
The idea being that I wanna train a dag of models/nodes where they could run in parallel w/ diff resource requirements.
Since Ray Trainer works nicely w/ Tune, if I can, say, subclass BaseTrainer and train a dag, then I can tune the entire dag in a configrable way easily.
Any suggestions?