Hi!
I’m trying to use Ray tune for hyperparameter search. Each model is trained with PTL. Weirdly, I’m getting the following error:
lightning_lite.utilities.exceptions.MisconfigurationException: No supported gpu backend found!
The distributed hparam search works on CPU, and training without Ray works fine on GPU. Is this a compatibility issue between PTL/Ray tune?
Note that I’m using a single GPU for each training job:
metrics = {"loss": "val_loss"}
callbacks = [TuneReportCallback(metrics, on="validation_end")] if args.tune else []
trainer = pl.Trainer(
devices=1,
accelerator="gpu",
precision=64,
gradient_clip_val=args.grad_clip_val,
limit_train_batches=1.0,
log_every_n_steps=10,
callbacks=callbacks,
)
trainer.fit(model, train_loader, val_loader)