Ray Tune Error Help

Dear all,
I encountered an error as per image.

Here are part of my code:
config = {

"n_hidden_size": 64,

"kernel_size": 3,

"lr": 1e-4,

"batch_size": 32

}

def train_class_tune_checkpoint(config, checkpoint_dir=None, num_epochs=10):

kwargs = {

  "max_epochs": num_epochs,

  "gpus": 1,

  "logger": TensorBoardLogger(

      save_dir=tune.get_trial_dir(), name="", version="."),

  "progress_bar_refresh_rate": 0,

  "callbacks": [

      TuneReportCheckpointCallback(

          metrics={"loss": "val_loss", "mean_accuracy": "val_accuracy"},

          filename="checkpoint",

          on="validation_end")

  ]

}

if checkpoint_dir:

    kwargs["resume_from_checkpoint"] = os.path.join(

        checkpoint_dir, "checkpoint")

model = Classification(config, num_inputs = 3, n_classes = 4, n_layer=4)

data_module = VehicleDataModule(final_train, final_test, config)

data_module.setup()

trainer = pl.Trainer(**kwargs)

trainer.fit(model, data_module)

def tune_class_pbt(num_samples, num_epochs):

config = {

"n_hidden_size": tune.choice([32, 64, 128]),

"kernel_size": 3,

"lr": 1e-3,

"batch_size": 64,

}

scheduler = PopulationBasedTraining(

    perturbation_interval=4,

    hyperparam_mutations={

        "lr": tune.loguniform(1e-4, 1e-1),

        "batch_size": [32, 64, 128]

    })

reporter = CLIReporter(

    parameter_columns=["n_hidden_size", "lr", "batch_size"],

    metric_columns=["loss", "mean_accuracy", "training_iteration"])

analysis = tune.run(

    tune.with_parameters(

        train_class_tune_checkpoint,

        num_epochs=num_epochs,

        num_gpus=1),

    resources_per_trial={

        "cpu": 1,

        "gpu": 1

    },

    metric="loss",

    mode="min",

    config=config,

    num_samples=num_samples,

    scheduler=scheduler,

    progress_reporter=reporter,

    name="tune_class_pbt")

print("Best hyperparameters found were: ", analysis.best_config)

tune_class_pbt(10, 10)

I used tune.with_parameters, but why I still encounter that error?
May I know how to use ray.put?

Thanks

Some piece of the code is missing so I can’t really run it. I wonder if you could refactor train_class_tune_checkpoint a little bit. For example take out data_module and pass it as an argument?
I think tune.with_parameters is probably the right approach here. One note about it is it only ray.put the rest of the kwargs other than train_class_tune_checkpoint itself. It will only be meaningful after you refactor some stuff from function signature into arguments.