TypeErrors when passing search spaces to trainable function

I’m trying to use ray tune to train a pytorch model and I get type errors, suggesting that the whole search space distribution is being passed to the function rather than actual values from the distribution?

Code excerpts:
#search space
config = {
‘hidden_size’: tune.sample_from(lambda _: 2 ** np.random.randint(7, 10)),
‘num_hidden’: tune.sample_from(lambda _: np.arange(1,5)),
‘lr’: tune.loguniform(1e-4, 1e-1),
‘batch_size’: tune.sample_from(lambda _: 2 ** np.random.randint(7, 10))

result = tune.run(
partial(train_model, config, checkpoint_dir),
resources_per_trial={“cpu”: cpus_per_trial, “gpu”: gpus_per_trial},

The most relevant portion of the error message:
File “/Users/dombyrne/phd/synthetic_genotypes/src/synthetic_geno_nn_cls.py”, line 87, in train_model
model = MLP(config[‘hidden_size’], config[‘num_hidden’], config[‘input_size’])
File “/Users/dombyrne/phd/synthetic_genotypes/src/synthetic_geno_nn_cls.py”, line 44, in init
l_sizes = [input_size] + [hidden_size] * int(num_hidden) + [output_size]
TypeError: int() argument must be a string, a bytes-like object or a number, not ‘Categorical’

I’ve tried reinstalling tune using pip, which makes no difference. I’m using ray version 1.9.0.

Best wishes,

Hey @DomByrne,

What happens if you just specify train_model as the trainable function?

- partial(train_model, config, checkpoint_dir),
+ train_model,

Tune should generate the configuration and pass it into train_model during execution.

That works, thank you!