Hi to all.
I’m wondering what the most ray
-suitable approach to perform hyperparameter sweep is.
Here’s my situation. I have a pseudo-black algorithm - it is not a black box, but I presume this algorithm is a black block. And want to see the sensitivity of the algorithm to the perturbation of the input (hyper)parameter.
In the beginning, I thought to use ray-tune
to implement the idea with ray. However, it seems like the ray-tune
is designed for sequentially performing the hyperparameter tuning. In my circumstances, the objective is “sweeping” the given hyperparameter set to understand the algorithm’s sensitivity to the hyperparameter.
One more thing I want to attain is to log the outputs with wandb
The followings are my trail.
@ray.remote()
def run_algo(hyperparams):
cost = algo(hyperparams)
wandb.init()
wandb.log({'cost':cost})
return 1
if __name__ == '__main__':
ray.init(num_cpus=3)
params = [1,2,3,4, ...] # list of hyperparameters
params = ray.put()
result = ray.get(run_algo.remote(params))
print(result)
From here, we have a few questions about my implementation.
-
Am I correctly implementing that the code will spawn three workers (i.e.,
num_cpus
=3) and assign the jobs to the works whenever the workers are idle? Plus, are the workers processing jobs asynchronously? -
Is there any way to perform (1) without explicitly returning the values from
run_algo
?
Moreover, what will be better implementation?
Thank you so much!