How severe does this issue affect your experience of using Ray?
- Low: It annoys or frustrates me for a moment.
When setting the ray.train.RunConfig() storage_path param to a local folder,
I’ve noticed that ray saves the results of a tune run in ~/ray_results in addition to the storage_path. Sample code:
from ray import tune
from ray.train import CheckpointConfig, RunConfig
from ray.tune.examples.mnist_pytorch import train_mnist
storage_path = "/tmp/ray_results"
exp_name = "tune_analyzing_results"
tuner = tune.Tuner(
train_mnist,
param_space={
"lr": tune.loguniform(0.001, 0.04),
"momentum": tune.grid_search([0.8, 0.9]),
},
run_config=RunConfig(
name=exp_name,
stop={"training_iteration": 100},
checkpoint_config=CheckpointConfig(
checkpoint_score_attribute="mean_accuracy",
num_to_keep=5,
),
storage_path=storage_path
),
tune_config=tune.TuneConfig(mode="max", metric="mean_accuracy",
num_samples=1)
)
result_grid = tuner.fit()
If one sets TUNE_RESULT_DIR to the same folder as whatever storage_path is set before running the code, this does not occur. This behavior seems strange; if storage_path is set but TUNE_RESULT_DIR is not set, I would have thought that the results should only be stored in storage_path and not additionally in the default ~/ray_results folder assumed when TUNE_RESULT_DIR is not set.
I’m using ray 2.7.1 and pytorch 2.0.0.post102 on MacOS 13.6.