Hyper parameter tuning performance

I am trying to compare the performance of Ray Tune against traditional hyper-parameters tuning using gridSearcCV. Please find the attached image for the results.

I have tried tests 1 and 2 with the least # of hyper-parameters for both cases, the response seems to be convincing. When I increased the # of hyper-parameters for tests 3 and 4, the traditional approach didn’t respond whereas Ray tune finished in less than a minute(faster than its previous run test 2).

Question: Ray Tune response is quick while increasing # of hyper-parameters. Is this expected behavior or random?

@pandiarajang this is some cool results. Did you use any schedulers?

Also, did you use tune-sklearn?

I have used tune.run()

Please find the code snippet below.

from ray import tune

 def train_model(config):
        train_x, test_x, train_y, test_y = train_test_split_telecom_churn()
        train_set = xgb.DMatrix(train_x, label=train_y)
        test_set = xgb.DMatrix(test_x, label=test_y)
        results = {}
        xgb.train(config,train_set,evals=[(test_set, "eval")],evals_result=results,verbose_eval=False)
        accuracy = 1. - results["eval"]["error"][-1]
        tune.report(mean_accuracy=accuracy, done=True)

config = {
        "objective": "binary:logistic",
        "eval_metric": ["logloss", "error"],
        "max_depth": tune.choice([7,20,30,120,200]),
        "learning_rate": tune.choice([0.1,0.4]),
        "min_child_weight": tune.choice([5,10, 100]),
        "subsample": tune.uniform(0.5, 1.0),
        "n_estimators":tune.choice([5,10,100]),
        "eta": tune.loguniform(1e-4, 1e-1)
    }

analysis = tune.run(
         train_model,
         resources_per_trial={"cpu": 3},
         config=config,
         num_samples=10)
end_time = datetime.now()

Interesting; perhaps this is because tune.choice forces a random sampling, which is different from GridSearchCV which evaluates each and every possible combination.

I recommend you trying out tune-sklearn! Here’s an xgboost example: https://github.com/ray-project/tune-sklearn/blob/master/examples/xgbclassifier.py