Accessing GP Surrogate and Adding Constraints in BayesOptSearch

It seems likely that Ray Tune’s BayesOptSearch does not provide a supported way to access the internal Gaussian Process (GP) surrogate model after a run, nor does it offer a built-in method to estimate confidence intervals for the optimized parameters. The BayesOptSearch implementation sets up the optimizer with f=None and manages the optimization process internally, without exposing the GP or its uncertainty estimates to the user after tuning is complete. There is also no documented API for extracting the GP or its confidence intervals post hoc in the available Ray or BayesOptSearch documentation. For confidence intervals, you may need to re-fit a GP externally using the observed parameter-result pairs from the Ray Tune results dataframe, but this is not natively supported by Ray Tune itself (bayesopt_search.py, bayesopt_example.py).

Regarding constraints, my understanding is that BayesOptSearch in Ray Tune does not support explicit parameter constraints or conditional search spaces in a “clean, integrated way.” The recommended workaround is to filter or penalize invalid configurations within your objective function (e.g., by returning NaN or a large loss), but this is not a true constraint mechanism and may affect optimization efficiency. There is no official API for adding constraints directly to BayesOptSearch, and this limitation is mentioned in community discussions (bbc3cdf2, 8ebedd77). Would you like more detail or example code for any of these workarounds?

Sources:

Hint: Mention @RunLLM in the post for followups.