Accessing GP Surrogate and Adding Constraints in BayesOptSearch

1. Severity of the issue: (select one)
None: I’m just curious or want clarification.

2. Environment:

  • Ray version: 2.50.1

I’m using BayesOptSearch for a Bayesian optimization of my simulation parameters. This works well in terms of finding the best parameter combination, but I have a couple of additional goals:

  • I would like to obtain the confidence interval for the final optimized parameters
  • I would like to add constraints to the parameter space in a clean, integrated way.

I haven’t been able to find a way to either access the internal GP after the search is done or to add a constraint, so I have a few questions:

  • Is there a supported way in Ray Tune to access the final GP surrogate after a BayesOptSearch run?
  • If not, what is the best way to estimate confidence intervals for the parameters after such a run?
  • Is there a supported way in Ray Tune to add constraints to a BayesOptSearch run?

Any advice would be appreciated.
Thanks!

It seems likely that Ray Tune’s BayesOptSearch does not provide a supported way to access the internal Gaussian Process (GP) surrogate model after a run, nor does it offer a built-in method to estimate confidence intervals for the optimized parameters. The BayesOptSearch implementation sets up the optimizer with f=None and manages the optimization process internally, without exposing the GP or its uncertainty estimates to the user after tuning is complete. There is also no documented API for extracting the GP or its confidence intervals post hoc in the available Ray or BayesOptSearch documentation. For confidence intervals, you may need to re-fit a GP externally using the observed parameter-result pairs from the Ray Tune results dataframe, but this is not natively supported by Ray Tune itself (bayesopt_search.py, bayesopt_example.py).

Regarding constraints, my understanding is that BayesOptSearch in Ray Tune does not support explicit parameter constraints or conditional search spaces in a “clean, integrated way.” The recommended workaround is to filter or penalize invalid configurations within your objective function (e.g., by returning NaN or a large loss), but this is not a true constraint mechanism and may affect optimization efficiency. There is no official API for adding constraints directly to BayesOptSearch, and this limitation is mentioned in community discussions (bbc3cdf2, 8ebedd77). Would you like more detail or example code for any of these workarounds?

Sources:

Hint: Mention @RunLLM in the post for followups.