class ray.tune.search.bayesopt.BayesOptSearch(space: Optional[Dict] = None, metric: Optional[str] = None, mode: Optional[str] = None, points_to_evaluate: Optional[List[Dict]] = None, utility_kwargs: Optional[Dict] = None, random_state: int = 42, random_search_steps: int = 10, verbose: int = 0, patience: int = 5, skip_duplicate: bool = True, analysis: Optional[ExperimentAnalysis] = None)[source]#

Bases: ray.tune.search.searcher.Searcher

Uses fmfn/BayesianOptimization to optimize hyperparameters.

fmfn/BayesianOptimization is a library for Bayesian Optimization. More info can be found here: https://github.com/fmfn/BayesianOptimization.

This searcher will automatically filter out any NaN, inf or -inf results.

You will need to install fmfn/BayesianOptimization via the following:

pip install bayesian-optimization

Initializing this search algorithm with a space requires that it’s in the BayesianOptimization search space format. Otherwise, you should instead pass in a Tune search space into Tuner(param_space=...), and the search space will be automatically converted for you.

See this BayesianOptimization example notebook for an example.

  • space – Continuous search space. Parameters will be sampled from this space which will be used to run trials.

  • metric – The training result objective value attribute. If None but a mode was passed, the anonymous metric _metric will be used per default.

  • mode – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

  • points_to_evaluate – Initial parameter suggestions to be run first. This is for when you already have some good parameters you want to run first to help the algorithm make better suggestions for future parameters. Needs to be a list of dicts containing the configurations.

  • utility_kwargs – Parameters to define the utility function. The default value is a dictionary with three keys: - kind: ucb (Upper Confidence Bound) - kappa: 2.576 - xi: 0.0

  • random_state – Used to initialize BayesOpt.

  • random_search_steps – Number of initial random searches. This is necessary to avoid initial local overfitting of the Bayesian process.

  • verbose – Sets verbosity level for BayesOpt packages.

  • patience – If patience is set and we’ve repeated a trial numerous times, we terminate the experiment.

  • skip_duplicate – skip duplicate config

  • analysis – Optionally, the previous analysis to integrate.

Tune automatically converts search spaces to BayesOptSearch’s format:

from ray import tune
from ray.tune.search.bayesopt import BayesOptSearch

config = {
    "width": tune.uniform(0, 20),
    "height": tune.uniform(-100, 100)

bayesopt = BayesOptSearch(metric="mean_loss", mode="min")
tuner = tune.Tuner(

If you would like to pass the search space manually, the code would look like this:

from ray import tune
from ray.tune.search.bayesopt import BayesOptSearch

space = {
    'width': (0, 20),
    'height': (-100, 100),
bayesopt = BayesOptSearch(space, metric="mean_loss", mode="min")
tuner = tune.Tuner(


add_evaluated_point(parameters, value[, ...])

Pass results from a point that has been evaluated separately.

add_evaluated_trials(trials_or_analysis, metric)

Pass results from trials that have been evaluated separately.

on_trial_complete(trial_id[, result, error])

Notification for the completion of trial.

on_trial_result(trial_id, result)

Optional notification for result during training.


Integrate the given analysis into the gaussian process.


Restoring current optimizer state.


Restores the state of a searcher from a given checkpoint_dir.


Storing current optimizer state.

save_to_dir(checkpoint_dir[, session_str])

Automatically saves the given searcher to the checkpoint_dir.


Set max concurrent trials this searcher can run.


Return new point to be explored by black box function.





The training result objective value attribute.


Specifies if minimizing or maximizing the metric.