ray.tune.search.bayesopt.BayesOptSearch#

class ray.tune.search.bayesopt.BayesOptSearch(space: Optional[Dict] = None, metric: Optional[str] = None, mode: Optional[str] = None, points_to_evaluate: Optional[List[Dict]] = None, utility_kwargs: Optional[Dict] = None, random_state: int = 42, random_search_steps: int = 10, verbose: int = 0, patience: int = 5, skip_duplicate: bool = True, analysis: Optional[ExperimentAnalysis] = None)[source]#

Bases: ray.tune.search.searcher.Searcher

Uses fmfn/BayesianOptimization to optimize hyperparameters.

fmfn/BayesianOptimization is a library for Bayesian Optimization. More info can be found here: https://github.com/fmfn/BayesianOptimization.

This searcher will automatically filter out any NaN, inf or -inf results.

You will need to install fmfn/BayesianOptimization via the following:

pip install bayesian-optimization

Initializing this search algorithm with a space requires that it’s in the BayesianOptimization search space format. Otherwise, you should instead pass in a Tune search space into Tuner(param_space=...), and the search space will be automatically converted for you.

See this BayesianOptimization example notebook for an example.

Parameters
  • space – Continuous search space. Parameters will be sampled from this space which will be used to run trials.

  • metric – The training result objective value attribute. If None but a mode was passed, the anonymous metric _metric will be used per default.

  • mode – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

  • points_to_evaluate – Initial parameter suggestions to be run first. This is for when you already have some good parameters you want to run first to help the algorithm make better suggestions for future parameters. Needs to be a list of dicts containing the configurations.

  • utility_kwargs – Parameters to define the utility function. The default value is a dictionary with three keys: - kind: ucb (Upper Confidence Bound) - kappa: 2.576 - xi: 0.0

  • random_state – Used to initialize BayesOpt.

  • random_search_steps – Number of initial random searches. This is necessary to avoid initial local overfitting of the Bayesian process.

  • analysis – Optionally, the previous analysis to integrate.

  • verbose – Sets verbosity level for BayesOpt packages.

Tune automatically converts search spaces to BayesOptSearch’s format:

from ray import tune
from ray.tune.search.bayesopt import BayesOptSearch

config = {
    "width": tune.uniform(0, 20),
    "height": tune.uniform(-100, 100)
}

bayesopt = BayesOptSearch(metric="mean_loss", mode="min")
tuner = tune.Tuner(
    my_func,
    tune_config=tune.TuneConfig(
        search_alg=baysopt,
    ),
    param_space=config,
)
tuner.fit()

If you would like to pass the search space manually, the code would look like this:

from ray import tune
from ray.tune.search.bayesopt import BayesOptSearch

space = {
    'width': (0, 20),
    'height': (-100, 100),
}
bayesopt = BayesOptSearch(space, metric="mean_loss", mode="min")
tuner = tune.Tuner(
    my_func,
    tune_config=tune.TuneConfig(
        search_alg=bayesopt,
    ),
)
tuner.fit()
set_search_properties(metric: Optional[str], mode: Optional[str], config: Dict, **spec) bool[source]#

Pass search properties to searcher.

This method acts as an alternative to instantiating search algorithms with their own specific search spaces. Instead they can accept a Tune config through this method. A searcher should return True if setting the config was successful, or False if it was unsuccessful, e.g. when the search space has already been set.

Parameters
  • metric – Metric to optimize

  • mode – One of [β€œmin”, β€œmax”]. Direction to optimize.

  • config – Tune config dict.

  • **spec – Any kwargs for forward compatiblity. Info like Experiment.PUBLIC_KEYS is provided through here.

suggest(trial_id: str) Optional[Dict][source]#

Return new point to be explored by black box function.

Parameters

trial_id – Id of the trial. This is a short alphanumerical string.

Returns

Either a dictionary describing the new point to explore or None, when no new point is to be explored for the time being.

register_analysis(analysis: ExperimentAnalysis)[source]#

Integrate the given analysis into the gaussian process.

Parameters

analysis – Optionally, the previous analysis to integrate.

on_trial_complete(trial_id: str, result: Optional[Dict] = None, error: bool = False)[source]#

Notification for the completion of trial.

Parameters
  • trial_id – Id of the trial. This is a short alphanumerical string.

  • result – Dictionary of result. May be none when some error occurs.

  • error – Boolean representing a previous error state. The result should be None when error is True.

save(checkpoint_path: str)[source]#

Storing current optimizer state.

restore(checkpoint_path: str)[source]#

Restoring current optimizer state.