ray.tune.search.bayesopt.BayesOptSearch#
- class ray.tune.search.bayesopt.BayesOptSearch(space: Dict | None = None, metric: str | None = None, mode: str | None = None, points_to_evaluate: List[Dict] | None = None, utility_kwargs: Dict | None = None, random_state: int = 42, random_search_steps: int = 10, verbose: int = 0, patience: int = 5, skip_duplicate: bool = True, analysis: ExperimentAnalysis | None = None)[source]#
Bases:
Searcher
Uses fmfn/BayesianOptimization to optimize hyperparameters.
fmfn/BayesianOptimization is a library for Bayesian Optimization. More info can be found here: fmfn/BayesianOptimization.
This searcher will automatically filter out any NaN, inf or -inf results.
You will need to install fmfn/BayesianOptimization via the following:
pip install bayesian-optimization
Initializing this search algorithm with a
space
requires that it’s in theBayesianOptimization
search space format. Otherwise, you should instead pass in a Tune search space intoTuner(param_space=...)
, and the search space will be automatically converted for you.See this BayesianOptimization example notebook for an example.
- Parameters:
space – Continuous search space. Parameters will be sampled from this space which will be used to run trials.
metric – The training result objective value attribute. If None but a mode was passed, the anonymous metric
_metric
will be used per default.mode – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.
points_to_evaluate – Initial parameter suggestions to be run first. This is for when you already have some good parameters you want to run first to help the algorithm make better suggestions for future parameters. Needs to be a list of dicts containing the configurations.
utility_kwargs – Parameters to define the utility function. The default value is a dictionary with three keys: - kind: ucb (Upper Confidence Bound) - kappa: 2.576 - xi: 0.0
random_state – Used to initialize BayesOpt.
random_search_steps – Number of initial random searches. This is necessary to avoid initial local overfitting of the Bayesian process.
verbose – Sets verbosity level for BayesOpt packages.
patience – If patience is set and we’ve repeated a trial numerous times, we terminate the experiment.
skip_duplicate – skip duplicate config
analysis – Optionally, the previous analysis to integrate.
Tune automatically converts search spaces to BayesOptSearch’s format:
from ray import tune from ray.tune.search.bayesopt import BayesOptSearch config = { "width": tune.uniform(0, 20), "height": tune.uniform(-100, 100) } bayesopt = BayesOptSearch(metric="mean_loss", mode="min") tuner = tune.Tuner( my_func, tune_config=tune.TuneConfig( search_alg=baysopt, ), param_space=config, ) tuner.fit()
If you would like to pass the search space manually, the code would look like this:
from ray import tune from ray.tune.search.bayesopt import BayesOptSearch space = { 'width': (0, 20), 'height': (-100, 100), } bayesopt = BayesOptSearch(space, metric="mean_loss", mode="min") tuner = tune.Tuner( my_func, tune_config=tune.TuneConfig( search_alg=bayesopt, ), ) tuner.fit()
Methods
Pass results from a point that has been evaluated separately.
Pass results from trials that have been evaluated separately.
Notification for the completion of trial.
Optional notification for result during training.
Integrate the given analysis into the gaussian process.
Restoring current optimizer state.
Restores the state of a searcher from a given checkpoint_dir.
Storing current optimizer state.
Automatically saves the given searcher to the checkpoint_dir.
Set max concurrent trials this searcher can run.
Return new point to be explored by black box function.
Attributes
The training result objective value attribute.
Specifies if minimizing or maximizing the metric.