Tune Search Algorithms

Tune provides various hyperparameter search algorithms to efficiently optimize your model. Tune allows you to use different search algorithms in combination with different trial schedulers. Tune will by default implicitly use the Variant Generation algorithm to create trials.

You can utilize these search algorithms as follows:

tune.run(my_function, search_alg=SearchAlgorithm(...))

Currently, Tune offers the following search algorithms (and library integrations):

Repeated Evaluations

Use ray.tune.suggest.Repeater to average over multiple evaluations of the same hyperparameter configurations. This is useful in cases where the evaluated training procedure has high variance (i.e., in reinforcement learning).

By default, Repeater will take in a repeat parameter and a search_alg. The search_alg will suggest new configurations to try, and the Repeater will run repeat trials of the configuration. It will then average the search_alg.metric from the final results of each repeated trial.

See the API documentation (Repeater) for more details.

from ray.tune.suggest import Repeater

search_alg = BayesOpt(...)
re_search_alg = Repeater(search_alg, repeat=10)

# Repeat 2 samples 10 times each.
tune.run(trainable, num_samples=20, search_alg=re_search_alg)

Note

This does not apply for grid search and random search.

Warning

It is recommended to not use Repeater with a TrialScheduler. Early termination can negatively affect the average reported metric.

HyperOpt Search (Tree-structured Parzen Estimators)

The HyperOptSearch is a SearchAlgorithm that is backed by HyperOpt to perform sequential model-based hyperparameter optimization. Note that this class does not extend ray.tune.suggest.BasicVariantGenerator, so you will not be able to use Tune’s default variant generation/search space declaration when using HyperOptSearch.

In order to use this search algorithm, you will need to install HyperOpt via the following command:

$ pip install --upgrade git+git://github.com/hyperopt/hyperopt.git

This algorithm requires using the HyperOpt search space specification. You can use HyperOptSearch like follows:

tune.run(... , search_alg=HyperOptSearch(hyperopt_space, ... ))

An example of this can be found in hyperopt_example.py.

class ray.tune.suggest.hyperopt.HyperOptSearch(space, metric='episode_reward_mean', mode='max', points_to_evaluate=None, n_initial_points=20, random_state_seed=None, gamma=0.25, max_concurrent=None, use_early_stopped_trials=None)[source]

Bases: ray.tune.suggest.suggestion.Searcher

A wrapper around HyperOpt to provide trial suggestions.

Requires HyperOpt to be installed from source. Uses the Tree-structured Parzen Estimators algorithm, although can be trivially extended to support any algorithm HyperOpt uses. Externally added trials will not be tracked by HyperOpt. Trials of the current run can be saved using save method, trials of a previous run can be loaded using restore method, thus enabling a warm start feature.

Parameters
  • space (dict) – HyperOpt configuration. Parameters will be sampled from this configuration and will be used to override parameters generated in the variant generation process.

  • metric (str) – The training result objective value attribute.

  • mode (str) – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

  • points_to_evaluate (list) – Initial parameter suggestions to be run first. This is for when you already have some good parameters you want hyperopt to run first to help the TPE algorithm make better suggestions for future parameters. Needs to be a list of dict of hyperopt-named variables. Choice variables should be indicated by their index in the list (see example)

  • n_initial_points (int) – number of random evaluations of the objective function before starting to aproximate it with tree parzen estimators. Defaults to 20.

  • random_state_seed (int, array_like, None) – seed for reproducible results. Defaults to None.

  • gamma (float in range (0,1)) – parameter governing the tree parzen estimators suggestion algorithm. Defaults to 0.25.

  • max_concurrent – Deprecated.

  • use_early_stopped_trials – Deprecated.

space = {
    'width': hp.uniform('width', 0, 20),
    'height': hp.uniform('height', -100, 100),
    'activation': hp.choice("activation", ["relu", "tanh"])
}
current_best_params = [{
    'width': 10,
    'height': 0,
    'activation': 0, # The index of "relu"
}]
algo = HyperOptSearch(
    space, metric="mean_loss", mode="min",
    points_to_evaluate=current_best_params)

BOHB

Tip

This implementation is still experimental. Please report issues on https://github.com/ray-project/ray/issues/. Thanks!

BOHB (Bayesian Optimization HyperBand) is a SearchAlgorithm that is backed by HpBandSter to perform sequential model-based hyperparameter optimization in conjunction with HyperBand. Note that this class does not extend ray.tune.suggest.BasicVariantGenerator, so you will not be able to use Tune’s default variant generation/search space declaration when using BOHB.

Importantly, BOHB is intended to be paired with a specific scheduler class: HyperBandForBOHB.

This algorithm requires using the ConfigSpace search space specification. In order to use this search algorithm, you will need to install HpBandSter and ConfigSpace:

$ pip install hpbandster ConfigSpace

You can use TuneBOHB in conjunction with HyperBandForBOHB as follows:

# BOHB uses ConfigSpace for their hyperparameter search space
import ConfigSpace as CS

config_space = CS.ConfigurationSpace()
config_space.add_hyperparameter(
    CS.UniformFloatHyperparameter("height", lower=10, upper=100))
config_space.add_hyperparameter(
    CS.UniformFloatHyperparameter("width", lower=0, upper=100))

experiment_metrics = dict(metric="episode_reward_mean", mode="min")
bohb_hyperband = HyperBandForBOHB(
    time_attr="training_iteration", max_t=100, **experiment_metrics)
bohb_search = TuneBOHB(
    config_space, max_concurrent=4, **experiment_metrics)

tune.run(MyTrainableClass,
    name="bohb_test",
    scheduler=bohb_hyperband,
    search_alg=bohb_search,
    num_samples=5)

Take a look at an example here. See the BOHB paper for more details.

class ray.tune.suggest.bohb.TuneBOHB(space, bohb_config=None, max_concurrent=10, metric='neg_mean_loss', mode='max')[source]

Bases: ray.tune.suggest.suggestion.Searcher

BOHB suggestion component.

Requires HpBandSter and ConfigSpace to be installed. You can install HpBandSter and ConfigSpace with: pip install hpbandster ConfigSpace.

This should be used in conjunction with HyperBandForBOHB.

Parameters
  • space (ConfigurationSpace) – Continuous ConfigSpace search space. Parameters will be sampled from this space which will be used to run trials.

  • bohb_config (dict) – configuration for HpBandSter BOHB algorithm

  • max_concurrent (int) – Number of maximum concurrent trials. Defaults to 10.

  • metric (str) – The training result objective value attribute.

  • mode (str) – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

Example:

import ConfigSpace as CS

config_space = CS.ConfigurationSpace()
config_space.add_hyperparameter(
    CS.UniformFloatHyperparameter('width', lower=0, upper=20))
config_space.add_hyperparameter(
    CS.UniformFloatHyperparameter('height', lower=-100, upper=100))
config_space.add_hyperparameter(
    CS.CategoricalHyperparameter(
        name='activation', choices=['relu', 'tanh']))

algo = TuneBOHB(
    config_space, max_concurrent=4, metric='mean_loss', mode='min')
bohb = HyperBandForBOHB(
    time_attr='training_iteration',
    metric='mean_loss',
    mode='min',
    max_t=100)
run(MyTrainableClass, scheduler=bohb, search_alg=algo)

Contributing a New Algorithm

If you are interested in implementing or contributing a new Search Algorithm, the API is straightforward:

class ray.tune.suggest.SearchAlgorithm[source]

Interface of an event handler API for hyperparameter search.

Unlike TrialSchedulers, SearchAlgorithms will not have the ability to modify the execution (i.e., stop and pause trials).

Trials added manually (i.e., via the Client API) will also notify this class upon new events, so custom search algorithms should maintain a list of trials ID generated from this class.

See also: ray.tune.suggest.BasicVariantGenerator.

add_configurations(experiments)[source]

Tracks given experiment specifications.

Parameters

experiments (Experiment | list | dict) – Experiments to run.

next_trials()[source]

Provides Trial objects to be queued into the TrialRunner.

Returns

Returns a list of trials.

Return type

trials (list)

on_trial_result(trial_id, result)[source]

Called on each intermediate result returned by a trial.

This will only be called when the trial is in the RUNNING state.

Parameters

trial_id – Identifier for the trial.

on_trial_complete(trial_id, result=None, error=False)[source]

Notification for the completion of trial.

Parameters
  • trial_id – Identifier for the trial.

  • result (dict) – Defaults to None. A dict will be provided with this notification when the trial is in the RUNNING state AND either completes naturally or by manual termination.

  • error (bool) – Defaults to False. True if the trial is in the RUNNING state and errors.

is_finished()[source]

Returns True if no trials left to be queued into TrialRunner.

Can return True before all trials have finished executing.

set_finished()[source]

Marks the search algorithm as finished.

Model-Based Suggestion Algorithms

Often times, hyperparameter search algorithms are model-based and may be quite simple to implement. For this, one can extend the following abstract class and implement on_trial_complete, and suggest.

class ray.tune.suggest.Searcher(metric='episode_reward_mean', mode='max', max_concurrent=None, use_early_stopped_trials=None)[source]

Bases: object

Abstract class for wrapping suggesting algorithms.

Custom algorithms can extend this class easily by overriding the suggest method provide generated parameters for the trials.

Any subclass that implements __init__ must also call the constructor of this class: super(Subclass, self).__init__(...).

To track suggestions and their corresponding evaluations, the method suggest will be passed a trial_id, which will be used in subsequent notifications.

Parameters
  • metric (str) – The training result objective value attribute.

  • mode (str) – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

class ExampleSearch(Searcher):
    def __init__(self, metric="mean_loss", mode="min", **kwargs):
        super(ExampleSearch, self).__init__(
            metric=metric, mode=mode, **kwargs)
        self.optimizer = Optimizer()
        self.configurations = {}

    def suggest(self, trial_id):
        configuration = self.optimizer.query()
        self.configurations[trial_id] = configuration

    def on_trial_complete(self, trial_id, result, **kwargs):
        configuration = self.configurations[trial_id]
        if result and self.metric in result:
            self.optimizer.update(configuration, result[self.metric])

tune.run(trainable_function, search_alg=ExampleSearch())