ray.tune.search.Repeater
ray.tune.search.Repeater#
- class ray.tune.search.Repeater(searcher: ray.tune.search.searcher.Searcher, repeat: int = 1, set_index: bool = True)[source]#
Bases:
ray.tune.search.searcher.Searcher
A wrapper algorithm for repeating trials of same parameters.
Set tune.TuneConfig(num_samples=β¦) to be a multiple of
repeat
. For example, set num_samples=15 if you intend to obtain 3 search algorithm suggestions and repeat each suggestion 5 times. Any leftover trials (num_samples mod repeat) will be ignored.It is recommended that you do not run an early-stopping TrialScheduler simultaneously.
- Parameters
searcher β Searcher object that the Repeater will optimize. Note that the Searcher will only see 1 trial among multiple repeated trials. The result/metric passed to the Searcher upon trial completion will be averaged among all repeats.
repeat β Number of times to generate a trial with a repeated configuration. Defaults to 1.
set_index β Sets a tune.search.repeater.TRIAL_INDEX in Trainable/Function config which corresponds to the index of the repeated trial. This can be used for seeds. Defaults to True.
Example:
from ray.tune.search import Repeater search_alg = BayesOptSearch(...) re_search_alg = Repeater(search_alg, repeat=10) # Repeat 2 samples 10 times each. tuner = tune.Tuner( trainable, tune_config=tune.TuneConfig( search_alg=re_search_alg, num_samples=20, ), ) tuner.fit()
PublicAPI: This API is stable across Ray releases.
- suggest(trial_id: str) Optional[Dict] [source]#
Queries the algorithm to retrieve the next set of parameters.
- Parameters
trial_id β Trial ID used for subsequent notifications.
- Returns
- Configuration for a trial, if possible.
If FINISHED is returned, Tune will be notified that no more suggestions/configurations will be provided. If None is returned, Tune will skip the querying of the searcher for this step.
- Return type
dict | FINISHED | None
- on_trial_complete(trial_id: str, result: Optional[Dict] = None, **kwargs)[source]#
Stores the score for and keeps track of a completed trial.
Stores the metric of a trial as nan if any of the following conditions are met:
result
is empty or not provided.result
is provided but no metric was provided.
- save(checkpoint_path: str)[source]#
Save state to path for this search algorithm.
- Parameters
checkpoint_path β File where the search algorithm state is saved. This path should be used later when restoring from file.
Example:
search_alg = Searcher(...) tuner = tune.Tuner( cost, tune_config=tune.TuneConfig( search_alg=search_alg, num_samples=5 ), param_space=config ) results = tuner.fit() search_alg.save("./my_favorite_path.pkl")
Changed in version 0.8.7: Save is automatically called by
Tuner().fit()
. You can useTuner().restore()
to restore from an experiment directory such as/ray_results/trainable
.
- restore(checkpoint_path: str)[source]#
Restore state for this search algorithm
- Parameters
checkpoint_path β File where the search algorithm state is saved. This path should be the same as the one provided to βsaveβ.
Example:
search_alg.save("./my_favorite_path.pkl") search_alg2 = Searcher(...) search_alg2 = ConcurrencyLimiter(search_alg2, 1) search_alg2.restore(checkpoint_path) tuner = tune.Tuner( cost, tune_config=tune.TuneConfig( search_alg=search_alg2, num_samples=5 ), ) tuner.fit()
- set_search_properties(metric: Optional[str], mode: Optional[str], config: Dict, **spec) bool [source]#
Pass search properties to searcher.
This method acts as an alternative to instantiating search algorithms with their own specific search spaces. Instead they can accept a Tune config through this method. A searcher should return
True
if setting the config was successful, orFalse
if it was unsuccessful, e.g. when the search space has already been set.- Parameters
metric β Metric to optimize
mode β One of [βminβ, βmaxβ]. Direction to optimize.
config β Tune config dict.
**spec β Any kwargs for forward compatiblity. Info like Experiment.PUBLIC_KEYS is provided through here.