ray.tune.search.sigopt.SigOptSearch
ray.tune.search.sigopt.SigOptSearch#
- class ray.tune.search.sigopt.SigOptSearch(space: Optional[List[Dict]] = None, name: str = 'Default Tune Experiment', max_concurrent: int = 1, connection: None = None, experiment_id: Optional[str] = None, observation_budget: Optional[int] = None, project: Optional[str] = None, metric: Optional[Union[str, List[str]]] = 'episode_reward_mean', mode: Optional[Union[str, List[str]]] = 'max', points_to_evaluate: Optional[List[Dict]] = None, **kwargs)[source]#
Bases:
ray.tune.search.searcher.Searcher
A wrapper around SigOpt to provide trial suggestions.
You must install SigOpt and have a SigOpt API key to use this module. Store the API token as an environment variable
SIGOPT_KEY
as follows:pip install -U sigopt export SIGOPT_KEY= ...
You will need to use the SigOpt experiment and space specification.
This searcher manages its own concurrency. If this Searcher is used in a
ConcurrencyLimiter
, themax_concurrent
value passed to it will override the value passed here.- Parameters
space – SigOpt configuration. Parameters will be sampled from this configuration and will be used to override parameters generated in the variant generation process. Not used if existing experiment_id is given
name – Name of experiment. Required by SigOpt.
max_concurrent – Number of maximum concurrent trials supported based on the user’s SigOpt plan. Defaults to 1. If this Searcher is used in a
ConcurrencyLimiter
, themax_concurrent
value passed to it will override the value passed here.connection – An existing connection to SigOpt.
experiment_id – Optional, if given will connect to an existing experiment. This allows for a more interactive experience with SigOpt, such as prior beliefs and constraints.
observation_budget – Optional, can improve SigOpt performance.
project – Optional, Project name to assign this experiment to. SigOpt can group experiments by project
metric (str or list(str)) – If str then the training result objective value attribute. If list(str) then a list of metrics that can be optimized together. SigOpt currently supports up to 2 metrics.
mode – If experiment_id is given then this field is ignored, If str then must be one of {min, max}. If list then must be comprised of {min, max, obs}. Determines whether objective is minimizing or maximizing the metric attribute. If metrics is a list then mode must be a list of the same length as metric.
Example:
space = [ { 'name': 'width', 'type': 'int', 'bounds': { 'min': 0, 'max': 20 }, }, { 'name': 'height', 'type': 'int', 'bounds': { 'min': -100, 'max': 100 }, }, ] algo = SigOptSearch( space, name="SigOpt Example Experiment", metric="mean_loss", mode="min") Example:
space = [ { 'name': 'width', 'type': 'int', 'bounds': { 'min': 0, 'max': 20 }, }, { 'name': 'height', 'type': 'int', 'bounds': { 'min': -100, 'max': 100 }, }, ] algo = SigOptSearch( space, name="SigOpt Multi Objective Example Experiment", metric=["average", "std"], mode=["max", "min"])
- set_search_properties(metric: Optional[str], mode: Optional[str], config: Dict, **spec) bool [source]#
Pass search properties to searcher.
This method acts as an alternative to instantiating search algorithms with their own specific search spaces. Instead they can accept a Tune config through this method. A searcher should return
True
if setting the config was successful, orFalse
if it was unsuccessful, e.g. when the search space has already been set.- Parameters
metric – Metric to optimize
mode – One of [“min”, “max”]. Direction to optimize.
config – Tune config dict.
**spec – Any kwargs for forward compatiblity. Info like Experiment.PUBLIC_KEYS is provided through here.
- set_max_concurrency(max_concurrent: int) bool [source]#
Set max concurrent trials this searcher can run.
This method will be called on the wrapped searcher by the
ConcurrencyLimiter
. It is intended to allow for searchers which have custom, internal logic handling max concurrent trials to inherit the value passed toConcurrencyLimiter
.If this method returns False, it signifies that no special logic for handling this case is present in the searcher.
- Parameters
max_concurrent – Number of maximum concurrent trials.
- suggest(trial_id: str)[source]#
Queries the algorithm to retrieve the next set of parameters.
- Parameters
trial_id – Trial ID used for subsequent notifications.
- Returns
- Configuration for a trial, if possible.
If FINISHED is returned, Tune will be notified that no more suggestions/configurations will be provided. If None is returned, Tune will skip the querying of the searcher for this step.
- Return type
dict | FINISHED | None
- on_trial_complete(trial_id: str, result: Optional[Dict] = None, error: bool = False)[source]#
Notification for the completion of trial.
If a trial fails, it will be reported as a failed Observation, telling the optimizer that the Suggestion led to a metric failure, which updates the feasible region and improves parameter recommendation.
Creates SigOpt Observation object for trial.
- static serialize_metric(metrics: List[str], modes: List[str])[source]#
Converts metrics to https://app.sigopt.com/docs/objects/metric
- serialize_result(result: Dict)[source]#
Converts experiments results to https://app.sigopt.com/docs/objects/metric_evaluation
- save(checkpoint_path: str)[source]#
Save state to path for this search algorithm.
- Parameters
checkpoint_path – File where the search algorithm state is saved. This path should be used later when restoring from file.
Example:
search_alg = Searcher(...) tuner = tune.Tuner( cost, tune_config=tune.TuneConfig( search_alg=search_alg, num_samples=5 ), param_space=config ) results = tuner.fit() search_alg.save("./my_favorite_path.pkl")
Changed in version 0.8.7: Save is automatically called by
Tuner().fit()
. You can useTuner().restore()
to restore from an experiment directory such as/ray_results/trainable
.
- restore(checkpoint_path: str)[source]#
Restore state for this search algorithm
- Parameters
checkpoint_path – File where the search algorithm state is saved. This path should be the same as the one provided to “save”.
Example:
search_alg.save("./my_favorite_path.pkl") search_alg2 = Searcher(...) search_alg2 = ConcurrencyLimiter(search_alg2, 1) search_alg2.restore(checkpoint_path) tuner = tune.Tuner( cost, tune_config=tune.TuneConfig( search_alg=search_alg2, num_samples=5 ), ) tuner.fit()