class ray.tune.schedulers.AsyncHyperBandScheduler(time_attr: str = 'training_iteration', metric: str | None = None, mode: str | None = None, max_t: int = 100, grace_period: int = 1, reduction_factor: float = 4, brackets: int = 1, stop_last_trials: bool = True)[source]#

Bases: FIFOScheduler

Implements the Async Successive Halving.

This should provide similar theoretical performance as HyperBand but avoid straggler issues that HyperBand faces. One implementation detail is when using multiple brackets, trial allocation to bracket is done randomly with over a softmax probability.

See https://arxiv.org/abs/1810.05934

  • time_attr – A training result attr to use for comparing time. Note that you can pass in something non-temporal such as training_iteration as a measure of progress, the only requirement is that the attribute should increase monotonically.

  • metric – The training result objective value attribute. Stopping procedures will use this attribute. If None but a mode was passed, the ray.tune.result.DEFAULT_METRIC will be used per default.

  • mode – One of {min, max}. Determines whether objective is minimizing or maximizing the metric attribute.

  • max_t – max time units per trial. Trials will be stopped after max_t time units (determined by time_attr) have passed.

  • grace_period – Only stop trials at least this old in time. The units are the same as the attribute named by time_attr.

  • reduction_factor – Used to set halving rate and amount. This is simply a unit-less scalar.

  • brackets – Number of brackets. Each bracket has a different halving rate, specified by the reduction factor.

  • stop_last_trials – Whether to terminate the trials after reaching max_t. Defaults to True.

set_search_properties(metric: str | None, mode: str | None, **spec) bool[source]#

Pass search properties to scheduler.

This method acts as an alternative to instantiating schedulers that react to metrics with their own metric and mode parameters.

  • metric – Metric to optimize

  • mode – One of [“min”, “max”]. Direction to optimize.

  • **spec – Any kwargs for forward compatiblity. Info like Experiment.PUBLIC_KEYS is provided through here.

on_trial_add(tune_controller: TuneController, trial: Trial)[source]#

Called when a new trial is added to the trial runner.

on_trial_result(tune_controller: TuneController, trial: Trial, result: Dict) str[source]#

Called on each intermediate result returned by a trial.

At this point, the trial scheduler can make a decision by returning one of CONTINUE, PAUSE, and STOP. This will only be called when the trial is in the RUNNING state.

on_trial_complete(tune_controller: TuneController, trial: Trial, result: Dict)[source]#

Notification for the completion of trial.

This will only be called when the trial is in the RUNNING state and either completes naturally or by manual termination.

on_trial_remove(tune_controller: TuneController, trial: Trial)[source]#

Called to remove trial.

This is called when the trial is in PAUSED or PENDING state. Otherwise, call on_trial_complete.

debug_string() str[source]#

Returns a human readable message for printing to the console.

save(checkpoint_path: str)[source]#

Save trial scheduler to a checkpoint

restore(checkpoint_path: str)[source]#

Restore trial scheduler from checkpoint.