ray.tune.Experiment
ray.tune.Experiment#
- class ray.tune.Experiment(name: str, run: Union[str, Callable, Type], *, stop: Optional[Union[Mapping, ray.tune.stopper.stopper.Stopper, Callable[[str, Mapping], bool]]] = None, time_budget_s: Optional[Union[int, float, datetime.timedelta]] = None, config: Optional[Dict[str, Any]] = None, resources_per_trial: Union[None, Mapping[str, Union[float, int, Mapping]], PlacementGroupFactory] = None, num_samples: int = 1, local_dir: Optional[str] = None, _experiment_checkpoint_dir: Optional[str] = None, sync_config: Optional[Union[ray.tune.syncer.SyncConfig, dict]] = None, checkpoint_config: Optional[Union[ray.air.config.CheckpointConfig, dict]] = None, trial_name_creator: Optional[Callable[[Trial], str]] = None, trial_dirname_creator: Optional[Callable[[Trial], str]] = None, log_to_file: bool = False, export_formats: Optional[Sequence] = None, max_failures: int = 0, restore: Optional[str] = None)[source]#
Bases:
object
Tracks experiment specifications.
Implicitly registers the Trainable if needed. The args here take the same meaning as the arguments defined
tune.py:run
.experiment_spec = Experiment( "my_experiment_name", my_func, stop={"mean_accuracy": 100}, config={ "alpha": tune.grid_search([0.2, 0.4, 0.6]), "beta": tune.grid_search([1, 2]), }, resources_per_trial={ "cpu": 1, "gpu": 0 }, num_samples=10, local_dir="~/ray_results", checkpoint_freq=10, max_failures=2)
- Parameters
TODO (xwjiang) – Add the whole list.
_experiment_checkpoint_dir – Internal use only. If present, use this as the root directory for experiment checkpoint. If not present, the directory path will be deduced from trainable name instead.
DeveloperAPI: This API may change across minor Ray releases.
- classmethod from_json(name: str, spec: dict)[source]#
Generates an Experiment object from JSON.
- Parameters
name – Name of Experiment.
spec – JSON configuration of experiment.
- classmethod get_trainable_name(run_object: Union[str, Callable, Type])[source]#
Get Trainable name.
- Parameters
run_object – Trainable to run. If string, assumes it is an ID and does not modify it. Otherwise, returns a string corresponding to the run_object name.
- Returns
A string representing the trainable identifier.
- Raises
TuneError – if
run_object
passed in is invalid.
- classmethod register_if_needed(run_object: Union[str, Callable, Type])[source]#
Registers Trainable or Function at runtime.
Assumes already registered if run_object is a string. Also, does not inspect interface of given run_object.
- Parameters
run_object – Trainable to run. If string, assumes it is an ID and does not modify it. Otherwise, returns a string corresponding to the run_object name.
- Returns
A string representing the trainable identifier.
- classmethod get_experiment_checkpoint_dir(run_obj: Union[str, Callable, Type], local_dir: Optional[str] = None, name: Optional[str] = None)[source]#
Get experiment checkpoint dir without setting up an experiment.
This is only used internally for better support of Tuner API.
- Parameters
run_obj – Trainable to run.
local_dir – The local_dir path.
name – The name of the experiment specified by user.
- Returns
Checkpoint directory for experiment.
- property run_identifier#
Returns a string representing the trainable identifier.
- property public_spec: Dict[str, Any]#
Returns the spec dict with only the public-facing keys.
Intended to be used for passing information to callbacks, Searchers and Schedulers.