ray.tune.Trainable.save_checkpoint#

Trainable.save_checkpoint(checkpoint_dir: str) Dict | None[source]#

Subclasses should override this to implement save().

Warning

Do not rely on absolute paths in the implementation of Trainable.save_checkpoint and Trainable.load_checkpoint.

Use validate_save_restore to catch Trainable.save_checkpoint/ Trainable.load_checkpoint errors before execution.

>>> from ray.tune.utils import validate_save_restore
>>> MyTrainableClass = ... 
>>> validate_save_restore(MyTrainableClass) 

Added in version 0.8.7.

Parameters:

checkpoint_dir – The directory where the checkpoint file must be stored. In a Tune run, if the trial is paused, the provided path may be temporary and moved.

Returns:

A dict or None. If dict, the return value will be automatically serialized by Tune. In that case, Trainable.load_checkpoint() will receive the dict upon restore.

Example

>>> trainable, trainable1, trainable2 = ... 
>>> print(trainable1.save_checkpoint("/tmp/checkpoint_1")) 
"/tmp/checkpoint_1"
>>> print(trainable2.save_checkpoint("/tmp/checkpoint_2")) 
{"some": "data"}
>>> trainable.save_checkpoint("/tmp/bad_example") 
"/tmp/NEW_CHECKPOINT_PATH/my_checkpoint_file" # This will error.