Trainable.save_checkpoint(checkpoint_dir: str) Optional[Union[str, Dict]][source]#

Subclasses should override this to implement save().


Do not rely on absolute paths in the implementation of Trainable.save_checkpoint and Trainable.load_checkpoint.

Use validate_save_restore to catch Trainable.save_checkpoint/ Trainable.load_checkpoint errors before execution.

>>> from ray.tune.utils import validate_save_restore
>>> MyTrainableClass = ... 
>>> validate_save_restore(MyTrainableClass) 
>>> validate_save_restore( 
...     MyTrainableClass, use_object_store=True)

New in version 0.8.7.


checkpoint_dir – The directory where the checkpoint file must be stored. In a Tune run, if the trial is paused, the provided path may be temporary and moved.


A dict or string. If string, the return value is expected to be prefixed by checkpoint_dir. If dict, the return value will be automatically serialized by Tune. In both cases, the return value is exactly what will be passed to Trainable.load_checkpoint() upon restore.


>>> trainable, trainable1, trainable2 = ... 
>>> print(trainable1.save_checkpoint("/tmp/checkpoint_1")) 
>>> print(trainable2.save_checkpoint("/tmp/checkpoint_2")) 
{"some": "data"}
>>> trainable.save_checkpoint("/tmp/bad_example") 
"/tmp/NEW_CHECKPOINT_PATH/my_checkpoint_file" # This will error.