ray.train.rl.RLCheckpoint
ray.train.rl.RLCheckpoint#
- class ray.train.rl.RLCheckpoint(local_path: Optional[Union[str, os.PathLike]] = None, data_dict: Optional[dict] = None, uri: Optional[str] = None)[source]#
Bases:
ray.air.checkpoint.Checkpoint
A
Checkpoint
with RLlib-specific functionality.Create this from a generic
Checkpoint
by callingRLCheckpoint.from_checkpoint(ckpt)
.PublicAPI (alpha): This API is in alpha and may change before becoming stable.
- get_policy(env: Optional[Any] = None) ray.rllib.policy.policy.Policy [source]#
Retrieve the policy stored in this checkpoint.
- Parameters
env – Optional environment to instantiate the trainer with. If not given, it is parsed from the saved trainer configuration.
- Returns
The policy stored in this checkpoint.