ray.train.backend.Backend#
- class ray.train.backend.Backend(*args, **kwargs)[source]#
Singleton for distributed communication backend.
If True, each worker process will have CUDA_VISIBLE_DEVICES set as the visible device IDs of all workers on the same node for this training instance. If False, each worker will have CUDA_VISIBLE_DEVICES set to the device IDs allocated by Ray for that worker.
- Type:
DeveloperAPI: This API may change across minor Ray releases.
- on_start(worker_group: WorkerGroup, backend_config: BackendConfig)[source]#
Logic for starting this backend.
- on_shutdown(worker_group: WorkerGroup, backend_config: BackendConfig)[source]#
Logic for shutting down the backend.
- on_training_start(worker_group: WorkerGroup, backend_config: BackendConfig)[source]#
Logic ran right before training is started.
Session API is available at this point.