ray.rllib.algorithms.algorithm.Algorithm.env_runner_group#
- Algorithm.env_runner_group: EnvRunnerGroup | None = None#
The
EnvRunnerGroup
of the Algorithm. AnEnvRunnerGroup
is composed of a single localEnvRunner
(see:self.env_runner
), serving as the reference copy of the models to be trained and optionally one or more remoteEnvRunners
used to generate training samples from the RL environment, in parallel. EnvRunnerGroup is fault-tolerant and elastic. It tracks health states for all the managed remote EnvRunner actors. As a result, Algorithm should never access the underlying actor handles directly. Instead, always access them via all the foreach APIs with assigned IDs of the underlying EnvRunners.