Learner.configure_optimizers() None[source]#

Configures, creates, and registers the optimizers for this Learner.

Optimizers are responsible for updating the model’s parameters during training, based on the computed gradients.

Normally, you should not override this method for your custom algorithms (which require certain optimizers), but rather override the self.configure_optimizers_for_module(module_id=..) method and register those optimizers in there that you need for the given module_id.

You can register an optimizer for any RLModule within self.module (or for the ALL_MODULES ID) by calling self.register_optimizer() and passing the module_id, optimizer_name (only in case you would like to register more than one optimizer for a given module), the optimizer instane itself, a list of all the optimizer’s parameters (to be updated by the optimizer), and an optional learning rate or learning rate schedule setting.

This method is called once during building (self.build()).