Using Weights & Biases with Tune

Weights & Biases (Wandb) is a tool for experiment tracking, model optimizaton, and dataset versioning. It is very popular in the machine learning and data science community for its superb visualization tools.

Weights & Biases

Ray Tune currently offers two lightweight integrations for Weights & Biases. One is the WandbLogger, which automatically logs metrics reported to Tune to the Wandb API.

The other one is the @wandb_mixin decorator, which can be used with the function API. It automatically initializes the Wandb API with Tune’s training information. You can just use the Wandb API like you would normally do, e.g. using wandb.log() to log your training process.

Please see here for a full example.

class ray.tune.integration.wandb.WandbLogger(config, logdir, trial=None)[source]

Weights and biases (https://www.wandb.com/) is a tool for experiment tracking, model optimization, and dataset versioning. This Ray Tune Logger sends metrics to Wandb for automatic tracking and visualization.

Wandb configuration is done by passing a wandb key to the config parameter of tune.run() (see example below).

The wandb config key can be optionally included in the logger_config subkey of config to be compatible with RLLib trainables (see second example below).

The content of the wandb config entry is passed to wandb.init() as keyword arguments. The exception are the following settings, which are used to configure the WandbLogger itself:

Parameters
  • api_key_file (str) – Path to file containing the Wandb API KEY. This file only needs to be present on the node running the Tune script if using the WandbLogger.

  • api_key (str) – Wandb API Key. Alternative to setting api_key_file.

  • excludes (list) – List of metrics that should be excluded from the log.

  • log_config (bool) – Boolean indicating if the config parameter of the results dict should be logged. This makes sense if parameters will change during training, e.g. with PopulationBasedTraining. Defaults to False.

Wandb’s group, run_id and run_name are automatically selected by Tune, but can be overwritten by filling out the respective configuration values.

Please see here for all other valid configuration settings: https://docs.wandb.com/library/init

Example:

from ray.tune.logger import DEFAULT_LOGGERS
from ray.tune.integration.wandb import WandbLogger
tune.run(
    train_fn,
    config={
        # define search space here
        "parameter_1": tune.choice([1, 2, 3]),
        "parameter_2": tune.choice([4, 5, 6]),
        # wandb configuration
        "wandb": {
            "project": "Optimization_Project",
            "api_key_file": "/path/to/file",
            "log_config": True
        }
    },
    loggers=DEFAULT_LOGGERS + (WandbLogger, ))

Example for RLLib:

from ray import tune
from ray.tune.integration.wandb import WandbLogger

tune.run(
    "PPO",
    config={
        "env": "CartPole-v0",
        "logger_config": {
            "wandb": {
                "project": "PPO",
                "api_key_file": "~/.wandb_api_key"
            }
        }
    },
    loggers=[WandbLogger])
ray.tune.integration.wandb.wandb_mixin(func: Callable)[source]

Weights and biases (https://www.wandb.com/) is a tool for experiment tracking, model optimization, and dataset versioning. This Ray Tune Trainable mixin helps initializing the Wandb API for use with the Trainable class or with @wandb_mixin for the function API.

For basic usage, just prepend your training function with the @wandb_mixin decorator:

from ray.tune.integration.wandb import wandb_mixin

@wandb_mixin
def train_fn(config):
    wandb.log()

Wandb configuration is done by passing a wandb key to the config parameter of tune.run() (see example below).

The content of the wandb config entry is passed to wandb.init() as keyword arguments. The exception are the following settings, which are used to configure the WandbTrainableMixin itself:

Parameters
  • api_key_file (str) – Path to file containing the Wandb API KEY. This file must be on all nodes if using the wandb_mixin.

  • api_key (str) – Wandb API Key. Alternative to setting api_key_file.

Wandb’s group, run_id and run_name are automatically selected by Tune, but can be overwritten by filling out the respective configuration values.

Please see here for all other valid configuration settings: https://docs.wandb.com/library/init

Example:

from ray import tune
from ray.tune.integration.wandb import wandb_mixin

@wandb_mixin
def train_fn(config):
    for i in range(10):
        loss = self.config["a"] + self.config["b"]
        wandb.log({"loss": loss})
    tune.report(loss=loss, done=True)

tune.run(
    train_fn,
    config={
        # define search space here
        "a": tune.choice([1, 2, 3]),
        "b": tune.choice([4, 5, 6]),
        # wandb configuration
        "wandb": {
            "project": "Optimization_Project",
            "api_key_file": "/path/to/file"
        }
    })