ray.tune.with_parameters#

ray.tune.with_parameters(trainable: Union[Type[Trainable], Callable], **kwargs)[source]#

Wrapper for trainables to pass arbitrary large data objects.

This wrapper function will store all passed parameters in the Ray object store and retrieve them when calling the function. It can thus be used to pass arbitrary data, even datasets, to Tune trainables.

This can also be used as an alternative to functools.partial to pass default arguments to trainables.

When used with the function API, the trainable function is called with the passed parameters as keyword arguments. When used with the class API, the Trainable.setup() method is called with the respective kwargs.

If the data already exists in the object store (are instances of ObjectRef), using tune.with_parameters() is not necessary. You can instead pass the object refs to the training function via the config or use Python partials.

Parameters
  • trainable – Trainable to wrap.

  • **kwargs – parameters to store in object store.

Function API example:

from ray import tune
from ray.air import session

def train(config, data=None):
    for sample in data:
        loss = update_model(sample)
        session.report(loss=loss)

data = HugeDataset(download=True)

tuner = Tuner(
    tune.with_parameters(train, data=data),
    # ...
)
tuner.fit()

Class API example:

from ray import tune

class MyTrainable(tune.Trainable):
    def setup(self, config, data=None):
        self.data = data
        self.iter = iter(self.data)
        self.next_sample = next(self.iter)

    def step(self):
        loss = update_model(self.next_sample)
        try:
            self.next_sample = next(self.iter)
        except StopIteration:
            return {"loss": loss, done: True}
        return {"loss": loss}

data = HugeDataset(download=True)

tuner = Tuner(
    tune.with_parameters(MyTrainable, data=data),
    # ...
)

Note

When restoring a Tune experiment, you need to re-specify the trainable wrapped with tune.with_parameters. The reasoning behind this is as follows:

1. tune.with_parameters stores parameters in the object store and attaches object references to the trainable, but the objects they point to may not exist anymore upon restoring in a new Ray cluster.

2. The attached objects could be arbitrarily large, so Tune does not save the object data along with the trainable.

To restore, Tune allows the trainable to be re-specified in Tuner.restore(path, trainable=...). Continuing from the previous examples, here’s an example of restoration:

from ray.tune import Tuner

data = HugeDataset(download=True)

tuner = Tuner.restore(
    "/path/to/experiment/",
    trainable=tune.with_parameters(MyTrainable, data=data),
    # ...
)

PublicAPI (beta): This API is in beta and may change before becoming stable.