Tutorials, User Guides, Examples

In this section, you can find material on how to use Tune and its various features. If any of the materials is out of date or broken, or if you’d like to add an example to this page, feel free to raise an issue on our Github repository.

Tutorials

Take a look at any of the below tutorials to get started with Tune.

User Guides

These pages will demonstrate the various features and configurations of Tune.

Colab Exercises

Learn how to use Tune in your browser with the following Colab-based exercises.

Exercise Description Library Colab Link
Basics of using Tune. TF/Keras Tune Tutorial
Using Search algorithms and Trial Schedulers to optimize your model. Pytorch Tune Tutorial
Using Population-Based Training (PBT). Pytorch Tune Tutorial

Tutorial source files can be found here.

Tune Examples

If any example is broken, or if you’d like to add an example to this page, feel free to raise an issue on our Github repository.

General Examples

  • async_hyperband_example: Example of using a Trainable class with AsyncHyperBandScheduler.

  • hyperband_example: Example of using a Trainable class with HyperBandScheduler. Also uses the Experiment class API for specifying the experiment configuration. Also uses the AsyncHyperBandScheduler.

  • pbt_example: Example of using a Trainable class with PopulationBasedTraining scheduler.

  • pbt_ppo_example: Example of optimizing a distributed RLlib algorithm (PPO) with the PopulationBasedTraining scheduler.

  • logging_example: Example of custom loggers and custom trial directory naming.

Search Algorithm Examples

Tensorflow/Keras Examples

PyTorch Examples

  • mnist_pytorch: Converts the PyTorch MNIST example to use Tune with the function-based API. Also shows how to easily convert something relying on argparse to use Tune.

  • mnist_pytorch_trainable: Converts the PyTorch MNIST example to use Tune with Trainable API. Also uses the HyperBandScheduler and checkpoints the model at the end.

XGBoost Example

  • xgboost_example: Trains a basic XGBoost model with Tune with the function-based API and an XGBoost callback.

LightGBM Example

  • lightgbm_example: Trains a basic LightGBM model with Tune with the function-based API and a LightGBM callback.

Contributed Examples

  • pbt_tune_cifar10_with_keras: A contributed example of tuning a Keras model on CIFAR10 with the PopulationBasedTraining scheduler.

  • genetic_example: Optimizing the michalewicz function using the contributed GeneticSearch algorithm with AsyncHyperBandScheduler.

  • tune_cifar10_gluon: MXNet Gluon example to use Tune with the function-based API on CIFAR-10 dataset.