If any example is broken, or if you’d like to add an example to this page, feel free to raise an issue on our Github repository.
Check out the Tune tutorials page for guides on how to use Tune with your preferred machine learning library.
async_hyperband_example: Example of using a Trainable class with AsyncHyperBandScheduler.
hyperband_example: Example of using a Trainable class with HyperBandScheduler. Also uses the Experiment class API for specifying the experiment configuration. Also uses the AsyncHyperBandScheduler.
pbt_example: Example of using a Trainable class with PopulationBasedTraining scheduler.
pbt_function: Example of using the function API with a PopulationBasedTraining scheduler.
pbt_ppo_example: Example of optimizing a distributed RLlib algorithm (PPO) with the PopulationBasedTraining scheduler.
logging_example: Example of custom loggers and custom trial directory naming.
Search Algorithm Examples¶
hyperopt_example: Optimizes a basic function using the function-based API and the HyperOptSearch (SearchAlgorithm wrapper for HyperOpt TPE).
tune_mnist_keras: Converts the Keras MNIST example to use Tune with the function-based API and a Keras callback. Also shows how to easily convert something relying on argparse to use Tune.
pbt_memnn_example: Example of training a Memory NN on bAbI with Keras using PBT.
tf_mnist_example: Converts the Advanced TF2.0 MNIST example to use Tune with the Trainable. This uses tf.function. Original code from tensorflow: https://www.tensorflow.org/tutorials/quickstart/advanced
mnist_pytorch: Converts the PyTorch MNIST example to use Tune with the function-based API. Also shows how to easily convert something relying on argparse to use Tune.
mnist_pytorch_trainable: Converts the PyTorch MNIST example to use Tune with Trainable API. Also uses the HyperBandScheduler and checkpoints the model at the end.
ddp_mnist_torch: An example showing how to use DistributedDataParallel with Ray Tune. This enables both distributed training and distributed hyperparameter tuning.
lightgbm_example: Trains a basic LightGBM model with Tune with the function-based API and a LightGBM callback.
🤗 Huggingface Transformers Example¶
pbt_transformers_example: Fine-tunes a Huggingface transformer with Tune Population Based Training.
pbt_tune_cifar10_with_keras: A contributed example of tuning a Keras model on CIFAR10 with the PopulationBasedTraining scheduler.
genetic_example: Optimizing the michalewicz function using the contributed GeneticSearch algorithm with AsyncHyperBandScheduler.
tune_cifar10_gluon: MXNet Gluon example to use Tune with the function-based API on CIFAR-10 dataset.
Open Source Projects using Tune¶
Here are some of the popular open source repositories and research projects that leverage Tune. Feel free to submit a pull-request adding (or requesting a removal!) of a listed project.
Softlearning: Softlearning is a reinforcement learning framework for training maximum entropy policies in continuous domains. Includes the official implementation of the Soft Actor-Critic algorithm.
Population Based Augmentation: Population Based Augmentation (PBA) is a algorithm that quickly and efficiently learns data augmentation functions for neural network training. PBA matches state-of-the-art results on CIFAR with one thousand times less compute.
Fast AutoAugment by Kakao: Fast AutoAugment (Accepted at NeurIPS 2019) learns augmentation policies using a more efficient search strategy based on density matching.
Allentune: Hyperparameter Search for AllenNLP from AllenAI.