Tune: A Research Platform for Distributed Model Selection and Training

Richard Liaw, Eric Liang, Robert Nishihara, Philipp Moritz, Joseph E. Gonzalez, Ion Stoica
2018
6 references

Abstract

Modern machine learning algorithms are increasingly computationally demanding, requiring specialized hardware and distributed computation to achieve high performance in a reasonable time frame. Many hyperparameter search algorithms have been proposed for improving the efficiency of model selection, however their adaptation to the distributed compute environment is often ad-hoc. We propose Tune, a unified framework for model selection and training that provides a narrow-waist interface between training scripts and search algorithms. We show that this interface meets the requirements for a broad range of hyperparameter search algorithms, allows straightforward scaling of search to large clusters, and simplifies algorithm implementation. We demonstrate the implementation of several state-of-the-art hyperparameter search algorithms in Tune. Tune is available at http://ray.readthedocs.io/en/latest/tune.html.

1 repository
6 references

Code References

â–¶ ray-project/ray
4 files
â–¶ doc/source/ray-overview/getting-started.md
1
- [Tune paper](https://arxiv.org/abs/1807.05118)
â–¶ doc/source/tune/index.rst
2
If Tune helps you in your academic research, you are encouraged to cite `our paper <https://arxiv.org/abs/1807.05118>`__.
journal={arXiv preprint arXiv:1807.05118},
â–¶ python/ray/tune/README.rst
2
If Tune helps you in your academic research, you are encouraged to cite `our paper <https://arxiv.org/abs/1807.05118>`__. Here is an example bibtex:
journal={arXiv preprint arXiv:1807.05118},
â–¶ README.rst
1
.. _`Tune paper`: https://arxiv.org/abs/1807.05118
Link copied to clipboard!