What is Ray?¶

Ray is a fast and simple framework for building and running distributed applications.
Ray accomplishes this mission by:
Providing simple primitives for building and running distributed applications.
Enabling end users to parallelize single machine code, with little to zero code changes.
Including a large ecosystem of applications, libraries, and tools on top of the core Ray to enable complex applications.
Ray Core provides the simple primitives for application building.
On top of Ray Core are several libraries for solving problems in machine learning:
Ray also has a number of other community contributed libraries:
Getting Started with Ray¶
Check out A Gentle Introduction to Ray to learn more about Ray and its ecosystem of libraries that enable things like distributed hyperparameter tuning, reinforcement learning, and distributed training.
Ray uses Tasks (functions) and Actors (Classes) to allow you to parallelize your Python code:
# First, run `pip install ray`.
import ray
ray.init()
@ray.remote
def f(x):
return x * x
futures = [f.remote(i) for i in range(4)]
print(ray.get(futures)) # [0, 1, 4, 9]
@ray.remote
class Counter(object):
def __init__(self):
self.n = 0
def increment(self):
self.n += 1
def read(self):
return self.n
counters = [Counter.remote() for i in range(4)]
[c.increment.remote() for c in counters]
futures = [c.read.remote() for c in counters]
print(ray.get(futures)) # [1, 1, 1, 1]
You can also get started by visiting our Tutorials. For the latest wheels (nightlies), see the installation page.
Getting Involved¶
Ray is more than a framework for distributed applications but also an active community of developers, researchers, and folks that love machine learning. Here’s a list of tips for getting involved with the Ray community:
Join our community slack to discuss Ray! The community is extremely active in helping people succeed in building their ray applications.
Star and follow us on on GitHub.
Join our Meetup Group to connect with others in the community!
Use the [ray] tag on StackOverflow to ask and answer questions about Ray usage
Subscribe to ray-dev@googlegroups.com to join development discussions.
Follow us and spread the word on Twitter!
If you’re interested in contributing to Ray, visit our page on Getting Involved to read about the contribution process and see what you can work on!
More Information¶
Here are some talks, papers, and press coverage involving Ray and its libraries. Please raise an issue if any of the below links are broken, or if you’d like to add your own talk!
Blog and Press¶
Modern Parallel and Distributed Python: A Quick Tutorial on Ray
Implementing A Parameter Server in 15 Lines of Python with Ray
RayOnSpark: Running Emerging AI Applications on Big Data Clusters with Ray and Analytics Zoo
[Tune] Tune: a Python library for fast hyperparameter tuning at any scale
[RLlib] New Library Targets High Speed Reinforcement Learning
Talks (Videos)¶
Slides¶
Academic Papers¶
Ray Serve
- Ray Serve: Scalable and Programmable Serving
- Key Concepts
- Tutorials
- Deploying Ray Serve
- Advanced Topics, Configurations, & FAQ
- Package Reference