ray.serve.batch(func: ray.serve.batching.F) ray.serve.batching.G[source]#
ray.serve.batch(max_batch_size: Optional[int] = 10, batch_wait_timeout_s: Optional[float] = 0.0) Callable[[ray.serve.batching.F], ray.serve.batching.G]

Converts a function to asynchronously handle batches.

The function can be a standalone function or a class method. In both cases, the function must be async def and take a list of objects as its sole argument and return a list of the same length as a result.

When invoked, the caller passes a single object. These will be batched and executed asynchronously once there is a batch of max_batch_size or batch_wait_timeout_s has elapsed, whichever occurs first.

Example: >>> from ray import serve >>> @serve.batch(max_batch_size=50, batch_wait_timeout_s=0.5) # doctest: +SKIP … async def handle_batch(batch: List[str]): # doctest: +SKIP … return [s.lower() for s in batch] # doctest: +SKIP

>>> async def handle_single(s: str): 
...     # Returns s.lower().
...     return await handle_batch(s) 
  • max_batch_size – the maximum batch size that will be executed in one call to the underlying function.

  • batch_wait_timeout_s – the maximum duration to wait for max_batch_size elements before running the underlying function.

PublicAPI (beta): This API is in beta and may change before becoming stable.