Batching & pre-dispatch mechanism

Is there any batching and pre-dispatch mechanism in Ray similar to what is provided in joblib.Parallel?

Copy them here for convenience:

pre_dispatch: {‘all’, integer, or expression, as in ‘3*n_jobs’}
The number of batches (of tasks) to be pre-dispatched. Default is ‘2*n_jobs’. When batch_size=”auto” this is reasonable default and the workers should never starve.

batch_size: int or ‘auto’, default: ‘auto’
The number of atomic tasks to dispatch at once to each worker. When individual evaluations are very fast, dispatching calls to workers can be slower than sequential computation because of the overhead. Batching fast computations together can mitigate this. The 'auto' strategy keeps track of the time it takes for a batch to complete, and dynamically adjusts the batch size to keep the time on the order of half a second, using a heuristic. The initial batch size is 1. batch_size="auto" with backend="threading" will dispatch batches of a single task at a time as the threading backend has very little overhead and using larger batch size has not proved to bring any gain in that case.