How to "bucket" batch requests on Serve?

Hello everyone!

I’d like to know if there’s a way to “bucket” batch requests on serve. Something along the lines of the following snippet:

@serve.batch(bucket_by="batch_param")
async def batch_process(self, samples: list[np.ndarray], batch_param: float):
    batch = build_batch(samples)
    return process_batch(batch, batch_param)

async def process(self, sample: np.ndarray, param: float):
    await batch_process(sample, param)

In which batch queues would be bucketed by param, i.e. requests with different param values would be queued in different batches.

Thanks!

Hi @amiasato! There’s no first-class way to do this on Serve, but I’m curious to hear more about your use case. Could you submit a feature request on the GitHub repo, so we can discuss it there and consider adding it?