Object list can't be used in 'await' expression

I have 2.9.1 Ray and trying to test micro batching locally, here is a sample code taken from documentation

from fastapi import FastAPI
from transformers import pipeline

from ray import serve

app = FastAPI()


@serve.deployment
@serve.ingress(app)
class SentimentAnalysis:
    def __init__(self):
        self._classifier = pipeline("sentiment-analysis")

    @serve.batch(max_batch_size=10, batch_wait_timeout_s=0.1)
    async def classify_batched(self, batched_inputs):
        print("Got batch size:", len(batched_inputs))
        results = self._classifier(batched_inputs)
        return [result["label"] for result in results]

    @app.get("/classify")
    async def classify(self, input_text: str) -> str:
        return await self.classify_batched(input_text)


batched_deployment = SentimentAnalysis.bind()

This gives me this error:

(ProxyActor pid=93832)   File "/Users/fed/Projects/AstrumU/ray-testing/.venv/lib/python3.9/site-packages/ray/serve/batching.py", line 498, in batch_wrapper
(ProxyActor pid=93832)     return await enqueue_request(args, kwargs)
(ProxyActor pid=93832)   File "/Users/fed/Projects/AstrumU/ray-testing/.venv/lib/python3.9/site-packages/ray/serve/batching.py", line 228, in _process_batches
(ProxyActor pid=93832)     results = await func_future
(ProxyActor pid=93832) TypeError: object list can't be used in 'await' expression

Any ideas what am I missing?

@XBeg9 Welcome to the Ray Community and forum for your first post.

Did you try using this pattern documented here: Dynamic Request Batching — Ray 2.9.1

cc: @Gene @cindy_zhang

Hi, yes. That’s what I referenced. Actually the problem has been fixed with async, you suppose to have in any case even if there is no await inside the function (batch side)

@XBeg9 Excellent. I will close this issue. Thanks for taking Ray for a spin and reporting issues.