How severe does this issue affect your experience of using Ray?
- High: It blocks me to complete my task.
I’m trying to host a ML model with Ray Serve + FastAPI, and it’s working fine up until I try to add APM server:
from ray import serve
from fastapi import FastAPI
from elasticapm.contrib.starlette import ElasticAPM, make_apm_client
app = FastAPI(
title="My ML API Server",
)
# ===========
# newly added
apm_config = {
"SERVICE_NAME": "My ML API Server",
}
apm = make_apm_client(config=apm_config)
app.add_middleware(ElasticAPM, client=apm)
# ===========
@serve.deployment(
name="MLAPI",
num_replicas=1,
ray_actor_options={"num_cpus": 1, "num_gpus": 0}
)
@serve.ingress(app)
class MLDeployment:
....
The error is:
File "~/Documents/Projects/xxx/./app/server.py", line 46, in <module>
class MLDeployment:
File "~/miniconda3/envs/env/lib/python3.10/site-packages/ray/serve/api.py", line 225, in decorator
frozen_app = cloudpickle.loads(cloudpickle.dumps(app))
File "~/miniconda3/envs/env/lib/python3.10/site-packages/ray/cloudpickle/cloudpickle_fast.py", line 88, in dumps
cp.dump(obj)
File "~/miniconda3/envs/env/lib/python3.10/site-packages/ray/cloudpickle/cloudpickle_fast.py", line 733, in dump
return Pickler.dump(self, obj)
TypeError: cannot pickle '_thread.lock' object
I initially started with this doc: Starlette/FastAPI Support | APM Python Agent Reference [6.x] | Elastic, but got error saying TypeError: ElasticAPM.__init__() missing 1 required positional argument: 'client'
, so I added an apm client, and resulted in above error trace regarding cloudpickle
.
Any help to overcome this issue is appreciated!