Ray client: Can't pickle custom serialized object

Ray 1.9.2 and python 3.7.12

Using ray recommenced custom serialization to serialize an object. It works in single node standalone but throws can’t pickle exception in cluster mode.

I tried to run a job via ray client to an existing ray cluster in k8s

ERROR worker.py:409 -- Failed to deserialize b"\x80\x05\x95Q\x00\x00\x00\x00\x00\x00\x00\x8c\x08builtins\x94\x8c\tTypeError\x94\x93\x94\x8c0can't pickle Context objects\x94\x85\x94R\x94."
Traceback (most recent call last):
  File "/usr/local/lib/python3.7/site-packages/ray/util/client/worker.py", line 407, in _put_pickled
    raise cloudpickle.loads(resp.error)
TypeError: can't pickle Context objects

Any suggestion to resolve this issue? Thanks

Hi @mmuru ,

my post was hidden but reinstated today and I had exact the problem: [Dask on Ray] Parallelizing Rasa's DaskGraphRunner - Problem with serializing SQLAlchemy objects

I needed to write custom serializers that omit serializing the DB connections because they are not serializable.

What you also probably need is to check serializability with inspect_serializability function (Serialization — Ray v1.9.2).

Hi @toza-mimoza, Thanks for your prompt response and direction. Yes, I’ve already checked serializability with inspect_serializability. I have implemented a custom serialization using option 1 with reduce I noticed the problem on the driver code which run on the Ray head however the same custom serializer code works on the Ray head node if I run it. My understanding is Ray driver run the serializing helper then the object gets stored via ray.put. Instead Ray client throws the above error.
I will try custom serialization option 2 and see if it works.

1 Like