Using ray recommenced custom serialization to serialize an object. It works in single node standalone but throws can’t pickle exception in cluster mode.
I tried to run a job via ray client to an existing ray cluster in k8s
ERROR worker.py:409 -- Failed to deserialize b"\x80\x05\x95Q\x00\x00\x00\x00\x00\x00\x00\x8c\x08builtins\x94\x8c\tTypeError\x94\x93\x94\x8c0can't pickle Context objects\x94\x85\x94R\x94."
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/ray/util/client/worker.py", line 407, in _put_pickled
raise cloudpickle.loads(resp.error)
TypeError: can't pickle Context objects
Hi @toza-mimoza, Thanks for your prompt response and direction. Yes, I’ve already checked serializability with inspect_serializability. I have implemented a custom serialization using option 1 with reduce I noticed the problem on the driver code which run on the Ray head however the same custom serializer code works on the Ray head node if I run it. My understanding is Ray driver run the serializing helper then the object gets stored via ray.put. Instead Ray client throws the above error.
I will try custom serialization option 2 and see if it works.