Object sharing between separate clients

Hello! If two clients are running as separate processes and one puts
data into the object store, is there a way to use the ObjectRef
from one to access the objects in the store from the other?


def serialize(value):
   c = ray.put(value)
   print('returned value is %s' % c)
   return c

def deserialize(value):
   print('value 1234 is %s' % value)
   print('type value is %s' % type(value))
   return ray.get(value)

Process A

data_string = '{"1001": 301.27, "1002": 433.21, "1003": 502.22}'
order_data_dict = json.loads(data_string)
ret_ref = serialize.remote(order_data_dict)
(pid=xyz) returned value is ObjectRef(some_id)

Inside the serialize function ray.put returns ObjectRef(some_id)
but the returned value is


Then pickling to send across processes

ret_p = pickle.dumps(ret_ref)

Process B

ret_p is passed to the process

result = pickle.loads(ret_p)
val = deserialize.remote(result) #Fails

expected would hope to be

{'1001': 301.27, '1002': 433.21, '1003': 502.22}

It seems the ClientObjectRef is specific to the client, but
is there a way to get the actual object ref so multiple clients
can access the object in the store? Or is there a better way to do this?

Hey @mrrobby ! What if you try using a named actor to act as a key value store?

Thanks @rliaw! I did something like this and it seems to work:

class RayStore(object):
    def get(self):
       return self.value

    def put(self, value):
        self.value = value

If there are lots of processes accessing an object set this way, could there be some extra overhead that could result, or conversely, could this save from unnecessary copies of objects from being made?

Hmm, maybe you could do a v = ray.put(self.value); return v inside def get(self). That would allow the object ref to be cached/re-used.

This seems to work. I’ll try this for some of the starting stuff and then we can profile as we move through to see what’s happening. Thank you!

1 Like