Memory leakage in object store memory

How severe does this issue affect your experience of using Ray?

  • High: It blocks me to complete my task.

In my project, to pass image from one actor to another, I am putting it into the object store memory via ray.put() and passing reference to another actor. I have seen that, by default it’s not deleting image data from object store even after deletion of all the reference handlers. So I tried to delete it manually via ray.internal.free(). With this, it’s working fine for single camera load. But when I try to feed data to actor from multiple cameras even ray.internal.free() is not working and its started memory spilling.

I am getting following messages:

(raylet) Spilled 12733 MiB, 2153 objects, write throughput 850 MiB/s.
Local object store memory usage:

(global lru) capacity: 9824364134
(global lru) used: 0.510488%
(global lru) num objects: 49
(global lru) num evictions: 1074
(global lru) bytes evicted: 5115151614

Can anyone explain what exactly num_evictions represents ? And How I can traceback the memory leakage in my code ?

Hi @shyampatel do you have some simple script for your workload so we can give it a try?

Can anyone explain what exactly num_evictions represents

It means the object being evicted due to memory pressure. You can find more details here Ray v2 Architecture - Google Docs

Thanks for your answer.

I could not reproduce the same with simple script. As I told it’s behaving strange when I put more load (multiple camera input) to my pipeline.

Evicted from object store and stored in storage memory. Right ?

Evicted from object store and stored in storage memory. Right ?

Not really. You might have a copy of your object in object store, but it’s not actively being used. In this case, if the memory pressure is high, it’ll be evicted, which means it’ll be removed from the memory in the local node. When it’s needed, it’ll be copied from the remote node again.

1 Like

@yic Can you check the thread : Memory leakage with ray.put() ? These both are related.