How severe does this issue affect your experience of using Ray?
- High: It blocks me to complete my task.
I am getting memory leakage in my pipeline. To analyse that I am using memray
(python module) with live mode. While analysis, I could find that there was some method call, of which allocations are continuously increasing, but there is no Location for that, can check yellow rectangle in attached pic.
Can anyone help me here to get the location of that method? Please let me know if I can pass any further information.
Maybe this is the native calls (c++ calls). can you do memray run --native
? It may reveal the ???.
Thanks @Ruiyang_Wang for your support. I have tried with --native
option and it indeed help in resolving the ???
. Issue here is that, it’s not giving stats (memory, allocations, etc.) with this option enable.
I am running live mode in remote via following code lines:
destination = memray.SocketDestination(server_port = 6121, address = '127.0.0.1')
memray.Tracker(destination=destination, trace_python_allocators=True, native_traces=True).__enter__()
I guess it’s not supported in live mode. Can you confirm this once? It would be better, if you can suggest any other method to get those native calls or to find memory leakage.
can you try this Ray-built in memray support? In the Ray Dashboard (localhost:8265) find your worker process and click “Memory Profiling”
@Ruiyang_Wang I am using ray 2.8.0
version, which does not have the mentioned feature.
@shyampatel can you please upgrade to latest ray?
Hi @Sam_Chan, I am working on upgrading the pipeline to ray=2.34
and python3.12
. Meanwhile, we could able to reproduce memory leakage via script with ray==2.8.0
and python3.8
. Once, the analysis is done, I will create another thread for the same.
1 Like
@Sam_Chan @Ruiyang_Wang I have created separate thread (Memory leakage with ray.put()) for addressing memory leakage issue. One thing, I could relate here is that, ray.put(object)
is causing the continuous increment in allocations count of ???
.