Configuring object spilling to another folder: still full!

My ray TensorFlow model building reboots my machine frequently, which obviously isn’t ideal :rofl:

Running on macOS.

So I am trying to pinpoint what the problem might be.

I noticed in the logs, I have many instances of this:

(raylet) [2024-04-03 11:07:00,175 E 46257 21929133] (raylet) file_system_monitor.cc:111: /tmp/ray/session_2024-04-03_11-06-48_291817_46247 is over 95% full, available space: 190828212224; capacity: 3996329328640. Object creation will fail if spilling is required.

After a bit of digging, I managed to relocate my spill folder like this:

script_dir = os.path.dirname(os.path.abspath(__file__))
storage_path = os.path.join(script_dir, "storage_path")
os.makedirs(storage_path, exist_ok=True)

ray.init(
    _system_config={
        "object_spilling_config": json.dumps(
            {"type": "filesystem", "params": {"directory_path": storage_path}},
        )
    },
)

The message has updated to reflect this new folder, but still it is reporting “95% full”, which is not even approaching true.

What might be the issue?

Hi @Mark_Norgate how do you know it’s not even approaching true. How did you check the available space and total capacity of your spill directory?

Well it’s a local folder…I know how much space is available on my machine. I don’t recall configuring a limit on a folder.

How is “free space” calculated? Certainly I have 20% of my SSD free.