How severe does this issue affect your experience of using Ray?

  • None: Just asking a question out of curiosity
  • Low: It annoys or frustrates me for a moment.
  • Medium: It contributes to significant difficulty to complete my task, but I can work around it.
  • High: It blocks me to complete my task.


Hi now i’m stucked with this error. my function is about 262Mib and my images are about 1mb. The problem is when I ray.put(image), it doesn’t work. I think the size of function is problem. So what should I do? I can’t seperate my function.

Hi @potato,

Could you paste the error you saw? Also why is your function so large? Does it closure capture some large objects?

def for_mrz(cropped_img):
“”" model configuration “”"

image_tensors = to_tensor(cropped_img)
image_tensors = image_tensors.unsqueeze(0)

# predict
with torch.no_grad():
    #for image_tensors, image_path_list in demo_loader:
        batch_size = image_tensors.size(0)
        image =
        # For max length prediction
        length_for_pred = torch.IntTensor([opt_mrz.batch_max_length] * batch_size).to(device)
        text_for_pred = torch.LongTensor(batch_size, opt_mrz.batch_max_length + 1).fill_(0).to(device)

        preds = model_mrz(image, text_for_pred, is_train=False)
        # select max probabilty (greedy decoding) then decode index to character
        _, preds_index = preds.max(2)
        preds_str = converter.decode(preds_index, length_for_pred)
        pred = preds_str[0]

        preds_prob = F.softmax(preds, dim=2)
        preds_max_prob, _ = preds_prob.max(dim=2)

        pred_EOS = pred.find('[s]')
        pred = pred[:pred_EOS]  # prune after "end of sentence" token ([s])

        pred_max_prob = preds_max_prob[0][:pred_EOS]

        # calculate confidence score (= multiply of pred_max_prob)
        confidence_score = pred_max_prob.cumprod(dim=0)[-1]

        confidence_score = round(confidence_score.item(),3)

Above one is my function and I have this error message.

ValueError: The remote function main.for_mrz is too large (262 MiB > FUNCTION_SIZE_ERROR_THRESHOLD=95 MiB). Check that its definition is not implicitly capturing a large array or other object in scope. Tip: use ray.put() to put large objects in the Ray object store.

I found the size of model is large. So I made model as param and give it to for_mrz(cropped_img,model). But it occurs this error. RuntimeError: Attempting to deserialize object on CUDA device 0 but torch.cuda.device_count() is 0. Please use torch.load with map_location to map your storages to an existing device. [repeated 19x across cluster]
So what should I do now?