aiohttp.web_exceptions.HTTPRequestEntityTooLarge

RuntimeError: Request failed with status code 500: Traceback (most recent call last):
File “/home/ray/anaconda3/lib/python3.7/site-packages/ray/dashboard/modules/job/job_head.py”, line 269, in upload_package
upload_package_to_gcs(package_uri, await req.read())
File “/home/ray/anaconda3/lib/python3.7/site-packages/aiohttp/web_request.py”, line 655, in read
max_size=self._client_max_size, actual_size=body_size
aiohttp.web_exceptions.HTTPRequestEntityTooLarge: Request Entity Too Large

What is the max size for the request? There is a csv file of 40MB that I copy now which is later used by the python code to retrieve training data. Is it against pattern? I guess the best way is to put the files in a cloud bucket so it doesn’t need to be copied from the process where the job is submitted.

For large files, a bucket is recommended.

Besides that, @architkulkarni @cade I think for runtime env, we support 250MB, so literally, jobs should also support that.

This size should be fine for job submission. Can you share a script that reproduces the problem @Y_C ? I want to see if there was anything special that can cause this.

I think it’s due to the total directly exceeds 100MB, (it said here in this doc) when I removed some files it worked.

1 Like