RuntimeError: Request failed with status code 500: Traceback (most recent call last):
File “/home/ray/anaconda3/lib/python3.7/site-packages/ray/dashboard/modules/job/job_head.py”, line 269, in upload_package
upload_package_to_gcs(package_uri, await req.read())
File “/home/ray/anaconda3/lib/python3.7/site-packages/aiohttp/web_request.py”, line 655, in read
aiohttp.web_exceptions.HTTPRequestEntityTooLarge: Request Entity Too Large
What is the max size for the request? There is a csv file of 40MB that I copy now which is later used by the python code to retrieve training data. Is it against pattern? I guess the best way is to put the files in a cloud bucket so it doesn’t need to be copied from the process where the job is submitted.