I have a django app on Cloud Run and I'd like to create an endpoint that will be called by another python script. This endpoint should save files to Google storage. The file size is 800Mb max.
So when I try to do this I receive: 413 Request Entity Too Large.
So from digging the internet I understood that I should use chunk file. But there is something I do not understand..
From this: https://github.com/django/daphne/issues/126 I understand that daphne is now able to receive large body in request. So, I thought that, even receiving a big file Django was managing to chunk it and send it piece by piece.
I am curious, Is there anyway to do what I want other than doing manual chunk ?
For now I added this to my settings:
GS_BLOB_CHUNK_SIZE = 524288
DATA_UPLOAD_MAX_MEMORY_SIZE = 26214400
FILE_UPLOAD_MAX_MEMORY_SIZE = 26214400
and I simply use generics.ListCreateAPIView
with default value for file upload handler.
Generally a 413 error means a size limit in a request has been exceeded. For Cloud Run, the quota for requests is 32mb. According to the documentation, the recommended way of uploading large files is through providing a Signed URL to the Cloud Storage Bucket, since Signed URLs can be used for resumable uploads:
Resumable uploads are the recommended method for uploading large files, because you do not have to restart them from the beginning if there is a network failure while the upload is underway.
You can generate a signed URL from your server backend and use it to upload a file without restrictions from your client side script. There appears to be other related questions in which Django servers in Cloud Run have upload limits, and the use of Signed URLS is recommended to deal with these cases.