I have a Django web application that is deployed, in production, with Caddy. I use Caddy as a reverse proxy pointing to daphne which is pointing to my Django app. However, when I try to upload a 5MB file to the django admin portal in production I get a 413 error. In debug mode, when I am just using Django (without caddy or daphne), I do not get this error. Anyone have any ideas? Here is my Caddyfile and related files:
0.0.0.0:2015
on startup daphne peptidedb.asgi:application &
header / {
-Server
# be sure to plan & test before enabling
# Strict-Transport-Security "max-age=63072000; includeSubDomains; preload"
Referrer-Policy "same-origin"
X-XSS-Protection "1; mode=block"
X-Content-Type-Options "nosniff"
# customize for your app
#Content-Security-Policy "connect-src 'self'; default-src 'none'; font-src 'self'; form-action 'self'; frame-ancestors 'none'; img-src data: 'self'; object-src 'self'; style-src 'self'; script-src 'self';"
X-Frame-Options "DENY"
}
proxy / localhost:8000 {
transparent
websocket
except /static
}
limits 750000000
log / stdout "{combined}"
errors stdout
asgi.py
import os
from channels.routing import get_default_application
import django
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "peptidedb.settings")
django.setup()
application = get_default_application()
wsgi.py
import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "peptidedb.settings")
application = get_wsgi_application()
It looks like when the Django app is deployed with channels, Daphne, and caddy this setting takes effect in settings.py
DATA_UPLOAD_MAX_MEMORY_SIZE = 1024 # value in bytes
I had to add this setting in my settings file and then my larger file upload works. The weird part is I did not need this setting when the app was deployed with only Django in debug mode. I wonder if my app when running inside the docker container is not able (permission? size?) to write/stream the big file to disk as that is normal Django behavior (instead of writing to memory).