I'm deploying a Django app with gunicorn, nginx and supervisor.
I currently run the background workers using celery:
$ python manage.py celery worker
This is my gunicorn configuration:
#!/bin/bash
NAME="hello_app" # Name of the application
DJANGODIR=/webapps/hello_django/hello # Django project directory
SOCKFILE=/webapps/hello_django/run/gunicorn.sock # we will communicte using this unix socket
USER=hello # the user to run as
GROUP=webapps # the group to run as
NUM_WORKERS=3 # how many worker processes should Gunicorn spawn
DJANGO_SETTINGS_MODULE=hello.settings # which settings file should Django use
DJANGO_WSGI_MODULE=hello.wsgi # WSGI module name
echo "Starting $NAME as `whoami`"
# Activate the virtual environment
cd $DJANGODIR
source ../bin/activate
export DJANGO_SETTINGS_MODULE=$DJANGO_SETTINGS_MODULE
export PYTHONPATH=$DJANGODIR:$PYTHONPATH
# Create the run directory if it doesn't exist
RUNDIR=$(dirname $SOCKFILE)
test -d $RUNDIR || mkdir -p $RUNDIR
# Start your Django Unicorn
# Programs meant to be run under supervisor should not daemonize themselves (do not use --daemon)
exec ../bin/gunicorn ${DJANGO_WSGI_MODULE}:application \
--name $NAME \
--workers $NUM_WORKERS \
--user=$USER --group=$GROUP \
--log-level=debug \
--bind=unix:$SOCKFILE
Is there a way to run celery background workers under gunicorn? Is it even referring to the same thing?
Celery and gunicorn are different things. Celery is an asynchronous task manager, and gunicorn is a web server. You can run both of them as background tasks (celeryd to daemonize celery), just feed them your django project.
A common way to run them is using supervisor, which will make sure they stay running after you log out from the server. The celery github repo has some sample scripts for using celery with supervisor.