Search code examples
pythondjangoconcurrencydjango-channelsdaphne

Why can't Django Channels Daphne use multi-threading to process request concurrently?


I am aware of Python's GIL and threading in Python isn't as easy as spawning a go routine in Go. However, it seems to me that Ruby was able to pull it off with Puma and Unicorn to achieve concurrency with multi-threading. My question is actually two-fold. My experience is limited to Django Channel's Daphne.

  1. Besides Daphne, what are the other choices of web server that is multi-threading like the puma and unicorn in Rails?

  2. From Daphne's documentation, I learned that parallelism is achieved by spawning new processes (workers)

    Because the work of running consumers is decoupled from the work of talking to HTTP, WebSocket and other client connections, you need to run a cluster of “worker servers” to do all the processing. Each server is single-threaded, so it’s recommended you run around one or two per core on each machine; it’s safe to run as many concurrent workers on the same machine as you like, as they don’t open any ports (all they do is talk to the channel backend).

As stated, each worker is single threaded. When it comes to I/O function call, the worker is blocked completely. My question is, why can't Daphne spawn multiple threads for each request. When one thread is blocked by I/O e.g. database access, CPU switches to execute another thread until the previous thread is unblocked. Similarly, Node.js is single-threaded but it does concurrency really well through non-blocking I/O. Why is it difficult to achieve the same feat. in Python? (besides the fact that it lacks a good event loop.)


Solution

  • Right now, uvicorn is the only alternative for daphne which suppports multi processing and is ready for production usage.

    $ pip install uvicorn
    
    $ uvicorn avilpage.asgi --workers 4
    

    This starts server with 4 workers.

    Since daphne/uvicorn use asyncio for multi tasking, I guess multi threading doesn't make sense.