Short question version: what am I doing wrong in my Daphne config, or my Consumer code, or my client code?
channels==1.1.8
daphne==1.3.0
Django==1.11.7
Details below:
I am trying to keep a persistent Websocket connection open using Django Channels and the Daphne interface server. I am launching Daphne with mostly default arguments: daphne -b 0.0.0.0 -p 8000 my_app.asgi:channel_layer
.
I am seeing the connections closing after some idle time in the browser, shortly over 20 seconds. The CloseEvent
sent with the disconnect has a code
value of 1006
(Abnormal Closure), no reason
set, and wasClean
set to false. This should be the server closing the connection without sending an explicit close frame.
The Daphne CLI has --ping-interval
and --ping-timeout
flags with default values of 20 and 30 seconds, respectively. This is documented as "The number of seconds a WebSocket must be idle before a keepalive ping is sent," for the former, and "The number of seconds before a WebSocket is closed if no response to a keepalive ping," for the latter. I read this as Daphne will wait until a WebSocket has been idle for 20 seconds to send a ping, and will close the Websocket if no response is received 30 seconds later. What I am seeing instead is connections getting closed after being 20 seconds idle. (Across three attempts with defaults, closed after 20081ms, 20026ms, and 20032ms)
If I change the server to launch with daphne -b 0.0.0.0 -p 8000 --ping-interval 10 --ping-timeout 60 my_app.asgi:channel_layer
, the connections still close, around 20 seconds idle time. (After three attempts with updated pings, closed after 19892ms, 20011ms, 19956ms)
Code below:
consumer.py
:
import logging
from channels import Group
from channels.generic.websockets import JsonWebsocketConsumer
from my_app import utilities
logger = logging.getLogger(__name__)
class DemoConsumer(JsonWebsocketConsumer):
"""
Consumer echos the incoming message to all connected Websockets,
and attaches the username to the outgoing message.
"""
channel_session = True
http_user_and_session = True
@classmethod
def decode_json(cls, text):
return utilities.JSONDecoder.loads(text)
@classmethod
def encode_json(cls, content):
return utilities.JSONEncoder.dumps(content)
def connection_groups(self, **kwargs):
return ['demo']
def connect(self, message, **kwargs):
super(DemoConsumer, self).connect(message, **kwargs)
logger.info('Connected to DemoConsumer')
def disconnect(self, message, **kwargs):
super(DemoConsumer, self).disconnect(message, **kwargs)
logger.info('Disconnected from DemoConsumer')
def receive(self, content, **kwargs):
super(DemoConsumer, self).receive(content, **kwargs)
content['user'] = self.message.user.username
# echo back content to all groups
for group in self.connection_groups():
self.group_send(group, content)
routing.py
:
from channels.routing import route
from . import consumers
channel_routing = [
consumers.DemoConsumer.as_route(path=r'^/demo/'),
]
demo.js
:
// Tracks the cursor and sends position via a Websocket
// Listens for updated cursor positions and moves an icon to that location
$(function () {
var socket = new WebSocket('ws://' + window.location.host + '/demo/');
var icon;
var moveTimer = null;
var position = {x: null, y: null};
var openTime = null;
var lastTime = null;
function sendPosition() {
if (socket.readyState === socket.OPEN) {
console.log('Sending ' + position.x + ', ' + position.y);
socket.send(JSON.stringify(position));
lastTime = Date.now();
} else {
console.log('Socket is closed');
}
// sending at-most 20Hz
setTimeout(function () { moveTimer = null; }, 50);
};
socket.onopen = function (e) {
var box = $('#websocket_box');
icon = $('<div class="pointer_icon"></div>').insertAfter(box);
box.on('mousemove', function (me) {
// some browsers will generate these events much closer together
// rather than overwhelm the server, batch them up and send at a reasonable rate
if (moveTimer === null) {
moveTimer = setTimeout(sendPosition, 0);
}
position.x = me.offsetX;
position.y = me.offsetY;
});
openTime = lastTime = Date.now();
};
socket.onclose = function (e) {
console.log("!!! CLOSING !!! " + e.code + " " + e.reason + " --" + e.wasClean);
console.log('Time since open: ' + (Date.now() - openTime) + 'ms');
console.log('Time since last: ' + (Date.now() - lastTime) + 'ms');
icon.remove();
};
socket.onmessage = function (e) {
var msg, box_offset;
console.log(e);
msg = JSON.parse(e.data);
box_offset = $('#websocket_box').offset();
if (msg && Number.isFinite(msg.x) && Number.isFinite(msg.y)) {
console.log((msg.x + box_offset.left) + ', ' + (msg.y + box_offset.top));
icon.offset({
left: msg.x + box_offset.left,
top: msg.y + box_offset.top
}).text(msg.user || '');
}
};
});
asgi.py
:
import os
from channels.asgi import get_channel_layer
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "my_project.settings")
channel_layer = get_channel_layer()
settings.py
:
CHANNEL_LAYERS = {
'default': {
'BACKEND': 'asgi_redis.RedisChannelLayer',
'ROUTING': 'main.routing.channel_routing',
'CONFIG': {
'hosts': [
'redis://redis:6379/2',
],
'symmetric_encryption_keys': [
SECRET_KEY,
],
}
}
}
The underlying problem turned out to be the nginx proxy in front of the interface server. The proxy was set to proxy_read_timeout 20s;
. If there were keepalive pings generated from the server, these were not getting counted toward the upstream read timeout. Increasing this timeout to a larger value allows the Websocket to stay open longer. I kept proxy_connect_timeout
and proxy_send_timeout
at 20s
.