So, I've started using Tornado for my asynchronous socket server and everything looked fine, until I've discovered a strange read_bytes(num_bytes)
method.
Because I have to read UTF from Java's OutputStream
, I had to re-write a "parser" in Python and that's how a code looks like right now:
def read_utf(self, callback):
def _utf_length(data):
self.stream.read_bytes(data, _read_utf)
def _read_utf(data):
callback(struct.unpack('>H', data)[0])
self.stream.read_bytes(2, _utf_length)
But.. It doesn't work. That's how a traceback looks like:
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\tornado\ioloop.py", line 600, in _run_callback
ret = callback()
File "C:\Python27\lib\site-packages\tornado\stack_context.py", line 275, in null_wrapper
return fn(*args, **kwargs)
File "C:\Python27\lib\site-packages\tornado\iostream.py", line 554, in wrapper
return callback(*args)
File "C:\Python27\lib\site-packages\tornado\stack_context.py", line 275, in null_wrapper
return fn(*args, **kwargs)
File "..\streams.py", line 57, in _utf_length
self.stream.read_bytes(data, _read_utf)
File "C:\Python27\lib\site-packages\tornado\iostream.py", line 312, in read_bytes
assert isinstance(num_bytes, numbers.Integral)
AssertionError
I tried to use self.stream.read_bytes(int(data), _read_utf)
, but that didn't worked because string is "empty" itself.
What can I do at this point?
You have to use struct.unpack on the data you receive in _utf_length
. _read_utf
gets the real data which you want to pass to your own callback:
def read_utf(self, callback):
def _utf_length(data):
length = struct.unpack('>H', data)[0]
self.stream.read_bytes(length, _read_utf)
def _read_utf(data):
callback(data)
self.stream.read_bytes(2, _utf_length)
Also consider writing this as a coroutine; it can be easier to follow than a series of callbacks:
@tornado.gen.coroutine
def read_utf(self):
length_data = yield self.stream.read_bytes(2)
length = struct.unpack('>H', length_data)[0]
data = yield self.stream.read_bytes(length)
raise tornado.gen.Return(data)