I'm using external process which writes short line of output for each chunk of data processed. I would like to react after each of these lines without any additional delay. However, seems that .outReceived()
of ProcessProtocol
is buffered. Docs state:
.outReceived(data): This is called with data that was received from the process' stdout pipe. Pipes tend to provide data in larger chunks than sockets (one kilobyte is a common buffer size), so you may not experience the "random dribs and drabs" behavior typical of network sockets, but regardless you should be prepared to deal if you don't get all your data in a single call. To do it properly, outReceived ought to simply accumulate the data and put off doing anything with it until the process has finished.
The result is, that I get output in one chunk after whole processing is done. How can I force ProcessProtocol
not to buffer stdout?
I'm using external process which writes short line of output for each chunk of data processed. I would like to react after each of these lines without any additional delay.
The result is, that I get output in one chunk after whole processing is done. How can I force ProcessProtocol not to buffer stdout?
The buffering is happening in the producer process, not the consumer. Standard C library stdout
is line-buffered only when connected to a terminal, otherwise it is fully-buffered. This is what causes the producer process to output data in large chunks rather than line by line when it is not connected to a terminal.
Use stdbuf utility to force producer process' stdout
to be line-buffered.
If the producer process is a python script use -u
python interpreter switch to completely turn off buffering of the standard streams. stdbuf
utility is better though.