I have an issue in nodeJS here which lies on async as far as I see but I am not able to handle it and I didn't find a solution to my specific problem.
I am currently implementing a server which polls data from another server every x seconds. I then use the data chunk of the response of the foreign server, which is a JSON-string, to recieve the necessary live-data and then parse and save them to my mongoDB. The problem is that the chunk sometimes is pretty long since the server sometimes transmitts many lines.
Therefore sometimes my tool already works whenever the chunk is not too big, but sometimes it doesn't. After logging the chunk I noticed that in these cases the chunk is logged twice.
For example if the res.data
looks like this: [1,"123"][9,"234"] (in real it is of course much bigger) I get logged:
chunk: [1,"123 chunk "][9,"234"]
And this destroys my var response
, which then is response: "][9,"234"]
.
Here is the important part of the code:
function pollUra() {
var response = '';
var deferred = Q.defer();
// Send a request to URAv2
var req = http.request(options, function (res) {
res.setEncoding('utf-8');
res.on('data', function (chunk) {
// We need to set response to null here, because if there is too many data in the chunk,
// the response concatenates chunk after chunk, while chunk is concatenated by its seperate parts.
// chunk1 -> chunk1+chunk2 -> chunk1+chunk2+chunk3
// response = chunk1 -> chunk1+chunk1+chunk2 -> chunk1+chunk1+chunk2+chunk1+chunk2+chunk3...
console.log('chunk: '+chunk);
response = '';
response += chunk;
});
res.on('end', function () {
var parsedResponseArray = parser.parseQuery(response);
...
}
I thought that the soulition described in the comment solved the issue, because it then seemed to work most time, but now it seems like it was only luck that the chunk didn't become big enough for the failure for a longer time.
My wish was to just catch the chunk after it was send completely but I am somehow not able to find the solution, since I thought that res.on('end')
is just called after the data chunk has been sended completely.
What am I missing?
Thanks in advance!
Remove the line response = '';
inside of your res.on('data'...
Otherwise the code looks good, the problem is you reinitialize the response
variable. As a result previously saved data chunk is erased each time the new is arrived.
Quoting you:
For example if the res.data looks like this: [1,"123"][9,"234"] (in real it is of course much bigger) I get logged:
chunk: [1,"123 chunk "][9,"234"]
Notice that chunk 1 + chunk 2 = [1,"123"][9,"234"]
So, the code has to be the one you have, but without resetting response
variable:
// Send a request to URAv2
var req = http.request(options, function (res) {
res.setEncoding('utf-8');
response = ''; // Reset variable in case it is not empty
res.on('data', function (chunk) {
// We need to set response to null here, because if there is too many data in the chunk,
// the response concatenates chunk after chunk, while chunk is concatenated by its seperate parts.
// chunk1 -> chunk1+chunk2 -> chunk1+chunk2+chunk3
// response = chunk1 -> chunk1+chunk1+chunk2 -> chunk1+chunk1+chunk2+chunk1+chunk2+chunk3...
console.log('chunk: '+chunk);
response += chunk;
});
res.on('end', function () {
var parsedResponseArray = parser.parseQuery(response);
...
}