I am writing some sort of proxy and I am trying to forward everything to the "client-server":
const headers = req.headers;
delete headers['content-length'];
const body = '...maybe some content...';
const chunks: any[] = [];
const request = http
.request(
url,
{
method: req.method,
headers,
},
(msg) => {
msg.on('data', (data) => chunks.push(data));
},
)
.on('error', (error) => {
reject(error);
})
.on('close', () => {
resolve(Buffer.concat(chunks).toString('utf8'));
});
request.write(body, (error) => {
if (error) {
reject(error);
}
});
request.end();
All in all it is working as I expect it to, but if I send any data (body
) in a GET request, then I am getting a Error: socket hang up
.
Node's documentation says that most likely the connection closes before the response is received.
The client server accepts a body in a GET and does not refuse the connection or something alike (tested via Postman).
What is happening in Node; Why does the connection close prematurely?
The request Content-Length
was causing the premature connection termination.
Node docs state
- Sending a 'Content-Length' header will disable the default chunked encoding.
I was removing the header from the incoming request, for Node to handle it automatically.
Turns out, that Node only computes the content-length in POST
/PATCH
/PUT
requests. In other methods, even by properly using .write()
the Content-Length will not be set automatically, causing the premature closure.
To fix this, compute the length of the body on your own and send in in the headers:
const body = '...maybe some content...';
const chunks: any[] = [];
const request = http
.request(
url,
{
method: req.method,
headers: {
...req.headers,
'content-length': Buffer.byteLength(body),
},
},
(msg) => {
msg.on('data', (data) => chunks.push(data));
},
)
.on('error', (error) => {
reject(error);
})
.on('close', () => {
resolve(Buffer.concat(chunks).toString('utf8'));
});
request.write(body, (error) => {
if (error) {
reject(error);
}
});
request.end();