curl -v -F "file=@bigfile.zip" http://server/handler
I know the "bigfile.zip" will be split to several parts and sent to server part by part, that might need a long time. So how could I read the first part before the last part sent? If that's impossible with Apache/Nginx + PHP/Python, what about build another HTTP server with node.js?
What I want to find is a "long request" (another side like "long polling"). Browser can write to server immediately by using an exists long request without create a new one.
Any suggestion?
=================
Connection: Keep-Alive ?
This is an example code of a long lived http post request in node.js
var http = require('http');
var server = http.createServer(function(req, res) {
if(req.method == 'POST' && req.url == '/handler') {
req.on('data', function(data) {
// i'm getting chunks of data in here !
});
req.on('end', function() {
res.writeHead(200, 'OK');
res.end('Got your file\n');
});
}
else {
res.writeHead(404, 'Not Found');
res.end();
}
});
server.listen(80);
Of course this is the most basic example and file uploads over http are slightly more complicated. This is why using something like formidable can be useful.
With node, once you are getting data you can start sending it to other places where they can be processed, even though the rest of the data is still coming. Usually you would use streams
this is an example of how to do it http://debuggable.com/posts/streaming-file-uploads-with-node-js:4ac094b2-b6c8-4a7f-bd07-28accbdd56cb