I am node and programming in general, and I have been really struggling with this...
I want to take a https response, resize it with graphicsmagick and send it to my amazon S3 bucket.
It appears that the https res is an IncomingMessage object (I can't find any info about that) and the stdout from graphicsmagick is a Socket.
The weird thing is that I can use pipe and send both of these to a writeStream with a local path, and both res and stdout create a nice new resized image.
And I can even send res to the S3 (using knox) and it works.
But stdout doesn't want to go to the S3 :-/
Any help would be appreciated!
https.get(JSON.parse(queryResponse).data.url,function(res){
var headers = {
'Content-Length': res.headers['content-length']
, 'Content-Type': res.headers['content-type']
}
graphicsmagick(res)
.resize('50','50')
.stream(function (err, stdout, stderr) {
req = S3Client.putStream(stdout,'new_resized.jpg', headers, function(err, res){
})
req.end()
})
})
knox - for connecting to S3 – https://github.com/LearnBoost/knox graphicsmagick - for image manipulation – https://github.com/aheckmann/gm
The problem was with the fact that Amazon needs to know content length before hand (thanks DarkGlass)
However, since my images are relatively small I found buffering preferential to MultiPartUpload.
My solution:
https.get(JSON.parse(queryResponse).data.url,function(res){
graphicsmagick(res)
.resize('50','50')
.stream(function (err, stdout, stderr) {
ws. = fs.createWriteStream(output)
i = []
stdout.on('data',function(data){
i.push(data)
})
stdout.on('close',function(){
var image = Buffer.concat(i)
var req = S3Client.put("new-file-name",{
'Content-Length' : image.length
,'Content-Type' : res.headers['content-type']
})
req.on('response',function(res){ //prepare 'response' callback from S3
if (200 == res.statusCode)
console.log('it worked')
})
req.end(image) //send the content of the file and an end
})
})
})