Here is my attempt to convert an svg string to a png buffer using node and the imagemagick convert tool. The png buffer is then used to draw an image in a pdf using pdfkit.
Td;lr I have a large svg string that needs to get to a child process "whole" (i.e not chunked). How do I do so?
This is an example that works for small files.
var child_process = require('child_process');
var pdfDocument = require('pdfkit');
var convert = child_process.spawn("convert", ["svg:", "png:-"]),
svgsrc = '<svg><rect height="100" width="100" style="fill:red;"/></svg>';
convert.stdout.on('data', function(data) {
console.log(data.toString('base64')
doc = new pdfDocument()
doc.image(data)
}
convert.stdin.write(svgsrc);
convert.stdin.end()
This works when the svg string is 'small' (like the on provided in the example) -- I'm not sure where the cut-off from small to large is.
However, when attempting to use a larger svg string (something you might generate using D3) like this [ large string ]. I run into:
Error: Incomplete or corrupt PNG file
So my question is: How do I ensure that the convert
child process reads the entire stream before processing it?
A few things are known:
The png buffer is indeed incomplete. I used a diff tool to check the base64 string generated by the app versus the base64 of a png-to-svg converter online. The non-corrupted string is much larger than the corrupted string. (sorry I haven't been more specific with file size). That is, the convert tool seems to not be reading the entire source at any given time.
The source svg string is not corrupted (as evidenced by the fact the the gist rendered it)
When used in the command line the convert tool correctly generate a
png file from a svg "stream" with cat large_svg.svg | convert svg:png:-
So this is not an issue with the convert tool.
This lead me to down a rabbit hole of looking a node's buffer size for writeable and readable streams but to no avail. Maybe someone has worked with larger streams in node and can help out with getting the to work.
As @mscdex pointed out I had to wait for the process to finish before
attemping downstream work. All that was need was to wait for the end
event on the convert.stdout
stream and concatenate buffers on the data
events.
// allocate a buffer of size 0
graph = Buffer.alloc(0)
// on data concat the incoming and the `graph`
convert.stdout.on('data', function(data) {
graph = Buffer.concat([graph, data])
}
convert.stdout.on('end', function(signal) {
// ... draw on pdf
}
EDIT:
Here is an more efficient version of the above where we use @mscdex
suggestion to do the concatenation on the end
callback and keeping a chunksize
argument to help the Buffer allocate size when concatenation the chunks.
// allocate a buffer of size 0
var graph = [];
var totalchunks = 0;
convert.stdout.on('data', function(data) {
graph.push(data);
totalsize +=data.length;
}
convert.stdout.on('end', function(signal) {
var image = Buffer.concat(graph, totalsize);
// ... draw on pdf
}