We have a ruby instance that sends a message to a node instance via rabbitmq (bunny and amqplib) like below:
{ :type => data, :data => msg }.to_bson.to_s
This seems to be going pretty well, but msg's are sometimes long and we are sending them across data centers. zlib would help a lot.
doing smth like this in the ruby sender:
encoded_data = Zlib::Deflate.deflate(msg).force_encoding(msg.encoding)
and then reading it inside node:
data = zlib.inflateSync(encoded_data)
returns
"\x9C" from ASCII-8BIT to UTF-8
Is what I'm trying to do possible?
I am not a Ruby dev, so I will write the Ruby part in more or less pseudo code.
Ruby code (run online at https://repl.it/BoRD/0)
require 'json'
require 'zlib'
car = {:make => "bmw", :year => "2003"}
car_str = car.to_json
puts "car_str", car_str
car_byte = Zlib::Deflate.deflate(car_str)
# If you try to `puts car_byte`, it will crash with the following error:
# "\x9C" from ASCII-8BIT to UTF-8
#(repl):14:in `puts'
#(repl):14:in `puts'
#(repl):14:in `initialize'
car_str_dec = Zlib::Inflate.inflate(car_byte)
puts "car_str_dec", car_str_dec
# You can check that the decoded message is the same as the source.
# somehow send `car_byte`, the encoded bytes to RabbitMQ.
Node code
var zlib = require('zlib');
// somehow get the message from RabbitMQ.
var data = '...';
zlib.inflate(data, function (err, buffer) {
if (err) {
// Handle the error.
} else {
// If source didn't have any encoding,
// no need to specify the encoding.
console.log(buffer.toString());
}
});
I also suggest you to stick with async functions in Node instead of their sync alternatives.