I build a ZIP file using the following code:
def compress_batch(directory_path)
zip_file_path = File.join( File.expand_path("..", directory_path), SecureRandom.hex(10))
Zip::File.open(zip_file_path, Zip::File::CREATE) do |zip_file|
(Dir.entries(directory_path) - %w(. ..)).each do |file_name|
zip_file.add file_name, File.join(directory_path, file_name)
end
end
result = File.open(zip_file_path, 'rb').read
File.unlink(zip_file_path)
result
end
I store that ZIP file in memory:
@result = Payoff::DataFeed::Compress::ZipCompress.new.compress_batch(source_path)
I put it into a hash:
options = {
data: @result
}
Then I submit it to my SideKiq worker using perform_async
:
DeliveryWorker.perform_async(options)
and get the following error:
[DEBUG] Starting store to: { "destination" => "sftp", "path" => "INBOUND/20191009.zip" }
Encoding::UndefinedConversionError: "\xBA" from ASCII-8BIT to UTF-8
from ruby/2.3.0/gems/activesupport-4.2.10/lib/active_support/core_ext/object/json.rb:34:in `encode'
However, if I use .new.perform
instead of .perform_async
, bypassing SideKiq, it works fine!
DeliveryWorker.new.perform(options)
My best guess is that there is something wrong with my encoding such that when the job goes to SideKiq / Redis, it blows up. How should I have encoded it? Do I need to change the creation of my ZIP file? Maybe I can convert the encoding upon submission to SideKiq?
Sidekiq serializes arguments as JSON. You are trying to stuff binary data into JSON, which only supports UTF-8 strings. You will need to Base64 encode the data if you wish to pass it through Redis.
require 'base64'
encoded = Base64.encode64(filedata)