I have written a very basic encryption module that I am using to encrypt some values in a .yaml file.
require 'OpenSSL'
require 'Base64'
cipher = OpenSSL::Cipher::AES.new(256, :CBC)
cipher.key = ENV['KEY']
cipher.iv = ENV['IV']
cipher.encrypt
encrypted = cipher.update(ARGV[0]) + cipher.final
p Base64.encode64(encrypted).gsub!(/\n/,'')
If I run the encryptor like so, I get one value
rvm use jruby-9.0.5.0
jruby encryptor.rb 'password'
4cP7jptj5Z14c2KoXdNf+g==
If I run the encryptor like this, I receive a different value
rvm use ruby-2.2.0
ruby encryptor.rb 'password'
y5ZdDfAGRmK1wQy2e4EOIA==
Im relatively new to encryption, so this may be a loaded (or simple) question, but why does my module return two different values depending on what interpreter im using?
EDIT: The key is 32-bytes, and I have changed the IV to be 16-bytes, and the results still differ between interpreters.
EXAMPLE KEYS:
key = 1DR337Z5C5CBD94643L9772F96C546AC
iv = 2BR367Z5R5CFD949
With OpenSSL, when you call cipher.encrypt
you reset the key and the iv (or at least change the internal state of the cipher
object in some way). There is a Ruby bug about this: https://bugs.ruby-lang.org/issues/8720
The JRuby OpenSSL emulation layer doesn’t seem to have this behaviour, and gives the same result wherever you have the call to cipher.encrypt
.
The workaround / fix is to ensure the call to cipher.encrypt
occurs before you set the key and iv:
cipher.encrypt
cipher.key = ENV['KEY']
cipher.iv = ENV['IV']