In JavaScript, doing this:
var numbers = new Array(1042147201, -1682263442, -1463053899, 1834416100)
sjcl.codec.base64.fromBits(numbers)
Return "Ph3ngZu6sm6oy5G1bVb35A==", but doing this in C#:
var numbers = new[] { 1042147201, -1682263442, -1463053899, 1834416100 };
var byteNumbers = new byte[numbers.Length * sizeof(int)];
Buffer.BlockCopy(numbers, 0, byteNumbers, 0, byteNumbers.Length);
Convert.ToBase64String(byteNumbers);
Return "gecdPm6yupu1kcuo5PdWbQ=="
Why is the result different and what do I have to do to get the same result like in JavaScript?
Looking at output of the 2 two pieces of code you have issue with endianness of ints
1834416100 - > 6D 56 F7 E4
Ph3ngZu6sm6oy5G1bVb35A== -> 3E 1D E7 81 9B BA B2 6E A8 CB 91 B5 6D 56 F7 E4
gecdPm6yupu1kcuo5PdWbQ== -> 81 E7 1D 3E 6E B2 BA 9B B5 91 CB A8 E4 F7 56 6D
Possible fix: reverse each integer when adding to array as shown in BitConverter class
int value = 12345678;
byte[] bytes = BitConverter.GetBytes(value);
if (BitConverter.IsLittleEndian)
Array.Reverse(bytes);