Why do this two lines of code deliver the same result? (The third character is different)
Convert.FromBase64String("aaa=")
Convert.FromBase64String("aab=")
Result is the same for both lines - hex - [0x69, a6] / decimal [105, 166]
Other tests delivered those insights:
aaY=
: Convert.ToBase64String(Convert.FromBase64String("aaa="))
Why is do I get the same results in the first examples? What am I missing here?
base-64 is 6 bits of data per character; 3 characters gives you 18 bits, but for two bytes you only need 16. That leaves 2 bits of noise. Any changes in that noise: will not matter. The change you're seeing is only in the noise bits.