Search code examples
c#base64decoding

Converting from Base64 returns the same result for different strings


Why do this two lines of code deliver the same result? (The third character is different)

Convert.FromBase64String("aaa=") 
Convert.FromBase64String("aab=")

Result is the same for both lines - hex - [0x69, a6] / decimal [105, 166]

Other tests delivered those insights:

  • converting back gives yet another different string - aaY= : Convert.ToBase64String(Convert.FromBase64String("aaa="))
  • "aac=" is different [105,167]
  • "aaaa" and "aaab" are decoded to different sequences

Why is do I get the same results in the first examples? What am I missing here?


Solution

  • base-64 is 6 bits of data per character; 3 characters gives you 18 bits, but for two bytes you only need 16. That leaves 2 bits of noise. Any changes in that noise: will not matter. The change you're seeing is only in the noise bits.