Sometimes I would use varbinary
instead of varchar
, especially when I need trailing white-spaces to count during =
comparison and am not interested in using the LIKE
operator for this purpose.
When selecting, I can specify which charset to use when decoding like so:
select convert(myvarbinary using utf8) from mytable
But most times, I just select myvarbinary from mytable
and it works just fine.
My question is, in the latter, when I do not specify the charset, what is the default character set coming from? On varchar, it is actually set on a per table or even a per-column basis, but not so for varbinary.
VARBINARY
has no CHARACTER SET
. All comparisons are done bit-by-bit. Such comparisons are somewhat like COLLATE ..._bin
.
For most applications it is best to do everything in CHARACTER SET utf8mb4 COLLATE utf8mb4_unicode_ci
.
The only reason I can think of for convert(myvarbinary using utf8)
is if you have various encodings in a single column. That sounds like a nightmare.