Do we have any performance benefit if creating an index on a binary column instead of string column. Which of the two will be fast. I know varchar will be slow in comparison to numeric values. Since binary values must be getting read directly, I am expecting it should be fast probably fasted.
EDIT:
Actual use case is if I have GUID column, I can use CHECKSUM or HASHBYTES to avoid string comparison.
I know we can't create index on MAX
. It is only about varchar(25), varbinary(25), int. There is no point of varchar(max)
It depends on the size of the column, but for two columns of the same size the varbinary will typically be faster. The other factor involved here is the collation used for the column. The default collation in Sql Server is not case senstive, meaning for for compare purposes SOME RANDOM KEY
, some random key
, and every permutation thereof are all the same value, and therefore the database must do extra work when comparing and sorting those keys to know what goes where and what value matches what other values: it's not just a straight byte-for-byte comparison anymore.