How exactly does the MaxLengthAttribute measure the length of a string? Is it based on bytes in UTF-16
as C# strings are UTF-16
? So if my SQL server uses an UTF-8
collation there could be situations where SQL Server and C# have different opinions about whether a string length is e.g. <= 100
?
Is it based on bytes in UTF-16 as C# strings are UTF-16
No, it's the number of characters in the string
object, i.e. the number returned by the Length
property of the string
.
For example, a string
with a value of "abc" has a Length
of 3 and is represented by 3 bytes in UTF-8 and 6 bytes in UTF-16.
A string
with a value of "åäö" also has a Length
of 3 despite being represented by 6 bytes in both UTF-8 and UTF-16.
So it refers the number of char
elements in the, or the length of the, string
.