In security-related papers we can often find that a string is called as a «X-bit length string», e.g.:
88cf3e49-e28e-4c0e-b95f-6a68a785a89d
This is a 128-bit value formatted as 32 hexadecimal digits separated by hyphens.
This string is a 32-characters UUID with 4 hyphens. I always assumed that a string length in bits depends on the applied encoding. So, how can I know that this string is a 128-bit string?
How exactly these bits are counted?
Each of the characters in that string is a hexadecimal digit. Each hex digit requires 4 bits to represent. 32 * 4 = 128.
(Note: your post says 36, but there are 32 digits there).
The string itself, if you're talking about the text representation you've shown, is, as you say, encoding size dependent. In UTF-8, for example, that string is 36 * 8 = 288 bits long.