Search code examples
iosswiftuint16unichar

How to set value of (initialize) unichar/UInt16 in hexadecimal in Swift


I want to set the value of a constant UInt16 to a hexadecimal value.

I know that the way to set a Swift Character is

let myValue: Character = "\u{1820}"

so I tried

let myValue: UInt16 = "\u{1820}"
let myValue: unichar = "\u{1820}"
let myValue: UInt16 = "\u{1820}".utf16
let myValue: unichar = "\u{1820}".utf16
let myValue = "\u{1820}"

but none of these worked.

When searching for the answer, I mostly got SO questions about converting from NSString or other type.

This is surely a silly question for all you experienced Objective C programmers, but I was having a hard time finding the answer. I finally did find it, though, so I will share my question and answer to hopefully add a few keywords for others who may search for the same question in the future.

Note:


Solution

  • If you had read the The Basics of the Swift documentation a little more carefully you would have found it.

    Integer literals can be written as:

    • A decimal number, with no prefix
    • A binary number, with a 0b prefix
    • An octal number, with a 0o prefix
    • A hexadecimal number, with a 0x prefix

    All of these integer literals have a decimal value of 17:

    let decimalInteger = 17  
    let binaryInteger = 0b10001       // 17 in binary notation  
    let octalInteger = 0o21           // 17 in octal notation  
    let hexadecimalInteger = 0x11     // 17 in hexadecimal notation
    

    So you would do

    let myValue: UInt16 = 0x1820
    

    or

    let myValue: unichar = 0x1820 // unichar is a type alias of UInt16