I'm getting introduced to objective C and have a mild understanding of enum types.
Here is a piece of sample code seen here used in the tutorial I'm following:
UIFont *bodyFont = [UIFont preferredFontForTextStyle:UIFontTextStyleBody];
UIFontDescriptor *existingDescriptor = [bodyFont fontDescriptor];
UIFontDescriptorSymbolicTraits traits = existingDescriptor.symbolicTraits;
traits |= UIFontDescriptorTraitBold;
UIFontDescriptor *newDescriptor = [existingDescriptor fontDescriptorWithSymbolicTraits:traits];
UIFont *boldBodyFont = [UIFont fontWithFontDescriptor:newDescriptor size:0];
From what I understand, bodyFont is set using a class method of UIFont then existingDescriptor is created by extracting that from bodyFont. The existing UIFontDescriptorSymbolicTraits are then extracted from that and stored in traits.
I do not understand what follows after that (traits |= UIFontDescriptorBold;) From googling, I understand that it is a bit wise comparison and then assignment, but I'm not sure why it has to be done his way. Going to my next question.
From the API for UIFontDescriptor.h (https://developer.apple.com/library/ios/documentation/uikit/reference/UIFontDescriptor_Class/Reference/Reference.html#//apple_ref/doc/c_ref/UIFontDescriptorSymbolicTraits)
typedef enum : uint32_t {
/* Typeface info (lower 16 bits of UIFontDescriptorSymbolicTraits) */
UIFontDescriptorTraitItalic = 1u << 0,
UIFontDescriptorTraitBold = 1u << 1,
UIFontDescriptorTraitExpanded = 1u << 5,
UIFontDescriptorTraitCondensed = 1u << 6,
UIFontDescriptorTraitMonoSpace = 1u << 10,
UIFontDescriptorTraitVertical = 1u << 11,
UIFontDescriptorTraitUIOptimized = 1u << 12,
UIFontDescriptorTraitTightLeading = 1u << 15,
UIFontDescriptorTraitLooseLeading = 1u << 16,
/* Font appearance info (upper 16 bits of UIFontDescriptorSymbolicTraits */
UIFontDescriptorClassMask = 0xF0000000,
UIFontDescriptorClassUnknown = 0u << 28,
UIFontDescriptorClassOldStyleSerifs = 1u << 28,
UIFontDescriptorClassTransitionalSerifs = 2u << 28,
UIFontDescriptorClassModernSerifs = 3u << 28,
UIFontDescriptorClassClarendonSerifs = 4u << 28,
UIFontDescriptorClassSlabSerifs = 5u << 28,
UIFontDescriptorClassFreeformSerifs = 7u << 28,
UIFontDescriptorClassSansSerif = 8u << 28,
UIFontDescriptorClassOrnamentals = 9u << 28,
UIFontDescriptorClassScripts = 10u << 28,
UIFontDescriptorClassSymbolic = 12u << 28
} UIFontDescriptorSymbolicTraits;
What is the meaning of the notation enum : uint32_t? I know the use of enum and I somewhat know that uint32_t means unsigned 32 bit integer (though I'm not sure how it differs from a normal unsigned int).
Another question why are the values created as shifted bits instead of just normal integers? Why do some values skip bits or numbers (eg UIDescriptorClassSlabSerifs goes from 5u << 28 to 7u << 28. Or UIFontDescriptorTraitBold 1u<<1 to 1u<<5)?
Please let me know if my questions need further explanation.
The : uint32_t
specifies the size of the storage that's used for variables of this type. uint_32t
means that regardless of architecture, you have exactly 32 bits of information. It's unsigned because bit twiddling on signed integers can produce unexpected results.
The values are specified this way to make it clear that they are being used as composable flags; a value stored in a variable of this type contains many pieces of information. It's a lot easier to read and write 1u << 5
1u << 6
than translate from decimal in your brain. The skipped bits are either to allow for future expansion or to group related flags, again for readability.
The |=
operator is not comparison, it's assignment. It's similar to +=
, which adds the right-hand operand to the left-hand and stores the result back in the latter. In this case, it does a bitwise OR, which sets the bits specified on the right-hand side. This is how you add flags to a bitmask.