I have this code for color converting :
// takes @"#123456"
+ (UIColor *)colorWithHexString:(NSString *)str {
const char *cStr = [str cStringUsingEncoding:NSASCIIStringEncoding];
long x = strtol(cStr+1, NULL, 16);
return [UIColor colorWithHex:(UInt32)x];
}
// takes 0x123456
+ (UIColor *)colorWithHex:(UInt32)col {
unsigned char r, g, b, a;
b = col & 0xFF;
g = (col >> 8) & 0xFF;
r = (col >> 16) & 0xFF;
a = (col >> 24) & 0xFF;
if (a == 0) {
a = 255.0;
}
return [UIColor colorWithRed:(float)r/255.0f green:(float)g/255.0f blue:(float)b/255.0f alpha:(float)a/255.0f];
}
The issue is when I test this code on the simulator return the right color, but when I test this on device (iPad mini 1) the color return
UIDeviceRGBColorSpace 1 1 1 0.0156863
which is white!
The iPad mini 1 has a 32-bit CPU, and the type long
is a 32-bit signed integer.
The number 0xffdd6858
overflows the 32-bit signed integer range, and thus will return the maximum number 0x7fffffff
instead, which is a translucent white.
To fix this, use strtoul
to ensure we have at least an unsigned 32-bit integer:
unsigned long x = strtoul(cStr+1, NULL, 16);
The code is fine on the simulator since your computer should be a 64-bit CPU and long
is a 64-bit signed integer, thus the strtol
call doesn't overflow and keeps the actual value.
Instead of writing your own method, consider reusing existing libraries like mRs-/HexColors or tinymind/UIColor-HexRGB.