In an iOS app, I need to provide image filters based on their size (width/height), think of something similar to "Large, Medium, Small" in Google images search. Opening each image and reading its dimensions when creating the list would be very performance intensive. Is there a way to get this info without opening the image itself?
Damien DeVille answered the question below, based on his suggestion, I am now using the following code:
NSURL *imageURL = [NSURL fileURLWithPath:imagePath];
if (imageURL == nil)
return;
CGImageSourceRef imageSourceRef = CGImageSourceCreateWithURL((CFURLRef)imageURL, NULL);
if(imageSourceRef == NULL)
return;
CFDictionaryRef props = CGImageSourceCopyPropertiesAtIndex(imageSourceRef, 0, NULL);
CFRelease(imageSourceRef);
NSLog(@"%@", (NSDictionary *)props);
CFRelease(props);
You can use ImageIO to achieve that.
If you have your image URL (or from a file path create a URL with +fileURLWithPath on NSURL) you can then create an image source with CGImageSourceCreateWithURL (you will have to bridge cast the URL to CFURLRef).
Once you have the image source, you can get a CFDictionaryRef of properties of the image (that you can again bridge cast to NSDictionary) by calling CGImageSourceCopyPropertiesAtIndex. What you get is a dictionary with plenty of properties about the image including the pixelHeight and pixelWidth.
You can pass 0 as the index. The index is because some images might have various embedded images (such as a thumbnail, or multiple frame like in a gif).
Note that by using an image source, the full image won't have to be loaded into memory but you will still be able to access its properties.
Make you you import and add the framework to your project.