I am creating an Image Processing app that does two image analysis functions. One is to read the RGB data of the image and the other is to read the EXIF data. I am taking a photo with the front camera and then saving it to the documents folder. To grab the RGB values I load the image in this manner:
NSString *jpgPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Test.jpg"];
UIImage *image = [UIImage imageWithContentsOfFile:jpgPath];
CFDataRef pixelData = CGDataProviderCopyData(CGImageGetDataProvider(image.CGImage));
const UInt8* data = CFDataGetBytePtr(pixelData);
This works as expected and I can get the pixel data. My issue is gathering the EXIF data. I am implementing the reading of my image in the same manner as RGB and all my EXIF data comes back as NULL.
NSString *EXIFPath = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents/Test.jpg"];
NSURL *url = [NSURL fileURLWithPath:EXIFPath];
CGImageSourceRef sourceRef = CGImageSourceCreateWithURL((__bridge CFURLRef)url, NULL);
NSDictionary *immutableMetadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(sourceRef,0,NULL);
NSDictionary *exifDic = [immutableMetadata objectForKey:(NSString *)kCGImagePropertyExifDictionary];
NSNumber *ExifApertureValue = [exifDic objectForKey:(NSString*)kCGImagePropertyExifApertureValue];
NSNumber *ExifShutterSpeed = [exifDic objectForKey:(NSString*)kCGImagePropertyExifShutterSpeedValue];
NSLog(@"ExifApertureValue : %@ \n",ExifApertureValue);
NSLog(@"ExifShutterSpeed : %@ \n",ExifShutterSpeed);
If I change the first line of code to read a preloaded image in the app like this:
NSString *aPath = [[NSBundle mainBundle] pathForResource:@"IMG_1406" ofType:@"JPG"];
It works. The problem is I can not preload the images. They must be taken live from the camera. Any suggestions are greatly appreciated. Thank you.
How does that file Test.jpg
get written to the Documents directory? Is it written with UIImageJPEGRepresentation
? If so, the EXIF data will be lost. Make sure you store the JPEG source for any images you need the metadata for.
Regardless of what's happening, it will help to log the full immutableMetadata
and exifDic
objects as soon as you retrieve them.
NSDictionary *immutableMetadata = (__bridge NSDictionary *) CGImageSourceCopyPropertiesAtIndex(sourceRef,0,NULL);
NSLog(@"immutableMetadata = %@", immutableMetadata);
NSDictionary *exifDic = [immutableMetadata objectForKey:(NSString *)kCGImagePropertyExifDictionary];
NSLog(@"exifDic");
If your exifDic
log only contains these three values, it's been saved by a function that didn't care about preserving the EXIF headers.
exifDic = {
ColorSpace = 1;
PixelXDimension = 1200;
PixelYDimension = 1600;
}
Two other things that work, but could be better:
(1) There's no guarantee that the Documents directory will be a subdirectory of NSHomeDirectory(). The reliable way to get this Documents location is as follows:
NSArray *documentDirectories = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentDirectory = [documentDirectories objectAtIndex:0];
NSString *imagePath = [documentDirectory stringByAppendingPathComponent:@"Test.jpg"];
(2) You're currently loading the image bytes from storage twice, once to get pixels and once to get the metadata. Load them into an NSData
object, and you only have to retrieve the file once. Keep that NSData
object, and you can save the image later without losing any detail. (This will consume memory equal to the file size, so only keep it if you need it.)
NSData *imageData = [NSData dataWithContentsOfFile:imagePath];
UIImage *image = [UIImage imageWithData:imageData];
// Do things involving image pixels...
CGImageSourceRef sourceRef = CGImageSourceCreateWithData((__bridge CFDataRef) imageData, NULL);
// Do things involving image metadata...