Search code examples
iosswiftmultithreadinggrand-central-dispatchexif

PHContentEditingInputRequestOptions for EXIF blocks main Thread


I am getting image EXIFs in order to get the LensMake.

This is the only way I found to do this:

let lstAssets : PHFetchResult<PHAsset> = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: nil)
lstAssets.enumerateObjects({object , index, stop in
    let options = PHContentEditingInputRequestOptions()
    object.requestContentEditingInput(with: options) { (contentEditingInput: PHContentEditingInput?, _) -> Void in
        let fullImage : CIImage = CIImage(contentsOf: contentEditingInput!.fullSizeImageURL!)!
        if let exif = fullImage.properties["{Exif}"]{ ...

The problem is that the code in the callback/clasure/completion of the requestContentEditingInput runs in the Main Thread, and blocks any calls from the UI (for example IBAction for a button). So if I start with a lstAssets.count of 100 and make 100 requests in the enumerateObjects, all the callbacks need to be finished to execute something in the UI.

I tried with no success placing the enumerateObjects in a DispatchQueue.global(qos: .background).async. The same happens if I put the callback code in a background thread.

Curiously, this behaviour is different if I have PHImageManager.default() with requestImageData for images but the problem is that I cannot get the EXIF that way.

How do I unblock the queue? Is there a better way to get EXIF?

I am running Swift 3 in Xcode 8.2.

Update: Weird behaviour by GCD: running a requestImageData right inside the enumerateObjects right before the PHContentEditingInputRequestOptions unblocks the Main Thread, but still when getting to the next screen via click the UI waits for the callbacks to end.


Solution

  • If you only need to read the exif metadata you can use PHImageManager which is friendlier with background queues. So you can change your code to something like this:

    let operationQueue = OperationQueue()
    
    let lstAssets : PHFetchResult<PHAsset> = PHAsset.fetchAssets(with: PHAssetMediaType.image, options: nil)
    operationQueue.addOperation {
        self.readPhotosMetadata(result: lstAssets, operationQueue: operationQueue)
    }
    
    func readPhotosMetadata(result: PHFetchResult<PHAsset>, operationQueue: OperationQueue) {
        let imageManager = PHImageManager.default()
        result.enumerateObjects({object , index, stop in
            let options = PHImageRequestOptions()
            options.isNetworkAccessAllowed = true
            options.isSynchronous = false
            imageManager.requestImageData(for: object, options: options, resultHandler: { (imageData, dataUTI, orientation, info) in
                operationQueue.addOperation {
                    guard let data = imageData,
                        let metadata = type(of: self).fetchPhotoMetadata(data: data) else {
                            print("metadata not found for \(object)")
                            return
                    }
                    print(metadata)
                }
            })
        })
    }
    
    static func fetchPhotoMetadata(data: Data) -> [String: Any]? {
        guard let selectedImageSourceRef = CGImageSourceCreateWithData(data as CFData, nil),
            let imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(selectedImageSourceRef, 0, nil) as? [String: Any] else {
            return nil
        }
        return imagePropertiesDictionary
    
    }