Search code examples
iosswiftbase64exif

Swift base64 image encoding with EXIF data


I am currently using base64 encoding to convert and sent multiple images in a JSON file from my Swift app to my API using:

let imageData = image.jpegData(compressionQuality: 1.0)
let sSideL = imageData.base64EncodedString(options: .lineLength64Characters)

While extending my API, I now would like to use the rich EXIF data provided by most smartphones like lense information, field of view or the device model. Most important for my current purpose is the "Image Model" tag, in order to identify the device, which took the picture.

I recognized that there are some EXIF data left in the base64 data coming through my API but it is limited to the orientation and very basic information like the orientation. Also when I directly print the base64String in Xcode and analyze it, it has very poor EXIF information. Technically it should be possible, because converting the same image in an online base64 converter and analyzing the returning string, I am able to see EXIF information like "Image Model", etc.

Is there a way to convert my UIImage to a base64 string keeping all EXIF details?

The API represents the main part of my system, so I would like to keep it as simple as possible and not add additional upload parameter.

EDIT Here my code to capture the UIImage

extension CameraController: AVCapturePhotoCaptureDelegate {


    public func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
                        resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {

        if let error = error {
            // ERROR
        }
        else if let buffer = photoSampleBuffer,
            let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil),
            let image = UIImage(data: data) {

            // SEND IMAGE TO SERVER
        }
        else {
            // UNKNOWN ERROR
        }
    }
}

Solution

  • You can use the newer (iOS 11+) delegate method:

        public func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
                if let error = error {
            // ERROR
                } else if let data = photo.fileDataRepresentation() {
    
            // SEND IMAGE DATA TO SERVER
        }
        else {
            // UNKNOWN ERROR
        }
    }
    

    or the method you are using:

        public func photoOutput(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhoto photoSampleBuffer: CMSampleBuffer?, previewPhoto previewPhotoSampleBuffer: CMSampleBuffer?,
                        resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Swift.Error?) {
    
        if let error = error {
            // ERROR
        } else if let buffer = photoSampleBuffer,
            let data = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: buffer, previewPhotoSampleBuffer: nil) {
            // SEND IMAGE DATA TO SERVER
        }
        else {
            // UNKNOWN ERROR
        }
    }
    

    Like leo-dabus mentioned, you need to send the image data to the server, that has the metadata in it. If you first create an UIImage and convert that back again to data, you have lost the metadata.