I am using Alamofire to upload multiple files at the same time to Open Asset using their REST API and I am able to get this to work, however, most of the EXIF data is being stripped out. Unfortunately, the EXIF data is a must as we need the ability mine out the GPS tags and a few other things through various web clients.
After doing some research, I found the issue is because I'm using UIImageJPEGRepresentation
to convert the photos to NSData
(which is what Alamofire
expects or a fileURL, which I don't think would work for me?).
I am also using the BSImagePicker library to allow the user to take/select multiple photos, which returns a an array of PHAssets which then get converted to NSData
. Here is my function to do this (where collectedImages
is a global dictionary):
func compressPhotos(assets: [PHAsset]) -> Void {
for asset in assets {
let filename = self.getOriginalFilename(asset)
let assetImage = self.getAssetPhoto(asset)
let compressedImage = UIImageJPEGRepresentation(assetImage, 0.5)! // bye bye metadata :(
collectedImages[filename] = compressedImage
print("compressed image: \(filename)")
}
}
I think I could retain the EXIF data if I could use the full path from the PHAsset
to the image locally on the phone, but Alamofire
does not appear to support that. I'm hoping I'm wrong about that. Here is my uploader:
func uploadPhotos(projectId: String, categoryId: String, data: [String: NSData], completionHandler: (AnyObject?, NSError?) -> ()) {
var jsonBody = [AnyObject]() //lazy
Alamofire.upload(
.POST,
self.url + "/Files",
multipartFormData: { multipartFormData in
for (filename, img) in data {
jsonBody.append(["project_id": projectId, "category_id": categoryId, "original_filename": filename])
multipartFormData.appendBodyPart(data: img, name: "file", fileName: filename, mimeType: "image/jpeg")
print("img size: \(img.length)")
}
let jsonData = jsonToNSData(jsonBody)
print("_jsonBody: \(jsonBody)")
multipartFormData.appendBodyPart(data: jsonData!, name: "_jsonBody")
print("multipart: \(multipartFormData)")
},
encodingCompletion: { encodingResult in
switch encodingResult {
case .Success(let upload, _, _):
upload.responseJSON { response in
debugPrint(response)
switch response.result {
case .Success(let value):
completionHandler(value as? NSArray, nil)
case .Failure(let error):
completionHandler(nil, error)
}
}
case .Failure(let encodingError):
print(encodingError)
}
}
)
}
So my question is, how can I upload multiple photos (while passing in other parameters too) while maintaining all the EXIF data? Alamofire
is an awesome library and I would like to use it here, but I'm not married to it for the upload process if I can't keep the EXIF data.
I am not sure why I was having issues with the fullSizeImageURL
, but I it did lead me to the right path as I was able to get this to work by getting the image as NSData
from the file path like this:
asset.requestContentEditingInputWithOptions(PHContentEditingInputRequestOptions()) { (input, _) in
let fileURL = input!.fullSizeImageURL?.filePathURL
let data = NSData(contentsOfFile: fileURL!.path!)!
And then I just passed that in the Alamofire.request()
as the data
argument. This maintained all the original photo metadata.