I'm using the Azure Face Recognition API in an iPhone app. It's working just fine when I take pictures with the back camera but when I use the front-facing one, the API fails to detect faces.
I've tried transferring the (front-facing) photo to my laptop and dragged it into the test area in the documentation and there the face was detected just fine.
This leads me to believe that there's maybe some metadata or flags specific to front-facing photos that confuse the API? And that those are stripped when uploaded through a browser?
UPDATE
Here's how I'm uploading the file using AlamoFire:
let data = UIImageJPEGRepresentation(photo, 0.5)
let url = "https://.../detect"
let octetHeaders = ["Content-Type": "application/octet-stream", "Ocp-Apim-Subscription-Key": "..."]
Alamofire.upload(data, to: url, method: .post, headers: octetHeaders)
Thanks!
Xuan Hu was right in the comments. Turns out the iPhone doesn't rotate images – it just sets an orientation EXIF-tag.
Hard rotating the photo before uploading made it all work:
func normalizeImageRotation(_ image: UIImage) -> UIImage {
if (image.imageOrientation == UIImageOrientation.up) { return image }
UIGraphicsBeginImageContextWithOptions(image.size, false, image.scale)
image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
let normalizedImage = UIGraphicsGetImageFromCurrentImageContext()!
UIGraphicsEndImageContext()
return normalizedImage
}