I have some issue while using GoogleMobileVision for iOS.
With UIImagePickerController set like this
UIImagePickerController* picker = [[UIImagePickerController alloc]init];
picker.delegate = self;
picker.sourceType = UIImagePickerControllerSourceTypeCamera;
picker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
[self presentViewController:picker animated:YES completion:^
{
self.faceImageView.layer.sublayers = nil; // drawing and re-drawing some lines...
}];
And detector:
[super viewDidLoad];
NSDictionary* options = @{
GMVDetectorFaceLandmarkType : @(GMVDetectorFaceLandmarkAll),
GMVDetectorFaceClassificationType : @(GMVDetectorFaceClassificationAll),
GMVDetectorFaceTrackingEnabled : @(NO),
//GMVDetectorFaceMode : @(GMVDetectorFaceAccurateMode) // Accurate mode detects face, but with wrong orientation; Fast mode can't detect faces!
};
self.faceDetector = [GMVDetector detectorOfType:GMVDetectorTypeFace options:options];
But, if using:picker.allowsEditing = YES;
everything works perfectly!
Question: is reason in image sizes? picker.allowsEditing = YES;
returns image of size 750x750 on iPhone 6s and 1932x2576 for default value of picker.allowsEditing
XCode v. 8.1 iPhone 6S iOS 10.1.1 GoogleMobileVision v 1.0.4
Just normalise the image's orientation by using this
func imageByNormalizingOrientation() -> UIImage {
if imageOrientation == .up {
return self
}
UIGraphicsBeginImageContextWithOptions(size, false, scale)
draw(in: CGRect(origin: CGPoint.zero, size: size))
let normalizedImage: UIImage? = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return normalizedImage!
}
and then send the output image to features(in:, options:).