Search code examples
iphoneobjective-ciosface-detection

Face Detection issue using CIDetector


I'm working on an app in which i have to detect left eye, right eye, and mouth position. I have an imageView on my self.view and imageView contains a face image, now I want to get both eyes and mouth coordinates. I have seen 2-3 sample codes for this but all are approximately same in all codes we have to invert my view for matching the coordinates which I don't want because my view have some other controls. And one more thing they all are using

UIImageView *imageView = [[UIImageView alloc]initWithImage:[UIImage imageNamed:@"image.png"]];

but my imageView has frame and i cant init it with image. When I do so I found faceFeature's eyes and mouth coordinates wrong.

I had started my code from this sample code but in this also view is being invert its Y coordinate.

Can any one help me how can i detect the face eyes and mouth position on UIImageView's image without invert my self.view.

Please let me know if my question is not clear enough.


Solution

  • Worked it out! - edited the class to have a faceContainer which contains all of the face objects (the mouth and eyes), then this container is rotated and thats all. Obviously this is very crude, but it does work. Here is a link, http://www.jonathanlking.com/download/AppDelegate.m. Then replace the app delegate from the sample code with it.

    -- OLD POST --

    Have a look at this Apple Documentation, and slide 42 onwards from this apple talk. Also you should probably watch the talk as it has a demo of what you are trying to achieve, it's called "Using Core Image on iOS & Mac OS X" and is here.