I am following a facial recognition app example from Pro iOS 5 Augmented Reality book. I even downloaded the source code... I ran it from there and the problem persists with his code. Here is the problem: It crashes on the assignment of an array that takes the featuresInImage of a CGImage for a CIDetector that is detecting for a face. From logging... it seems that this method is called many many many times... I am using cocos2d_chipmunk so I am using a CSScene. Note that this crash is an EXC_BAD_ACCESS (code=1, address=0x4499923c)
Help please?
- (void)facialRecognitionRequest:(UIImage *)image {
//NSLog(@"Image is: %f by %f", image.size.width, image.size.height);
if (!isProcessingRequest) {
isProcessingRequest = YES;
//NSLog(@"Detecting Faces");
NSArray* arr = [detector featuresInImage:[CIImage imageWithCGImage:[image CGImage]]]; // CRASHES HERE
if ([arr count] > 0) {
//NSLog(@"Faces found.");
for (int i = 0; i < 1; i++) { //< [arr count]; i++) {
CIFaceFeature *feature = [arr objectAtIndex:i];
double xPosition = (feature.leftEyePosition.x + feature.rightEyePosition.x+feature.mouthPosition.x)/(3*image.size.width) ;
double yPosition = (feature.leftEyePosition.y + feature.rightEyePosition.y+feature.mouthPosition.y)/(3*image.size.height);
double dist = sqrt(pow((feature.leftEyePosition.x - feature.rightEyePosition.x),2)+pow((feature.leftEyePosition.y - feature.rightEyePosition.y),2))/image.size.width;
yPosition += dist;
CGSize size = [[CCDirector sharedDirector] winSize];
pumpkin.opacity = 255;
pumpkin.scale = 5*(size.width*dist)/256.0;
//int randomPumpkin = ((arc4random() % 10) + 5);
[pumpkin setDisplayFrame:[[CCSpriteFrameCache sharedSpriteFrameCache] spriteFrameByName:[NSString stringWithFormat:@"pumpkin%d.png", pumpkin_count + 4]]];
CCMoveTo *moveAction = [CCMoveTo actionWithDuration:0 position:ccp((size.width * (xPosition)), (size.height * ((yPosition))))];
[pumpkin runAction:moveAction];
}
} else {
pumpkin.opacity = 0;
}
}
isProcessingRequest = NO;
}
Assigning the CIDetector:
- (id)init {
if (self = [super init]) {
// ....... other stuff here
NSDictionary *detectorOptions = [NSDictionary dictionaryWithObjectsAndKeys:CIDetectorAccuracyLow, CIDetectorAccuracy, nil];
self.detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions]; // CIDetector instance named detector is my property
}
return self;
}
I tried: CGImage *theCGImage = [image CGImage]; NSLog(@"theCGImage: %@", theCGImage);
CIImage *theCIImage = [CIImage imageWithCGImage:theCGImage];
NSLog(@"theCIImage: %@", theCIImage);
NSArray* arr = [detector featuresInImage:theCIImage];
NSLog(@"arr: %@", arr);
Here are the results:
2012-04-15 19:08:25.136 Ch8[981:609f] tmpCGImage: <CGImage 0x1f689c00>
2012-04-15 19:08:25.143 Ch8[981:609f] tmpCIImage: <CIImage: 0x1f687970 extent [0 0 480 360]>
2012-04-15 19:08:25.282 Ch8[981:609f] arr: (
"<CIFaceFeatureInternal: 0x1f58e080>"
)
I also tried enabling NSZombies
but still no luck... any ideas?
In answer to the comment (not the entire question, presented as answer just for the formatting):
"how would i do that do i just write if statements to see if they are non-zero then log it?"
Instead of:
NSArray* arr = [detector featuresInImage:[CIImage imageWithCGImage:[image CGImage]]];
break it up into three statements:
CGImage *theCGImage = [image CGImage];
NSLog(@"theCGImage: %@", theCGImage);
CIImage *theCIImage = [CIImage imageWithCGImage:theCGImage];
NSLog(@"theCIImage: %@", theCIImage);
NSArray* arr = [detector featuresInImage:theCIImage];
NSLog(@"arr: %@", arr);
So that the offending statement can be found. This is a general debugging technique and not a bad way to write the code in any case.
The NSLog
statements are not really necessary, a breakpoint on the first statement and then single step through.
For crashes due to premature releases use NSZombies
. It can be enabled in Xcode under "Edit Scheme, tab: "Diagnostics", be sure to turn it off when running on the device.