Search code examples
iosiphoneimageavcapturesessionavcapturedevice

Front Flash Brightness iPhone


I have been working on a project for a while now and I have come to one thing I want to really work out that I haven't been able to figure out.

In the application when taking a front facing picture, I would like front flash to actually make the picture brighter.

I am using a custom AVCaptureSession camera, it is full screen. Here is the code that does make a flash happen, just the picture isn't brighter at all.

//Here is the code for a front flash on the picture button press. It does flash, just doesn't help.
UIWindow* wnd = [UIApplication sharedApplication].keyWindow;
UIView *v = [[UIView alloc] initWithFrame: CGRectMake(0, 0, wnd.frame.size.width, wnd.frame.size.height)];
[wnd addSubview: v];
v.backgroundColor = [UIColor whiteColor];
[UIView beginAnimations: nil context: nil];
[UIView setAnimationDuration: 1.0];
v.alpha = 0.0f;
[UIView commitAnimations];

//imageView is just the actual view the the cameras image fills.
imageView.hidden = NO;
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
    for(AVCaptureInputPort *port in [connection inputPorts]) {
        if ([[port mediaType] isEqual:AVMediaTypeVideo]) {
            videoConnection = connection;
            break;
        }
    }if (videoConnection) {
        break;
    }

}   [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
    if (imageDataSampleBuffer != NULL) {
        imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
        UIImage *thePicture  = [UIImage imageWithData:imageData];
        self.imageView.image = thePicture;
 //After the picture is on the screen, I just make sure some buttons are supposed to be where they are supposed to be.
        saveButtonOutlet.hidden = NO;
        saveButtonOutlet.enabled = YES;
        diaryEntryOutlet.hidden = YES;
        diaryEntryOutlet.enabled = NO;
    }

}];

}

Solution

  • You need to set the screen to white before the image is captured, wait for the capture to complete and then remove the white screen in the completion block.

    You should also dispatch the capture after a short delay to ensure the screen has turned white -

    UIWindow* wnd = [UIApplication sharedApplication].keyWindow;
    UIView *v = [[UIView alloc] initWithFrame: CGRectMake(0, 0, wnd.frame.size.width, wnd.frame.size.height)];
    [wnd addSubview: v];
    v.backgroundColor = [UIColor whiteColor];
    
    dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.1 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{
        [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
            if (imageDataSampleBuffer != NULL) {
                imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                UIImage *thePicture  = [UIImage imageWithData:imageData];
    
                dispatch_async(dispatch_get_main_queue(), ^{
                    self.imageView.image = thePicture;
                    [v removeFromSuperview];
                });
            }
            //After the picture is on the screen, I just make sure some buttons are supposed to be where they are supposed to be.
            saveButtonOutlet.hidden = NO;
            saveButtonOutlet.enabled = YES;
            diaryEntryOutlet.hidden = YES;
            diaryEntryOutlet.enabled = NO;
        }];
    }];