Search code examples
iosgpuimage

How can I get a UIImage from GPUImage camera?


I know I can use the still camera's capturePhotoAsImageProcessedUpToFilter method, but it gives a shutter click sound and I still have some processing to do so I don't want it to sound yet.

I tried using a filter's imageFromCurrentFramebuffer method, but that is always turning nil.

CGRect frame = CGRectMake(0, 0, self.view.frame.size.height, self.view.frame.size.width);
_backgroundImageView = [[GPUImageView alloc] initWithFrame:frame];
[self.view insertSubview:_backgroundImageView atIndex:0];

_camera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetHigh cameraPosition:AVCaptureDevicePositionBack];
_camera.outputImageOrientation = UIInterfaceOrientationLandscapeRight;
filter = [[GPUImageBrightnessFilter alloc] init];
filter.brightness = 0;

[_camera addTarget: filter];

[filter addTarget:_backgroundImageView];
[_camera startCameraCapture];
// processTimer is to periodically do processing on what's being shown on screen.
processTimer =[NSTimer timerWithTimeInterval:10 target:self selector:@selector(processImage) userInfo:nil repeats:NO];
[[NSRunLoop mainRunLoop] addTimer:processTimer forMode:NSRunLoopCommonModes];


-(void) processImage {
    [processTimer invalidate];

    UIImage *testImage = [filter imageFromCurrentFramebuffer]; // always nil

    [_camera capturePhotoAsImageProcessedUpToFilter:filter withCompletionHandler:^(UIImage *processedImage, NSError *error) {

         // additional processing here ... don't want shutter sound yet.

    }];
}

Solution

  • OK.. finally got an answer to the problem. I need to call useNextFrameForImageCapture on the filter before calling imageFromCurrentFramebuffer. This sometimes still returns nil, but gets me what I need.