Search code examples
iosmultithreadinggrand-central-dispatchcore-imagedispatch-async

dispatch_barrier_async doesn't wait for CoreImage to finish with processing


before I ask my question I should say I have read so much about it and I've tried so many ways but none has worked. I'm doing dozens of Core Image processing in a concurrent Queue, and I need to wait for them by using a dispatch_barrier_async to finish so only then I can do my final render and go to the next view controller but ironically, dispatch_barrier doesn't wait for my concurrent queue to finish, Why is that? is it because Im doing core image processing in the wrong thread?

//Here is my cocurrent queue.

dispatch_queue_t concurrentQueue = 
dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_BACKGROUND, 0);

Using it for processing effects here as an example

-(void)setupEffects{
//It's one of my effects as an example which renders for previewing the //effect.

case Effect4:{
              dispatch_async(concurrentQueue, ^{
                  //BG
                  self.firstCIFilter = [CIFilter filterWithName:@"CIHexagonalPixellate"
                                            withInputParameters:@{@"inputImage": [self getFirstCIImage],@"inputScale":@26}];
                  self.lastSelectedInputImgforBG =[self applyCIEffectWithCrop];

                  //FG
                  self.firstCIFilter = [CIFilter filterWithName:@"CIPhotoEffectProcess"
                                            withInputParameters:@{@"inputImage":[self getFirstCIImage]}];
                  self.fgImgWithEffect = [self applyCIEffect];

                  dispatch_async(dispatch_get_main_queue(), ^{
                      self.lastSelectedInputImgforFG= [self cropAndFadeAndRenderFGImage];
                      [self saveEffect];
                      [self loadEffectsWithIndex:effectIndex];
                  });
              });
 }

//Once user is done, it renders the image once again
-(UIImage *)applyCIEffectWithCrop{
    __weak typeof(self) weakSelf = self;
    @autoreleasepool{
        weakSelf.firstCIContext =nil;
        weakSelf.firstResultCIImage=nil;
        weakSelf.croppingCIImage=nil;

        weakSelf.firstCIContext = [CIContext contextWithOptions:nil];
        weakSelf.firstResultCIImage = [weakSelf.firstCIFilter valueForKey:kCIOutputImageKey];
        weakSelf.croppingCIImage=[weakSelf.firstResultCIImage imageByCroppingToRect:CGRectMake(0,0, weakSelf.affineClampImage1.size.width*scale , weakSelf.affineClampImage1.size.height*scale)];
        return  [UIImage imageFromCIImage:weakSelf.croppingCIImage scale:1.0 orientation:weakSelf.scaledDownInputImage.imageOrientation cropped:YES withFirstCIImage:[weakSelf getFirstCIImage]];
    }
}

And then for my final render, this method needs to wait for my setupEffect to finish and then use the segue but it doesn't.

- (void)doneButtonAction {
    _finalRender =YES;
    CGFloat max=MAX(self.originalSizeInputImage.size.width,self.originalSizeInputImage.size.height);
    if (max<=1700){
        //Do nothing for Final Render
        self.scaledDownInputImage= self.originalSizeInputImage;
    }else{
        CGSize scaledDownSize = [self getScalingSizeForFinalRenderForImage: self.originalSizeInputImage];
        self.scaledDownInputImage = [self scaleThisImage:self.originalSizeInputImage scaledToFillSize:scaledDownSize];
    }
    imageRect = AVMakeRectWithAspectRatioInsideRect(self.scaledDownInputImage.size, self.viewWithLoadedImages.bounds);

    //Preparation for high quality render with high resolution input 
    //image.
    self.affineClampImage1 = [self affineClampImage];
    self.selectionCropAndBlurredImage = [self croppedFGtoGetBlurred];
    [self.imgData appendData:UIImagePNGRepresentation(self.scaledDownInputImage)];
    [self.effectClass getimageWithImageData:self.imgData];

    if (_effectMode) {
        //Applying effects again for the high resolution input image.
        [self setupEffects];
    }else{
        [self setupFilters];
    }

    dispatch_async(concurrentQueue, ^{
        //Rendering the high quality Images in full resolution here.
        CGRect frame = CGRectMake(0.0, 0.0,
                                  self.lastSelectedInputImgforBG.size.width  *self.lastSelectedInputImgforBG.scale,
                                  self.lastSelectedInputImgforBG.size.height *self.lastSelectedInputImgforBG.scale);
        UIGraphicsBeginImageContextWithOptions(frame.size, NO, 1.0);
        // Draw transparent images on top of each other
        [self.lastSelectedInputImgforBG drawInRect:frame];
        [self.lastSelectedInputImgforFG drawInRect:frame];
        self.tempImage=nil;
        self.tempImage = UIGraphicsGetImageFromCurrentImageContext();        
        UIGraphicsEndImageContext();
    });

    dispatch_barrier_async(concurrentQueue, ^{
        // Getting the full resolution rendered image and going to 
        //the next viewcontroller when the setupEffect and render is 
        //finished... which it doesn't wait until they're finished...
        self.finalHightqualityRenderedImage = self.tempImage;        
        [self performSegueWithIdentifier:@"showShareVC" sender:self];
    });
}

I should mention My code works without problem without using my concurrent queue but of course that blocks the UI until its done which is not my goal. Your help will be truly appreciated.


Solution

  • I think the explanation is at the bottom of the dispatch_barrier_async:

    The queue you specify should be a concurrent queue that you create yourself using the dispatch_queue_create function. If the queue you pass to this function is a serial queue or one of the global concurrent queues, this function behaves like the dispatch_async function.

    So instead of grabbing DISPATCH_QUEUE_PRIORITY_BACKGROUND as per your first line of code, create concurrentQueue yourself using dispatch_queue_create.