Search code examples
objective-ciosimage-processingcore-image

Core Image CIPerspectiveTransform Filter: How to use CIVectors?


I am having a very hard time finding any documentation online that clearly explains how to implement Core Image's CIPerspectiveTransform filter properly. In particular, when setting CIVector values for inputTopLeft, inputTopRight, inputBottomRight, and inputBottomLeft, what are these vectors doing to the image? (I.e., what is the math behind how these vectors warp my image?)

Currently this is the code I am using. It doesn't crash, but it doesn't show an image:

CIImage *myCIImage = [[CIImage alloc] initWithImage:self.image];
CIContext *context = [CIContext contextWithOptions:nil];
CIFilter *filter = [CIFilter filterWithName:@"CIPerspectiveTransform" keysAndValues:@"inputImage", myCIImage, @"inputTopLeft", [CIVector vectorWithX:118 Y:484], @"inputTopRight", [CIVector vectorWithX:646 Y:507], @"inputBottomRight", [CIVector vectorWithX:548 Y:140], @"inputBottomLeft", [CIVector vectorWithX:155 Y:153], nil];
CIImage *outputImage = [filter outputImage];
CGImageRef cgimg = [context createCGImage:outputImage fromRect:[outputImage extent]];
UIImage *transformedImage = [UIImage imageWithCGImage:cgimg];
[self setImage:transformedImage];
CGImageRelease(cgimg);

Other things to note that might be important:

  • My UIImageView (75pts x 115 pts) is already initialized via awakeFromNib and already has an image (151px x 235px) associated with it.

  • The above code is being implemented in the UIImageView's - (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event function. The hope is that I'll be able to adjust the perspective of the image based on screen coordinates so it looks like the image is moving in 3D space.

  • This code is for an iPhone app.

Again, the question I think I am asking is what the various parameter vectors do, but I may be asking the wrong question.

The following post is very similar but is asking why his image disappears rather than how to use CIVectors for CIPerspectiveTransform. It also has received very little traction, perhaps because it is too general: How I can use CIPerspectiveTransform filter


Solution

  • As I commented on the linked question, CIPerspectiveTransform is not available in the iOS implementation of Core Image as of iOS 5.1. That's why you and the other asker weren't seeing any image as a result, because most likely your CIFilter was nil.

    If you just want to implement a form of perspective on an image, there are two different fast ways of doing this on iOS, as I describe in this answer. One is to simply use the right kind of CATransform3D on the layer of a UIImageView, but this is only useful for display, not for image adjustment.

    The second way is to manipulate the image using an appropriate 3-D transformation matrix in OpenGL ES. As I indicate in the above-linked answer, I have an open source framework that wraps all this, and the FilterShowcase sample there has an example of applying a perspective to incoming video. You can easily swap out the video input with your image, and grab an image from that after the perspective effect is applied.