I found the Instagram has a camera window like 300*300? its a square, then Im trying to use the GPUImage to make the same camera size.so I wrote like this:
primaryView = [GPUImageView alloc] initWithFrame:
CGRectMake(0,0,300,300)];//define a square view
//define a still camera
stillCamera = [[GPUImageStillCamera alloc]
initWithSessionPreset:AVCaptureSessionPreset640x480
cameraPosition:AVCaptureDevicePositionFront];
//make it portrait
stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
//define a filter
filter = [[GPUImageFilter alloc] init];
//force the output to 300*300
[filter forceProcessingAtSize:((GPUImageView*)self.primaryView).sizeInPixels];
//set things up
[stillCamera addTarget: filter];
[filter addTarget: primaryView];
[stillCamera startCameraCapture];
Now I really get a square view..but the view looks so flat, its completely distortion, I think it might be something wrong with the aspect ratio.then I set the fill mode of the primary view like this:
[primaryView setFillMode:kGPUImageFillModePreserveAspectRatioAndFill];
then this:
[primaryView setFillMode:kGPUImageFillModePreserveAspectRatio];
then this:
[primaryView kGPUImageFillModeStretch];
FINALLY! none of these work..so what I am missing, any one, help.
-forceProcessingAtSize:
causes that filter to output an image of exactly the size you provide, ignoring aspect ratio. That squishes your image into a 300x300 square.
If you want to preserve the aspect ratio of your image, you'd want to use -forceProcessingAtSizeRespectingAspectRatio:
.
However, that will letterbox your image. To replicate what Instagram does, you probably want to instead cut a square region out of the center of the image. For that, use a crop filter with a square aspect ratio.