I have added this code to my project. It works fine, create and shows and instance of ZBarReaderViewController from the current view.
However I would like to be able to define a custom region of my current view controller and show ZBarReaderViewController inside that region whilst still showing my "previous/other" view. The code below show the view controller in full screen mode.
On interface builder I can only add UIViews in an existing ViewController and hence I am unable to associate the custom view region to ZBarReaderViewController.
The only thing I can do is to associate it to a ZBarReaderView instance but, as ZBarReaderViewController is a closed source (I can see only the header files on the ZBar reader project I am using) I am unable to modify the behaviour.
How can I solve this?
(IBAction)startScanning:(id)sender {
NSLog(@"Scanning..");
resultTextView.text = @"Scanning..";
ZBarReaderViewController *codeReader = [ZBarReaderViewController new];
codeReader.readerDelegate=self;
codeReader.supportedOrientationsMask = ZBarOrientationMaskAll;
ZBarImageScanner *scanner = codeReader.scanner;
[scanner setSymbology: ZBAR_I25 config: ZBAR_CFG_ENABLE to: 0];
[self presentViewController:codeReader animated:YES completion:nil];
}
So here is an example for a scanner view controller. I used storyboard to create the view, but you can do it also programatically or using regular nib.
First, create your view (let's say in a storyboard) and place a UIView inside it, where you would like the scanner to be shown.
Now, let's take a look on the view controller (please see comments inside it):
#import <AVFoundation/AVFoundation.h>
#import "ScannerViewController.h"
@interface ScannerViewController () <AVCaptureMetadataOutputObjectsDelegate>
// UI
@property (weak, nonatomic) IBOutlet UIView *viewPreview; // Connect it to the view you created in the storyboard, for the scanner preview
// Video
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *videoPreviewLayer;
@property (nonatomic, strong) AVAudioPlayer *audioPlayer;
@property (nonatomic, strong) AVCaptureSession *flashLightSession;
@property (nonatomic) BOOL isReading;
@end
@implementation ScannerViewController
- (void)viewDidLoad
{
[super viewDidLoad];
// Initially make the captureSession object nil.
_captureSession = nil;
// Set the initial value of the flag to NO.
_isReading = NO;
}
- (void)didReceiveMemoryWarning
{
[super didReceiveMemoryWarning];
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self startStopReading:nil];
}
- (IBAction)startStopReading:(id)sender
{
if (!_isReading) {
[self startReading];
}
else {
// In this case the app is currently reading a QR code and it should stop doing so.
[self stopReading];
}
// Set to the flag the exact opposite value of the one that currently has.
_isReading = !_isReading;
}
#pragma mark - Private
- (BOOL)startReading
{
NSError *error;
// Get an instance of the AVCaptureDevice class to initialize a device object and provide the video
// as the media type parameter.
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// Get an instance of the AVCaptureDeviceInput class using the previous device object.
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
// If any error occurs, simply log the description of it and don't continue any more.
NSLog(@"%@", [error localizedDescription]);
return NO;
}
// Initialize the captureSession object.
_captureSession = [[AVCaptureSession alloc] init];
// Set the input device on the capture session.
[_captureSession addInput:input];
// Initialize a AVCaptureMetadataOutput object and set it as the output device to the capture session.
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
// Create a new serial dispatch queue.
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]]; // Add all the types you need, currently it is just QR code
// Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
// Start video capture.
[_captureSession startRunning];
return YES;
}
- (void)stopReading
{
// Stop video capture and make the capture session object nil.
[_captureSession stopRunning];
_captureSession = nil;
// Remove the video preview layer from the viewPreview view's layer.
[_videoPreviewLayer removeFromSuperlayer];
}
#pragma mark - AVCaptureMetadataOutputObjectsDelegate
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputMetadataObjects:(NSArray *)metadataObjects fromConnection:(AVCaptureConnection *)connection
{
// Check if the metadataObjects array is not nil and it contains at least one object.
if (metadataObjects != nil && [metadataObjects count] > 0) {
[self performSelectorOnMainThread:@selector(stopReading) withObject:nil waitUntilDone:NO];
_isReading = NO;
// If the audio player is not nil, then play the sound effect.
if (_audioPlayer) {
[_audioPlayer play];
}
// This was my result, but you can search the metadataObjects array for what you need exactly
NSString *code = [(AVMetadataMachineReadableCodeObject *)[metadataObjects objectAtIndex:0] stringValue];
}
}