Search code examples
iosxamarin.iosavfoundationqr-codeios-camera

iOS: Camera does not recognize QR Code on first load of view controller


I have a small app, that does read QR-Codes for a login and alternatively offers the possibility to hand-type the code and login. The app starts and heads directly to the login (View). When I try to scan a qr code that does not work - the delegate is never called/the event never raised.

I adapted the approach from Larry OBrien http://www.knowing.net/index.php/2013/10/09/natively-recognize-barcodesqr-codes-in-ios-7-with-xamarin-ios/

And created my own ScannerView class for that use:

public sealed partial class ScannerView : UIView
{
    private readonly AVCaptureVideoPreviewLayer _layer;
    public AVCaptureSession Session { get; }
    private readonly AVCaptureMetadataOutput _metadataOutput;

    public event EventHandler<AVMetadataMachineReadableCodeObject> MetadataFound = delegate { };
    public ScannerView (IntPtr handle) : base (handle)
    {
        Session = new AVCaptureSession();
        var camera = AVCaptureDevice.DefaultDeviceWithMediaType(AVMediaType.Video);
        var input = AVCaptureDeviceInput.FromDevice(camera);
        Session.AddInput(input);

        //Add the metadata output channel
        _metadataOutput = new AVCaptureMetadataOutput {RectOfInterest = Bounds};
        var metadataDelegate = new MetadataOutputDelegate();
        var dispatchQueue = new DispatchQueue("scannerQueue");
        _metadataOutput.SetDelegate(metadataDelegate, dispatchQueue);
        Session.AddOutput(_metadataOutput);

        _layer = new AVCaptureVideoPreviewLayer(Session)
        {
            MasksToBounds = true,
            VideoGravity = AVLayerVideoGravity.ResizeAspectFill,
            Frame = Bounds
        };

        Layer.AddSublayer(_layer);

        // Hand event over to subscriber
        metadataDelegate.MetadataFound += (s, e) => MetadataFound(s, e);
    }

    public override void LayoutSubviews()
    {
        base.LayoutSubviews();
        _layer.Frame = Bounds;
        _metadataOutput.RectOfInterest = Bounds;
    }

    public void SetMetadataType(AVMetadataObjectType type)
    {
        //Confusing! *After* adding to session, tell output what to recognize...
        _metadataOutput.MetadataObjectTypes = type;
    }
}

And in my LoginView I do the following:

    public override void ViewWillAppear(bool animated)
    {
        base.ViewWillAppear(animated);
        // Manipulate navigation stack
        NavigationController.SetViewControllers(
            NavigationController.ViewControllers.Where(
                viewController => viewController is LoginView).ToArray(), false);

        ScannerView.MetadataFound += (s, e) =>
        {
            Console.WriteLine($"Found: [{e.Type.ToString()}] {e.StringValue}");
            LoginViewModel.BarCode = e.StringValue;
            if (LoginViewModel.DoneCommand.CanExecute())
            {
                ScannerView.Session.StopRunning();
                LoginViewModel.DoneCommand.Execute();
            }
        };
    }

    public override void ViewDidAppear(bool animated)
    {
        base.ViewDidAppear(animated);
        ScannerView.Session.StartRunning();
        ScannerView.SetMetadataType(AVMetadataObjectType.QRCode | AVMetadataObjectType.EAN13Code);
    }

Funny thing is, that this works once I logged in with the manual input and logged out again, so I'm on the same screen again (possibly not the same but a new instance of it as the GC may destroy the view as it is removed from the navigation stack?)

I have put the scannerview as a subview on the LoginView in the storyboard. For navigation I use MVVMCross. (just for info)

So: What am I doing wrong? What do I need to do to make it work on the first load? (I got it to do that once - with the same code... maybe it is a timing issue?)


Solution

  • Obviously this is a timing issue. I solved it by adding a "Tap to scan" paradigm. When tapping I execute the following code:

            public override void TouchesBegan(NSSet touches, UIEvent evt)
        {
            base.TouchesBegan(touches, evt);
            Console.WriteLine($"Current types to scan: {this.MetadataOutput.MetadataObjectTypes}");
            this.SetMetadataType(this.MetadataObjectType);
            Console.WriteLine($"New types to scan: {this.MetadataOutput.MetadataObjectTypes}");
        }
    
        public void SetMetadataType(AVMetadataObjectType type)
        {
            //Confusing! *After* adding to session, tell output what to recognize...
            this.Session.BeginConfiguration();
            this.MetadataOutput.MetadataObjectTypes = type;
            this.Session.CommitConfiguration();
        }
    

    Where MetadataObjectType is set to the codes we're looking for before. And that solves the problem - the scanning now works every time. I think the magic part is the Begin- and CommitConfiguration call, as this also works, if I do not use the touch to scan paradigm.