Search code examples
iosavfoundationbarcode-scannerswift4

AVFoundation, DataMatrix, Swift 4


Ugh. My struggle with changes in Swift 4 AVFoundation continues.

I have a data matrix "QR" code that I'm trying to read.

It reads just fine with Swift 3 compilation but with the Swift 4 changes to the code it is not picked up.

Also note that the Apple provided example that is supposed to work with Swift 4 does not read the DataMatrix code either

When I print out the available types Data Matrix is available.

print("types available:\n \(metadataOutput.availableMetadataObjectTypes)")

yields:

types available:
[__ObjC.AVMetadataObject.ObjectType(_rawValue: face), ... 
__ObjC.AVMetadataObject.ObjectType(_rawValue: org.iso.DataMatrix), ...

However, didOutput metadataObjects: is never called when I run the code. It DOES get called for ever other type however.

Additionally adding in the type explicitly:

metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.dataMatrix]

does nothing different.

Anyone have experience with scanning DataMatrix in Swift 4?

Code:

import UIKit
import AVFoundation

class ViewController: UIViewController,AVCaptureMetadataOutputObjectsDelegate {

var videoCaptureDevice: AVCaptureDevice = AVCaptureDevice.default(for: AVMediaType.video)!
var device = AVCaptureDevice.default(for: AVMediaType.video)
var output = AVCaptureMetadataOutput()
var previewLayer: AVCaptureVideoPreviewLayer?

var captureSession = AVCaptureSession()
var code: String?

var scannedCode = UILabel()

override func viewDidLoad() {
    super.viewDidLoad()

    self.setupCamera()
    self.addLabelForDisplayingCode()
}

private func setupCamera() {

    let input = try? AVCaptureDeviceInput(device: videoCaptureDevice)

    if self.captureSession.canAddInput(input!) {
        self.captureSession.addInput(input!)
    }

    self.previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

    if let videoPreviewLayer = self.previewLayer {
        videoPreviewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill
        videoPreviewLayer.frame = self.view.bounds
        view.layer.addSublayer(videoPreviewLayer)
    }

    let metadataOutput = AVCaptureMetadataOutput()
    if self.captureSession.canAddOutput(metadataOutput) {
        self.captureSession.addOutput(metadataOutput)

        metadataOutput.setMetadataObjectsDelegate(self, queue: DispatchQueue.main)

        print("types available:\n \(metadataOutput.availableMetadataObjectTypes)")

        metadataOutput.metadataObjectTypes = metadataOutput.availableMetadataObjectTypes
//            metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.dataMatrix]
    } else {
        print("Could not add metadata output")
    }
}

private func addLabelForDisplayingCode() {
    view.addSubview(scannedCode)
    scannedCode.translatesAutoresizingMaskIntoConstraints = false
    scannedCode.bottomAnchor.constraint(equalTo: view.bottomAnchor, constant: -20.0).isActive = true
    scannedCode.leadingAnchor.constraint(equalTo: view.leadingAnchor, constant: 20.0).isActive = true
    scannedCode.trailingAnchor.constraint(equalTo: view.trailingAnchor, constant: -20.0).isActive = true
    scannedCode.heightAnchor.constraint(equalToConstant: 50).isActive = true
    scannedCode.font = UIFont.preferredFont(forTextStyle: .title2)
    scannedCode.backgroundColor = UIColor.black.withAlphaComponent(0.5)
    scannedCode.textAlignment = .center
    scannedCode.textColor = UIColor.white
    scannedCode.text = "Scanning...."
}

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)

    if (captureSession.isRunning == false) {
        captureSession.startRunning();
    }
}

override func viewWillDisappear(_ animated: Bool) {
    super.viewWillDisappear(animated)

    if (captureSession.isRunning == true) {
        captureSession.stopRunning();
    }
}

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) {
    print(metadataObjects)
    for metadata in metadataObjects {
        let readableObject = metadata as! AVMetadataMachineReadableCodeObject
        let code = readableObject.stringValue
        scannedCode.text = code

    }
}
}

Thanks so much.


Solution

  • You have to make sure the connection is not mirrored. The data matrix needs to be read in the original format.

    https://developer.apple.com/documentation/avfoundation/avcaptureconnection/1389172-isvideomirrored