Search code examples
swiftface-recognitioniphone-x

Face Recognition iPhone X Swift


I done some experiments with ARFaceAnchor for recognize some emotions like blinking eyes and so on. For sure I set correctly the FaceAnchor because on the debugger I'm able to see the coordinates but it seems that it's not recognizing any emotions that I set...

Attached you will find the ViewController and in a separate Class you will find the Emotions.

Any ideas? Thank you!

//  ViewController.swift
//

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSessionDelegate {

    @IBOutlet var sceneView: ARSCNView!

    let session = ARSession()

    override func viewDidLoad() {
        super.viewDidLoad()

        self.sceneView.scene = SCNScene()
        self.sceneView.rendersContinuously = true

        // Configure our ARKit tracking session for facial recognition
        let config = ARFaceTrackingConfiguration()
        config.worldAlignment = .gravity
        session.delegate = self
        session.run(config, options: [])
    }

    // AR Session

    var currentFaceAnchor: ARFaceAnchor?
    var currentFrame: ARFrame?

    func session(_ session: ARSession, didUpdate frame: ARFrame) {
        self.currentFrame = frame
        DispatchQueue.main.async {

        }
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

    }

    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
        guard let faceAnchor = anchors.first as? ARFaceAnchor else { return }
        self.currentFaceAnchor = faceAnchor
        print("Face",faceAnchor)

    }

    func session(_ session: ARSession, didRemove anchors: [ARAnchor]) {

    }

    var expressionsToUse: [Expression] = [SmileExpression(), EyebrowsRaisedExpression(), EyeBlinkLeftExpression(), EyeBlinkRightExpression(), JawOpenExpression(), LookLeftExpression(), LookRightExpression()] //All the expressions
    var currentExpression: Expression? = nil {
        didSet {
            if currentExpression != nil {
                self.currentExpressionShownAt = Date()
            } else {
                self.currentExpressionShownAt = nil
            }
        }
    }
    var currentExpressionShownAt: Date? = nil



}

Solution

  • The reason no Expressions are being detected is because you aren't actually doing anything with them, apart from adding them to expressionsToUse Array.

    Each Expression has three functions, which you aren't currently using:

     func name() -> String {}
     func isExpressing(from: ARFaceAnchor) -> Bool {}
     func isDoingWrongExpression(from: ARFaceAnchor) -> Bool {}
    

    Since you want to detect the Emotions you need to hook these functions into the following delegate method:

    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) { }
    

    As such something like this will point you in the right direction:

    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    
        //1. Check To See We Have A Valid ARFaceAnchor
        guard let faceAnchor = anchors.first as? ARFaceAnchor else { return }
    
        self.currentFaceAnchor = faceAnchor
    
        //2. Loop Through Each Of The Expression & Determine Which One Is Being Used
        expressionsToUse.forEach { (possibleExpression) in
    
            //a. If The The User Is Doing A Particular Expression Then Assign It To The currentExpression Variable
            if possibleExpression.isExpressing(from: faceAnchor){
    
                currentExpression = possibleExpression
    
                print("""
                      Current Detected Expression = \(possibleExpression.name())
                      It Was Detected On \(currentExpressionShownAt!)
                      """)
    
            }else if possibleExpression.isDoingWrongExpression(from: faceAnchor){
    
               print("Incorrect Detected Expression = \(possibleExpression.name())")
            }
        }
    
    }
    

    Hope it helps...