Delegate not working

Hello everybody, I am new in DeepAR. I wrote this simple code where I am trying to print the value of some landmarks and I am using the MVVM design. Could anybody help me understanding why once the model.deepAR.delegate = self line is run (I get the print on the terminal) the faceTracked function is never called (I can’t see the print on terminal) when I run the app and my face is clearly visible?

import SwiftUI
import DeepAR

struct ContentView: View {
    @ObservedObject var arViewModel = DeepARViewModel()
    
    var body: some View {
        DeepARViewContainer(arViewModel: arViewModel)
                    .edgesIgnoringSafeArea(.all)
    }
}

struct DeepARViewContainer: UIViewRepresentable {
    var arViewModel: DeepARViewModel

    func makeUIView(context: Context) -> UIView {
        arViewModel.startSessionDelegate()
        return arViewModel.arView
    }

    func updateUIView(_ uiView: UIView, context: Context) {
        // Update the view if needed
    }
}


struct DeepARModel {
    // Your DeepAR logic here
    var deepAR: DeepAR!
    private var cameraController: CameraController!
    var arView: UIView!
    var lefMouthLandmark: Float = 0
    var rightMouthLandmark: Float = 0

    init() {
        // Initialize and configure DeepARView
        deepAR = DeepAR()
        deepAR.setLicenseKey("my_key")
        cameraController = CameraController()
        cameraController.deepAR = deepAR
        arView = deepAR.createARView(withFrame: UIScreen.main.bounds)
        cameraController.startCamera()
    }
    
    mutating func update(multiFaceData: MultiFaceData) -> Void {
        self.lefMouthLandmark = Float(multiFaceData.faceData.1.landmarks.55)
        self.rightMouthLandmark = Float(multiFaceData.faceData.1.landmarks.49)
        print(self.lefMouthLandmark)
        print(self.rightMouthLandmark)
        }
    }


class DeepARViewModel: UIViewController, ObservableObject {
    @Published private var model : DeepARModel = DeepARModel()
    
    var arView : UIView {
        model.arView
    }
    
    func startSessionDelegate() {
        print("Session delegate on")
        model.deepAR.delegate = self
    }
    
    }
    

extension DeepARViewModel: DeepARDelegate {
    func faceTracked(_ faceData: MultiFaceData) {
            print("In the face Tracked")
            model.update(multiFaceData: faceData)
        }
}

Update: I still have the same issue but the reason seems to be that the face tracking is not active unless there’s some event (like a face filter) that triggers it. For this reason I activated the face tracking as soon as the Deep AR engine is initiated and I did that before creating any ARView as suggested in the documentation. This is the updated init() of DeepARModel.

    init() {
        // Initialize and configure DeepARView
        deepAR = DeepAR()
        deepAR.setLicenseKey("my_key")
        faceTrackingInitParameters = FaceTrackingInitParameters(initializeEngineWithFaceTracking: true, initializeFaceTrackingSynchronously: true)
        deepAR.setFaceTrackingInitParameters(faceTrackingInitParameters)
        
        cameraController = CameraController()
        cameraController.deepAR = deepAR
        arView = deepAR.createARView(withFrame: UIScreen.main.bounds)
        cameraController.startCamera()
        print("Is face visible: ", deepAR.faceVisible)
    }

This doesn’t work. It prints “Is face visible : false”. (And of course when I try the app on my phone my face is clearly visible)

emptyeffect.deepar.zip (656 Bytes)
Hi,

Yes the face tracking isn’t called in a frame if no active effect uses it. The workaround is simple, just call switch effect with the effect I’ve attached. It makes no visual change, it simply forces the face tracking to be active

Thank you very much Jelena it worked. A quick, follow up question. I am trying to detect a smile before the filter is applied. I tried to do that using landmarks but even if I managed to get the X, Y coordinates of the mouth corners they vary a lot depending on things like distance from the camera so it’s really hard to hard-code the conditions for a smile based on those values. As an alternative I could use an empty emotion filter to activate the emotion detection and use a threshold on ‘happiness’. Is it possible to create an empty emotion filter? Is it the best solution according to you? Thanks!

Yes, loading an invisible emotion effect would be the solution. Which effect do you use for emotions?

Disabling the mesh renderers should be enough

Thanks again Jelena! I am currently trying to build one. Do you happen to have any emotion filter off-the-shelf to provide?

InvisibleEmotions.deepar.zip (1.9 KB)

I made this one, it should work.

Legend! Thank you so much!

1 Like