Process frame called but DeepAR not initialized! this error keep come out

RootEncoder-iOS-master 5.zip (296.1 KB)

This is my Swift Project File

can you check?
when i pressed startStream Button
this “Process frame called but DeepAR not initialized!” error show up

import Foundation
import DeepAR

class DeepARShared {
    static let shared = DeepARShared()

    #if targetEnvironment(simulator)
    var deepAR: DeepAR?
    #else
    var deepAR: DeepAR
    #endif
    var initialized: Bool = false
  
    var processingFrame: Bool = false

    private var taskQueue: DispatchQueue
    private var task: DispatchWorkItem?

    private init() {
        #if targetEnvironment(simulator)
        #else
        self.deepAR = DeepAR()
        #endif
        self.taskQueue = DispatchQueue(label: "com.example.taskQueue")
    }

    func initialize() {
      #if targetEnvironment(simulator)
      #else
        self.deepAR = DeepAR()
        self.deepAR.setLicenseKey("myLicenseKey")
        self.deepAR.changeLiveMode(false)
        self.deepAR.enableAudioProcessing(false)
      #endif
    }

    func startDeepAR() {
      print("KaiCameraView DeepARShared startDeepAR call")

      self.deepAR.initializeOffscreen(withWidth: 720, height: 1280)
      DeepARShared.shared.deepAR.switchEffect(withSlot: "mask", path: "Split_View_Look.deepar")
    }

    func shutdown() {
      print("KaiCameraView DeepARShared shutdown call")
      #if targetEnvironment(simulator)
      #else
        DeepARShared.shared.deepAR.switchEffect(withSlot: "mask", path: "Pixel_Hearts.deepar")
      #endif
    }
}
extension CameraManager: DeepARDelegate {
    public func frameAvailable(_ sampleBuffer: CMSampleBuffer!) {
        
        print("Sample image Foramt frameAvailable")
        thread.async {
            //TODO render OpenGlView/MetalView using CMSampleBuffer to draw preview and filters
            if (!self.onPreview) {
                self.callback.getYUVData(from: sampleBuffer)
            }
        }
    }
    
    public func didInitialize() {
        print("Sample image didInitialize")
    }
    
    public func onError(withCode code: ARErrorType, error: String!) {
        print("Sample image error \(error) \(code.rawValue)")
    }
    
}

I simply set up the configuration using the function self.deepAR.initializeOffscreen(withWidth: 720, height: 1280) and planned to process the sampleBuffer coming from the default camera through DeepARShared.shared.deepAR.enqueueCameraFrame(sampleBuffer, mirror: true) or, in case of failure to get the sample video buffer as indicated by:

guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
    print("vDeepAR Fail to Get Sample Video Buffer!")
    return
}
DeepARShared.shared.deepAR.processFrame(imageBuffer, mirror: true)

Then, the intention was to receive and render the buffer at:

public func frameAvailable(_ sampleBuffer: CMSampleBuffer!) {
    
    print("Sample image Format frameAvailable")
    thread.async {
        //TODO render OpenGlView/MetalView using CMSampleBuffer to draw preview and filters
        if (!self.onPreview) {
            self.callback.getYUVData(from: sampleBuffer)
        }
    }
}

As I have checked in the documentation, I also attempted using startCapture , but I am still facing the same error. What is going wrong?
“Process frame called but DeepAR not initialized!”

The important thing is that I have the SDK version 3.x, and it works fine when I run it with that version. Has the method of use changed with the update of the version?

Hi, we were working on some possible race conditions around initialisations since then, so that may be the reason for the change. It looks like you aren’t waiting for the did intialize callback to start sending frames.

where this log is:

You can add a variable that will block the frames from being sent while it’s false.

That should reslove your issue. Let us know if it persists.

can you show me example?
im sorry i understand what you said but dont know what exactly i have to do

you have this line

var initialized: Bool = false

you should set it to true in

public func didInitialize()

and then add a check (if) that it is true befrore you call

DeepARShared.shared.deepAR.processFrame(imageBuffer, mirror: true)