iOS : Computer vision only mode and processFrame

Hello, I have a problem when I launch my iOS program.
I’m new to swift programming, sorry if my question seems simple but I can’t find anything to solve this problem.

When I try to execute the processFrame function after DeepAr initialization, I get this error:
Thread 1: EXC_BAD_ACCESS (code=1, address=0x17d)

To initialize DeepAR, I first generate a new instance of DeepAR. Then I define my license key, add the delegate and run the initialize function.

The frame that I launch to process frame exist (not nil) and I DeepAr is in isVisionOnly and the visionInitialized return true.

The error is not triggered when I use DeepAR in OffScreen. I guess I’m missing something?

    @objc public func initDeepAR(imageWidth :Int, imageHeight :Int)
    {
        deepARInstance.setLicenseKey("***")
        deepARInstance.delegate = deeparDelegate
        deepARInstance.initialize()
    }

    @objc public func proccesFrame(pointer:UnsafeMutableRawPointer,cameraFrontal:Bool)
    {
        let buffer = createPixelBufferFromTexturePointer(texturePointer: pointer )
        
        print("is vision only : " , deepARInstance.isVisionOnly(), " | ",deepARInstance.visionInitialized)
        print("Buffer exist : ",buffer != nil)

        deepARInstance.processFrame(buffer, mirror: true)
    }

Thanks

Hi.

What is the use case? If you have a view that should display DeepAR or is it in fact offscreen?

If it’s offscreen then that’s what should be set, if you have an ar view you should set it

    CGRect arviewrect = CGRectMake(0, 0, [UIScreen mainScreen].bounds.size.width, [UIScreen mainScreen].bounds.size.height);
    self.arview = (ARView*)[self.deepAR createARViewWithFrame:arviewrect];
    [self.view insertSubview:self.arview atIndex:0];

Hi,

I’m trying to get the FaceData struct from my camera stream. The camera stream images are sent in the processFrame function and I guess I can get the FaceData struct in the faceTracked delegate

You can have an arview instead of the camera view to process in livetime, and yes you can get face data in that delegate. One note: To save resources, this callback will not be called if there is no active effect that tracks the face. I’ve attached an effect that makes no visual changes but forces the face tracking to be on

emptyeffect.deepar.zip (656 Bytes)