DeepARDelegate frameAvailable: The sampleBuffer is transparent

I want to use the DeepAR effect with a custom camera. I’m also using an ARView to display the camera on the screen. Everything is working well.
But when I want to get the sampleBuffer (with effect applied) to publish the stream to RTMP from the DeepARDelegate frameAvailable(_ sampleBuffer: CMSampleBuffer!). Previously, I called the deepAR.startCapture function.
When I debugged to check, I found that the sampleBuffer is transparent.

in CustomCameraController:

func captureOutput(_ captureOutput: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        if let deepAR = _deepAR {
            if connection == _videoConnection {
                deepAR.enqueueCameraFrame(sampleBuffer, mirror: _mirrorCamera)
            } else {
                deepAR.enqueueAudioSample(sampleBuffer)
            }
        } else if let arview = _arview {
            if (connection == _videoConnection) {
                arview.enqueueCameraFrame(sampleBuffer, mirror: _mirrorCamera)
            } else {
                arview.enqueueAudioSample(sampleBuffer)
            }
        }
    }

In DeepARDelegate:

func didInitialize() {
        print("\ndidInitialize")
        self.deepar.startCapture(withOutputWidthAndFormat: 640, outputHeight: 480, subframe: CGRect(x: 0,y: 0,width: 640,height: 480), outputImageFormat: OutputFormat(rawValue: kCVPixelFormatType_32BGRA))
        setupStream()
    }
func frameAvailable(_ sampleBuffer: CMSampleBuffer!) {
        if (streaming) {
            if let pixelBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
                let width: Int = CVPixelBufferGetWidth(pixelBuffer)
                let height: Int = CVPixelBufferGetHeight(pixelBuffer)
                print("Width: \(width) pixels, Height: \(height) pixels")
            }
            let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
            let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer)
            let mediaType = CMFormatDescriptionGetMediaType(formatDescription!)
            
            let format = CVPixelBufferGetPixelFormatType(imageBuffer!)
            
            let transparent = isFullTransparent(sampleBuffer)
            print("Is Fully transparent \(transparent)")
            
            if format == kCVPixelFormatType_32BGRA {
                print("Sample buffer includes an alpha channel")
            } else {
                print("Sample buffer does not include an alpha channel")
            }
            
            rtmpStream.appendSampleBuffer(sampleBuffer, withType: mediaType == kCMMediaType_Video ? .video : .audio)
            DispatchQueue.main.async {
                self.setPreviewImage(sampleBuffer)
            }
        }
    }

If I send the sampleBuffer from the captureOutput method in the CustomCameraController, the stream will work. However, when using the frameAvailable(_ sampleBuffer: CMSampleBuffer!) method, it’s working with transparent frame.

Please help me fix it :frowning:

HI, the startCapture subframe uses [0-1] range. Here is a snippet from one of our video call examples

    self.deepAr.startCapture(withOutputWidth: 720, outputHeight: 1280, subframe: CGRect(x: 0.0, y: 0.0, width: 1.0, height: 1.0))

Hope this helps with your issue.

1 Like