Every three minutes, deepAR shuts down

Every three minutes, deepAR shuts down without any apparent reason. Can you tell me why this might be happening? I’ve already turned off the iPhone’s auto screen-off feature. I suspect it might be because I’m not changing the screen touch or face effects. Even after performing various actions like screen rotation and effect changes, after three minutes, deepAR automatically shuts down, and the function

func didFinishShutdown() {
    print("deepAR is done")
}

gets executed, ending the session.

Is this happening because I am using a free license?

Additionally, when recording a video, deepAR does not shut down even after three minutes.

self.deepAR.startVideoRecording(withOutputWidth: width, outputHeight: height)

IMG_7015-1.mov.zip (5.9 MB)

I’ve tested both versions 5.4.3 and 5.4.4 of DeepAR.
++ the sampleCode dose working well but i dont see difference betweent this code and sampleCode (https://github.com/DeepARSDK/quickstart-ios-swifta)

Additionally, I am using SwiftUI. Therefore, I am wrapping the deepARView in UIViewRepresentable for rendering. I wonder if this could be causing any issues? Despite not using the self.deepAR.shutdown() function, the shutdown process automatically initiates and didFinishShutdown gets triggered as soon as 3 minutes pass. I am unable to determine the exact cause of the shutdown since I don’t have access to the internal code of the library. I have also tested this with the iPhone’s screen auto-off feature disabled, so this should not be affecting it.

I will post additional code. If you need the complete code, please email me, and I will provide it.

Essentially, the view stored in the variable of DeepARViewController is being rendered on the screen through MainTrackViewModel using UIViewRepresentable in MainTrackView.

and this is the code

//
//  MainTrack.swift
//  SwiftUIWithUIKit
//
//  Created by 온석태 on 2023/10/19.
//

import SwiftUI


struct MainTrackView: View {
    @EnvironmentObject var navigationController: NavigationController
    @ObservedObject var mainTrackViewModel = MainTrackViewModel()
    
    var body: some View {
        VStack {
            DeepARViewWrapper(deepARView: mainTrackViewModel.deepARViewController.deepArView!)
        }.frame(maxWidth: .infinity, maxHeight: .infinity)
            .background(Color.yellow)
            .ignoresSafeArea(.all, edges: .all)
            .overlay(
                VStack {
                    HStack {
                        // 뒤로가기 버튼
                        Button(action: {
                            navigationController.pop()
                        }) {
                            Image(systemName: "chevron.backward.2")
                                .foregroundColor(.black)
                        }
                        
                        Spacer()
                        
                        // 카메라 방향 변경 버튼
                        Button(action: {
                            mainTrackViewModel.switchCameraDirection()
                        }) {
                            Image(systemName: mainTrackViewModel.cameraDirection == CameraDirection.Front ?
                                  "arrow.triangle.2.circlepath.camera" : "arrow.triangle.2.circlepath.camera.fill")
                            .foregroundColor(false ? .yellow : .white)
                        }
                    }
                    .font(.system(size:25))
                    .padding()
                    
                    Spacer()
                    
                    HStack {
                        Spacer()
                        
                        // 사진 모드 버튼
                        Button(action: {mainTrackViewModel.switchCaptureMode(captureMode: CaptureMode.Photo)}) {
                            Text("Photo")
                                .foregroundColor(.white)
                        }
                        .padding(10)
                        .background(
                            RoundedRectangle(cornerRadius: 15) // 둥근 모서리를 적용합니다.
                                .fill(mainTrackViewModel.captureMode  == CaptureMode.Photo ? Color.blue : Color.clear)
                        )
                        
                        Spacer()
                        
                        // 비디오 모드 버튼
                        Button(action: {mainTrackViewModel.switchCaptureMode(captureMode: CaptureMode.Video)}) {
                            Text("Video")
                                .foregroundColor(.white)
                        }
                        .padding(10)
                        .background(
                            RoundedRectangle(cornerRadius: 15) // 둥근 모서리를 적용합니다.
                                .fill(mainTrackViewModel.captureMode  == CaptureMode.Video ? Color.blue : Color.clear)
                        )
                        
                        Spacer()
                        
                        // 오디오 모드 버튼
                        Button(action: {mainTrackViewModel.switchCaptureMode(captureMode: CaptureMode.Audio)}) {
                            Text("Audio")
                                .foregroundColor(.white)
                        }
                    
                        .padding(10)
                        .background(
                            RoundedRectangle(cornerRadius: 15) // 둥근 모서리를 적용합니다.
                                .fill(mainTrackViewModel.captureMode  == CaptureMode.Audio ? Color.blue : Color.clear)
                        )
                        Spacer()
                        
                        // 라이브 모드 버튼
                        Button(action: {mainTrackViewModel.switchCaptureMode(captureMode: CaptureMode.LiveStream)}) {
                            Text("LiveStream")
                                .foregroundColor(.white)
                        }
                    
                        .padding(10)
                        .background(
                            RoundedRectangle(cornerRadius: 15) // 둥근 모서리를 적용합니다.
                                .fill(mainTrackViewModel.captureMode  == CaptureMode.LiveStream ? Color.blue : Color.clear)
                        )
                    }
                    
                    Spacer().frame(maxHeight: 40)
                    HStack {
                        // 이전 이펙트 버튼
                        Button(action: {
                            if (mainTrackViewModel.currentDeepAREffectIndex != 0) {
                                mainTrackViewModel.priviousDeepAREffect()
                            }
                        }) {
                            Image(systemName: mainTrackViewModel.currentDeepAREffectIndex != 0 ?
                                  "arrowtriangle.backward.fill": "arrowtriangle.backward")
                        }
                        
                        Spacer()
                        
                        // 촬영 버튼
                        Button(action: {
                            mainTrackViewModel.onPressedRecordBtn()
                        }) {
                            Image(systemName: mainTrackViewModel.isRecording || mainTrackViewModel.isStreaming ?
                                  "record.circle" : "record.circle.fill")
                            .resizable()
                            .frame(width: 50.0, height: 50.0)
                            .foregroundColor(mainTrackViewModel.isRecording || mainTrackViewModel.isStreaming ? .red : .white)
                        }
                        
                        Spacer()
                        
                        // 다음 이펙트 버튼
                        Button(action: {
                            if (mainTrackViewModel.currentDeepAREffectIndex < mainTrackViewModel.deepAREffectList.count) {
                                mainTrackViewModel.nextDeepAREffect()
                            }
                        }) {
                            Image(systemName: mainTrackViewModel.currentDeepAREffectIndex < mainTrackViewModel.deepAREffectList.count ?
                                  "arrowtriangle.right.fill": "arrowtriangle.right")
                        }
                    }
                    .font(.system(size:25))
                    .padding(.horizontal,40)
                    
                    HStack {
                        Spacer()
                        // 뒤로가기 버튼
                        Button(action: {
                            navigationController.push(MasteringView(), animated: true)
                        }) {
                            Image(systemName: "arrow.right.square.fill")
                                .resizable()
                                .frame(width: 30.0, height: 30.0)
                                .foregroundColor(.white)
                        }
                    }
                    .padding(.horizontal,40)
                    .padding(.bottom, 10)
                    .padding(.top, 20)
                }
            )
            .onAppear{
                print("메인트랙")
            }
    }
}


struct DeepARViewWrapper: UIViewRepresentable {
    
    var deepARView: UIView  // Assume this is your DeepAR view
    
    func makeUIView(context: Context) -> UIView {
        return deepARView
    }

    func updateUIView(_ uiView: UIView, context: Context) {
        // Perform any updates to the UIView if necessary
    }
}

//
//  MainTrackViewModel.swift
//  SwiftUIWithUIKit
//
//  Created by 온석태 on 2023/10/20.
//

import Foundation
import rtmp
import RootEncoder

class MainTrackViewModel: ObservableObject, ConnectCheckerRtmp {
    @Published var isFlashOn:Bool = false
    @Published var isSilentModeOn:Bool = false
    @Published var captureMode:CaptureMode = CaptureMode.Photo
    @Published var isRecording:Bool = false
    @Published var isStreaming:Bool = false
    @Published var currentDeepAREffectIndex: Int = 0
    @Published var cameraDirection = CameraDirection.Front
    var rtmpEndpoint:String = "rtmp://192.168.0.28:1935/live/vid2"
    var cameraBase: CameraBase!
    
    // audioEngine 매니저
    var audioEngineMananger: AudioEngineMananger = AudioEngineMananger.audioEngineMananger
    
    var deepARViewController: DeepARViewController = DeepARViewController.deepARViewController
    
    var deepAREffectList: [String?] {
        return DeepAREffects.allCases.map { $0.rawValue.path }
    }
    
    init () {
        let effectPath = self.deepAREffectList[self.currentDeepAREffectIndex]
        self.cameraBase = CameraBase(connectCheckerRtmp: self)
        deepARViewController.setupDeepARAndCamera(cameraBase: self.cameraBase)
    }
    
    deinit {
        deepARViewController.showDownDeepAR()
    }

    
    
    func switchCameraDirection() {
        print("[MainTrackViewModel] - switchCameraDirection 카메라 방향 변경")
        self.cameraDirection = self.cameraDirection == CameraDirection.Front ? CameraDirection.Back : CameraDirection.Front
        deepARViewController.switchCameraDirection()
    }
    
    func capturePhoto() {
        print("[MainTrackViewModel] - capturePhoto 사진 촬영")
    }
    
    func priviousDeepAREffect() {
        print("[MainTrackViewModel] - priviousDeepAREffect 이전 효과")
        if (self.currentDeepAREffectIndex > 0) {
            self.currentDeepAREffectIndex -= 1;
            if let effectPath = self.deepAREffectList[self.currentDeepAREffectIndex] {
                deepARViewController.setDeepAREffectByPath(slot: DeepARSlot.effect, path: effectPath)
            }
        }
    }
    
    func nextDeepAREffect() {
        print("[MainTrackViewModel] - nextDeepAREffect 다음 효과")
        if (self.currentDeepAREffectIndex <= self.deepAREffectList.count) {
            self.currentDeepAREffectIndex += 1;
            if let effectPath = self.deepAREffectList[self.currentDeepAREffectIndex] {
                deepARViewController.setDeepAREffectByPath(slot: DeepARSlot.effect, path: effectPath)
            }
        }
    }
    
    func switchCaptureMode (captureMode: CaptureMode) {
        print("[MainTrackViewModel] - switchCaptureMode 촬영 모드 변경 \(self.captureMode) -> \(captureMode)")
        if self.captureMode == captureMode {return}
        
        self.captureMode = captureMode
    }
    
    func onPressedRecordBtn () {
        print("[MainTrackViewModel] - onPressedRecordBtn 녹화버튼 클릭")
        switch (self.captureMode) {
        case CaptureMode.Photo:
            self.takePhoto()
            break
        case CaptureMode.Video:
            if (self.isRecording) {
                self.stopViedoRecording()
            } else {
                self.startVideoRecording()
            }
            break
        case CaptureMode.Audio:
            if (self.isRecording) {
                self.stopAudioRecording()
            } else {
                self.startAudioRecording()
            }
        case CaptureMode.LiveStream:
            if (self.isStreaming) {
                self.isStreaming = false
                self.audioEngineMananger.startAudioEngine()
                self.deepARViewController.stopStream()
            } else {
                self.isStreaming = true
                self.audioEngineMananger.stopAudioEngine()
                self.deepARViewController.startStream(endpoint: self.rtmpEndpoint)
            }
            break
        case .VideoCall:
            break
        }
    }
    
    func takePhoto () {
        print("[MainTrackViewModel] - takePhoto 사진 촬영")
        self.deepARViewController.takeScreenShot()
    }
    
    func startVideoRecording () {
        print("[MainTrackViewModel] - startVideoRecording 비디오 촬영 시작")
        self.isRecording = true
        self.deepARViewController.startViedoRecording()
    }
    
    func stopViedoRecording () {
        print("[MainTrackViewModel] - stopViedoRecording 비디오 촬영 종료")
        self.isRecording = false
        self.deepARViewController.stopViedeoRecording()
        
    }
    
    func startAudioRecording () {
        print("[MainTrackViewModel] - startAudioRecording 오디오 녹음 시작")
        self.isRecording = true
        
    }
    
    func stopAudioRecording () {
        print("[MainTrackViewModel] - stopAudioRecording 오디오 녹음 종료")
        self.isRecording = false
    }
    
    
    
    
    // rtmp connection 상태 callback
    
    func onConnectionSuccessRtmp() {
        print("onConnectionSuccessRtmp")
    }
    
    func onConnectionFailedRtmp(reason: String) {
        print("onConnectionFailedRtmp", reason)
        self.isStreaming = false
    }
    
    func onNewBitrateRtmp(bitrate: UInt64) {
        print("onNewBitrateRtmp", bitrate)
    }
    
    func onDisconnectRtmp() {
        print("onDisconnectRtmp")
        self.isStreaming = false
    }
    
    func onAuthErrorRtmp() {
        print("onAuthErrorRtmp")
        self.isStreaming = false
    }
    
    func onAuthSuccessRtmp() {
        print("onAuthSuccessRtmp")
        self.isStreaming = false
    }
    
}

extension String {
    var path: String? {
        return Bundle.main.path(forResource: self, ofType: nil)
    }
}

//
//  CameraManager.swift
//  SwiftUIWithUIKit
//
//  Created by 온석태 on 2023/10/21.
//

import Foundation
import DeepAR
import AVKit
import AVFoundation
import SwiftUI
import RootEncoder

//import rtsp
import rtmp

enum CaptureMode {
    case Photo
    case Video
    case Audio
    case LiveStream
    case VideoCall
}

enum CameraDirection: Int {
    case Front = 1
    case Back = 2
}

enum DeepAREffects: String, CaseIterable {
    case viking_helmet = "viking_helmet.deepar"
    case MakeupLook = "MakeupLook.deepar"
    case Split_View_Look = "Split_View_Look.deepar"
    case Emotions_Exaggerator = "Emotions_Exaggerator.deepar"
    case Emotion_Meter = "Emotion_Meter.deepar"
    case Stallone = "Stallone.deepar"
    case flower_face = "flower_face.deepar"
    case galaxy_background = "galaxy_background.deepar"
    case Humanoid = "Humanoid.deepar"
    case Neon_Devil_Horns = "Neon_Devil_Horns.deepar"
    case Ping_Pong = "Ping_Pong.deepar"
    case Pixel_Hearts = "Pixel_Hearts.deepar"
    case Snail = "Snail.deepar"
    case Hope = "Hope.deepar"
    case Vendetta_Mask = "Vendetta_Mask.deepar"
    case Fire_Effect = "Fire_Effect.deepar"
    case burning_effect = "burning_effect.deepar"
    case Elephant_Trunk = "Elephant_Trunk.deepar"
}

enum DeepARSlot: String {
    case effect = "effect"
}

class DeepARViewController: UIViewController, DeepARDelegate {
    static let deepARViewController = DeepARViewController()
    
    var videoPlayerController: VideoPlayerViewController = VideoPlayerViewController.videoPlayerViewController
    
    var deepAR: DeepAR!
    var cameraController: CameraController!
    var deepArView: UIView?
    
    let defaultSlot: DeepARSlot = DeepARSlot.effect
    let defaultDeepAREffect:String = "viking_helmet.deepar"
    
    var outputPath:String?
    
    var cameraBase: CameraBase!
    override func viewDidLoad() {
        super.viewDidLoad()
        self.getProjectFolder()
        
        // deepAR root폴더 지정
        deepAR.setVideoRecordingOutputPath(self.outputPath)
        
        // 화면 방향 전환 이벤트 등록
        NotificationCenter.default.addObserver(self, selector: #selector(orientationDidChange), name: UIDevice.orientationDidChangeNotification, object: nil)
    }
    
    deinit {
        // 화면 방향 전환 이벤트 해제
        NotificationCenter.default.removeObserver(self, name: UIDevice.orientationDidChangeNotification, object: nil)
    }

    
    // 화면 방향 전환 이벤트 함수
    @objc func orientationDidChange() {
           print("[DeepARViewController] - orientationDidChange 화면 전환됨")
            guard let orientation = UIApplication.shared.windows.first?.windowScene?.interfaceOrientation else { return }
            switch orientation {
            case .landscapeLeft:
                cameraController.videoOrientation = .landscapeLeft
                break
            case .landscapeRight:
                cameraController.videoOrientation = .landscapeRight
                break
            case .portrait:
                cameraController.videoOrientation = .portrait
                break
            case .portraitUpsideDown:
                cameraController.videoOrientation = .portraitUpsideDown
            default:
                break
            }
    }
    
    // 매프레임이 생성될때마다 callback 되는 함수
    func frameAvailable(_ sampleBuffer: CMSampleBuffer!) {
        cameraBase.getYUVData(from: sampleBuffer)
    }
    
    func convertARGBToYUV(sampleBuffer: CMSampleBuffer) -> CVPixelBuffer? {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else {
          return nil
        }
        // pixel format is ARGB
    //    guard CVPixelBufferGetPixelFormatType(imageBuffer) == kCVPixelFormatType_32ARGB else {
    //      return nil
    //    }
    //
    //    guard CVPixelBufferLockBaseAddress(imageBuffer, .readOnly) == kCVReturnSuccess else {
    //      return nil
    //    }
        defer {
          CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly)
        }
        let width = CVPixelBufferGetWidth(imageBuffer)
        let height = CVPixelBufferGetHeight(imageBuffer)
        let ciImage = CIImage(cvPixelBuffer: imageBuffer)
        // Create a CIContext for rendering
        let ciContext = CIContext()
        // Create an empty CVPixelBuffer for YUV data
        var yuvPixelBuffer: CVPixelBuffer?
        CVPixelBufferCreate(nil, width, height, kCVPixelFormatType_420YpCbCr8BiPlanarFullRange, nil, &yuvPixelBuffer)
        // Render the CIImage into the YUV CVPixelBuffer
        ciContext.render(ciImage, to: yuvPixelBuffer!)
        print("Pixel Format: \(CVPixelBufferGetPixelFormatType(yuvPixelBuffer!))")
        return yuvPixelBuffer
      }
    
    
    func getProjectFolder () {
        let fileManager = FileManager.default
        let tmpDirectoryURL = FileManager.default.temporaryDirectory
        
        let mySpecialFolderURL = tmpDirectoryURL.appendingPathComponent("DeepAR")
        
        // 폴더가 이미 존재하는지 확인
        if !fileManager.fileExists(atPath: mySpecialFolderURL.path) {
            do {
                // 폴더가 존재하지 않을 경우 폴더를 생성
                try fileManager.createDirectory(at: mySpecialFolderURL, withIntermediateDirectories: true, attributes: nil)
                print("Successfully created mySpecialFolder!")
            } catch {
                print("Error creating directory: \(error.localizedDescription)")
            }
            self.outputPath = mySpecialFolderURL.path
        } else {
            print("mySpecialFolder already exists!")
            self.outputPath = mySpecialFolderURL.path
        }
        
        print("[DeepARViewCOntroller] - getProjectFolder 프로젝트 아웃풋 경로  = \(self.outputPath)")
    }
    
    
    // 카메라 사용 종료후 deep 종료 함수
    func showDownDeepAR () {
        self.deepAR.shutdown()
    }
    
    // 카메라 사용 설정 함수 [첫 진입시 실행될함수]
    func setupDeepARAndCamera(cameraBase: CameraBase) {
        self.cameraBase = cameraBase
        self.deepAR = DeepAR()
        
        self.deepAR.delegate = self
        self.deepAR.setLicenseKey("LicenseKey")
        
        cameraController = CameraController()
        cameraController.deepAR = self.deepAR
        self.deepAR.videoRecordingWarmupEnabled = false;
        
        let arView = self.deepAR.createARView(withFrame: self.view.bounds)
        arView?.translatesAutoresizingMaskIntoConstraints = false
        self.view.addSubview(arView!)
        arView?.leftAnchor.constraint(equalTo: self.view.leftAnchor, constant: 0).isActive = true
        arView?.rightAnchor.constraint(equalTo: self.view.rightAnchor, constant: 0).isActive = true
        arView?.topAnchor.constraint(equalTo: self.view.topAnchor, constant: 0).isActive = true
        arView?.bottomAnchor.constraint(equalTo: self.view.bottomAnchor, constant: 0).isActive = true
        
        self.deepArView = arView
        
        cameraController.startCamera(withAudio: false)
        if let defaultEffect:String = self.defaultDeepAREffect.path {
            self.setDeepAREffectByPath(slot: self.defaultSlot, path: defaultEffect)
        }
    }
    
    // 카메라 deep AR 효과 변경 함수
    func setDeepAREffectByPath(slot: DeepARSlot, path: String) {
        self.deepAR.switchEffect(withSlot: slot.rawValue, path: path)
    }
    
    // 카메라 방향 전환 함수
    func switchCameraDirection () {
        self.cameraController.position = cameraController.position == .back ? .front : .back
    }
    
    // 사진촬영 액션 함수
    func takeScreenShot () {
        self.deepAR.takeScreenshot()
    }
    
    func startStream (endpoint: String) {
        // deepAR startCapture 시 deepAR 에서 제공해주는 비디오 촬영기능이 비활성화되므로 라이브 방송시에만 킨다
        deepARStartCapture()
        if (cameraBase.prepareVideo() && cameraBase.prepareAudio()) {
            cameraBase.startStream(endpoint: endpoint)
        }
    }
    
    func stopStream () {
        // deepAR startCapture 시 deepAR 에서 제공해주는 비디오 촬영기능이 비활성화되므로 라이브 방송시이 끝나면 캡처링 종료
        deepARStopCapture()
        cameraBase.stopStream()
    }
    
    // 사진 촬영 완료후 콜백 함수
    func didTakeScreenshot(_ screenshot: UIImage!) {
        UIImageWriteToSavedPhotosAlbum(screenshot, nil, nil, nil)
        
        let imageView = UIImageView(image: screenshot)
    }
    
    func startViedoRecording () {
        let width: Int32 = Int32(self.deepAR.renderingResolution.width)
        let height: Int32 =  Int32(self.deepAR.renderingResolution.height)
        
        if(self.deepAR.videoRecordingWarmupEnabled) {
            self.deepAR.resumeVideoRecording()
        } else {
            if(self.deepAR.videoRecordingWarmupEnabled) {
                NSLog("Can't change video recording settings when video recording warmap enabled")
                return
            }
            let videoQuality = 0.1
            let bitrate =  1250000
            let videoSettings:[AnyHashable : AnyObject] = [:]
            
            // 낮은 퀄리티의 동영상
//            let videoSettings:[AnyHashable : AnyObject] = [
//                         AVVideoQualityKey : (videoQuality as AnyObject),
//                         AVVideoAverageBitRateKey : (bitrate as AnyObject)
//                     ]

            let frame = CGRect(x: 0, y: 0, width: 1, height: 1)
            
            let dateFormatter = DateFormatter()
            dateFormatter.dateFormat = "yyyyMMddHHmmssSSS"
            let dateString = dateFormatter.string(from: Date())
            let fileName = "deepAR_video_\(dateString)"
            
            // 촬영후 생성할 비디오파일 이름 지정
            self.deepAR.setVideoRecordingOutputName(fileName)
            self.deepAR.enableAudioProcessing(false)
            
            // 녹음 시작 [오디오 녹음 미포함]
            self.deepAR.startVideoRecording(withOutputWidth: width, outputHeight: height, subframe: frame, videoCompressionProperties: videoSettings, recordAudio: false)
            
            // 아래는 음성도 포함한 녹음
            //self.deepAR.startVideoRecording(withOutputWidth: width, outputHeight: height)
        }
    }
    
    func didInitialize() {
        print("didInitialize")
        //매 프레임마다 frameAvailable 을 호출하려면 정확히 didInitialize 안에서 deepAR.startCapture 를실행해야 정상동작 [도큐멘트 참조]
//        deepAR.startCapture(withOutputWidthAndFormat: 720, outputHeight: 1280, subframe: CGRect(x: 0, y: 0, width: 1, height: 1), outputImageFormat: OutputFormat.RGBA)
        //deepAR.startCapture(withOutputWidth: 720, outputHeight: 1280, subframe: CGRect(x: 0, y: 0, width: 1, height: 1))
    }
    
    func deepARStartCapture () {
        deepAR.startCapture(withOutputWidthAndFormat: 720, outputHeight: 1280, subframe: CGRect(x: 0, y: 0, width: 1, height: 1), outputImageFormat: OutputFormat.RGBA)
    }
    
    func deepARStopCapture () {
        deepAR.stopCapture()
    }
    
    func didFinishShutdown () {
        print("deepAR is done")
    }
    func stopViedeoRecording () {
        self.deepAR.finishVideoRecording()
    }
    
    func didFinishPreparingForVideoRecording() {
        NSLog("didFinishPreparingForVideoRecording!!!!!")
    }
    
    // 비디오 촬영 시작시 호출함수
    func didStartVideoRecording() {
        AudioEngineMananger.audioEngineMananger.startAudioRecord()
        NSLog("didStartVideoRecording!!!!!")
    }
    
    
    // 비디오 촬영 종료시 콜백 호출함수
    func didFinishVideoRecording(_ videoFilePath: String!) {
        // 오디오엔진 녹음종료
        let itemId = AudioEngineMananger.audioEngineMananger.stopAudioRecord()
        
        // 비디오 플레이리스트에 비디오 추가
        let currentPosition = BSETransport.getPosition()
        let duration = BSEAudioItem.getLength(itemID: itemId)
        videoPlayerController.appendVideo(path: videoFilePath, position: currentPosition, duration: duration)
        
        print("didFinishVideoRecording!!!!!", videoFilePath)
        UISaveVideoAtPathToSavedPhotosAlbum(videoFilePath, self, #selector(video(_:didFinishSavingWithError:contextInfo:)), nil)
    }
    
    // 촬영한 영상 앨범에 저장
    @objc func video(_ videoPath: String, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {
        if let error = error {
            print("Error saving
[IMG_7015-1.mov.zip|attachment](upload://vmrHv9WUxqtv6P9uwVWKkULVoie.zip) (5.9 MB)
 video: \(error.localizedDescription)")
        } else {
            print("Successfully saved video to photo album.")
        }
    }
    
    // Add the necessary DeepAR delegate methods here
}

Hi are you using the free licence? The free licence has a 3 minute timeout before it crashes, any paid licence doesn’t have it.