Help this problem that DeepAR With IVS for LiveStreaming

this is what i try

Here’s the translation of the given content:

Hello, I am currently testing a live streaming application using deepAR and IVS in a demo app. I am using an iOS device, specifically the iPhone 12.

Below is my Podfile:

Uncomment the next line to define a global platform for your project

platform :ios, ‘9.0’

target ‘quickstart-ios-swift’ do

Comment the next line if you don’t want to use dynamic frameworks

use_frameworks!

Pods for quickstart-ios-swift

pod 'AmazonIVSBroadcast', '~> 1.12.0'
pod 'DeepAR'

end

I initially wanted to use the latest version of AmazonIVSBroadcast without specifying the version, but I encountered an error:

2023-10-27 15:36:20.516995+0900 quickstart-ios-swift[664:66109] [AudioConverter] CodecConverter.cpp:1030 Encoder client can’t handle 256-byte packet!

Due to this error, I had to use a specific version.

The problem is that while the video and audio are correctly transmitted to IVS, when I run the following code:

func setupIVS() {
do {
let broadcastSession = try IVSBroadcastSession(configuration: IVSPresets.configurations().standardPortrait(),
descriptors: IVSPresets.devices().frontCamera(),
delegate: self)

        let customImageSource = broadcastSession.createImageSource(withName: "customSourceName")

        broadcastSession.attach(customImageSource, toSlotWithName: "customSourceSlot")
        self.customImageSource = customImageSource
        
        self.broadcastSession = broadcastSession
        
        let url = URL(string: kIngestServer)
        let key = kStreamKey
        try broadcastSession.start(with: url!, streamKey: key)
    } catch {
        assertionFailure("IVSBroadcastSession failed to initialize.")
    }
}

The phone’s screen freezes. It seems like the thread might be blocked because while the screen is frozen, when I check IVS, the live streaming is still ongoing normally. How can I resolve this? I would appreciate it

Replying for future reference

In setting up the session, the descriptor should be set up to microphone() or nil if customImageSource using the camera is used.