Hi there,
I’m building an application using react-native lib. However, I can’t find a way to pass my own media stream or use the output stream for my purposes. Is it possible or not? I saw a way for web SDK, but couldn’t find a way for mobile app.
Hi there,
I’m building an application using react-native lib. However, I can’t find a way to pass my own media stream or use the output stream for my purposes. Is it possible or not? I saw a way for web SDK, but couldn’t find a way for mobile app.
Hi, the react-native plugin is a third party demo so it doesn’t have all of the functions implemented but the native SDKs do have a way to achieve this, you can see some examples on GitHub
You can check out the Offscreen Rendering guides
rendered frames will be delivered through frameAvailable
and from there you can feed them into whatever pipeline you need
Hi! Yes, it is possible to work with media streams in React Native, but it depends on the library you’re using. If you’re referring to WebRTC (e.g., react-native-webrtc
), you can pass and manipulate media streams using getUserMedia()
for capturing audio/video and RTCPeerConnection
for handling streams. Unlike the web SDK, React Native requires native module support, so ensure you have the right permissions set up and that your implementation aligns with the platform’s constraints. If you need more specific guidance, could you share which library you’re using and your exact use case? I’d be happy to help!