我使用webrtc-ios和swiftui成功地在屏幕上实时播放了远程视频。但我也想要实时录制视频。我知道如何将RTCVideooframe更改为CMsamplebuffer,并且我知道我必须使用Avassetwriter保存它。但我不确定在哪里可以实时提取RTCVideoFrame。
这是我的代码。
struct VideoView: UIViewRepresentable {
let videoTrack: RTCVideoTrack?
@Binding var refreshVideoTrack: Bool
//RTCNSGLVideoView
//RTCMTLNSVideoView
func makeUIView(context: Context) -> RTCEAGLVideoView {
let view = RTCEAGLVideoView(frame: .zero)
view.contentMode = .scaleAspectFill
return view
}
func updateUIView(_ view: RTCEAGLVideoView, context: Context) {
if(refreshVideoTrack){
videoTrack?.add(view)
refreshVideoTrack = false
}
}
}
VideoView(videoTrack: homeViewModel.remoteVideoTrack, refreshVideoTrack: Binding<Bool>(get: {return homeViewModel.refreshRemoteVideoTrack},
set: { p in homeViewModel.refreshRemoteVideoTrack = p}))我使用RTCPeerConnectionDelegate获取remoteVideoTrack
func peerConnection(_ peerConnection: RTCPeerConnection, didAdd stream: RTCMediaStream) {
dLog("")
remoteVideoTrack = stream.videoTracks.first
remoteVideoTrack?.isEnabled = true
refreshRemoteVideoTrack = true
}发布于 2021-11-05 09:40:44
要获取RTCVideoFrame,您可以创建一个中间RTCVideoRenderer
class FrameRenderer : NSObject, RTCVideoRenderer {
func setSize(_ size: CGSize) {}
func renderFrame(_ frame: RTCVideoFrame?) {}
}videoTrack?.add(FrameRenderer())https://stackoverflow.com/questions/69847842
复制相似问题