我正在使用webRTC在两台iphone设备之间进行视频通话。webrtc对等连接建立成功。视频流应该在swiftui中显示。保存视频的webRTC object RTCEAGLVideoView需要在swiftui中渲染。在模型类中,我将此对象声明为
var remoteVideoView : RTCEAGLVideoView?
{
willSet {
objectWillChange.send()
}
}在SwiftUI类中,应该呈现remoteVideoView。
VStack()
{
//show remoteVideoView here
}应该使用哪种对象来渲染此videoview。
发布于 2020-02-06 18:53:25
首先,您需要创建包含音频轨道和视频轨道的RTCMediaStream:
@property RTCMediaStream * _Nullable mediaStream;使用streamId从媒体服务器获取的流将其转换为RTCMediaStream,然后使用以下代码在RTCEAGLVideoView中呈现它
if (mediaStream.videoTracks.count > 0) {
RTCVideoTrack *videoTrack = [self.mediaStream.videoTracks objectAtIndex:0];
[videoTrack addRenderer:remoteView];
}另外,创建RTCEAGLVideoView的IBOutlet,如下所示:
@property (weak, nonatomic) IBOutlet RTCEAGLVideoView *remoteView;现在在视图控制器中添加一个视图,并将类作为RTCEAGLVideoView分配给该视图。接下来,连接videoView插座。
如果你没有使用故事板,请使用下面的代码:
RTCEAGLVideoView *remoteView = [[RTCEAGLVideoView alloc] initWithFrame:self.frame]; // pass CGRect frame here.
remoteView.delegate = self;
[yourView addSubview:remoteView];现在你可以看到你的视频了。
在下面的VStack()中添加RTCEAGLVideoView:
struct RemoteView: UIViewRepresentable {
func remoteView(context: Context) -> RTCEAGLVideoView {
//create frame for RTCEAGLVideoView here
}
}
struct RemoteView_Preview: PreviewProvider {
static var previews: some View {
RemoteView(frame: .zero)
}
}在VStack中使用如下所示:
VStack {
RemoteView(frame)
VStack{
Text("")
}
}发布于 2020-02-07 16:23:33
struct RemoteView : UIViewRepresentable {
@Binding var video: VideoCall
@Binding var remoteView: RTCEAGLVideoView
func updateUIView(_ uiView: RTCEAGLVideoView, context: UIViewRepresentableContext<RemoteView>) {
}
func makeUIView(context: Context) -> RTCEAGLVideoView {
self.remoteView.frame = CGRect(x: 20, y: 20, width: 200, height: 300)
self.remoteView = self.video.remoteVideoView!
return self.remoteView
}
} 我从另一个swiftui结构中调用了它,
VStack()
{
RemoteView()
}但是我得到了编译器错误
https://stackoverflow.com/questions/60087973
复制相似问题