我已经在使用WebRTC了。我想要本地视频流到文件。如果你能给我一个提示,我将不胜感激。
感谢您的阅读。
@Override
protected void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.activity_main);
//sigConnect("http://10.54.36.19:8000/");
sigConnect("http://unwebrtc.herokuapp.com/");
initWebRTC();
Log.i(TAG, "VideoCapturerAndroid.getDeviceCount() = " + VideoCapturerAndroid.getDeviceCount());
String nameOfFrontFacingDevice = VideoCapturerAndroid.getNameOfFrontFacingDevice();
String nameOfBackFacingDevice = VideoCapturerAndroid.getNameOfBackFacingDevice();
Log.i(TAG, "VideoCapturerAndroid.getNameOfFrontFacingDevice() = " + nameOfFrontFacingDevice);
Log.i(TAG, "VideoCapturerAndroid.getNameOfBackFacingDevice() = " + nameOfBackFacingDevice);
VideoCapturerAndroid capturer = VideoCapturerAndroid.create(nameOfFrontFacingDevice);
MediaConstraints videoConstraints = new MediaConstraints();
VideoSource videoSource = peerConnectionFactory.createVideoSource(capturer, videoConstraints);
localVideoTrack = peerConnectionFactory.createVideoTrack(VIDEO_TRACK_ID, videoSource);
glview = (GLSurfaceView) findViewById(R.id.glview);
VideoRendererGui.setView(glview, null);
try {
rendereRemote = VideoRendererGui.createGui(0, 0, 100, 100, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
rendereLocal = VideoRendererGui.createGui(72, 72, 25, 25, VideoRendererGui.ScalingType.SCALE_ASPECT_FILL, true);
localVideoTrack.addRenderer(rendereLocal);
} catch (Exception e) {
e.printStackTrace();
}
mediaStream = peerConnectionFactory.createLocalMediaStream(LOCAL_MEDIA_STREAM_ID);
mediaStream.addTrack(localVideoTrack);
}发布于 2016-11-04 14:01:48
Libjingle库使用GlSurfaceView来渲染视频。您可以尝试使用FFMPEG库保存该视图中的视频帧。但对音频不太确定
发布于 2018-11-12 19:13:32
您必须创建视频容器,如mp4,并手动编码和写入每个原始帧。此外,最新的webrtc版本还提供了从麦克风录制音频的通道。还应该对音频样本进行编码和多路复用。
有关访问远程和本地对等设备的原始视频帧的
https://stackoverflow.com/questions/40416166
复制相似问题