首页
学习
活动
专区
圈层
工具
发布
社区首页 >问答首页 >Libstreaming错误(解码器缓冲区不够大,解码器没有解码任何内容)

Libstreaming错误(解码器缓冲区不够大,解码器没有解码任何内容)
EN

Stack Overflow用户
提问于 2014-06-10 04:17:39
回答 4查看 4K关注 0票数 4

我正在尝试从这里使用libstreaming库:https://github.com/fyhertz/libstreaming

我从这里开始关注example2:https://github.com/fyhertz/libstreaming-examples

正在尝试在Galaxy Nexus上使用此流媒体库。

如果我使用小分辨率(新的VideoQuality(128,96,20,500000)),我得到一个错误,解码器没有解码任何东西:

代码语言:javascript
复制
06-09 19:59:31.531: D/libEGL(8198): loaded /vendor/lib/egl/libEGL_POWERVR_SGX540_120.so
06-09 19:59:31.539: D/libEGL(8198): loaded /vendor/lib/egl/libGLESv1_CM_POWERVR_SGX540_120.so
06-09 19:59:31.539: D/libEGL(8198): loaded /vendor/lib/egl/libGLESv2_POWERVR_SGX540_120.so
06-09 19:59:31.632: D/OpenGLRenderer(8198): Enabling debug mode 0
06-09 19:59:33.773: D/MainActivity(8198): Start
06-09 19:59:33.773: D/MainActivity(8198): Found mSurfaceView: net.majorkernelpanic.streaming.gl.SurfaceView{420285e0 V.E..... ........ 32,32-688,910 #7f080001 app:id/surface}
06-09 19:59:33.789: I/dalvikvm(8198): Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2
06-09 19:59:33.789: W/dalvikvm(8198): VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;
06-09 19:59:33.789: D/dalvikvm(8198): VFY: replacing opcode 0x6e at 0x005e
06-09 19:59:33.789: I/MediaStream(8198): Phone supports the MediaCoded API
06-09 19:59:33.843: D/dalvikvm(8198): GC_CONCURRENT freed 65K, 2% free 9075K/9168K, paused 4ms+2ms, total 34ms
06-09 19:59:33.843: D/dalvikvm(8198): WAIT_FOR_CONCURRENT_GC blocked 15ms
06-09 19:59:34.750: V/VideoQuality(8198): Supported resolutions: 1920x1080, 1280x720, 960x720, 800x480, 720x576, 720x480, 768x576, 640x480, 320x240, 352x288, 240x160, 176x144, 128x96
06-09 19:59:34.750: V/VideoQuality(8198): Supported frame rates: 15-15fps, 15-30fps, 24-30fps
06-09 19:59:35.140: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.171: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.179: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.211: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.242: W/ACodec(8198): Use baseline profile instead of 8 for AVC recording
06-09 19:59:35.242: I/ACodec(8198): setupVideoEncoder succeeded
06-09 19:59:35.515: I/OMXClient(8198): Using client-side OMX mux.
06-09 19:59:35.515: E/OMXNodeInstance(8198): OMX_GetExtensionIndex failed
06-09 19:59:36.359: D/dalvikvm(8198): GC_CONCURRENT freed 156K, 3% free 9356K/9552K, paused 4ms+5ms, total 25ms
06-09 19:59:38.531: W/System.err(8198): java.lang.RuntimeException: The decoder did not decode anything.
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.decode(EncoderDebugger.java:799)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:246)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:115)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.testMediaCodecAPI(H264Stream.java:132)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.testH264(H264Stream.java:119)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.video.H264Stream.configure(H264Stream.java:111)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.Session.syncConfigure(Session.java:395)
06-09 19:59:38.539: W/System.err(8198):     at net.majorkernelpanic.streaming.Session$3.run(Session.java:371)
06-09 19:59:38.539: W/System.err(8198):     at android.os.Handler.handleCallback(Handler.java:725)
06-09 19:59:38.539: W/System.err(8198):     at android.os.Handler.dispatchMessage(Handler.java:92)
06-09 19:59:38.546: W/System.err(8198):     at android.os.Looper.loop(Looper.java:137)
06-09 19:59:38.546: W/System.err(8198):     at android.os.HandlerThread.run(HandlerThread.java:60)

如果我尝试使用更大的分辨率(新的VideoQuality(640,480,20,500000)),它会报告解码器输入缓冲区不够大:

代码语言:javascript
复制
06-09 19:51:51.054: D/libEGL(8096): loaded /vendor/lib/egl/libEGL_POWERVR_SGX540_120.so
06-09 19:51:51.062: D/libEGL(8096): loaded /vendor/lib/egl/libGLESv1_CM_POWERVR_SGX540_120.so
06-09 19:51:51.070: D/libEGL(8096): loaded /vendor/lib/egl/libGLESv2_POWERVR_SGX540_120.so
06-09 19:51:51.164: D/OpenGLRenderer(8096): Enabling debug mode 0
06-09 19:51:53.054: D/MainActivity(8096): Start
06-09 19:51:53.054: D/MainActivity(8096): Found mSurfaceView: net.majorkernelpanic.streaming.gl.SurfaceView{42031b00 V.E..... ........ 32,32-688,910 #7f080001 app:id/surface}
06-09 19:51:53.062: I/dalvikvm(8096): Could not find method android.media.MediaCodec.createInputSurface, referenced from method net.majorkernelpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2
06-09 19:51:53.062: W/dalvikvm(8096): VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;
06-09 19:51:53.062: D/dalvikvm(8096): VFY: replacing opcode 0x6e at 0x005e
06-09 19:51:53.070: I/MediaStream(8096): Phone supports the MediaCoded API
06-09 19:51:53.132: D/dalvikvm(8096): GC_CONCURRENT freed 103K, 2% free 9038K/9168K, paused 4ms+3ms, total 42ms
06-09 19:51:53.132: D/dalvikvm(8096): WAIT_FOR_CONCURRENT_GC blocked 28ms
06-09 19:51:54.039: V/VideoQuality(8096): Supported resolutions: 1920x1080, 1280x720, 960x720, 800x480, 720x576, 720x480, 768x576, 640x480, 320x240, 352x288, 240x160, 176x144, 128x96
06-09 19:51:54.039: V/VideoQuality(8096): Supported frame rates: 15-15fps, 15-30fps, 24-30fps
06-09 19:51:54.468: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.500: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.515: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.554: D/dalvikvm(8096): GC_FOR_ALLOC freed 106K, 2% free 9210K/9344K, paused 18ms, total 18ms
06-09 19:51:54.554: I/dalvikvm-heap(8096): Grow heap (frag case) to 9.458MB for 460816-byte allocation
06-09 19:51:54.578: D/dalvikvm(8096): GC_FOR_ALLOC freed 0K, 2% free 9660K/9796K, paused 22ms, total 22ms
06-09 19:51:54.593: D/dalvikvm(8096): GC_CONCURRENT freed <1K, 2% free 9660K/9796K, paused 3ms+2ms, total 20ms
06-09 19:51:54.656: D/dalvikvm(8096): GC_FOR_ALLOC freed <1K, 2% free 9660K/9796K, paused 13ms, total 13ms
06-09 19:51:54.656: I/dalvikvm-heap(8096): Grow heap (frag case) to 9.897MB for 460816-byte allocation
06-09 19:51:54.671: D/dalvikvm(8096): GC_FOR_ALLOC freed 0K, 2% free 10110K/10248K, paused 16ms, total 16ms
06-09 19:51:54.679: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:54.687: D/dalvikvm(8096): GC_CONCURRENT freed <1K, 2% free 10110K/10248K, paused 2ms+1ms, total 13ms
06-09 19:51:54.703: W/ACodec(8096): Use baseline profile instead of 8 for AVC recording
06-09 19:51:54.703: I/ACodec(8096): setupVideoEncoder succeeded
06-09 19:51:55.257: D/dalvikvm(8096): GC_CONCURRENT freed 2K, 1% free 10501K/10576K, paused 4ms+2ms, total 32ms
06-09 19:51:55.359: I/OMXClient(8096): Using client-side OMX mux.
06-09 19:51:55.359: E/OMXNodeInstance(8096): OMX_GetExtensionIndex failed
06-09 19:51:56.187: W/System.err(8096): java.lang.IllegalStateException: The decoder input buffer is not big enough (nal=91280, capacity=65536).
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.check(EncoderDebugger.java:838)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.decode(EncoderDebugger.java:753)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:246)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.hw.EncoderDebugger.debug(EncoderDebugger.java:115)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.testMediaCodecAPI(H264Stream.java:132)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.testH264(H264Stream.java:119)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.video.H264Stream.configure(H264Stream.java:111)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.Session.syncConfigure(Session.java:395)
06-09 19:51:56.187: W/System.err(8096):     at net.majorkernelpanic.streaming.Session$3.run(Session.java:371)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Handler.handleCallback(Handler.java:725)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Handler.dispatchMessage(Handler.java:92)
06-09 19:51:56.187: W/System.err(8096):     at android.os.Looper.loop(Looper.java:137)
06-09 19:51:56.187: W/System.err(8096):     at android.os.HandlerThread.run(HandlerThread.java:60)

我已经尝试了几十种不同的分辨率、帧率和比特率组合。我尝试的所有结果要么是“解码器没有解码任何东西”,要么是“解码器输入缓冲区不够大”。

有没有人可以开箱即用这个库?这些错误的原因是什么?解决方案是什么?如果我的搜索结果有任何迹象表明,我似乎是世界上唯一一个有这个问题的人。感谢您的真知灼见!

下面是我的MainActivity.java中的代码:

代码语言:javascript
复制
package com.cornet.cornetspydroid2;

import net.majorkernelpanic.streaming.Session;
import net.majorkernelpanic.streaming.SessionBuilder;
import net.majorkernelpanic.streaming.audio.AudioQuality;
import net.majorkernelpanic.streaming.gl.SurfaceView;
import net.majorkernelpanic.streaming.video.VideoQuality;
import android.app.Activity;
import android.app.Fragment;
import android.content.pm.ActivityInfo;
import android.os.Bundle;
import android.util.Log;
import android.view.LayoutInflater;
import android.view.Menu;
import android.view.MenuItem;
import android.view.SurfaceHolder;
import android.view.View;
import android.view.ViewGroup;
import android.view.WindowManager;

public class MainActivity extends Activity implements Session.Callback, SurfaceHolder.Callback {

    private static final String TAG = "MainActivity";

    private static final String ip = "10.3.1.204";
    private static final VideoQuality VIDEO_QUALITY = new VideoQuality(128,96,20,500000);

    private Session mSession;
    private SurfaceView mSurfaceView;

    @Override
    protected void onCreate(Bundle savedInstanceState) {

        super.onCreate(savedInstanceState);

        if (savedInstanceState == null) {
            getFragmentManager().beginTransaction().add(R.id.container, new PlaceholderFragment()).commit();
        }

        setContentView(R.layout.activity_main);

        setRequestedOrientation(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);

    }

    public void start(View view) {

        if (mSession != null && mSession.isStreaming()) {
            Log.d(TAG, "Already streaming!");
            return;
        }

        Log.d(TAG, "Start");

        mSurfaceView = (SurfaceView)findViewById(R.id.surface);

        mSession = SessionBuilder.getInstance()
            .setCallback(this)
            .setSurfaceView(mSurfaceView)
            .setPreviewOrientation(90)
            .setContext(getApplicationContext())
            .setAudioEncoder(SessionBuilder.AUDIO_NONE)
            .setAudioQuality(new AudioQuality(16000, 32000))
            .setVideoEncoder(SessionBuilder.VIDEO_H264)
            .setVideoQuality(VIDEO_QUALITY)
            .setDestination(ip)
        .build();

        mSurfaceView.getHolder().addCallback(this);

        if (!mSession.isStreaming()) {
            mSession.configure();
        }

    }

    public void stop(View view) {

        Log.d(TAG, "Stop");

        if (mSession != null) {
            mSession.stop();
        }

        if (mSurfaceView != null) {
            mSurfaceView.getHolder().removeCallback(this);
        }

    }

    @Override
    public void onDestroy() {

        super.onDestroy();

        if (mSession != null) {
            mSession.release();
        }

    }

    @Override
    public void onPreviewStarted() {
        Log.d(TAG,"Preview started.");
    }

    @Override
    public void onSessionConfigured() {
        Log.d(TAG,"Preview configured.");
        // Once the stream is configured, you can get a SDP formated session description
        // that you can send to the receiver of the stream.
        // For example, to receive the stream in VLC, store the session description in a .sdp file
        // and open it with VLC while streming.
        Log.d(TAG, mSession.getSessionDescription());
        mSession.start();
    }

    @Override
        public void onSessionStarted() {
        Log.d(TAG,"Session started.");
    }

    @Override
    public void onBitrareUpdate(long bitrate) {
        Log.d(TAG,"Bitrate: "+bitrate);
    }

    @Override
    public void onSessionError(int message, int streamType, Exception e) {
        if (e != null) {
            Log.e(TAG, e.getMessage(), e);
        }
    }

    @Override
    public void onSessionStopped() {
        Log.d(TAG,"Session stopped.");
    }

    @Override
    public void surfaceCreated(SurfaceHolder holder) {
        mSession.startPreview();
    }

    @Override
        public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) {
    }

    @Override
    public void surfaceDestroyed(SurfaceHolder holder) {
        mSession.stop();
    }

    @Override
    public boolean onCreateOptionsMenu(Menu menu) {

        // Inflate the menu; this adds items to the action bar if it is present.
        getMenuInflater().inflate(R.menu.main, menu);
        return true;
    }

    @Override
    public boolean onOptionsItemSelected(MenuItem item) {
        // Handle action bar item clicks here. The action bar will
        // automatically handle clicks on the Home/Up button, so long
        // as you specify a parent activity in AndroidManifest.xml.
        int id = item.getItemId();
        if (id == R.id.action_settings) {
            return true;
        }
        return super.onOptionsItemSelected(item);
    }

    /**
     * A placeholder fragment containing a simple view.
     */
    public static class PlaceholderFragment extends Fragment {

        public PlaceholderFragment() {
        }

        @Override
        public View onCreateView(LayoutInflater inflater, ViewGroup container,
                Bundle savedInstanceState) {
            View rootView = inflater.inflate(R.layout.fragment_main, container,
                    false);
            return rootView;
        }
    }

}

更新:这个库中的MediaStream类有一个静态初始化器,用于查找名为"android.media.MediaCodec“的类。当我强制它使用sSuggestedMode = MODE_MEDIARECORDER_API而不是MediaCodec时,无论我选择哪种分辨率,都不会出现错误,Wireshark会看到来自手机的数据包。但由于某种原因,VLC无法播放该视频流(udp/h264://@10.3.1.204:16420)。这似乎表明我选择的解决方案不是问题所在;至少不是直接的问题。

错误发生在Session.syncConfigure()调用中(它甚至没有到达Session.start())。它可以成功配置音频流,但对视频流的Stream.configure()调用失败。syncConfigure()调用最终到达H264Stream.testMediaCodecAPI(),后者调用EncoderDebugger.debug()。debug()方法抛出了两个原始错误:输入缓冲区不够大,或者解码器没有解码任何东西。

一些可能会揭示的东西(包含在我提供的原始日志中):我似乎总是在启动时从"dalvikm“标记中得到调试错误:”找不到从方法net.majorkernalpanic.streaming.video.VideoStream.encodeWithMediaCodecMethod2".引用的方法android.media.MediaCodec.createInputSurface紧跟在该日志条目之后,有一条警告,同样来自"dalvikm“标签:"VFY: unable to resolve virtual method 377: Landroid/media/MediaCodec;.createInputSurface ()Landroid/view/Surface;”这可能与此有关吗?为什么它能够从MediaStream中的Class.forName()调用中找到MediaCodec类,但是后来当它试图从MediaCodec (createInputSurface)访问一个有文档记录的方法时,却出现了找不到该方法的警告?我的AndroidManifest.xml文件(在主项目和库项目中)都指定了minSDK16和target SDK19。MediaCodec类是在API16版本中添加的,所以我不应该收到这些警告。这是否表明我在某种程度上配置错误?这些警告是否与我遇到的问题有关?

EN

回答 4

Stack Overflow用户

发布于 2014-06-19 00:13:58

实际上,createInputSurface()并不是使用唱片库的强制性函数,只有在使用MODE_MEDIACODEC_API_2模式时才需要它。 Lisbtreaming应该可以在Android4.1和4.2上运行,只要MediaCodec应用程序接口在手机上工作正常。

说明:

当在Android4.1和4.2上使用libstreaming时,你确实会在日志中看到VM抱怨createInputSurface()不存在。这不会使应用程序崩溃,因为此方法永远不会被调用(除非您试图以某种方式强制MODE_MEDIACODEC_API_2 )。

现在让我解释一下“解码器输入缓冲区不够大”错误是如何发生的。

当libstreaming与以前从未在用户手机上使用过的分辨率的MODE_MEDIACODEC_API一起使用时,它首先尝试查看是否至少有一个可通过MediaCodec应用编程接口访问的编码器在该分辨率下正常工作。为了做到这一点,它将尝试使用手机上所有可用的编码器和解码器对一个简单的视频进行编码和解码。当解码器无法解码编码器生成的H264流时,就会出现您提到的错误。

如果测试停止且未找到有效的编码器/解码器对,则认为电话不支持该分辨率。然后,Libstreaming将尝试回退到MODE_MEDIARECORDER_API模式。

要找出这些“模式”实际上是什么,只需阅读项目的,我在那里用了更长的篇幅解释了所有这些。

关于测试,需要了解一些重要事情

该测试的结果存储在SharedPreference中,因此如果您想重试,可以清除应用程序的缓存,或者在EncoderDebugger.java中将布尔调试更改为true。此外,如果android的版本发生变化(例如,在更新之后),测试将再次运行。

此测试背后的原因是MediaCodec应用编程接口错误如地狱,如果您尝试使用它,您可能已经知道这一点。

(这个测试实际上是用EncoderDebugger类编写的,你可以在github上查看它。)

行动小组的电话上发生了什么?

他的手机不能通过安卓4.2的测试,但可以通过安卓4.3的测试。MediaCodec应用编程接口已在中间层电话上打了补丁。

不过,可能还有一种方法可以改进该测试,使其在搭载Android 4.2的Galaxy Nexus上工作。例如,目前仅支持以下颜色格式:

  • COLOR_FormatYUV420SemiPlanar
  • COLOR_FormatYUV420PackedSemiPlanar
  • COLOR_TI_FormatYUV420PackedSemiPlanar:
  • COLOR_FormatYUV420Planar:
  • COLOR_FormatYUV420PackedPlanar:

(免责声明,我写了lib)

票数 4
EN

Stack Overflow用户

发布于 2014-06-11 16:02:42

据报道,Galaxy Nexus支持的视频分辨率存在问题。我从来没有尝试过128x96,我也不再需要访问手机来查看它。我确实试过320x240,但它在这个设备上坏了。但640x480确实可以工作,但它可能对20 FPS不满意。我建议你尝试15 FPS:

代码语言:javascript
复制
private static final VideoQuality VIDEO_QUALITY = VideoQuality(640, 480, 15, 500000);
票数 1
EN

Stack Overflow用户

发布于 2014-06-14 04:23:21

终于找到问题所在了!我的Galaxy Nexus使用的是Android 4.2.2。虽然在4.2.2中存在MediaCodec类,但直到4.3 (build 18)才添加createInputSurface()方法。我用4.3版本的设备测试了它,它工作正常。

不要将这个库用于4.1.x或4.2.x设备,除非您在MediaStream静态初始化器中强制使用sSuggestedMode = MODE_MEDIARECORDER_API。

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/24128279

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档