当前位置: 首页 > 知识库问答 >
问题:

Android:如何用mediecodec解码视频(编码不生成文件)

米项禹
2023-03-14

整个过程是:从摄像机获取视频数据,对其进行编码和解码,并在SurfaceView上显示。

    /**
     * Drains all pending output from the decoder, and adds it to the circular buffer.
     * <p>
     */
    public void drainEncoder() {
        final int TIMEOUT_USEC = 0;     // no timeout -- check for buffers, bail if none

        ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();
        while (true) {
            int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                break;
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = mEncoder.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // Should happen before receiving buffers, and should only happen once.
                // The MediaFormat contains the csd-0 and csd-1 keys, which we'll need
                // for MediaMuxer.  It's unclear what else MediaMuxer might want, so
                // rather than extract the codec-specific data and reconstruct a new
                // MediaFormat later, we just grab it here and keep it around.

                mEncodedFormat = mEncoder.getOutputFormat();
                Log.d(TAG, "encoder output format changed: " + mEncodedFormat);
            } else if (encoderStatus < 0) {
                Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
                        encoderStatus);
                // let's ignore it
            } else {
                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];
                if (encodedData == null) {
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                            " was null");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // The codec config data was pulled out when we got the
                    // INFO_OUTPUT_FORMAT_CHANGED status.  The MediaMuxer won't accept
                    // a single big blob -- it wants separate csd-0/csd-1 chunks --
                    // so simply saving this off won't work.

                    if (VERBOSE) Log.d(TAG, "ignoring BUFFER_FLAG_CODEC_CONFIG");
                    mBufferInfo.size = 0;
                }

                if (mBufferInfo.size != 0) {
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    encodedData.position(mBufferInfo.offset);
                    encodedData.limit(mBufferInfo.offset + mBufferInfo.size);

                    mEncBuffer.add(encodedData, mBufferInfo.flags,
                            mBufferInfo.presentationTimeUs);

                    if (VERBOSE) {
                        Log.d(TAG, "sent " + mBufferInfo.size + " bytes to muxer, ts=" +
                                mBufferInfo.presentationTimeUs);
                    }

                }


                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {
                    // Codec config info.  Only expected on first packet.  One way to
                    // handle this is to manually stuff the data into the MediaFormat
                    // and pass that to configure().  We do that here to exercise the API.
                    assertFalse(decoderConfigured);
                    try {
                        decoder = MediaCodec.createDecoderByType(MIME_TYPE);
                    } catch (IOException e) {
                        e.printStackTrace();
                    }
                    MediaFormat format =
                            MediaFormat.createVideoFormat(MIME_TYPE, 80, 80);
                    format.setByteBuffer("csd-0", encodedData);
                    decoder.configure(format, mSurfaceView.getHolder().getSurface(),
                            null, 0);
                    decoder.start();
                    decoderInputBuffers = decoder.getInputBuffers();
                    decoderOutputBuffers = decoder.getOutputBuffers();
                    decoderConfigured = true;
                    if (VERBOSE)
                        Log.d(TAG, "decoder configured (" + mBufferInfo.size + "bytes)");
                } else {
                    // Get a decoder input buffer, blocking until it's available.
                    assertTrue(decoderConfigured);
                    int inputBufIndex = decoder.dequeueInputBuffer(-1);
                    ByteBuffer inputBuf = decoderInputBuffers[inputBufIndex];
                    inputBuf.clear();
                    inputBuf.put(encodedData);
                    decoder.queueInputBuffer(inputBufIndex, 0, mBufferInfo.size,
                            mBufferInfo.presentationTimeUs, mBufferInfo.flags);
                    // encoderDone = (mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0;
                    // if (VERBOSE) Log.d(TAG, "passed" + mBufferInfo.size + "bytes to decoder"
                    //        + (encoderDone ? "(EOS)" : ""));
                }


                mEncoder.releaseOutputBuffer(encoderStatus, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    Log.w(TAG, "reached end of stream unexpectedly");
                    break;      // out of while
                }
            }

        }

        // Check for output from the decoder.  We want to do this on every loop to avoid
        // the possibility of stalling the pipeline.  We use a short timeout to avoid
        // burning CPU if the decoder is hard at work but the next frame isn't quite ready.
        //
        // If we're decoding to a Surface, we'll get notified here as usual but the
        // ByteBuffer references will be null.  The data is sent to Surface instead.
        if (decoderConfigured) {
            int decoderStatus = decoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (decoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                if (VERBOSE) Log.d(TAG, "no output from decoder available");
            } else if (decoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // The storage associated with the direct ByteBuffer may already be unmapped,
                // so attempting to access data through the old output buffer array could
                // lead to a native crash.
                if (VERBOSE) Log.d(TAG, "decoder output buffers changed");
                decoderOutputBuffers = decoder.getOutputBuffers();
            } else if (decoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // this happens before the first frame is returned
                decoderOutputFormat = decoder.getOutputFormat();
                if (VERBOSE) Log.d(TAG, "decoder output format changed:" +
                        decoderOutputFormat);
            } else if (decoderStatus < 0) {
                fail("unexpected result from deocder.dequeueOutputBuffer:" + decoderStatus);
            } else {  // decoderStatus >= 0
                if (!toSurface) {
                    ByteBuffer outputFrame = decoderOutputBuffers[decoderStatus];
                    outputFrame.position(mBufferInfo.offset);
                    outputFrame.limit(mBufferInfo.offset + mBufferInfo.size);
                    rawSize += mBufferInfo.size;
                    if (mBufferInfo.size == 0) {
                        if (VERBOSE) Log.d(TAG, "got empty frame");
                    } else {
                        // if (VERBOSE) Log.d(TAG, "decoded, checking frame" + checkIndex);
                        // assertEquals("Wrong time stamp", computePresentationTime(checkIndex),
                        //        mBufferInfo.presentationTimeUs);
                        // if (!checkFrame(checkIndex++, decoderOutputFormat, outputFrame)) {
                        //    badFrames++;
                        // }
                    }
                    if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                        if (VERBOSE) Log.d(TAG, "output EOS");
                        outputDone = true;
                    }
                    decoder.releaseOutputBuffer(decoderStatus, false /*render*/);
                } else {
                    if (VERBOSE) Log.d(TAG, "surface decoder given buffer" + decoderStatus +
                            "(size=" + mBufferInfo.size + ")");
                    rawSize += mBufferInfo.size;
                    if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                        if (VERBOSE) Log.d(TAG, "output EOS");
                        outputDone = true;
                    }
                    boolean doRender = (mBufferInfo.size != 0);
                    // As soon as we call releaseOutputBuffer, the buffer will be forwarded
                    // to SurfaceTexture to convert to a texture.  The API doesn't guarantee
                    // that the texture will be available before the call returns, so we
                    // need to wait for the onFrameAvailable callback to fire.
                    decoder.releaseOutputBuffer(decoderStatus, doRender);
                    // if (doRender) {
                    //    if (VERBOSE) Log.d(TAG, "awaiting frame" + checkIndex);
                    //    assertEquals("Wrong time stamp", computePresentationTime(checkIndex),
                    //            info.presentationTimeUs);
                    //    outputSurface.awaitNewImage();
                    //    outputSurface.drawImage();
                    //    if (!checkSurfaceFrame(checkIndex++)) {
                    //        badFrames++;
                    //    }
                }
            }
        }
    }
int decoderStatus = decoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);

共有1个答案

钱展
2023-03-14

问题解决了1。配置解码器的csd-0 csd-1 2。以前传递给解码器的数据不正确。

以下是新的代码:

 public void drainEncoder() {
        final int TIMEOUT_USEC = 0;     // no timeout -- check for buffers, bail if none
        int pos = 0;

        ByteBuffer[] encoderOutputBuffers = mEncoder.getOutputBuffers();

        while (true) {
            int encoderStatus = mEncoder.dequeueOutputBuffer(mBufferInfo, TIMEOUT_USEC);
            if (encoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                // no output available yet
                break;
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                // not expected for an encoder
                encoderOutputBuffers = mEncoder.getOutputBuffers();
            } else if (encoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                // Should happen before receiving buffers, and should only happen once.
                // The MediaFormat contains the csd-0 and csd-1 keys, which we'll need
                // for MediaMuxer.  It's unclear what else MediaMuxer might want, so
                // rather than extract the codec-specific data and reconstruct a new
                // MediaFormat later, we just grab it here and keep it around.

                mEncodedFormat = mEncoder.getOutputFormat();
                Log.d(TAG, "encoder output format changed: " + mEncodedFormat);
            } else if (encoderStatus < 0) {
                Log.w(TAG, "unexpected result from encoder.dequeueOutputBuffer: " +
                        encoderStatus);
                // let's ignore it
            } else { // >= 0
                ByteBuffer encodedData = encoderOutputBuffers[encoderStatus];

                byte[] outData = new byte[mBufferInfo.size];
                encodedData.get(outData);
                if (arrInfo != null) {
                    System.arraycopy(outData, 0, arrOutput, pos, outData.length);
                    pos += outData.length;
                } else {
                    ByteBuffer spsPpsBuffer = ByteBuffer.wrap(outData);
                    if (spsPpsBuffer.getInt() == 0x00000001) {
                        arrInfo = new byte[outData.length];
                        System.arraycopy(outData, 0, arrInfo, 0, outData.length);

                        findSpsAndPps(arrInfo);
                    }
                }

                if (encodedData == null) {
                    throw new RuntimeException("encoderOutputBuffer " + encoderStatus +
                            " was null");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0) {


                    MediaFormat format = MediaFormat.createVideoFormat(MIME_TYPE, 120, 120);

                    format.setByteBuffer("csd-0", ByteBuffer.wrap(arrSps));
                    format.setByteBuffer("csd-1", ByteBuffer.wrap(arrPps));

                    decoder.configure(format, mSurfaceView.getHolder().getSurface(), null, 0);
                    decoder.start();

                    decoderConfigured = true;

                } else {


                    ByteBuffer[] decoderInputBuffers = decoder.getInputBuffers();

                    int inputBufferIndex = decoder.dequeueInputBuffer(-1);
                    if (inputBufferIndex >= 0) {
                        ByteBuffer inputBuffer = decoderInputBuffers[inputBufferIndex];
                        inputBuffer.clear();
                        inputBuffer.put(arrOutput, 0, pos);
                        decoder.queueInputBuffer(inputBufferIndex, 0, pos, System.currentTimeMillis(), 0);
                    }
                }

                mEncoder.releaseOutputBuffer(encoderStatus, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
                    Log.w(TAG, "reached end of stream unexpectedly");
                    break;      // out of while
                }
            }

            if (decoderConfigured) {

                int decoderStatus = decoder.dequeueOutputBuffer(mBufferInfo, 10000);
                Log.i("decoderStatus", "decoderStatus =====" + decoderStatus);
                if (decoderStatus == MediaCodec.INFO_TRY_AGAIN_LATER) {
                    if (VERBOSE) Log.d(TAG, "no output from decoder available");
                } else if (decoderStatus == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {
                    if (VERBOSE) Log.d(TAG, "decoder output buffers changed");
                } else if (decoderStatus == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {
                    MediaFormat newFormat = decoder.getOutputFormat();
                    if (VERBOSE) Log.d(TAG, "decoder output format changed: " + newFormat);
                } else if (decoderStatus < 0) {
                    throw new RuntimeException(
                            "unexpected result from decoder.dequeueOutputBuffer: " +
                                    decoderStatus);
                } else { // decoderStatus >= 0
                    if (VERBOSE) Log.d(TAG, "surface decoder given buffer " + decoderStatus +
                            " (size=" + mBufferInfo.size + ")");

                    boolean doRender = (mBufferInfo.size != 0);
                    decoder.releaseOutputBuffer(decoderStatus, doRender);
                    // decoder.flush();    // reset decoder state 
                }
            }

        }
    }
 类似资料:
  • 在我的android应用程序中,我将视频编码为base64,如下所示。 File File=new File(path); InputStream is=new FileInputStream(File); int length=(int)File.length(); byte[]bytes=new byte[lengt]; int a=is.read(bytes,0,length); Strin

  • 我正在开发一个通过RTP接收H264编码数据的应用程序,但我无法让Android的MediaCodec输出任何内容。我正在按照https://stackoverflow.com/a/7668578/10788248对RTP数据包进行解包 在编码帧被重新组装后,我将它们输入到出列的输入缓冲区中。 当我对输入缓冲区进行排队时,我不会得到任何错误,但是解码器的回调从来不会调用onOutputBuffer

  • 12 视频解码器 介绍当前可用的一些视频解码器 rawvideo 用于RAW视频解码。即解码rawvideo流。 rawvideo解码选项 top top_field_first 指定输入视频的呈现字段类型 -1 步进视频 (默认) 0 下场优先(底部优先) 1 上场优先(顶部优先)

  • 17 视频编码器 介绍一些当前有效的视频编码器 libtheora libtheora的封装 编译需要头和库文件,还需要利用--enable-libtheora在配置中允许 更多信息参考http://www.theora.org/ libtheora选项 下面是映射给libtheora的全局选项,它们对品质和码率产生影响。 b 对CBR(固定码率编码)设置码率,单位bit/s,在VBR(动态码率编

  • 我正在尝试使用MediaCodec和MediaMuxer对来自相机的视频和来自麦克风的音频进行编码。我在录制时使用OpenGL在图像上覆盖文本。 我以这些课程为例: http://bigflake.com/mediacodec/CameraToMpegTest.java.txt https://github.com/OnlyInAmerica/HWEncoderExperiments/blob/m

  • 本文向大家介绍python如何实现视频转代码视频,包括了python如何实现视频转代码视频的使用技巧和注意事项,需要的朋友参考一下 本文实例为大家分享了python如何实现视频转代码视频的具体代码,供大家参考,具体内容如下 流程图: 这次python编程的流程图如下:  注意事项: 在编程的过程中有需要注意的几点: 这次编程使用到了opencv库,需要安装 帧率的获取可以通过这个函数——FPS =