android:SurfaceTexure,相机帧等待时间

概述我正在尝试使用MediaCodec和MediaMux,我遇到了一些麻烦.以下是logcat的错误:12-1311:59:58.238:E/AndroidRuntime(23218):FATALEXCEPTION:main12-1311:59:58.238:E/AndroidRuntime(23218):java.lang.RuntimeException:Unabletoresumeactivity{com.brendon.camerato

我正在尝试使用MediaCodec和MediaMux,我遇到了一些麻烦.

以下是logcat的错误:

12-13 11:59:58.238: E/AndroIDRuntime(23218): FATAL EXCEPTION: main12-13 11:59:58.238: E/AndroIDRuntime(23218): java.lang.RuntimeException: Unable to resume activity {com.brendon.cameratompeg/com.brendon.cameratompeg.CameraToMpeg}: java.lang.IllegalStateException: Can't stop due to wrong state.12-13 11:59:58.238: E/AndroIDRuntime(23218):    at androID.app.ActivityThread.performResumeActivity(ActivityThread.java:2918)

代码在“mStManager.awaitNewImage();”处出错,它位于onResume函数中. logcat说“相机等待时间”.
mStManager是SurfaceTextureManager类的一个实例.并且“相机帧等待时间”来自awaitNewImage()函数.我已将该课程添加到我的帖子中.

我的部分代码是这样的(onCreate函数和onResume函数):

  @OverrIDe    protected voID onCreate(Bundle savedInstanceState) {        // arbitrary but popular values        int encWIDth = 640;        int encHeight = 480;        int encBitRate = 6000000;      // Mbps        Log.d(TAG, MIME_TYPE + " output " + encWIDth + "x" + encHeight + " @" + encBitRate);        super.onCreate(savedInstanceState);        setContentVIEw(R.layout.activity_camera_to_mpeg);           prepareCamera(encWIDth, encHeight);           prepareEncoder(encWIDth, encHeight, encBitRate);           m@R_301_5983@Surface.makeCurrent();           prepareSurfaceTexture();           mCamera.startPrevIEw();         }@OverrIDepublic voID onResume(){    try {         long startWhen = System.nanoTime();           long desiredEnd = startWhen + DURATION_SEC * 1000000000L;           SurfaceTexture st = mStManager.getSurfaceTexture();           int frameCount = 0;        while (System.nanoTime() < desiredEnd) {            // Feed any pending encoder output into the muxer.            drainEncoder(false);            // Switch up the colors every 15 frames.  BesIDes demonstrating the use of            // fragment shaders for vIDeo editing, this provIDes a visual indication of            // the frame rate: if the camera is capturing at 15fps, the colors will change            // once per second.            if ((frameCount % 15) == 0) {                String fragmentShader = null;                if ((frameCount & 0x01) != 0) {                    fragmentShader = SWAPPED_FRAGMENT_SHADER;                }                mStManager.changeFragmentShader(fragmentShader);            }            frameCount++;            // Acquire a new frame of @R_301_5983@, and render it to the Surface.  If we had a            // GLSurfaceVIEw we Could switch EGL contexts and call drawImage() a second            // time to render it on screen.  The texture can be shared between contexts by            // passing the GLSurfaceVIEw's EGLContext as eglCreateContext()'s share_context            // argument.            mStManager.awaitNewImage();            mStManager.drawImage();            // Set the presentation time stamp from the SurfaceTexture's time stamp.  This            // will be used by Mediamuxer to set the PTS in the vIDeo.            if (VERBOSE) {                Log.d(TAG, "present: " +                        ((st.getTimestamp() - startWhen) / 1000000.0) + "ms");            }            m@R_301_5983@Surface.setPresentationTime(st.getTimestamp());            // submit it to the encoder.  The eglSwapBuffers call will block if the @R_301_5983@            // is full, which would be bad if it stayed full until we dequeued an output            // buffer (which we can't do, since we're stuck here).  So long as we fully drain            // the encoder before supplying additional @R_301_5983@, the system guarantees that we            // can supply another frame without blocking.            if (VERBOSE) Log.d(TAG, "sending frame to encoder");            m@R_301_5983@Surface.swapBuffers();        }        // send end-of-stream to encoder, and drain remaining output        drainEncoder(true);    } catch(Exception e) {        Log.d(TAG,  e.getMessage());        // release everything we grabbed        releaseCamera();        releaseEncoder();        releaseSurfaceTexture();    }}

代码中与错误相关的类

 private static class SurfaceTextureManager            implements SurfaceTexture.OnFrameAvailableListener {        private SurfaceTexture mSurfaceTexture;        private CameraToMpeg.STextureRender mTextureRender;        private Object mFrameSyncObject = new Object();     // guards mFrameAvailable        private boolean mFrameAvailable;        /**         * Creates instances of TextureRender and SurfaceTexture.         */        public SurfaceTextureManager() {            mTextureRender = new CameraToMpeg.STextureRender();            mTextureRender.surfaceCreated();            if (VERBOSE) Log.d(TAG, "textureID=" + mTextureRender.getTextureID());            mSurfaceTexture = new SurfaceTexture(mTextureRender.getTextureID());            // This doesn't work if this object is created on the thread that CTS started for            // these test cases.            //            // The CTS-created thread has a Looper, and the SurfaceTexture constructor will            // create a Handler that uses it.  The "frame available" message is delivered            // there, but since we're not a Looper-based thread we'll never see it.  For            // this to do anything useful, OutputSurface must be created on a thread without            // a Looper, so that SurfaceTexture uses the main application Looper instead.            //            // Java language note: passing "this" out of a constructor is generally unwise,            // but we should be able to get away with it here.            mSurfaceTexture.setonFrameAvailableListener(this);        }        public voID release() {            // this causes a bunch of warnings that appear harmless but might confuse someone:            //  W BufferQueue: [unnamed-3997-2] cancelBuffer: BufferQueue has been abandoned!            //mSurfaceTexture.release();            mTextureRender = null;            mSurfaceTexture = null;        }        /**         * Returns the SurfaceTexture.         */        public SurfaceTexture getSurfaceTexture() {            return mSurfaceTexture;        }        /**         * Replaces the fragment shader.         */        public voID changeFragmentShader(String fragmentShader) {            mTextureRender.changeFragmentShader(fragmentShader);        }        /**         * Latches the next buffer into the texture.  Must be called from the thread that created         * the OutputSurface object.         */        public voID awaitNewImage() {            final int TIMEOUT_MS = 2500;            synchronized (mFrameSyncObject) {                while (!mFrameAvailable) {                    try {                        // Wait for onFrameAvailable() to signal us.  Use a timeout to avoID                        // stalling the test if it doesn't arrive.                        mFrameSyncObject.wait(TIMEOUT_MS);                        if (!mFrameAvailable) {                            // Todo: if "spurIoUs wakeup", continue while loop                            throw new RuntimeException("Camera frame wait timed out");                        }                    } catch (InterruptedException IE) {                        // shouldn't happen                        throw new RuntimeException(IE);                    }                }                mFrameAvailable = false;            }            // Latch the data.            mTextureRender.checkGlError("before updateTexImage");            mSurfaceTexture.updateTexImage();        }        /**         * Draws the data from SurfaceTexture onto the current EGL surface.         */        public voID drawImage() {            mTextureRender.drawFrame(mSurfaceTexture);        }        @OverrIDe        public voID onFrameAvailable(SurfaceTexture st) {            if (VERBOSE) Log.d(TAG, "new frame available");            synchronized (mFrameSyncObject) {                if (mFrameAvailable) {                    throw new RuntimeException("mFrameAvailable already set, frame Could be dropped");                }                mFrameAvailable = true;                mFrameSyncObject.notifyAll();            }        }    }

有没有人有任何想法?谢谢!

@H_301_32@解决方法:

我也遇到过这个问题.因此,原因是您的代码在具有looper的线程上运行.您必须确保代码在没有looper的线程上运行.如果是这样,SurfaceTexture.OnFrameAvailableListener将把“frame available”消息传递给等待的线程,而不是将消息发送到主线程上的Handler,你就会陷入困境.

Bigflake的示例为您提供了详细说明:

/** * Wraps testEditVIDeo, running it in a new thread.  required because of the way * SurfaceTexture.OnFrameAvailableListener works when the current thread has a Looper * configured. */private static class VIDeoEditWrapper implements Runnable {    private Throwable mThrowable;    private DecodeEditEncodeTest mTest;    private VIDeoEditWrapper(DecodeEditEncodeTest test) {        mTest = test;    }    @OverrIDe    public voID run() {        try {            mTest.vIDeoEdittest();        } catch (Throwable th) {            mThrowable = th;        }    }    /** Entry point. */    public static voID runTest(DecodeEditEncodeTest obj) throws Throwable {        VIDeoEditWrapper wrapper = new VIDeoEditWrapper(obj);        Thread th = new Thread(wrapper, "codec test");        th.start();        th.join();        if (wrapper.mThrowable != null) {            throw wrapper.mThrowable;        }    }}
总结

以上是内存溢出为你收集整理的android:SurfaceTexure,相机帧等待时间全部内容,希望文章能够帮你解决android:SurfaceTexure,相机帧等待时间所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1107086.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-28
下一篇 2022-05-28

发表评论

登录后才能评论

评论列表(0条)

保存