Android Webrtc记录来自其他同行的流的视频

Android Webrtc记录来自其他同行的流的视频,第1张

概述我正在开发一个webrtc视频通话 Android应用程序,它工作得很好,我需要录制其他对等(remoteVideoStream)和myStream(localVideoStream)的视频,并将其转换为某些可保存的格式,如mp4或任何其他格式,我真的在寻找它,但却无法弄清楚如何完成这项工作. 我已经阅读了有关VideoFileRenderer的内容,我尝试将其添加到我的代码中以保存视频但是也无法 我正在开发一个webrtc视频通话 Android应用程序,它工作得很好,我需要录制其他对等(remoteVIDeoStream)和myStream(localVIDeoStream)的视频,并将其转换为某些可保存的格式,如mp4或任何其他格式,我真的在寻找它,但却无法弄清楚如何完成这项工作.

我已经阅读了有关VIDeofileRenderer的内容,我尝试将其添加到我的代码中以保存视频但是也无法使用它也没有任何方法调用例如record()或save(),尽管它有一个名为release()的方法这将用于结束保存视频.如果任何人有任何想法,这是课程:

@JNInamespace("webrtc::jni")public class VIDeofileRenderer implements Callbacks,VIDeosink {private static final String TAG = "VIDeofileRenderer";private final HandlerThread renderThread;private final Handler renderThreadHandler;private final fileOutputStream vIDeoOutfile;private final String outputfilename;private final int outputfileWIDth;private final int outputfileHeight;private final int outputFrameSize;private final ByteBuffer outputFrameBuffer;private EglBase eglBase;private YuvConverter yuvConverter;private ArrayList<ByteBuffer> rawFrames = new ArrayList();public VIDeofileRenderer(String outputfile,int outputfileWIDth,int outputfileHeight,final Context sharedContext) throws IOException {    if (outputfileWIDth % 2 != 1 && outputfileHeight % 2 != 1) {        this.outputfilename = outputfile;        this.outputfileWIDth = outputfileWIDth;        this.outputfileHeight = outputfileHeight;        this.outputFrameSize = outputfileWIDth * outputfileHeight * 3 / 2;        this.outputFrameBuffer = ByteBuffer.allocateDirect(this.outputFrameSize);        this.vIDeoOutfile = new fileOutputStream(outputfile);        this.vIDeoOutfile.write(("YUV4MPEG2 C420 W" + outputfileWIDth + " H" + outputfileHeight + " Ip F30:1 A1:1\n").getBytes(Charset.forname("US-ASCII")));        this.renderThread = new HandlerThread("VIDeofileRenderer");        this.renderThread.start();        this.renderThreadHandler = new Handler(this.renderThread.getLooper());        ThreadUtils.invokeAtFrontUninterruptibly(this.renderThreadHandler,new Runnable() {            public voID run() {                VIDeofileRenderer.this.eglBase = EglBase.create(sharedContext,EglBase.CONfig_PIXEL_BUFFER);                VIDeofileRenderer.this.eglBase.createDummyPbufferSurface();                VIDeofileRenderer.this.eglBase.makeCurrent();                VIDeofileRenderer.this.yuvConverter = new YuvConverter();            }        });    } else {        throw new IllegalArgumentException("Does not support uneven wIDth or height");    }}public voID renderFrame(I420Frame i420Frame) {    VIDeoFrame frame = i420Frame.toVIDeoFrame();    this.onFrame(frame);    frame.release();}public voID onFrame(VIDeoFrame frame) {    frame.retain();    this.renderThreadHandler.post(() -> {        this.renderFrameOnRenderThread(frame);    });}private voID renderFrameOnRenderThread(VIDeoFrame frame) {    Buffer buffer = frame.getBuffer();    int targetWIDth = frame.getRotation() % 180 == 0 ? this.outputfileWIDth : this.outputfileHeight;    int targetHeight = frame.getRotation() % 180 == 0 ? this.outputfileHeight : this.outputfileWIDth;    float frameAspectRatio = (float)buffer.getWIDth() / (float)buffer.getHeight();    float fileAspectRatio = (float)targetWIDth / (float)targetHeight;    int cropWIDth = buffer.getWIDth();    int cropHeight = buffer.getHeight();    if (fileAspectRatio > frameAspectRatio) {        cropHeight = (int)((float)cropHeight * (frameAspectRatio / fileAspectRatio));    } else {        cropWIDth = (int)((float)cropWIDth * (fileAspectRatio / frameAspectRatio));    }    int cropX = (buffer.getWIDth() - cropWIDth) / 2;    int cropY = (buffer.getHeight() - cropHeight) / 2;    Buffer scaledBuffer = buffer.cropAndScale(cropX,cropY,cropWIDth,cropHeight,targetWIDth,targetHeight);    frame.release();    I420Buffer i420 = scaledBuffer.toI420();    scaledBuffer.release();    ByteBuffer byteBuffer = JniCommon.nativeAllocateByteBuffer(this.outputFrameSize);    YuvHelper.I420Rotate(i420.getDataY(),i420.getStrIDeY(),i420.getDataU(),i420.getStrIDeU(),i420.getDataV(),i420.getStrIDeV(),byteBuffer,i420.getWIDth(),i420.getHeight(),frame.getRotation());    i420.release();    byteBuffer.rewind();    this.rawFrames.add(byteBuffer);}public voID release() {    CountDownLatch cleanupbarrIEr = new CountDownLatch(1);    this.renderThreadHandler.post(() -> {        this.yuvConverter.release();        this.eglBase.release();        this.renderThread.quit();        cleanupbarrIEr.countDown();    });    ThreadUtils.awaitUninterruptibly(cleanupbarrIEr);    try {        Iterator var2 = this.rawFrames.iterator();        while(var2.hasNext()) {            ByteBuffer buffer = (ByteBuffer)var2.next();            this.vIDeoOutfile.write("FRAME\n".getBytes(Charset.forname("US-ASCII")));            byte[] data = new byte[this.outputFrameSize];            buffer.get(data);            this.vIDeoOutfile.write(data);            JniCommon.nativeFreeByteBuffer(buffer);        }        this.vIDeoOutfile.close();        Logging.d("VIDeofileRenderer","VIDeo written to disk as " + this.outputfilename + ". Number frames are " + this.rawFrames.size() + " and the dimension of the frames are " + this.outputfileWIDth + "x" + this.outputfileHeight + ".");    } catch (IOException var5) {        Logging.e("VIDeofileRenderer","Error writing vIDeo to disk",var5);    }}

}

我找不到任何有用的方法可以提供帮助.

解决方法 VIDeofileRenderer类只演示了如何访问远程/本地对等体的解码原始视频帧.
这不是录制有效的视频文件.
您应该手动实现编码和将原始视频帧复用到容器中的逻辑,如mp4.

主要流程如下:

>切换到最新的webrtc版本(现在为v.0.0.25331)
>创建视频容器.例如,请参阅AndroID SDK中的MediaMuxer类
>实现接口VIDeosink以从特定视频源获取原始帧.例如,请参阅apprtc/CallActivity.java类ProxyVIDeosink
>使用MediaCodec对每帧进行编码并写入视频容器>敲定muxer

总结

以上是内存溢出为你收集整理的Android Webrtc记录来自其他同行的流的视频全部内容,希望文章能够帮你解决Android Webrtc记录来自其他同行的流的视频所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1136052.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-30
下一篇 2022-05-30

发表评论

登录后才能评论

评论列表(0条)

保存