如何在iOS 8中使用AVSampleBufferDisplayLayer进行RTP H264 Streams with GStreamer?

如何在iOS 8中使用AVSampleBufferDisplayLayer进行RTP H264 Streams with GStreamer?,第1张

概述在iOS 8中的程序员获取HW-H264解码器的通知之后,我想现在使用它.有一个很好的介绍,直接访问视频编码和解码从WWDC 2014年那里.你可以看看 here. 基于案例1,我开始开发一个应用程序,应该可以从GStreamer获取一个H264-RTP-UDP-Stream,将其下载到一个“appsink”元素中,以直接访问NAL单元并执行转换创建CMSampleBuffers,我的AVSamp 在iOS 8中的程序员获取HW-H264解码器的通知之后,我想现在使用它.有一个很好的介绍,直接访问视频编码和解码从WWDC 2014年那里.你可以看看 here.

基于案例1,我开始开发一个应用程序,应该可以从GStreamer获取一个H264-RTP-UDP-Stream,将其下载到一个“appsink”元素中,以直接访问NAL单元并执行转换创建CMSampleBuffers,我的AVSampleBufferDisplayLayer可以显示.

有趣的代码如下:

////  GStreamerBackend.m// #import "GStreamerBackend.h"Nsstring * const naluTypesstrings[] = {    @"UnspecifIEd (non-VCL)",@"Coded slice of a non-IDR picture (VCL)",@"Coded slice data partition A (VCL)",@"Coded slice data partition B (VCL)",@"Coded slice data partition C (VCL)",@"Coded slice of an IDR picture (VCL)",@"Supplemental enhancement information (SEI) (non-VCL)",@"Sequence parameter set (non-VCL)",@"Picture parameter set (non-VCL)",@"Access unit delimiter (non-VCL)",@"End of sequence (non-VCL)",@"End of stream (non-VCL)",@"Filler data (non-VCL)",@"Sequence parameter set extension (non-VCL)",@"Prefix NAL unit (non-VCL)",@"Subset sequence parameter set (non-VCL)",@"Reserved (non-VCL)",@"Coded slice of an auxiliary coded picture without partitioning (non-VCL)",@"Coded slice extension (non-VCL)",@"Coded slice extension for depth vIEw components (non-VCL)",@"UnspecifIEd (non-VCL)",};static GstFlowReturn new_sample(GstAppSink *sink,gpointer user_data){    GStreamerBackend *backend = (__brIDge GStreamerBackend *)(user_data);    GstSample *sample = gst_app_sink_pull_sample(sink);    GstBuffer *buffer = gst_sample_get_buffer(sample);    GstMemory *memory = gst_buffer_get_all_memory(buffer);    GstMAPInfo info;    gst_memory_map (memory,&info,GST_MAP_READ);    int startCodeIndex = 0;    for (int i = 0; i < 5; i++) {        if (info.data[i] == 0x01) {            startCodeIndex = i;            break;        }    }    int nalu_type = ((uint8_t)info.data[startCodeIndex + 1] & 0x1F);    NSLog(@"NALU with Type \"%@\" received.",naluTypesstrings[nalu_type]);    if(backend.searchForSPSAndPPS) {        if (nalu_type == 7)            backend.spsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];        if (nalu_type == 8)            backend.ppsData = [NSData dataWithBytes:&(info.data[startCodeIndex + 1]) length: info.size - 4];        if (backend.spsData != nil && backend.ppsData != nil) {            const uint8_t* const parameterSetPointers[2] = { (const uint8_t*)[backend.spsData bytes],(const uint8_t*)[backend.ppsData bytes] };            const size_t parameterSetSizes[2] = { [backend.spsData length],[backend.ppsData length] };            CMVIDeoFormatDescriptionRef vIDeoFormatDescr;            Osstatus status = CMVIDeoFormatDescriptionCreateFromH264ParameterSets(kcfAllocatorDefault,2,parameterSetPointers,parameterSetSizes,4,&vIDeoFormatDescr);            [backend setVIDeoFormatDescr:vIDeoFormatDescr];            [backend setSearchForSPSAndPPS:false];            NSLog(@"Found all data for CMVIDeoFormatDescription. Creation: %@.",(status == noErr) ? @"successfully." : @"Failed.");        }    }    if (nalu_type == 1 || nalu_type == 5) {        CMBlockBufferRef vIDeoBlock = NulL;        Osstatus status = CMBlockBufferCreateWithMemoryBlock(NulL,info.data,info.size,kcfAllocatorNull,NulL,&vIDeoBlock);        NSLog(@"BlockBufferCreation: %@",(status == kCMBlockBufferNoErr) ? @"successfully." : @"Failed.");        const uint8_t sourceBytes[] = {(uint8_t)(info.size >> 24),(uint8_t)(info.size >> 16),(uint8_t)(info.size >> 8),(uint8_t)info.size};        status = CMBlockBufferReplaceDataBytes(sourceBytes,vIDeoBlock,4);        NSLog(@"BlockBufferReplace: %@",(status == kCMBlockBufferNoErr) ? @"successfully." : @"Failed.");        CMSampleBufferRef sbRef = NulL;        const size_t sampleSizeArray[] = {info.size};        status = CMSampleBufferCreate(kcfAllocatorDefault,true,backend.vIDeoFormatDescr,1,sampleSizeArray,&sbRef);        NSLog(@"SampleBufferCreate: %@",(status == noErr) ? @"successfully." : @"Failed.");        CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sbRef,YES);        cfmutabledictionaryRef dict = (cfmutabledictionaryRef)CFArrayGetValueAtIndex(attachments,0);        CFDictionarySetValue(dict,kCMSampleAttachmentKey_displayImmediately,kcfBooleanTrue);        NSLog(@"Error: %@,Status:%@",backend.displayLayer.error,(backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusUnkNown)?@"unkNown":((backend.displayLayer.status == AVQueuedSampleBufferRenderingStatusRendering)?@"rendering":@"Failed"));        dispatch_async(dispatch_get_main_queue(),^{            [backend.displayLayer enqueueSampleBuffer:sbRef];            [backend.displayLayer setNeedsdisplay];        });    }    gst_memory_unmap(memory,&info);    gst_memory_unref(memory);    gst_buffer_unref(buffer);    return GST_FLOW_OK;}@implementation GStreamerBackend- (instancetype)init{    if (self = [super init]) {        self.searchForSPSAndPPS = true;        self.ppsData = nil;        self.spsData = nil;        self.displayLayer = [[AVSampleBufferdisplayLayer alloc] init];        self.displayLayer.bounds = CGRectMake(0,300,300);        self.displayLayer.backgroundcolor = [UIcolor blackcolor].CGcolor;        self.displayLayer.position = CGPointMake(500,500);        self.queue = dispatch_get_global_queue(disPATCH_QUEUE_PRIORITY_DEFAulT,0);        dispatch_async(self.queue,^{            [self app_function];        });    }    return self;}- (voID)start{    if(gst_element_set_state(self.pipeline,GST_STATE_PLAYING) == GST_STATE_CHANGE_FAILURE) {        NSLog(@"Failed to set pipeline to playing");    }}- (voID)app_function{    GstElement *udpsrc,*rtphdepay,*capsfilter;    GMainContext *context; /* Glib context used to run the main loop */    GMainLoop *main_loop;  /* Glib main loop */    context = g_main_context_new ();    g_main_context_push_thread_default(context);    g_set_application_name ("appsink");    self.pipeline = gst_pipeline_new ("testpipe");    udpsrc = gst_element_factory_make ("udpsrc","udpsrc");    GstCaps *caps = gst_caps_new_simple("application/x-rtp","media",G_TYPE_STRING,"vIDeo","clock-rate",G_TYPE_INT,90000,"enCoding-name","H264",NulL);    g_object_set(udpsrc,"caps",caps,"port",5000,NulL);    gst_caps_unref(caps);    rtphdepay = gst_element_factory_make("rtph264depay","rtph264depay");    capsfilter = gst_element_factory_make("capsfilter","capsfilter");    caps = gst_caps_new_simple("vIDeo/x-h264","streamformat","byte-stream","alignment","nal",NulL);    g_object_set(capsfilter,NulL);    self.appsink = gst_element_factory_make ("appsink","appsink");    gst_bin_add_many (GST_BIN (self.pipeline),udpsrc,rtphdepay,capsfilter,self.appsink,NulL);    if(!gst_element_link_many (udpsrc,NulL)) {        NSLog(@"Cannot link gstreamer elements");        exit (1);    }    if(gst_element_set_state(self.pipeline,GST_STATE_READY) != GST_STATE_CHANGE_SUCCESS)        NSLog(@"Could not change to ready");    GstAppSinkCallbacks callbacks = { NulL,new_sample,NulL};    gst_app_sink_set_callbacks (GST_APP_SINK(self.appsink),&callbacks,(__brIDge gpointer)(self),NulL);    main_loop = g_main_loop_new (context,FALSE);    g_main_loop_run (main_loop);    /* Free resources */    g_main_loop_unref (main_loop);    main_loop = NulL;    g_main_context_pop_thread_default(context);    g_main_context_unref (context);    gst_element_set_state (GST_ELEMENT (self.pipeline),GST_STATE_NulL);    gst_object_unref (GST_OBJECT (self.pipeline));}@end

运行应用程序并开始流式传输到iOS设备时所获得的内容:

NALU with Type "Sequence parameter set (non-VCL)" received.NALU with Type "Picture parameter set   (non-VCL)" received.Found all data for CMVIDeoFormatDescription. Creation: successfully..NALU with Type "Coded slice of an IDR picture (VCL)" received.BlockBufferCreation: successfully.BlockBufferReplace: successfully.SampleBufferCreate: successfully.Error: (null),Status:unkNownNALU with Type "Coded slice of a non-IDR picture (VCL)" received.BlockBufferCreation: successfully.BlockBufferReplace: successfully.SampleBufferCreate: successfully.Error: (null),Status:rendering[...] (repetition of the last 5 lines)

所以似乎解码,因为它应该做,但我的问题是,我看不到任何东西在我的AVSampleBufferdisplayLayer.
这可能是kCMSampleAttachmentKey_DisplayImmediately的一个问题,但我已经设置好像我被告知到here (see the ‘important’ note).

欢迎每一个想法;)

解决方法 现在工作了每个NALU的长度不包含长度头本身.所以我已经从我的info.size中减去4,然后再将它用于我的sourceBytes. 总结

以上是内存溢出为你收集整理的如何在iOS 8中使用AVSampleBufferDisplayLayer进行RTP H264 Streams with GStreamer?全部内容,希望文章能够帮你解决如何在iOS 8中使用AVSampleBufferDisplayLayer进行RTP H264 Streams with GStreamer?所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1110198.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-29
下一篇 2022-05-29

发表评论

登录后才能评论

评论列表(0条)

保存