ios – 从CMBlockBuffer中提取h264

ios – 从CMBlockBuffer中提取h264,第1张

概述我正在使用Apple VideoTool Box(iOS)来压缩设备摄像头拍摄的原始帧. 我的回调正在使用包含CMBlockBuffer的CMSampleBufferRef对象进行调用. CMBlockBuffer对象包含H264基本流,但是我没有找到任何方法来获取基本流的指针. 当我打印到控制台CMSampleBufferRef对象我得到: (lldb) po blockBufferRefCM 我正在使用Apple VIDeoTool Box(iOS)来压缩设备摄像头拍摄的原始帧.

我的回调正在使用包含CMBlockBuffer的CMSampleBufferRef对象进行调用.

CMBlockBuffer对象包含H264基本流,但是我没有找到任何方法来获取基本流的指针.

当我打印到控制台CMSampleBufferRef对象我得到:

(lldb) po blockBufferRefCMBlockBuffer 0x1701193e0 totalDataLength: 4264 retainCount: 1 allocator: 0x1957c2c80 subBlockCapacity: 2 [0] 4264 bytes @ offset 128 Buffer Reference:    CMBlockBuffer 0x170119350 totalDataLength: 4632 retainCount: 1 allocator: 0x1957c2c80 subBlockCapacity: 2     [0] 4632 bytes @ offset 0 Memory Block 0x10295c000,4632 bytes (custom V=0 A=0x0 F=0x18498bb44 R=0x0)

看来,我设法获取指针的CMBlockBuffer对象正在禁止另一个不可访问的CMBlockBuferRef(4632字节).

任何人都可以发表如何访问H264元素流?

谢谢!

解决方法 现在我一直在努力争取一段时间,终于弄清楚了一切.

函数CMBlockBufferGetDataPointer可以访问所需的所有数据,但是有一些不太明显的事情需要将其转换为基本流.

AVCC与附件B格式

CMBlockBuffer中的数据以AVCC格式存储,而基本流通常遵循附件B规范(here是两种格式的优秀概述).在AVCC格式中,4个第一个字节包含NAL单元的长度(H264数据包的另一个字).您需要使用4字节起始码替换此标头:0x00 0x00 0x00 0x01,用作附件B基本流中的NAL单元之间的分隔符(3字节版本0x00 0x00 0x01也可以正常工作).

单个CMBlockBuffer中有多个NAL单元

下一个不是很明显的事情是,单个CMBlockBuffer有时会包含多个NAL单元.苹果似乎在每个I帧NAL单元(也称为IDR)中添加了一个包含元数据的附加NAL单元(SEI).这可能是为什么您在单个CMBlockBuffer对象中看到多个缓冲区.但是,CMBlockBufferGetDataPointer函数为您提供了访问所有数据的单个指针.话虽如此,多个NAL单元的存在使AVCC头的转换变得复杂.现在,您实际上必须读取AVCC标题中包含的长度值以找到下一个NAL单元,并继续转换标题,直到达到缓冲区的结尾.

Big-Endian与little-Endian

下一个不是很明显的事情是,AVCC头是以Big-Endian格式存储的,而iOS是本机的little-Endian.因此,当您读取AVCC头文件中包含的长度值时,首先将其传递给CFSwAPInt32BigToHost函数.

SPS和PPS NAL单位

最后不太明显的是,CMBlockBuffer内的数据不包含参数NAL单元SPS和PPS,其中包含解码器的配置参数,如配置文件,级别,分辨率,帧速率.这些作为元数据存储在样本缓冲区的格式描述中,可以通过CMVIDeoFormatDescriptionGetH264ParameterSetAtIndex函数进行访问.请注意,您必须在发送之前将起始码添加到这些NAL单元. SPS和PPS NAL设备不必每个新的帧发送.解码器只需要读取一次,但通常会周期性重新发送它们,例如在每个新的I帧NAL单元之前.

代码示例

以下是考虑所有这些事项的代码示例.

static voID vIDeoFrameFinishedEnCoding(voID *outputCallbackRefCon,voID *sourceFrameRefCon,Osstatus status,VTEncodeInfoFlags infoFlags,CMSampleBufferRef sampleBuffer) {    // Check if there were any errors enCoding    if (status != noErr) {        NSLog(@"Error enCoding vIDeo,err=%lld",(int64_t)status);        return;    }    // In this example we will use a NSMutableData object to store the    // elementary stream.    NSMutableData *elementaryStream = [NSMutableData data];    // Find out if the sample buffer contains an I-Frame.    // If so we will write the SPS and PPS NAL units to the elementary stream.    BOol isiframe = NO;    CFArrayRef attachmentsArray = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer,0);    if (CFArrayGetCount(attachmentsArray)) {        CFBooleanRef notSync;        CFDictionaryRef dict = CFArrayGetValueAtIndex(attachmentsArray,0);        BOol keyExists = CFDictionaryGetValueIfPresent(dict,kCMSampleAttachmentKey_NotSync,(const voID **)&notSync);        // An I-Frame is a sync frame        isiframe = !keyExists || !CFBooleanGetValue(notSync);    }    // This is the start code that we will write to    // the elementary stream before every NAL unit    static const size_t startCodeLength = 4;    static const uint8_t startCode[] = {0x00,0x00,0x01};    // Write the SPS and PPS NAL units to the elementary stream before every I-Frame    if (isiframe) {        CMFormatDescriptionRef description = CMSampleBufferGetFormatDescription(sampleBuffer);        // Find out how many parameter sets there are        size_t numberOfParameterSets;        CMVIDeoFormatDescriptionGetH264ParameterSetAtIndex(description,NulL,&numberOfParameterSets,NulL);        // Write each parameter set to the elementary stream        for (int i = 0; i < numberOfParameterSets; i++) {            const uint8_t *parameterSetPointer;            size_t parameterSetLength;            CMVIDeoFormatDescriptionGetH264ParameterSetAtIndex(description,i,&parameterSetPointer,&parameterSetLength,NulL);            // Write the parameter set to the elementary stream            [elementaryStream appendBytes:startCode length:startCodeLength];            [elementaryStream appendBytes:parameterSetPointer length:parameterSetLength];        }    }    // Get a pointer to the raw AVCC NAL unit data in the sample buffer    size_t blockBufferLength;    uint8_t *bufferDataPointer = NulL;    CMBlockBufferGetDataPointer(CMSampleBufferGetDataBuffer(sampleBuffer),&blockBufferLength,(char **)&bufferDataPointer);    // Loop through all the NAL units in the block buffer    // and write them to the elementary stream with    // start codes instead of AVCC length headers    size_t bufferOffset = 0;    static const int AVCCheaderLength = 4;    while (bufferOffset < blockBufferLength - AVCCheaderLength) {        // Read the NAL unit length        uint32_t NALUnitLength = 0;        memcpy(&NALUnitLength,bufferDataPointer + bufferOffset,AVCCheaderLength);        // Convert the length value from Big-endian to little-endian        NALUnitLength = CFSwAPInt32BigToHost(NALUnitLength);        // Write start code to the elementary stream        [elementaryStream appendBytes:startCode length:startCodeLength];        // Write the NAL unit without the AVCC length header to the elementary stream        [elementaryStream appendBytes:bufferDataPointer + bufferOffset + AVCCheaderLength                               length:NALUnitLength];        // Move to the next NAL unit in the block buffer        bufferOffset += AVCCheaderLength + NALUnitLength;    }}
总结

以上是内存溢出为你收集整理的ios – 从CMBlockBuffer中提取h264全部内容,希望文章能够帮你解决ios – 从CMBlockBuffer中提取h264所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1112141.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-29
下一篇 2022-05-29

发表评论

登录后才能评论

评论列表(0条)

保存