ios – 为什么AVSampleBufferDisplayLayer停止显示取自AVCaptureVideoDataOutput的委托的CMSampleBuffers?

ios – 为什么AVSampleBufferDisplayLayer停止显示取自AVCaptureVideoDataOutput的委托的CMSampleBuffers?,第1张

概述我想用AVSampleBufferDisplayLayer显示一些CMSampleBuffer,但是在显示第一个样本后它会冻结. 我从AVCaptureVideoDataOutputSampleBuffer委托中获取了samplebuffers: -(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer: 我想用AVSampleBufferdisplayLayer显示一些CMSampleBuffer,但是在显示第一个样本后它会冻结.

我从AVCaptureVIDeoDataOutputSampleBuffer委托中获取了samplebuffers:

-(voID)captureOutput:(AVCaptureOutput *)captureOutput dIDOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{    CFRetain(sampleBuffer);    [self imagetoBuffer:sampleBuffer];    CFRelease(sampleBuffer);}

把它们放到一个矢量中

-(voID) imagetoBuffer: (CMSampleBufferRef )source{//buffers is defined as: std::vector<CMSampleBufferRef> buffers;        CMSampleBufferRef newRef;        CMSampleBufferCreatecopy(kcfAllocatorDefault,source,&newRef);        buffers.push_back(newRef);}

然后尝试通过AVSampleBufferdisplayLayer(在另一个VIEwController中)显示它们

AVSampleBufferdisplayLayer * displayLayer = [[AVSampleBufferdisplayLayer alloc] init];    displayLayer.bounds = self.vIEw.bounds;    displayLayer.position = CGPointMake(CGRectGetMIDX(self.displayOnMe.bounds),CGRectGetMIDY(self.displayOnMe.bounds));    displayLayer.vIDeoGravity = AVLayerVIDeoGravityResizeAspectFill;    displayLayer.backgroundcolor = [[UIcolor greencolor] CGcolor];    [self.vIEw.layer addSublayer:displayLayer];    self.vIEw.autoresizingMask = UIVIEwautoresizingFlexibleWIDth | UIVIEwautoresizingFlexibleHeight;    dispatch_queue_t queue = dispatch_queue_create("My queue",disPATCH_QUEUE_SERIAL);    [displayLayer setNeedsdisplay];    [displayLayer requestMediaDataWhenReadyOnQueue:queue                                        usingBlock:^{                                            while ([displayLayer isReadyForMoreMediaData]) {                                                if (samplesKey < buffers.size()) {                                                    CMSampleBufferRef buf = buffers[samplesKey];                                                    [displayLayer enqueueSampleBuffer:buffers[samplesKey]];                                                    samplesKey++;                                                }else                                                {                                                    [displayLayer stopRequestingMediaData];                                                    break;                                                }                                            }                                        }];

但它显示第一个样本然后冻结,什么都不做.

我的视频数据输出设置如下:

//set up our outputself.vIDeoDataOutput = [[AVCaptureVIDeoDataOutput alloc] init];dispatch_queue_t queue = dispatch_queue_create("VIDeoQueue",disPATCH_QUEUE_SERIAL);[_vIDeoDataOutput setSampleBufferDelegate:self queue:queue];[_vIDeoDataOutput setVIDeoSettings:[NSDictionary dictionaryWithObjectsAndKeys:                                                [NSNumber numberWithInt:kCVPixelFormatType_32BGRA],(ID)kCVPixelBufferPixelFormatTypeKey,nil]];
解决方法 我在相同的上下文中遇到了这个问题,尝试从AVCaptureVIDeoDataOutput获取输出并将其显示在AVSampledisplay层中.

如果您的帧按显示顺序出现,那么修复非常简单,只需在CMSampleBufferRef上设置显示立即标记即可.

获取委托返回的样本缓冲区,然后……

CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer,YES);cfmutabledictionaryRef dict = (cfmutabledictionaryRef)CFArrayGetValueAtIndex(attachments,0);CFDictionarySetValue(dict,kCMSampleAttachmentKey_displayImmediately,kcfBooleanTrue);

如果您的帧以编码器顺序(不是显示顺序)出现,则CMSampleBuffer上的时间戳需要为零偏置并重新缓冲,以使第一帧时间戳等于时间0.

double pts = CMTimeGetSeconds(CMSampleBufferGetPresentationTimeStamp(sampleBuffer)); // ptsstart is equal to the first frames presentationTimeStamp so playback starts from time 0. CMTime presentationTimeStamp = CMTimeMake((pts-ptsstart)*1000000,1000000); CMSampleBufferSetoutputPresentationTimeStamp(sampleBuffer,presentationTimeStamp);

更新:

我遇到了一种情况,当我使用零偏差方法时,某些视频仍然没有顺利播放,我进一步调查了.正确答案似乎是在您打算玩的第一帧中使用PTS.

我的回答在这里,但我也会在这里发布.

Set rate at which AVSampleBufferDisplayLayer renders sample buffers

时基需要设置为您要解码的第一帧的显示时间戳(pts).我通过从所有后续pts中减去初始pts并将Timebase设置为0来将第一帧的pts索引为0.无论出于何种原因,这对某些视频无效.

你想要这样的东西(在解码之前调用):

CMTimebaseRef controlTimebase;CMTimebaseCreateWithMasterClock( CFAllocatorGetDefault(),CMClockGetHostTimeClock(),&controlTimebase );displayLayer.controlTimebase = controlTimebase;// Set the timebase to the initial pts hereCMTimebaseSetTime(displayLayer.controlTimebase,CMTimeMake(ptsInitial,1));CMTimebaseSetRate(displayLayer.controlTimebase,1.0);

设置CMSampleBuffer的PTS …

CMSampleBufferSetoutputPresentationTimeStamp(sampleBuffer,presentationTimeStamp);

也许确保不立即显示….

CFDictionarySetValue(dict,kcfBooleanFalse);

WWDC 2014 Session 513中对此进行了简要介绍.

总结

以上是内存溢出为你收集整理的ios – 为什么AVSampleBufferDisplayLayer停止显示取自AVCaptureVideoDataOutput的委托的CMSampleBuffers?全部内容,希望文章能够帮你解决ios – 为什么AVSampleBufferDisplayLayer停止显示取自AVCaptureVideoDataOutput的委托的CMSampleBuffers?所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1044831.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-24
下一篇 2022-05-24

发表评论

登录后才能评论

评论列表(0条)

保存