iOS:captureOutput:didOutputSampleBuffer:不调用fromConnection

iOS:captureOutput:didOutputSampleBuffer:不调用fromConnection,第1张

概述我想从AVCaptureSession的实时馈送中提取帧,我使用Apple的AVCam作为测试用例.这是AVCam的链接: https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html 我发现那个captureOutput:didOutputSampleBuffer:fromConnection没有 我想从AVCaptureSession的实时馈送中提取帧,我使用Apple的AVCam作为测试用例.这是AVCam的链接:

https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html

我发现那个captureOutput:dIDOutputSampleBuffer:fromConnection没有被调用,我想知道为什么或者我做错了什么.

这是我做的:

(1)我使AVCamVIEwController成为委托

@interface AVCamVIEwController () <AVCapturefileOutputRecordingDelegate,AVCaptureVIDeoDataOutputSampleBufferDelegate>

(2)我创建了一个AVCaptureVIDeoDataOutput对象并将其添加到会话中

AVCaptureVIDeoDataOutput *vIDeoDataOutput = [[AVCaptureVIDeoDataOutput alloc] init];if ([session canAddOutput:vIDeoDataOutput])     {         [session addOutput:vIDeoDataOutput];     }

(3)我通过记录随机字符串来测试添加了委托方法和测试

- (voID)captureOutput:(AVCaptureOutput *)captureOutput dIDOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{    NSLog(@"I am called");}

测试应用程序可以工作,但是不会调用captureOutput:dIDOutputSampleBuffer:fromConnection.

(4)我在SO上读到了AVCaptureSession * session = [[AVCaptureSession alloc] init]中的会话变量;在vIEwDIDLoad中是本地的是一个可能的原因,为什么没有调用委托,我把它作为AVCamVIEwController类的实例变量,但它没有被调用.

这是我正在测试的vIEwDIDLoad方法(取自AVCam),我在方法的末尾添加了AVCaptureDataOutput:

- (voID)vIEwDIDLoad{    [super vIEwDIDLoad];    // Create the AVCaptureSession    session = [[AVCaptureSession alloc] init];    [self setSession:session];    // Setup the prevIEw vIEw    [[self prevIEwVIEw] setSession:session];    // Check for device authorization    [self checkDeviceAuthorizationStatus];    // In general it is not safe to mutate an AVCaptureSession or any of its inputs,outputs,or connections from multiple threads at the same time.    // Why not do all of this on the main queue?    // -[AVCaptureSession startRunning] is a blocking call which can take a long time. We dispatch session setup to the sessionQueue so that the main queue isn't blocked (which keeps the UI responsive).    dispatch_queue_t sessionQueue = dispatch_queue_create("session queue",disPATCH_QUEUE_SERIAL);    [self setSessionQueue:sessionQueue];    dispatch_async(sessionQueue,^{        [self setBackgroundRecordingID:uibackgroundtaskInvalID];        NSError *error = nil;        AVCaptureDevice *vIDeoDevice = [AVCamVIEwController deviceWithMediaType:AVMediaTypeVIDeo preferringposition:AVCaptureDevicepositionBack];        AVCaptureDeviceinput *vIDeoDeviceinput = [AVCaptureDeviceinput deviceinputWithDevice:vIDeoDevice error:&error];        if (error)        {            NSLog(@"%@",error);        }        if ([session canAddinput:vIDeoDeviceinput])        {            [session addinput:vIDeoDeviceinput];            [self setVIDeoDeviceinput:vIDeoDeviceinput];            dispatch_async(dispatch_get_main_queue(),^{                // Why are we dispatching this to the main queue?                // Because AVCaptureVIDeoPrevIEwLayer is the backing layer for AVCamPrevIEwVIEw and UIVIEw can only be manipulated on main thread.                // Note: As an exception to the above rule,it is not necessary to serialize vIDeo orIEntation changes on the AVCaptureVIDeoPrevIEwLayer’s connection with other session manipulation.                [[(AVCaptureVIDeoPrevIEwLayer *)[[self prevIEwVIEw] layer] connection] setVIDeoOrIEntation:(AVCaptureVIDeoOrIEntation)[self interfaceOrIEntation]];            });        }        AVCaptureDevice *audioDevice = [[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject];        AVCaptureDeviceinput *audioDeviceinput = [AVCaptureDeviceinput deviceinputWithDevice:audioDevice error:&error];        if (error)        {            NSLog(@"%@",error);        }        if ([session canAddinput:audioDeviceinput])        {            [session addinput:audioDeviceinput];        }        AVCaptureMovIEfileOutput *movIEfileOutput = [[AVCaptureMovIEfileOutput alloc] init];        if ([session canAddOutput:movIEfileOutput])        {            [session addOutput:movIEfileOutput];            AVCaptureConnection *connection = [movIEfileOutput connectionWithMediaType:AVMediaTypeVIDeo];            if ([connection isVIDeoStabilizationSupported])                [connection setEnablesVIDeoStabilizationWhenAvailable:YES];            [self setMovIEfileOutput:movIEfileOutput];        }        AVCaptureStillimageOutput *stillimageOutput = [[AVCaptureStillimageOutput alloc] init];        if ([session canAddOutput:stillimageOutput])        {            [stillimageOutput setoutputSettings:@{AVVIDeoCodecKey : AVVIDeoCodecJPEG}];            [session addOutput:stillimageOutput];            [self setStillimageOutput:stillimageOutput];        }        AVCaptureVIDeoDataOutput *vIDeoDataOutput = [[AVCaptureVIDeoDataOutput alloc] init];        [vIDeoDataOutput setSampleBufferDelegate:self queue:sessionQueue];        if ([session canAddOutput:vIDeoDataOutput])        {            NSLog(@"Yes I can add it");            [session addOutput:vIDeoDataOutput];        }    });}- (voID)vIEwWillAppear:(BOol)animated{    dispatch_async([self sessionQueue],^{        [self addobserver:self forKeyPath:@"sessionRunningAndDeviceAuthorized" options:(NSkeyvalueObservingOptionold | NSkeyvalueObservingOptionNew) context:SessionRunningAndDeviceAuthorizedContext];        [self addobserver:self forKeyPath:@"stillimageOutput.capturingStillimage" options:(NSkeyvalueObservingOptionold | NSkeyvalueObservingOptionNew) context:CapturingStillimageContext];        [self addobserver:self forKeyPath:@"movIEfileOutput.recording" options:(NSkeyvalueObservingOptionold | NSkeyvalueObservingOptionNew) context:RecordingContext];        [[NSNotificationCenter defaultCenter] addobserver:self selector:@selector(subjectAreaDIDChange:) name:AVCaptureDeviceSubjectAreaDIDChangeNotification object:[[self vIDeoDeviceinput] device]];        __weak AVCamVIEwController *weakSelf = self;        [self setRuntimeErrorHandlingObserver:[[NSNotificationCenter defaultCenter] addobserverForname:AVCaptureSessionRuntimeErrorNotification object:[self session] queue:nil usingBlock:^(NSNotification *note) {            AVCamVIEwController *strongSelf = weakSelf;            dispatch_async([strongSelf sessionQueue],^{                // Manually restarting the session since it must have been stopped due to an error.                [[strongSelf session] startRunning];                [[strongSelf recordbutton] setTitle:NSLocalizedString(@"Record",@"Recording button record Title") forState:UIControlStatenormal];            });        }]];        [[self session] startRunning];    });}

有人可以告诉我为什么和如何解决它的建议?

解决方法 我已经做了很多实验,我想我可能有答案.我有类似但不同的代码,这些代码是从头开始编写的,而不是从Apple的样本中复制(现在有点旧).

我认为这是一节……

AVCaptureMovIEfileOutput *movIEfileOutput = [[AVCaptureMovIEfileOutput alloc] init];    if ([session canAddOutput:movIEfileOutput])    {        [session addOutput:movIEfileOutput];        AVCaptureConnection *connection = [movIEfileOutput connectionWithMediaType:AVMediaTypeVIDeo];        if ([connection isVIDeoStabilizationSupported])            [connection setEnablesVIDeoStabilizationWhenAvailable:YES];        [self setMovIEfileOutput:movIEfileOutput];    }

从我的实验来看,这是导致你的问题的原因.在我的代码中,当这是captureOutput:dIDOutputSampleBuffer:fromConnection不被调用.我认为视频系统EITHER为您提供了一系列样本缓冲区,或者将压缩的优化电影文件记录到磁盘,而不是两者. (至少在iOS上.)我想这很有意义/并不奇怪,但我没有看到它记录在任何地方!

此外,有一次,当我打开麦克风时,我似乎遇到了错误和/或缓冲区回调.再次没有记录,这些是错误-11800(未知错误).但我不能总是重现那个.

总结

以上是内存溢出为你收集整理的iOS:captureOutput:didOutputSampleBuffer:不调用fromConnection全部内容,希望文章能够帮你解决iOS:captureOutput:didOutputSampleBuffer:不调用fromConnection所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1073811.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-26
下一篇 2022-05-26

发表评论

登录后才能评论

评论列表(0条)

保存