objective-c – AVAssetWriter有时会失败,状态为AVAssetWriterStatusFailed.似乎是随机的

objective-c – AVAssetWriter有时会失败,状态为AVAssetWriterStatusFailed.似乎是随机的,第1张

概述我正在使用AVAssetWriterInputPixelBufferAdaptor使用AVAssetWriter编写MP4视频文件. 源是来自UIImagePickerController的视频,可以从相机或资产库中新近捕获.现在的质量是UIImagePickerControllerQualityTypeMedium. 有时候作家失败了.它的状态是AVAssetWriterStatusFailed 我正在使用AVAssetWriterinputPixelBufferAdaptor使用AVAssetWriter编写MP4视频文件.

源是来自UIImagePickerController的视频,可以从相机或资产库中新近捕获.现在的质量是UIImagePickerControllerQualityTypeMedium.

有时候作家失败了.它的状态是AVAssetWriterStatusFailed,AVAssetWriter对象的错误属性是:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation Could not be completed" UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unkNown error occurred (-536870210),NSUnderlyingError=0x4dd8e0 "The operation Couldn’t be completed. (Osstatus error -536870210.)",NSLocalizedDescription=The operation Could not be completed

该错误大约发生在代码运行时间的20%.它似乎在iPhone 4 / 4S上比在iPhone 5上更频繁地失败.

如果源视频质量较高,它也会更频繁地出现.
使用UIImagePickerControllerQualityTypeLow错误不会经常发生.
使用UIImagePickerControllerQualityTypeHigh,错误更频繁地发生.

我也注意到了别的东西:
它似乎有点波澜.当它失败时,即使我删除了应用程序并重新安装它,以下运行也经常会失败.这让我想知道,我的程序是否泄漏了一些内存,即使应用程序被杀,该内存是否仍然存活(甚至可能?).

这是我用来渲染视频的代码:

- (voID)writeVIDeo{    offlineRenderingInProgress = YES;/* --- Writer Setup --- */    [locationQueue cancelAllOperations];    [self stopWithoutRewinding];    NSError *writerError = nil;    BOol succes;    succes = [[NSfileManager defaultManager] removeItemAtURL:self.outputURL error:nil];    // DLog(@"Url: %@,succes: %i,error: %@",self.outputURL,succes,fileError);    writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(Nsstring *)kUTTypeQuickTimeMovIE error:&writerError];    //writer.shouldOptimizeforNetworkUse = NO;    if (writerError) {        DLog(@"Writer error: %@",writerError);        return;    }    float bitsPerPixel;    CMVIDeoDimensions dimensions = CMVIDeoFormatDescriptionGetDimensions((__brIDge CMVIDeoFormatDescriptionRef)([readerVIDeoOutput.vIDeoTracks[0] formatDescriptions][0]));    int numPixels = dimensions.wIDth * dimensions.height;    int bitsPerSecond;    // Assume that lower-than-SD resolutions are intended for streaming,and use a lower bitrate    if ( numPixels < (640 * 480) )        bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.    else        bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.    bitsPerSecond = numPixels * bitsPerPixel;    NSDictionary *vIDeoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:                                          AVVIDeoCodecH264,AVVIDeoCodecKey,[NSNumber numberWithfloat:vIDeoSize.wIDth],AVVIDeoWIDthKey,[NSNumber numberWithInteger:vIDeoSize.height],AVVIDeoHeightKey,[NSDictionary dictionaryWithObjectsAndKeys:                                           [NSNumber numberWithInteger:30],AVVIDeoMaxKeyFrameIntervalKey,nil],AVVIDeoCompressionPropertIEsKey,nil];    writerVIDeoinput = [AVAssetWriterinput assetWriterinputWithMediaType:AVMediaTypeVIDeo outputSettings:vIDeoCompressionSettings];    writerVIDeoinput.transform =  movIE.preferredtransform;    writerVIDeoinput.expectsMediaDataInRealTime = YES;    [writer addinput:writerVIDeoinput];    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:                                                       [NSNumber numberWithInt:kCVPixelFormatType_32ARGB],kCVPixelBufferPixelFormatTypeKey,nil];    writerPixelAdaptor = [AVAssetWriterinputPixelBufferAdaptor assetWriterinputPixelBufferAdaptorWithAssetWriterinput:writerVIDeoinput                                                                                      sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];    BOol CouldStart = [writer startWriting];    if (!CouldStart) {        DLog(@"Could not start AVAssetWriter!");        abort = YES;        [locationQueue cancelAllOperations];        return;    }    [self configureFilters];    CIContext *offlineRenderContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @NO}];    CGcolorSpaceRef colorSpace = CGcolorSpaceCreateDeviceRGB();    if (!self.canEdit) {        [self createVIDeoReaderWithAsset:movIE timeRange:CMTimeRangeFromTimetoTime(kCMTimeZero,kCMTimePositiveInfinity) forOfflineRender:YES];    } else {        [self createVIDeoReaderWithAsset:movIE timeRange:CMTimeRangeWithNOVIDeoRangeInDuration(self.thumbnailEditVIEw.range,movIE.duration) forOfflineRender:YES];    }    CMTime startOffset = reader.timeRange.start;    DLog(@"startOffset: %llu",startOffset.value);    [self.thumbnailEditVIEw removeFromSupervIEw];    //    self.thumbnailEditVIEw = nil;    [glLayer removeFromSuperlayer];    glLayer = nil;    [playerVIEw removeFromSupervIEw];    playerVIEw = nil;    glContext = nil;    [writerVIDeoinput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(disPATCH_QUEUE_PRIORITY_DEFAulT,0) usingBlock:^{        @try {        BOol dIDWriteSomething = NO;        DLog(@"Preparing to write...");        while ([writerVIDeoinput isReadyForMoreMediaData]) {            if (abort) {                NSLog(@"Abort == YES");                [locationQueue cancelAllOperations];                [writerVIDeoinput markAsFinished];                vIDeoConvertCompletionBlock(NO,writer.error.localizedDescription);            }            if (writer.status == AVAssetWriterStatusFailed) {                DLog(@"Writer.status: AVAssetWriterStatusFailed,writer.error);                [[NSUserDefaults standardUserDefaults] setobject:[NSNumber numberWithInt:1] forKey:@"QualityOverrIDe"];                [[NSUserDefaults standardUserDefaults] synchronize];                abort = YES;                [locationQueue cancelAllOperations];                vIDeoConvertCompletionBlock(NO,writer.error.localizedDescription);                return;                DLog(@"Source file exists: %i",[[NSfileManager defaultManager] fileExistsAtPath:movIE.URL.relativePath]);            }            DLog(@"Writing started...");            CMSampleBufferRef buffer = nil;            if (reader.status != AVAssetReaderStatusUnkNown) {                if (reader.status == AVAssetReaderStatusReading) {                    buffer = [readerVIDeoOutput copyNextSampleBuffer];                    if (dIDWriteSomething == NO) {                        DLog(@"copying sample buffers...");                    }                }                if (!buffer) {                    [writerVIDeoinput markAsFinished];                    DLog(@"Finished...");                    CGcolorSpaceRelease(colorSpace);                    [self offlineRenderingDIDFinish];                    dispatch_async(dispatch_get_global_queue(disPATCH_QUEUE_PRIORITY_DEFAulT,0),^{                        [writer finishWriting];                        if (writer.error != nil) {                            DLog(@"Error: %@",writer.error);                        } else {                            DLog(@"Succes!");                        }                        if (writer.status == AVAssetWriterStatusCompleted) {                            vIDeoConvertCompletionBlock(YES,nil);                        }                        else {                            abort = YES;                            vIDeoConvertCompletionBlock(NO,writer.error.localizedDescription);                        }                    });                    return;                }                dIDWriteSomething = YES;            }            else {                DLog(@"Still waiting...");                //Reader just needs a moment to get ready...                continue;            }            CVPixelBufferRef pixelBuffer = CMSampleBufferGetimageBuffer(buffer);            if (pixelBuffer == NulL) {                DLog(@"Pixelbuffer == NulL");                continue;            }            //DLog(@"Sample call back! Pixelbuffer: %lu",CVPixelBufferGetHeight(pixelBuffer));            //NSDictionary *options = [NSDictionary dictionaryWithObject:(__brIDge ID)CGcolorSpaceCreateDeviceRGB() forKey:kCIImagecolorSpace];            CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];            CIImage *outputimage = [self filteredImageWithImage:ciimage];            CVPixelBufferRef outPixelBuffer = NulL;            CVReturn status;            CFDictionaryRef empty; // empty value for attr value.            cfmutabledictionaryRef attrs;            empty = CFDictionaryCreate(kcfAllocatorDefault,// our empty IOSurface propertIEs dictionary                                       NulL,NulL,&kcfTypeDictionaryKeyCallBacks,&kcfTypeDictionaryValueCallBacks);            attrs = CFDictionaryCreateMutable(kcfAllocatorDefault,1,&kcfTypeDictionaryValueCallBacks);            CFDictionarySetValue(attrs,kCVPixelBufferIOSurfacePropertIEsKey,empty);            CFDictionarySetValue(attrs,kCVPixelBufferCGImageCompatibilityKey,(__brIDge const voID *)([NSNumber numberWithBool:YES]));            CFDictionarySetValue(attrs,kCVPixelBufferCGBitmapContextCompatibilityKey,(__brIDge const voID *)([NSNumber numberWithBool:YES]));            status = CVPixelBufferCreate(kcfAllocatorDefault,ciimage.extent.size.wIDth,ciimage.extent.size.height,kCVPixelFormatType_32BGRA,attrs,&outPixelBuffer);            //DLog(@"Output image size: %f,%f,pixelbuffer height: %lu",outputimage.extent.size.wIDth,outputimage.extent.size.height,CVPixelBufferGetHeight(outPixelBuffer));            if (status != kCVReturnSuccess) {                DLog(@"Couldn't allocate output pixelBufferRef!");                continue;            }            [offlineRenderContext render:outputimage toCVPixelBuffer:outPixelBuffer bounds:outputimage.extent colorSpace:colorSpace];            CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer);            CMTime currentTime = CMTimeSubtract(currentSourceTime,startOffset);            CMTime duration = reader.timeRange.duration;            if (CMTIME_IS_POSITIVE_INFINITY(duration)) {                duration = movIE.duration;            }            CMTime durationConverted = CMTimeConvertScale(duration,currentTime.timescale,kCMTimeRoundingMethod_Default);            float durationfloat = (float)durationConverted.value;            float progress =  ((float) currentTime.value) / durationfloat;            //DLog(@"duration : %f,progress: %f",durationfloat,progress);            [self updateOfflineRenderProgress:progress];            if (pixelBuffer != NulL && writerVIDeoinput.readyForMoreMediaData) {                [writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime];            } else {                continue;            }            if (writer.status == AVAssetWriterStatusWriting) {                DLog(@"Writer.status: AVAssetWriterStatusWriting");            }            CFRelease(buffer);            CVPixelBufferRelease(outPixelBuffer);        }        }        @catch (NSException *exception) {            DLog(@"Catching exception: %@",exception);        }    }];}
解决方法 好吧,我想我自己解决了.坏人就是这条线:
[writerVIDeoinput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(disPATCH_QUEUE_PRIORITY_DEFAulT,0) usingBlock:^{ ....

我传递的全局队列是并发队列.这允许在前一个回调完成之前进行新的回调.资产编写器不是设计为一次从多个线程写入.

创建和使用新的串行队列似乎可以解决这个问题:

assetWriterQueue = dispatch_queue_create("AssetWriterQueue",disPATCH_QUEUE_SERIAL);[writerVIDeoinput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...
总结

以上是内存溢出为你收集整理的objective-c – AVAssetWriter有时会失败,状态为AVAssetWriterStatusFailed.似乎是随机的全部内容,希望文章能够帮你解决objective-c – AVAssetWriter有时会失败,状态为AVAssetWriterStatusFailed.似乎是随机的所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/langs/1242435.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-06-06
下一篇 2022-06-06

发表评论

登录后才能评论

评论列表(0条)

保存