ios – 如何实时捕捉iPhone视频录制中的逐帧图像

ios – 如何实时捕捉iPhone视频录制中的逐帧图像,第1张

概述我试图实时测量所选颜色的饱和度,如下所示: Idea http://i62.tinypic.com/2af9zia.png 我跟随苹果公司的this guide.我更新了代码以使用ARC,当然我的视图控制器是AVCaptureVideoDataOutputSampleBufferDelegate,但我不知道如何实际开始捕获数据,就像启动相机获取一些实际输入一样. 这是我的代码: #import " 我试图实时测量所选颜色的饱和度,如下所示:

Idea http://i62.tinypic.com/2af9zia.png

我跟随苹果公司的this guide.我更新了代码以使用ARC,当然我的视图控制器是AVCaptureVIDeoDataOutputSampleBufferDelegate,但我不知道如何实际开始捕获数据,就像启动相机获取一些实际输入一样.

这是我的代码:

#import "VIEwController.h"@interface VIEwController ()@property (nonatomic,strong) AVCaptureSession *session;@property (nonatomic,strong) AVCaptureVIDeoPrevIEwLayer *prevIEwLayer;@end@implementation VIEwController- (voID)vIEwDIDLoad{    [super vIEwDIDLoad];    // Do any additional setup after loading the vIEw,typically from a nib    [self setupCaptureSession];}- (voID)dIDReceiveMemoryWarning{    [super dIDReceiveMemoryWarning];    // dispose of any resources that can be recreated.}// Create and configure a capture session and start it running- (voID)setupCaptureSession{    NSError *error = nil;    // Create the session    AVCaptureSession *session = [[AVCaptureSession alloc] init];    // Configure the session to produce lower resolution vIDeo frames,if your    // processing algorithm can cope. We'll specify medium quality for the    // chosen device.    session.sessionPreset = AVCaptureSessionPresetMedium;    // Find a suitable AVCaptureDevice    AVCaptureDevice *device = [AVCaptureDevice                               defaultDeviceWithMediaType:AVMediaTypeVIDeo];    // Create a device input with the device and add it to the session.    AVCaptureDeviceinput *input = [AVCaptureDeviceinput deviceinputWithDevice:device                                                                        error:&error];    if (!input) {        // Handling the error appropriately.    }    [session addinput:input];    // Create a VIDeoDataOutput and add it to the session    AVCaptureVIDeoDataOutput *output = [[AVCaptureVIDeoDataOutput alloc] init];    [session addOutput:output];    // Configure your output.    dispatch_queue_t queue = dispatch_queue_create("myQueue",NulL);    [output setSampleBufferDelegate:self queue:queue];    // Specify the pixel format    output.vIDeoSettings =    [NSDictionary dictionaryWithObject:     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]                                forKey:(ID)kCVPixelBufferPixelFormatTypeKey];    // Start the session running to start the flow of data    [self startCapturingWithSession:session];    // Assign session to an ivar.    [self setSession:session];}- (voID)startCapturingWithSession: (AVCaptureSession *) captureSession{    //----- disPLAY THE PREVIEW LAYER -----    //display it full screen under out vIEw controller existing controls    NSLog(@"display the prevIEw layer");    CGRect layerRect = [[[self vIEw] layer] bounds];    [self.prevIEwLayer setBounds:layerRect];    [self.prevIEwLayer setposition:CGPointMake(CGRectGetMIDX(layerRect),CGRectGetMIDY(layerRect))];    //[[[self vIEw] layer] addSublayer:[[self CaptureManager] self.prevIEwLayer]];    //We use this instead so it goes on a layer behind our UI controls (avoIDs us having to manually bring each control to the front):    UIVIEw *Cameraview = [[UIVIEw alloc] init];    [[self vIEw] addSubvIEw:Cameraview];    [self.vIEw sendSubvIEwToBack:Cameraview];    [[Cameraview layer] addSublayer:self.prevIEwLayer];    //----- START THE CAPTURE SESSION RUNNING -----    [captureSession startRunning];}// Delegate routine that is called when a sample buffer was written- (voID)captureOutput:(AVCaptureOutput *)captureOutputdIDOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer       fromConnection:(AVCaptureConnection *)connection{    // Create a UIImage from the sample buffer data    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];}// Create a UIImage from sample buffer data- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{    // Get a CMSampleBuffer's Core VIDeo image buffer for the media data    CVImageBufferRef imageBuffer = CMSampleBufferGetimageBuffer(sampleBuffer);    // Lock the base address of the pixel buffer    CVPixelBufferLockBaseAddress(imageBuffer,0);    // Get the number of bytes per row for the pixel buffer    voID *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);    // Get the number of bytes per row for the pixel buffer    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);    // Get the pixel buffer wIDth and height    size_t wIDth = CVPixelBufferGetWIDth(imageBuffer);    size_t height = CVPixelBufferGetHeight(imageBuffer);    // Create a device-dependent RGB color space    CGcolorSpaceRef colorSpace = CGcolorSpaceCreateDeviceRGB();    // Create a bitmap graphics context with the sample buffer data    CGContextRef context = CGBitmapContextCreate(baseAddress,wIDth,height,8,bytesPerRow,colorSpace,kCGBitmapByteOrder32little | kCGImageAlphaPremultiplIEdFirst);    // Create a Quartz image from the pixel data in the bitmap graphics context    CGImageRef quartzImage = CGBitmapContextCreateImage(context);    // Unlock the pixel buffer    CVPixelBufferUnlockBaseAddress(imageBuffer,0);    // Free up the context and color space    CGContextRelease(context);    CGcolorSpaceRelease(colorSpace);    // Create an image object from the Quartz image    UIImage *image = [UIImage imageWithCGImage:quartzImage];    // Release the Quartz image    CGImageRelease(quartzImage);    return (image);}@end
解决方法 这样做对我来说,一切都与设置视频预览有关:

#import "VIEwController.h"@interface VIEwController ()@property (nonatomic,NulL);    [output setSampleBufferDelegate:self queue:queue];    // Specify the pixel format    output.vIDeoSettings =    [NSDictionary dictionaryWithObject:     [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]                                forKey:(ID)kCVPixelBufferPixelFormatTypeKey];    // Start the session running to start the flow of data    [self startCapturingWithSession:session];    // Assign session to an ivar.    [self setSession:session];}- (voID)startCapturingWithSession: (AVCaptureSession *) captureSession{    NSLog(@"Adding vIDeo prevIEw layer");    [self setPrevIEwLayer:[[AVCaptureVIDeoPrevIEwLayer alloc] initWithSession:captureSession]];    [self.prevIEwLayer setVIDeoGravity:AVLayerVIDeoGravityResizeAspectFill];    //----- disPLAY THE PREVIEW LAYER -----    //display it full screen under out vIEw controller existing controls    NSLog(@"display the prevIEw layer");    CGRect layerRect = [[[self vIEw] layer] bounds];    [self.prevIEwLayer setBounds:layerRect];    [self.prevIEwLayer setposition:CGPointMake(CGRectGetMIDX(layerRect),CGRectGetMIDY(layerRect))];    //[[[self vIEw] layer] addSublayer:[[self CaptureManager] self.prevIEwLayer]];    //We use this instead so it goes on a layer behind our UI controls (avoIDs us having to manually bring each control to the front):    UIVIEw *Cameraview = [[UIVIEw alloc] init];    [[self vIEw] addSubvIEw:Cameraview];    [self.vIEw sendSubvIEwToBack:Cameraview];    [[Cameraview layer] addSublayer:self.prevIEwLayer];    //----- START THE CAPTURE SESSION RUNNING -----    [captureSession startRunning];}// Delegate routine that is called when a sample buffer was written- (voID)captureOutput:(AVCaptureOutput *)captureOutputdIDOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer       fromConnection:(AVCaptureConnection *)connection{    // Create a UIImage from the sample buffer data    [connection setVIDeoOrIEntation:AVCaptureVIDeoOrIEntationLandscapeleft];    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];}// Create a UIImage from sample buffer data- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer{    // Get a CMSampleBuffer's Core VIDeo image buffer for the media data    CVImageBufferRef imageBuffer = CMSampleBufferGetimageBuffer(sampleBuffer);    // Lock the base address of the pixel buffer    CVPixelBufferLockBaseAddress(imageBuffer,0);    // Free up the context and color space    CGContextRelease(context);    CGcolorSpaceRelease(colorSpace);    // Create an image object from the Quartz image    UIImage *image = [UIImage imageWithCGImage:quartzImage];    // Release the Quartz image    CGImageRelease(quartzImage);    return (image);}@end
总结

以上是内存溢出为你收集整理的ios – 如何实时捕捉iPhone视频录制中的逐帧图像全部内容,希望文章能够帮你解决ios – 如何实时捕捉iPhone视频录制中的逐帧图像所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址: http://outofmemory.cn/web/1074899.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-26
下一篇 2022-05-26

发表评论

登录后才能评论

评论列表(0条)

保存