如何正确获取 didOutputSampleBuffer 的视频输出?

How to get video output from didOutputSampleBuffer properly?

我正在尝试根据此页面从前置摄像头获取每一帧:https://developer.apple.com/library/ios/qa/qa1702/_index.html

我也把我的代码贴在这里:

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    self.captureSession = [[AVCaptureSession alloc] init];
    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
    NSAssert([self checkDeviceAuthorizationStatus], @"authorization failed");

    self.sessionQueue = dispatch_queue_create(SESSION_QUEUE_LABEL, DISPATCH_QUEUE_SERIAL);
    dispatch_async(self.sessionQueue, ^{
        NSAssert([self findCamera:YES], @"get camera failed");
        NSAssert([self attachCameraToCaptureSession], @"get input failed");
        NSAssert([self setupVideoOutput], @"get output failed");
    });
}


- (BOOL) findCamera : (BOOL)useFrontCamera {
    AVCaptureDevice *camera = nil;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in devices) {
        if (useFrontCamera && AVCaptureDevicePositionFront == [device position]) {
            camera = device;
        } else if (!useFrontCamera && AVCaptureDevicePositionBack == [device position]) {
            camera = device;
        }
    }

    if (nil != camera) {
        if ([camera lockForConfiguration:nil]) {
            [camera setActiveVideoMinFrameDuration:CMTimeMake(1, 10)];
            [camera setActiveVideoMaxFrameDuration:CMTimeMake(1, 30)];
            [camera unlockForConfiguration];
        }
        self.camera = camera;
    }
    return (nil != self.camera);
}

- (BOOL) attachCameraToCaptureSession {
    NSAssert(nil != self.camera, @"no camera");
    NSAssert(nil != self.captureSession, @"no session");

    self.cameraInput = nil;
    NSError *error = nil;
    self.cameraInput = [AVCaptureDeviceInput deviceInputWithDevice:self.camera error:&error];

    if (nil != error) {
        NSLog(@"attach camera to session error: %@", error);
        return false;
    }

    if ([self.captureSession canAddInput:self.cameraInput]) {
        [self.captureSession addInput:self.cameraInput];
    } else {
        return false;
    }

    return true;
}

- (BOOL)setupVideoOutput {
    self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    self.captureQueue = dispatch_queue_create(CAPTURE_QUEUE_LABEL, DISPATCH_QUEUE_SERIAL);
    [self.videoOutput setSampleBufferDelegate:self queue:self.captureQueue];
    self.videoOutput.alwaysDiscardsLateVideoFrames = NO;
    self.videoOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt: kCVPixelFormatType_32BGRA ] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        [self.captureSession addOutput:self.videoOutput];
        return true;
    }
    return false;
}

然后我尝试从 didOutputSampleBuffer 函数中获取帧,但是 UIImage 始终为 nil。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (captureOutput == self.videoOutput) {
        NSLog(@"ok");
        dispatch_async(self.sessionQueue, ^{
            if (sampleBuffer) {
                UIImage *image = [ViewController imageFromSampleBuffer:sampleBuffer];
                NSLog(@"%@", image);
            }
        });
    } else {
        NSLog(@"not ok");
    }
}

imageFromSampleBuffer:sampleBuffer功能和我一开始贴的链接里的一样

另外,imageFromSampleBuffer::

总是报错

:CGBitmapContextCreateImage:无效上下文 0x0。这是一个严重的错误。此应用程序或其使用的库正在使用无效的上下文,从而导致系统稳定性和可靠性的整体下降。此通知是礼貌的:请解决此问题。它将在即将到来的更新中成为致命错误。

谁能告诉我为什么?谢谢!

我的 phone 是 iPhone5s 和 iOS8.1.

使用此方法从视频中获取每一帧 url...希望这对您有所帮助

-(void)generateThumbImage : (NSURL *)url
{
    // NSURL *url = [NSURL fileURLWithPath:filepath];

    AVAsset *asset = [AVAsset assetWithURL:url];
    AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator  alloc]initWithAsset:asset];
    imageGenerator.appliesPreferredTrackTransform=YES;

    CMTime time1 = [asset duration];

    for(int i=0;i<5;i++)
    {
        CMTime time = CMTimeMakeWithSeconds(i,30);
        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
        UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
        CGImageRelease(imageRef);  // CGImageRef won't be released by ARC

        [arrayImages addObject:thumbnail];
    }
_imageView1.image=[arrayImages objectAtIndex:0];
_imageView2.image=[arrayImages objectAtIndex:1];
_imageView3.image=[arrayImages objectAtIndex:2];
_imageView4.image=[arrayImages objectAtIndex:3];
_imageView5.image=[arrayImages objectAtIndex:4];
NSLog(@"Image array:::::%@",arrayImages);

}

您不能像使用 dispatch_async 那样在另一个操作队列中使用 sampleBuffer。此对象在被 imageFromSampleBuffer 使用时可能已被释放。您必须从 2 种方法中进行选择:

  1. 在与您提到的示例相同的队列(线程)中使用缓冲区。
  2. 保留(或复制)它以供进一步使用。可以在这里找到一个很好的例子:https://developer.apple.com/library/ios/samplecode/RosyWriter/Introduction/Intro.html(查看 RosyWriterVideoProcessor.m 中的 captureOutput:... 方法)。