Sample Buffer Delegate Swift 2 用于实时视频过滤器

Sample Buffer Delegate Swift 2 For Real Time Video Filter

我正在尝试在 swift 的 iPhone 上使用相机创建光强度 reader。这个想法是它采用所有像素的强度分量并将它们平均给我一个单一的值。我不需要相机的预览。我一直在拼凑几个教程来尝试让它工作,到目前为止已经想出了下面的代码。 camDeviceSetup() 运行s on ViewDidLoad, cameraSetup() 运行s on a button press.

我 运行 在开始 "videoDeviceOutput!.setSampleBufferDelegate" 的行上出现错误,它说它无法将类型 FirstViewController(视图控制器)的值转换为预期的参数。

let captureSession = AVCaptureSession()
// If we find a device we'll store it here for later use
var captureDevice : AVCaptureDevice?
var videoDeviceOutput: AVCaptureVideoDataOutput?
// AVCaptureVideoPreviewLayer is a subclass of CALayer that you use to display video as it is being captured by an input device.
var previewLayer = AVCaptureVideoPreviewLayer()

func camDeviceSetup() {
    captureSession.sessionPreset = AVCaptureSessionPreset640x480
    let devices = AVCaptureDevice.devices()
    for device in devices {
        // Make sure this particular device supports video
        if (device.hasMediaType(AVMediaTypeVideo)) {
            // Finally check the position and confirm we've got the back camera
            if(device.position == AVCaptureDevicePosition.Back) {
                captureDevice = device as? AVCaptureDevice
            }
        }
    }
    if captureDevice != nil {
        let err : NSError? = nil
        captureSession.addInput(try! AVCaptureDeviceInput(device: captureDevice))

        if err != nil {
            print("error: \(err?.localizedDescription)")
        }

    }
}

func cameraSetup() {
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
    previewLayer.frame = view.bounds
    view.layer.addSublayer(previewLayer)

    videoDeviceOutput = AVCaptureVideoDataOutput()
    videoDeviceOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey:Int(kCVPixelFormatType_420YpCbCr8BiPlanarFullRange)]
    videoDeviceOutput!.alwaysDiscardsLateVideoFrames = true

//This is the line that gets stuck and not sure why
    videoDeviceOutput!.setSampleBufferDelegate(self, queue: dispatch_queue_create("VideoBuffer", DISPATCH_QUEUE_SERIAL))

    if captureSession.canAddOutput(videoDeviceOutput) {
        captureSession.addOutput(videoDeviceOutput)
    }

    captureSession.startRunning() 
}

func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
    // Think once the delegate is correctly set my algorithm for finding light intensity goes here

}

那条线上的问题归结为我没有在 ViewController.

顶部的 class 中声明 AVCaptureVideoDataOutputSampleBufferDelegate