AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput 未调用

AVCaptureVideoDataOutputSampleBufferDelegate.CaptureOutput not called

我目前有一个自己开发的框架(MySDK),以及一个使用MySDK的iOS应用程序(MyApp)。

在 MySDK 内部,我在 MySDK 中有一个 class(扫描仪),用于处理来自设备相机的视频输出的图像。

这是我的代码示例:

Scanner.swift

class Scanner: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {

    var captureDevice : AVCaptureDevice?
    var captureOutput : AVCaptureVideoDataOutput?
    var previewLayer : AVCaptureVideoPreviewLayer?
    var captureSession : AVCaptureSession?

    var rootViewController : UIViewController?

    func scanImage (viewController: UIViewController)
    {
        NSLog("%@", "scanning begins!")

        if (captureSession == nil) { captureSession = AVCaptureSession() }

        rootViewController = viewController;

        captureSession!.sessionPreset = AVCaptureSessionPresetHigh

        let devices = AVCaptureDevice.devices()

        for device in devices {
            if (device.hasMediaType(AVMediaTypeVideo)) {
                if(device.position == AVCaptureDevicePosition.Back) {
                    captureDevice = device as? AVCaptureDevice
                }
            }
        }

        if (captureDevice != nil) {
            NSLog("%@", "beginning session!")

            beginSession()
        }
    }

    func beginSession()
    {
        if (captureSession == nil) { captureSession = AVCaptureSession() }
        if (captureOutput == nil) { captureOutput = AVCaptureVideoDataOutput() }
        if (previewLayer == nil) { previewLayer = AVCaptureVideoPreviewLayer() }

        let queue = dispatch_queue_create("myQueue", DISPATCH_QUEUE_SERIAL);

        captureOutput!.setSampleBufferDelegate(self, queue: queue)
        captureOutput!.videoSettings = [kCVPixelBufferPixelFormatTypeKey as NSString: Int(kCVPixelFormatType_32BGRA)]

        captureSession!.addInput(try! AVCaptureDeviceInput(device: captureDevice))
        captureSession!.addOutput(captureOutput)

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer!.frame = rootViewController!.view.layer.frame

        rootViewController!.view.layer.addSublayer(previewLayer!)

        captureSession!.startRunning()
    }

    func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBufferRef!, fromConnection connection: AVCaptureConnection!)
    {
        NSLog("%@", "captured!")
    }
}

在 MyApp 中,我有一个 ViewController,它实现了一个 IBAction,其中初始化了 Scanner class,并触发了 scanImage 函数。

MyApp.m:

- (IBAction)btnScanImage_TouchDown:(id)sender
{
    Scanner * scanner = [[Scanner alloc] init];

    [scanner scanImage:self];
}

相机视图出现在应用程序内部,但从未触发 captureOutput 函数,控制台仅包含以下两行:

2016-03-07 11:11:45.860 myapp[1236:337377] scanning begins!
2016-03-07 11:11:45.984 myapp[1236:337377] beginning session!

创建一个独立的应用程序,并将 Scanner.swift 中的代码嵌入到 ViewController 中就可以了; captureOutput 函数正确触发。

有人知道我做错了什么吗?

经过反复试验,我终于找到了解决问题的方法。

显然,我没有将 Scanner 对象创建为 class 变量,只是作为 local变量。

一旦 Scanner 对象被创建为 class 变量,委托方法 captureOutput 被正确触发。