captureOutput 未被调用

captureOutput not being called

我已经研究这个问题太久了。

我正在尝试获取 MacOS 网络摄像头数据和 运行 网络摄像头输出帧上的 CIDetect。

我知道我需要:

出于某种原因,在调用 .setSampleBufferDelegate(...) 之后(当然还有在 AVCaptureSession 实例上调用 .startRunning() 之后),我的 AVCaptureVideoDataOutputSampleBufferDelegatecaptureOutput没有被调用。

我在网上发现很多人遇到这个问题,但我找不到任何解决方案。

在我看来,这似乎与 DispatchQueue 有关。

MyDelegate.swift:

class MyDelegate : NSObject {


    var context: CIContext?;
    var detector : CIDetector?;

    override init() {
        context = CIContext();
        detector = CIDetector(ofType: CIDetectorTypeFace, context: context);
        print("set up!");

    }

}
extension MyDelegate : AVCaptureVideoDataOutputSampleBufferDelegate {
    func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("success?");
        var pixelBuffer : CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!;
        var image : CIImage = CIImage(cvPixelBuffer: pixelBuffer);
        var features : [CIFeature] = detector!.features(in: image);
        for feature in features {
            print(feature.type);
            print(feature.bounds);
        }
    }

    func captureOutput(_ : AVCaptureOutput, didDrop sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
        print("fail?");
    }
}

ViewController.swift:

var captureSession : AVCaptureSession;
var captureDevice : AVCaptureDevice?
var previewLayer : AVCaptureVideoPreviewLayer?

var vdo : AVCaptureVideoDataOutput;

var videoDataOutputQueue : DispatchQueue;

override func viewDidLoad() {
    super.viewDidLoad()

    camera.layer = CALayer()

    // Do any additional setup after loading the view, typically from a nib.
    captureSession.sessionPreset = AVCaptureSessionPresetLow

    // Get all audio and video devices on this machine
    let devices = AVCaptureDevice.devices()

    // Find the FaceTime HD camera object
    for device in devices! {
        print(device)

        // Camera object found and assign it to captureDevice
        if ((device as AnyObject).hasMediaType(AVMediaTypeVideo)) {
            print(device)
            captureDevice = device as? AVCaptureDevice
        }
    }

    if captureDevice != nil {
        do {   
            try captureSession.addInput(AVCaptureDeviceInput(device: captureDevice))
            // vdo : AVCaptureVideoDataOutput;
            vdo.videoSettings = [kCVPixelBufferPixelFormatTypeKey as AnyHashable: NSNumber(value: kCVPixelFormatType_32BGRA)]

            try captureDevice!.lockForConfiguration()
            captureDevice!.activeVideoMinFrameDuration = CMTimeMake(1, 30)
            captureDevice!.unlockForConfiguration()

            videoDataOutputQueue.sync{
                vdo.setSampleBufferDelegate(
                    MyDelegate,
                    queue: videoDataOutputQueue
                );
                vdo.alwaysDiscardsLateVideoFrames = true
                captureSession.addOutput(vdo)   
                captureSession.startRunning();
            }
        } catch {
            print(AVCaptureSessionErrorKey.description)
        }
    }

viewDidLoad 中与 AVFoundation 相关的所有必要变量都已在 Viewcontrollerinit() 中实例化。为了清楚起见,我省略了它。

有什么想法吗?

谢谢,SO!

科维克

编辑: - 修复了从 selfMyDelegate 的设置委托。

这就是我初始化 videoDataOutputQueue 的方式:

    videoDataOutputQueue = DispatchQueue(
        label: "VideoDataOutputQueue"   
    );

您在声明所需的示例缓冲区委托方法时犯了一个错误:

captureOutput(_:didOutputSampleBuffer:from:).

请检查并确保它是:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!)

PS:注意该方法的参数是如何声明的。所有参数都有 '!'这意味着自动展开。

我有一个类似的问题:在我的例子中,问题是在 Swift 4 中写入你必须实现以下方法:

func metadataOutput(_ output: AVCaptureMetadataOutput, didOutput metadataObjects: [AVMetadataObject], from connection: AVCaptureConnection) 

而不是:

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [Any]!, from connection: AVCaptureConnection!)

希望对您有所帮助。

编辑

此方法必须由 AVCaptureMetadataOutputObjectsDelegate 实现(例如,您的 viewcontroller)。为了启动 QRCode 捕获会话,您可以尝试这样的操作:

    captureSession = AVCaptureSession()

    let videoCaptureDevice = AVCaptureDevice.default(for: AVMediaType.video);
    var videoInput:AVCaptureDeviceInput? =  nil;

    do {
        if let v = videoCaptureDevice{
            videoInput = try AVCaptureDeviceInput(device: v)
        }
        else{
            print("Error: can't find videoCaptureDevice");
        }

    } catch {
        let ac = UIAlertController(title: "Error", message: error.localizedDescription, preferredStyle: .alert)
        ac.addAction(UIAlertAction(title: "Ok", style: .default))
        present(ac, animated: true)
        return
    }

    if let videoInput = videoInput{
        if (captureSession.canAddInput(videoInput)) {
            captureSession.addInput(videoInput)
        } else {
            //Show error
            return;
        }
    }
    else{
        //Show error
        return;
    }

    let metadataOutput = AVCaptureMetadataOutput()

    if (captureSession.canAddOutput(metadataOutput)) {
        captureSession.addOutput(metadataOutput);

        metadataOutput.setMetadataObjectsDelegate(/*YOUR DELEGATE*/, queue: DispatchQueue.main);
        metadataOutput.metadataObjectTypes = [AVMetadataObject.ObjectType.qr, AVMetadataObject.ObjectType.code128];
    } else {
        //Show error
        return;
    }

    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession);
    previewLayer.frame = view.layer.bounds;

    previewLayer.videoGravity = AVLayerVideoGravity.resizeAspectFill;
    view.layer.addSublayer(previewLayer);

    captureSession.startRunning();

在我的例子中,委托方法没有被调用,因为 AVCaptureMovieFileOutput 在 'AVCaptureVideoDataOutput' 之前被添加到会话中。我猜只能将一个与视频相关的输出添加到会话中。仅添加 AVCaptureVideoDataOutput 解决了问题。