Swift 中 Xcode 的相机应用程序初始化程序错误

Initializer error in Camera App for Xcode in Swift

我正在使用 Swift 在 Xcode 10.1 中构建一个类似于相机应用程序的应用程序。为此,我导入了 AVFoundation,即将完成我的代码。然而,在这行代码

     let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)

在这段代码中

func beginSession () {
    do {
        let captureDeviceInput = try AVCaptureDeviceInput( device: captureDevice!)

        captureSession.addInput(captureDeviceInput)
    } catch {
        print(error.localizedDescription)
    }

    let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) {
        self.previewLayer = self.previewLayer
        self.view.layer.addSublayer(self.previewLayer)
        self.previewLayer.frame = self.view.layer.frame
        captureSession.startRunning()

        let dataOutput = AVCaptureVideoDataOutput()
        dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value: kCVPixelFormatType_32BGRA)] as [String : Any]

        dataOutput.alwaysDiscardsLateVideoFrames = true

        if captureSession.canAddOutput(dataOutput) {
            captureSession.addOutput(dataOutput)
        }

出现错误 "Cannot invoke initializer for type 'AVCaptureVideoPreviewLayer' with an argument list of type '(session: AVCaptureSession, () -> ())'"

我不太了解这意味着什么或如何解决它,因为我对编程还比较陌生。

你在哪里初始化captureSession?

在你的 UIViewController 中尝试这样的事情:

    var captureSession = AVCaptureSession()
    var videoPreviewLayer: AVCaptureVideoPreviewLayer?

    override func viewDidLoad() {
        super.viewDidLoad()

        beginSession()
    }

    func beginSession() {

        // Get an instance of the AVCaptureDevice class to initialize a device object and provide the video as the media type parameter.
        if let captureDevice = AVCaptureDevice.default(for: AVMediaType.video) {

            do {
                // Get an instance of the AVCaptureDeviceInput class using the previous device object.
                let input = try AVCaptureDeviceInput(device: captureDevice)

                // Set the input device on the capture session.
                captureSession.addInput(input)

                // Initialize a AVCaptureVideoDataOutput object and set it as the output device to the capture session.
                let dataOutput = AVCaptureVideoDataOutput()
                dataOutput.videoSettings = [(kCVPixelBufferPixelFormatTypeKey as NSString): NSNumber(value: kCVPixelFormatType_32BGRA)] as [String : Any]

                dataOutput.alwaysDiscardsLateVideoFrames = true

                if captureSession.canAddOutput(dataOutput) {
                    captureSession.addOutput(dataOutput)
                }

                // Initialize the video preview layer and add it as a sublayer to the viewPreview view's layer.
                videoPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
                videoPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
                videoPreviewLayer?.frame = self.view.layer.bounds // It may be best to setup an UIView outlet instead of using self.view
                self.view.layer.addSublayer(videoPreviewLayer!) 

                // Start video capture.
                captureSession.startRunning()

            } catch {
                // If any error occurs, simply print it out and don't continue any more.
                print(error)
                return
            }
        }
    }

希望对您有所帮助!