SWIFT 3:使用 AVCapturePhotoOutput 拍摄照片(需要另一双眼睛来检查代码,为什么这不起作用?)

SWIFT 3: Capture photo with AVCapturePhotoOutput (Need another set of eyes to look over code, why isn't this working?)

我有一个自定义相机,AVCapturePhotoCaptureDelegate 添加到 class,并使用以下代码捕获静态图像:

插座、变量和常量

@IBOutlet weak var cameraPreview: UIView!
@IBOutlet wear var takePhotoPreview: UIImageView!

private var cameraView: AVCaptureVideoPreviewLayer!
private var camera: AVCaptureDevice!
private var cameraInput: AVCaptureDeviceInput!
private var cameraOutput: AVCapturePhotoOutput!
private var photoSampleBuffer: CMSampleBuffer?
private var previewPhotoSampleBuffer: CMSampleBuffer?
private var photoData: Data? = nil

private let cameraSession = AVCaptureSession()
private photoOutput = AVCapturePhotoOutput()

设置相机会话

private func createCamera() {
    cameraSession.beginConfiguration()
    cameraSession.sessionPreset = AVCaptureSessionPresetPhoto
    cameraSession.automaticallyConfiguresCaptureDeviceForWideColor = true

    // Add Camera Input
    if let defaultCamera = AVCaptureDeviceDiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: AVMediaTypeVideo, position: .back).devices {
        camera = defaultCamera.first
        do {
            let cameraInput = try AVCaptureDeviceInput(device: camera)
            if cameraSession.canAddInput(cameraInput) {
                cameraSession.addInput(cameraInput)
                print("Camera input added to the session")
            }
        } catch { print("Could not add camera input to the camera session") }
    }

    // Add Camera View Input
    if let cameraView = AVCaptureVideoPreviewLayer(session: cameraSession) {
        cameraView.frame = cameraPreview.bounds
        cameraView.videoGravity = AVLayerVideoGravityResizeAspectFill
        cameraView.cornerRadius = 12.0
        cameraPreview.layer.addSublayer(cameraView)
        print("Camera view created for the camera session")
    } else { print("Could not create camera preview") }

    // Add Photo Output
    let cameraPhotoOutput = AVCapturePhotoOutput()
    if cameraSession.canAddOutput(cameraPhotoOutput) {
        cameraSession.addOutput(cameraPhotoOutput)
        cameraPhotoOutput.isHighResolutionCaptureEnabled = true
        print("Camera output added to the camera session")
    } else {
        print("Could not add camera photo output to the camera session")
        cameraSession.commitConfiguration()
        return
    }

    cameraSession.commitConfiguration()

    cameraSession.startRunning()
}

捕捉按钮

@IBOutlet weak var cameraShutter: UIButton!
@IBAction func cameraShutter(_ sender: UIButton) {
    let photoSettings = AVCapturePhotoSettings()
    photoSettings.flashMode = .on
    photoSettings.isHighResolutionPhotoEnabled = true
    photoSettings.isAutoStillImageStabilizationEnabled = true
    if photoSettings.availablePreviewPhotoPixelFormatTypes.count > 0 {
        photoSettings.previewPhotoFormat = [ kCVPixelBufferPixelFormatTypeKey as String : photoSettings.availablePreviewPhotoPixelFormatTypes.first!]
    }
    cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self)
}

iOS观察相机功能

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {
    if let photoSampleBuffer = photoSampleBuffer {
        photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoSampleBuffer, previewPhotoSampleBuffer: previewPhotoSampleBuffer)
        let photoDataProvider = CGDataProvider(data: photoData as! CFData)
        let cgImagePhotoRef = CGImage(jpegDataProviderSource: photoDataProvider!, decode: nil, shouldInterpolate: true, intent: .absoluteColorimetric)
        let newPhoto = UIImage(cgImage: cgImagePhotoRef!, scale: 1.0, orientation: UIImageOrientation.right)
        self.takePhotoPreview.image = newPhoto
        self.takePhotoPreview.isHidden = false
    }
        else {
        print("Error capturing photo: \(error)")
        return
    }
}

好吧,事情就是这样——我在 cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self) 处设置了一个断点,进入该行后收到以下错误消息:

错误信息

致命错误:在展开可选值时意外发现 nil [运行时详细信息]致命错误:在展开可选值时意外发现 nil

上面的代码直接来自 Apple 的示例文档 "AVCam" 以及来自 SO Q&As 的输入(, ,以及其他重复这些答案的人)。我的最终目标是捕获图像,并立即将图像和用户推送到新的 ViewController 到 edit/post/save;但是,我目前正在使用 UIImageView 来确认捕获...首先它不起作用。

所以,这个实现是怎么回事???这几天让我抓狂。

Swift3,xCode8

尝试改变

cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self) 

进入

cameraOutput.capturePhoto(with: photoSettings, delegate: self as AVCapturePhotoCaptureDelegate)

好的,明白了。 El Tomato 对问题儿童的治疗是正确的,但这不是正确的处方。我的 createCamera() 函数设置为 private 这当然会使内容在其主体之外不可见。因此,当我调用正确的 AVCapturePhotoOutput() 时,capturePhoto() 调用执行的缓冲区馈送不存在...抛出描述的错误。

所以这意味着行:

cameraPhotoOutput.capturePhoto(with: photoSettings, delegate: self)

是正确的,但这只是错误的执行设置。为了确认正确执行我...

  • 更改了我的 private let photoOutput = AVCapturePhotoOutput() 常量
  • private let cameraPhotoOutput = AVCapturePhotoOutput()
  • 并直接在 private func createCamera()
  • 中调用该常量

立即完美地执行图像捕获。

还尝试用 cameraOutputAVCapturePhotoOutput! 替换 cameraPhotoOutputAVCapturePhotoOutput(),并简单地重现了错误。

如果您有兴趣:cgImage 创建过程在 func capture(_ : capture... 函数中保持不变。在其范围内,我还确定了摄像头设备的位置,更改了前置摄像头的图像方向,并在主队列上将照片发送到 ReviewViewController 上的 var photoContent: UIImage? 变量。

希望我的心理错误能帮助到其他人:-)