如何在 Swift 中使用 AVCaptureSession 捕获图片?

How to capture picture with AVCaptureSession in Swift?

我有一个 UIViewController,我在其中使用 AVCaptureSession 来显示相机,它工作得很好而且很快。我在这个相机视图的顶部放置了一个 UIButton 对象,并为按钮添加了一个 IBAction

这是现在的样子:

现在我想在用户点击按钮时获取当前相机视图的图片:

@IBAction func takePicture(sender: AnyObject) {
    // omg, what do do?!
}

我完全不知道我该怎么做。我想象可能会有这样的事情:

let captureSession = AVCaptureSession()
var myDearPicture = captureSession.takePicture() as UIImage // something like it?

控制器代码的完整 link 在这里 https://gist.github.com/rodrigoalvesvieira/392d683435ee29305059,希望对您有所帮助

AVCaptureSession 示例

import UIKit
import AVFoundation
class ViewController: UIViewController {
    let captureSession = AVCaptureSession()
    let stillImageOutput = AVCaptureStillImageOutput()
    var error: NSError?
    override func viewDidLoad() {
        super.viewDidLoad()
        let devices = AVCaptureDevice.devices().filter{ [=10=].hasMediaType(AVMediaTypeVideo) && [=10=].position == AVCaptureDevicePosition.Back }
        if let captureDevice = devices.first as? AVCaptureDevice  {

            captureSession.addInput(AVCaptureDeviceInput(device: captureDevice, error: &error))
            captureSession.sessionPreset = AVCaptureSessionPresetPhoto
            captureSession.startRunning()
            stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
            if captureSession.canAddOutput(stillImageOutput) {
                captureSession.addOutput(stillImageOutput)
            }
            if let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession) {
                previewLayer.bounds = view.bounds
                previewLayer.position = CGPointMake(view.bounds.midX, view.bounds.midY)
                previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
                let cameraPreview = UIView(frame: CGRectMake(0.0, 0.0, view.bounds.size.width, view.bounds.size.height))
                cameraPreview.layer.addSublayer(previewLayer)
                cameraPreview.addGestureRecognizer(UITapGestureRecognizer(target: self, action:"saveToCamera:"))
                view.addSubview(cameraPreview)
            }
        }
    }
    func saveToCamera(sender: UITapGestureRecognizer) {
        if let videoConnection = stillImageOutput.connectionWithMediaType(AVMediaTypeVideo) {
            stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
                (imageDataSampleBuffer, error) -> Void in
                let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
                 UIImageWriteToSavedPhotosAlbum(UIImage(data: imageData), nil, nil, nil)
            }
        }
    }
    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
    }
}

UIImagePickerController 示例

import UIKit

class ViewController: UIViewController, UINavigationControllerDelegate, UIImagePickerControllerDelegate {
    let imagePicker = UIImagePickerController()
    @IBOutlet weak var imageViewer: UIImageView!
    override func viewDidLoad() {
        super.viewDidLoad()
        // Do any additional setup after loading the view, typically from a nib.
    }
    override func didReceiveMemoryWarning() {
        super.didReceiveMemoryWarning()
        // Dispose of any resources that can be recreated.
    }
    func imagePickerController(picker: UIImagePickerController, didFinishPickingImage image: UIImage!, editingInfo: [NSObject : AnyObject]!) {
            dismissViewControllerAnimated(true, completion: nil)
             imageViewer.image = image
    }
    @IBAction func presentImagePicker(sender: AnyObject) {

        if UIImagePickerController.isCameraDeviceAvailable( UIImagePickerControllerCameraDevice.Front) {

            imagePicker.delegate = self
            imagePicker.sourceType = UIImagePickerControllerSourceType.Camera
            presentViewController(imagePicker, animated: true, completion: nil)

        }
    }
}

由于 AVCaptureStillImageOutput 已弃用,我创建了另一个 Swift 在 iOS 中使用 AVCaptureSessionAVCapturePhotoOutput 的示例 10. 检查 this出。