IOS:Swift:视频截屏
IOS:Swift: Video Screen Capture
iOS/Swift中调用的屏幕设备是什么?
当我打印我得到的设备时
(
"<AVCaptureFigVideoDevice: 0x134d0f210 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
"<AVCaptureFigVideoDevice: 0x134e0af80 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
"<AVCaptureFigAudioDevice: 0x174265440 [iPad Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"
)
屏幕 ID 在哪里?
过时的 objective c 代码太多,而 swift 是一个移动的目标。我正在寻找 swift 解决方案来从我的 iPad 屏幕捕获视频并从内置麦克风捕获音频。音频将是一个单独的问题。
这是 OS X
的屏幕抓取器
https://github.com/kennyledet/SwiftCap
// AVCaptureSession holds inputs and outputs for real-time capture
let mSession = AVCaptureSession()
let mScreenCapOutput = AVCaptureMovieFileOutput()
var mOutputPath = ""
// Just capture main display for now
let mMainDisplayId = CGMainDisplayID()
但我在文档中找不到 iPad...
的显示 ID CGMainDisplayID
这是swift
中相机的典型解决方案
https://github.com/bradley/iOSSwiftSimpleAVCamera
但它有太多错误,无法使用 iOS 8.1 或 8.2 进行编译,也无法从摄像头抓取视频。
func addVideoOutput() {
var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)
self.videoDeviceOutput = AVCaptureVideoDataOutput()
self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true
self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
if self.session.canAddOutput(self.videoDeviceOutput) {
self.session.addOutput(self.videoDeviceOutput)
}
}
Apple 给出了这样的 objective-c 解决方案
/*
* Create video connection
*/
AVCaptureDeviceInput *videoIn = [[AVCaptureDeviceInput alloc] initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
if ([_captureSession canAddInput:videoIn])
[_captureSession addInput:videoIn];
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[videoOut setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]}];
dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([_captureSession canAddOutput:videoOut])
[_captureSession addOutput:videoOut];
_videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
self.videoOrientation = _videoConnection.videoOrientation;
if([self.session canSetSessionPreset:AVCaptureSessionPreset640x480])
[self.session setSessionPreset:AVCaptureSessionPreset640x480]; // Lower video resolution to decrease recorded movie size
return YES;
}
这应该很容易.....???
这是 swift 中 iOSSwiftSimpleAVCamera 的工作副本。它并不能完全解决您的问题,但对于查看此线程的其他任何人来说,它都是一个起点。一些错误检查已从此代码中删除,所以请注意,它只能在实际设备上运行,而不是在模拟器中运行。
应用委托
//
// AppDelegate.swift
// iOSSwiftSimpleAVCamera
//
// Created by Bradley Griffith on 7/1/14.
// Copyright (c) 2014 Bradley Griffith. All rights reserved.
//
import UIKit
import CoreData
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func applicationDidFinishLaunching(application: UIApplication) {
}
func applicationWillResignActive(application: UIApplication) {
// Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
// Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game.
}
func applicationDidEnterBackground(application: UIApplication) {
// Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
// If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
}
func applicationWillEnterForeground(application: UIApplication) {
// Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background.
}
func applicationDidBecomeActive(application: UIApplication) {
// Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
}
func applicationWillTerminate(application: UIApplication) {
// Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
// Saves changes in the application's managed object context before the application terminates.
self.saveContext()
}
func saveContext () {
var error: NSError? = nil
let managedObjectContext = self.managedObjectContext
//if managedObjectContext != nil {
if managedObjectContext.hasChanges && !managedObjectContext.save(&error) {
// Replace this implementation with code to handle the error appropriately.
// abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.
//println("Unresolved error \(error), \(error.userInfo)")
abort()
// &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&}
}
}
// #pragma mark - Core Data stack
// Returns the managed object context for the application.
// If the context doesn't already exist, it is created and bound to the persistent store coordinator for the application.
var managedObjectContext: NSManagedObjectContext {
if !(_managedObjectContext != nil) {
let coordinator = self.persistentStoreCoordinator
//if coordinator != nil {
_managedObjectContext = NSManagedObjectContext()
_managedObjectContext!.persistentStoreCoordinator = coordinator
//&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&}
}
return _managedObjectContext!
}
var _managedObjectContext: NSManagedObjectContext? = nil
// Returns the managed object model for the application.
// If the model doesn't already exist, it is created from the application's model.
var managedObjectModel: NSManagedObjectModel {
if !(_managedObjectModel != nil) {
let modelURL = NSBundle.mainBundle().URLForResource("iOSSwiftSimpleAVCamera", withExtension: "momd")
_managedObjectModel = NSManagedObjectModel(contentsOfURL: modelURL!)
}
return _managedObjectModel!
}
var _managedObjectModel: NSManagedObjectModel? = nil
// Returns the persistent store coordinator for the application.
// If the coordinator doesn't already exist, it is created and the application's store added to it.
var persistentStoreCoordinator: NSPersistentStoreCoordinator {
if !(_persistentStoreCoordinator != nil) {
let storeURL = self.applicationDocumentsDirectory.URLByAppendingPathComponent("iOSSwiftSimpleAVCamera.sqlite")
var error: NSError? = nil
_persistentStoreCoordinator = NSPersistentStoreCoordinator(managedObjectModel: self.managedObjectModel)
if _persistentStoreCoordinator!.addPersistentStoreWithType(NSSQLiteStoreType, configuration: nil, URL: storeURL, options: nil, error: &error) == nil {
/*
Replace this implementation with code to handle the error appropriately.
abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.
Typical reasons for an error here include:
* The persistent store is not accessible;
* The schema for the persistent store is incompatible with current managed object model.
Check the error message to determine what the actual problem was.
If the persistent store is not accessible, there is typically something wrong with the file path. Often, a file URL is pointing into the application's resources directory instead of a writeable directory.
If you encounter schema incompatibility errors during development, you can reduce their frequency by:
* Simply deleting the existing store:
NSFileManager.defaultManager().removeItemAtURL(storeURL, error: nil)
* Performing automatic lightweight migration by passing the following dictionary as the options parameter:
[NSMigratePersistentStoresAutomaticallyOption: true, NSInferMappingModelAutomaticallyOption: true}
Lightweight migration will only work for a limited set of schema changes; consult "Core Data Model Versioning and Data Migration Programming Guide" for details.
*/
//println("Unresolved error \(error), \(error.userInfo)")
abort()
}
}
return _persistentStoreCoordinator!
}
var _persistentStoreCoordinator: NSPersistentStoreCoordinator? = nil
// #pragma mark - Application's Documents directory
// Returns the URL to the application's Documents directory.
var applicationDocumentsDirectory: NSURL {
let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
return urls[urls.endIndex-1] as! NSURL
}
}
CameraSessionController
//
// CameraSessionController.swift
// iOSSwiftSimpleAVCamera
//
// Created by Bradley Griffith on 7/1/14.
// Copyright (c) 2014 Bradley Griffith. All rights reserved.
//
import UIKit
import AVFoundation
import CoreMedia
import CoreImage
@objc protocol CameraSessionControllerDelegate {
optional func cameraSessionDidOutputSampleBuffer(sampleBuffer: CMSampleBuffer!)
}
class CameraSessionController: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {
var session: AVCaptureSession!
var sessionQueue: dispatch_queue_t!
var videoDeviceInput: AVCaptureDeviceInput!
var videoDeviceOutput: AVCaptureVideoDataOutput!
var stillImageOutput: AVCaptureStillImageOutput!
var runtimeErrorHandlingObserver: AnyObject?
var sessionDelegate: CameraSessionControllerDelegate?
/* Class Methods
------------------------------------------*/
class func deviceWithMediaType(mediaType: NSString, position: AVCaptureDevicePosition) -> AVCaptureDevice {
var devices: NSArray = AVCaptureDevice.devicesWithMediaType(mediaType as String)
var captureDevice: AVCaptureDevice = devices.firstObject as! AVCaptureDevice
for object:AnyObject in devices {
let device = object as! AVCaptureDevice
if (device.position == position) {
captureDevice = device
break
}
}
return captureDevice
}
/* Lifecycle
------------------------------------------*/
override init() {
super.init();
self.session = AVCaptureSession()
self.authorizeCamera();
self.sessionQueue = dispatch_queue_create("CameraSessionController Session", DISPATCH_QUEUE_SERIAL)
dispatch_async(self.sessionQueue, {
self.session.beginConfiguration()
self.addVideoInput()
self.addVideoOutput()
self.addStillImageOutput()
self.session.commitConfiguration()
})
}
/* Instance Methods
------------------------------------------*/
func authorizeCamera() {
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: {
(granted: Bool) -> Void in
// If permission hasn't been granted, notify the user.
if !granted {
dispatch_async(dispatch_get_main_queue(), {
UIAlertView(
title: "Could not use camera!",
message: "This application does not have permission to use camera. Please update your privacy settings.",
delegate: self,
cancelButtonTitle: "OK").show()
})
}
});
}
func addVideoInput() -> Bool {
var success: Bool = false
var error: NSError?
var videoDevice: AVCaptureDevice = CameraSessionController.deviceWithMediaType(AVMediaTypeVideo, position: AVCaptureDevicePosition.Back)
self.videoDeviceInput = AVCaptureDeviceInput.deviceInputWithDevice(videoDevice, error: &error) as! AVCaptureDeviceInput;
if !(error != nil) {
if self.session.canAddInput(self.videoDeviceInput) {
self.session.addInput(self.videoDeviceInput)
success = true
}
}
return success
}
func addVideoOutput() {
//&&&&&&&&&&&&&&&&&&&&&var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)
self.videoDeviceOutput = AVCaptureVideoDataOutput()
self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true
self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
if self.session.canAddOutput(self.videoDeviceOutput) {
self.session.addOutput(self.videoDeviceOutput)
}
}
func addStillImageOutput() {
self.stillImageOutput = AVCaptureStillImageOutput()
self.stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if self.session.canAddOutput(self.stillImageOutput) {
self.session.addOutput(self.stillImageOutput)
}
}
func startCamera() {
dispatch_async(self.sessionQueue, {
var weakSelf: CameraSessionController? = self
self.runtimeErrorHandlingObserver = NSNotificationCenter.defaultCenter().addObserverForName(AVCaptureSessionRuntimeErrorNotification, object: self.sessionQueue, queue: nil, usingBlock: {
(note: NSNotification!) -> Void in
let strongSelf: CameraSessionController = weakSelf!
dispatch_async(strongSelf.sessionQueue, {
strongSelf.session.startRunning()
})
})
self.session.startRunning()
})
}
func teardownCamera() {
dispatch_async(self.sessionQueue, {
self.session.stopRunning()
NSNotificationCenter.defaultCenter().removeObserver(self.runtimeErrorHandlingObserver!)
})
}
func focusAndExposeAtPoint(point: CGPoint) {
dispatch_async(self.sessionQueue, {
var device: AVCaptureDevice = self.videoDeviceInput.device
var error: NSErrorPointer!
if device.lockForConfiguration(error) {
if device.focusPointOfInterestSupported && device.isFocusModeSupported(AVCaptureFocusMode.AutoFocus) {
device.focusPointOfInterest = point
device.focusMode = AVCaptureFocusMode.AutoFocus
}
if device.exposurePointOfInterestSupported && device.isExposureModeSupported(AVCaptureExposureMode.AutoExpose) {
device.exposurePointOfInterest = point
device.exposureMode = AVCaptureExposureMode.AutoExpose
}
device.unlockForConfiguration()
}
else {
// TODO: Log error.
}
})
}
func captureImage(completion:((image: UIImage?, error: NSError?) -> Void)?) {
if (completion != nil){
if(self.stillImageOutput != nil) {
return
}}
dispatch_async(self.sessionQueue, {
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo), completionHandler: {
(imageDataSampleBuffer: CMSampleBuffer?, error: NSError?) -> Void in
if (imageDataSampleBuffer != nil)
{
if(error != nil)
{
completion!(image:nil, error:nil)
}
}
else if (imageDataSampleBuffer != nil) {
var imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
var image: UIImage = UIImage(data: imageData)!
completion!(image:image, error:nil)
}
})
})
}
/* AVCaptureVideoDataOutput Delegate
------------------------------------------*/
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
self.sessionDelegate?.cameraSessionDidOutputSampleBuffer?(sampleBuffer)
}
}
相机视图控制器
//
// CameraViewController.swift
// iOSSwiftSimpleAVCamera
//
// Created by Bradley Griffith on 7/1/14.
// Copyright (c) 2014 Bradley Griffith. All rights reserved.
//
import UIKit
import CoreMedia
import AVFoundation
class CameraViewController: UIViewController, CameraSessionControllerDelegate {
var cameraSessionController: CameraSessionController!
var previewLayer: AVCaptureVideoPreviewLayer!
/* Lifecycle
------------------------------------------*/
override func viewDidLoad() {
super.viewDidLoad()
self.cameraSessionController = CameraSessionController()
self.cameraSessionController.sessionDelegate = self
self.setupPreviewLayer()
}
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
self.cameraSessionController.startCamera()
}
override func viewWillDisappear(animated: Bool) {
super.viewWillDisappear(animated)
self.cameraSessionController.teardownCamera()
}
/* Instance Methods
------------------------------------------*/
func setupPreviewLayer() {
var minSize = min(self.view.bounds.size.width, self.view.bounds.size.height)
var bounds: CGRect = CGRectMake(0.0, 0.0, minSize, minSize)
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.cameraSessionController.session)
self.previewLayer.bounds = bounds
self.previewLayer.position = CGPointMake(CGRectGetMidX(self.view.bounds), CGRectGetMidY(self.view.bounds))
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(self.previewLayer)
}
func cameraSessionDidOutputSampleBuffer(sampleBuffer: CMSampleBuffer!) {
// Any frame processing could be done here.
}
}
如果你想截屏并保存,也可以选择先截屏,然后再将图片数组转换成视频,不过从性能的角度来看效率不是很高,你可能不会有 30-60 fps,但如果你对 5-20 fps 没问题,你可能想看看 this example for swift3.
iOS/Swift中调用的屏幕设备是什么?
当我打印我得到的设备时
(
"<AVCaptureFigVideoDevice: 0x134d0f210 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>",
"<AVCaptureFigVideoDevice: 0x134e0af80 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>",
"<AVCaptureFigAudioDevice: 0x174265440 [iPad Microphone][com.apple.avfoundation.avcapturedevice.built-in_audio:0]>"
)
屏幕 ID 在哪里?
过时的 objective c 代码太多,而 swift 是一个移动的目标。我正在寻找 swift 解决方案来从我的 iPad 屏幕捕获视频并从内置麦克风捕获音频。音频将是一个单独的问题。
这是 OS X
的屏幕抓取器https://github.com/kennyledet/SwiftCap
// AVCaptureSession holds inputs and outputs for real-time capture
let mSession = AVCaptureSession()
let mScreenCapOutput = AVCaptureMovieFileOutput()
var mOutputPath = ""
// Just capture main display for now
let mMainDisplayId = CGMainDisplayID()
但我在文档中找不到 iPad...
的显示 ID CGMainDisplayID这是swift
中相机的典型解决方案https://github.com/bradley/iOSSwiftSimpleAVCamera
但它有太多错误,无法使用 iOS 8.1 或 8.2 进行编译,也无法从摄像头抓取视频。
func addVideoOutput() {
var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)
self.videoDeviceOutput = AVCaptureVideoDataOutput()
self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true
self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
if self.session.canAddOutput(self.videoDeviceOutput) {
self.session.addOutput(self.videoDeviceOutput)
}
}
Apple 给出了这样的 objective-c 解决方案
/*
* Create video connection
*/
AVCaptureDeviceInput *videoIn = [[AVCaptureDeviceInput alloc] initWithDevice:[self videoDeviceWithPosition:AVCaptureDevicePositionBack] error:nil];
if ([_captureSession canAddInput:videoIn])
[_captureSession addInput:videoIn];
AVCaptureVideoDataOutput *videoOut = [[AVCaptureVideoDataOutput alloc] init];
[videoOut setAlwaysDiscardsLateVideoFrames:YES];
[videoOut setVideoSettings:@{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]}];
dispatch_queue_t videoCaptureQueue = dispatch_queue_create("Video Capture Queue", DISPATCH_QUEUE_SERIAL);
[videoOut setSampleBufferDelegate:self queue:videoCaptureQueue];
if ([_captureSession canAddOutput:videoOut])
[_captureSession addOutput:videoOut];
_videoConnection = [videoOut connectionWithMediaType:AVMediaTypeVideo];
self.videoOrientation = _videoConnection.videoOrientation;
if([self.session canSetSessionPreset:AVCaptureSessionPreset640x480])
[self.session setSessionPreset:AVCaptureSessionPreset640x480]; // Lower video resolution to decrease recorded movie size
return YES;
}
这应该很容易.....???
这是 swift 中 iOSSwiftSimpleAVCamera 的工作副本。它并不能完全解决您的问题,但对于查看此线程的其他任何人来说,它都是一个起点。一些错误检查已从此代码中删除,所以请注意,它只能在实际设备上运行,而不是在模拟器中运行。
应用委托
//
// AppDelegate.swift
// iOSSwiftSimpleAVCamera
//
// Created by Bradley Griffith on 7/1/14.
// Copyright (c) 2014 Bradley Griffith. All rights reserved.
//
import UIKit
import CoreData
@UIApplicationMain
class AppDelegate: UIResponder, UIApplicationDelegate {
var window: UIWindow?
func applicationDidFinishLaunching(application: UIApplication) {
}
func applicationWillResignActive(application: UIApplication) {
// Sent when the application is about to move from active to inactive state. This can occur for certain types of temporary interruptions (such as an incoming phone call or SMS message) or when the user quits the application and it begins the transition to the background state.
// Use this method to pause ongoing tasks, disable timers, and throttle down OpenGL ES frame rates. Games should use this method to pause the game.
}
func applicationDidEnterBackground(application: UIApplication) {
// Use this method to release shared resources, save user data, invalidate timers, and store enough application state information to restore your application to its current state in case it is terminated later.
// If your application supports background execution, this method is called instead of applicationWillTerminate: when the user quits.
}
func applicationWillEnterForeground(application: UIApplication) {
// Called as part of the transition from the background to the inactive state; here you can undo many of the changes made on entering the background.
}
func applicationDidBecomeActive(application: UIApplication) {
// Restart any tasks that were paused (or not yet started) while the application was inactive. If the application was previously in the background, optionally refresh the user interface.
}
func applicationWillTerminate(application: UIApplication) {
// Called when the application is about to terminate. Save data if appropriate. See also applicationDidEnterBackground:.
// Saves changes in the application's managed object context before the application terminates.
self.saveContext()
}
func saveContext () {
var error: NSError? = nil
let managedObjectContext = self.managedObjectContext
//if managedObjectContext != nil {
if managedObjectContext.hasChanges && !managedObjectContext.save(&error) {
// Replace this implementation with code to handle the error appropriately.
// abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.
//println("Unresolved error \(error), \(error.userInfo)")
abort()
// &&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&}
}
}
// #pragma mark - Core Data stack
// Returns the managed object context for the application.
// If the context doesn't already exist, it is created and bound to the persistent store coordinator for the application.
var managedObjectContext: NSManagedObjectContext {
if !(_managedObjectContext != nil) {
let coordinator = self.persistentStoreCoordinator
//if coordinator != nil {
_managedObjectContext = NSManagedObjectContext()
_managedObjectContext!.persistentStoreCoordinator = coordinator
//&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&}
}
return _managedObjectContext!
}
var _managedObjectContext: NSManagedObjectContext? = nil
// Returns the managed object model for the application.
// If the model doesn't already exist, it is created from the application's model.
var managedObjectModel: NSManagedObjectModel {
if !(_managedObjectModel != nil) {
let modelURL = NSBundle.mainBundle().URLForResource("iOSSwiftSimpleAVCamera", withExtension: "momd")
_managedObjectModel = NSManagedObjectModel(contentsOfURL: modelURL!)
}
return _managedObjectModel!
}
var _managedObjectModel: NSManagedObjectModel? = nil
// Returns the persistent store coordinator for the application.
// If the coordinator doesn't already exist, it is created and the application's store added to it.
var persistentStoreCoordinator: NSPersistentStoreCoordinator {
if !(_persistentStoreCoordinator != nil) {
let storeURL = self.applicationDocumentsDirectory.URLByAppendingPathComponent("iOSSwiftSimpleAVCamera.sqlite")
var error: NSError? = nil
_persistentStoreCoordinator = NSPersistentStoreCoordinator(managedObjectModel: self.managedObjectModel)
if _persistentStoreCoordinator!.addPersistentStoreWithType(NSSQLiteStoreType, configuration: nil, URL: storeURL, options: nil, error: &error) == nil {
/*
Replace this implementation with code to handle the error appropriately.
abort() causes the application to generate a crash log and terminate. You should not use this function in a shipping application, although it may be useful during development.
Typical reasons for an error here include:
* The persistent store is not accessible;
* The schema for the persistent store is incompatible with current managed object model.
Check the error message to determine what the actual problem was.
If the persistent store is not accessible, there is typically something wrong with the file path. Often, a file URL is pointing into the application's resources directory instead of a writeable directory.
If you encounter schema incompatibility errors during development, you can reduce their frequency by:
* Simply deleting the existing store:
NSFileManager.defaultManager().removeItemAtURL(storeURL, error: nil)
* Performing automatic lightweight migration by passing the following dictionary as the options parameter:
[NSMigratePersistentStoresAutomaticallyOption: true, NSInferMappingModelAutomaticallyOption: true}
Lightweight migration will only work for a limited set of schema changes; consult "Core Data Model Versioning and Data Migration Programming Guide" for details.
*/
//println("Unresolved error \(error), \(error.userInfo)")
abort()
}
}
return _persistentStoreCoordinator!
}
var _persistentStoreCoordinator: NSPersistentStoreCoordinator? = nil
// #pragma mark - Application's Documents directory
// Returns the URL to the application's Documents directory.
var applicationDocumentsDirectory: NSURL {
let urls = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)
return urls[urls.endIndex-1] as! NSURL
}
}
CameraSessionController
//
// CameraSessionController.swift
// iOSSwiftSimpleAVCamera
//
// Created by Bradley Griffith on 7/1/14.
// Copyright (c) 2014 Bradley Griffith. All rights reserved.
//
import UIKit
import AVFoundation
import CoreMedia
import CoreImage
@objc protocol CameraSessionControllerDelegate {
optional func cameraSessionDidOutputSampleBuffer(sampleBuffer: CMSampleBuffer!)
}
class CameraSessionController: NSObject, AVCaptureVideoDataOutputSampleBufferDelegate {
var session: AVCaptureSession!
var sessionQueue: dispatch_queue_t!
var videoDeviceInput: AVCaptureDeviceInput!
var videoDeviceOutput: AVCaptureVideoDataOutput!
var stillImageOutput: AVCaptureStillImageOutput!
var runtimeErrorHandlingObserver: AnyObject?
var sessionDelegate: CameraSessionControllerDelegate?
/* Class Methods
------------------------------------------*/
class func deviceWithMediaType(mediaType: NSString, position: AVCaptureDevicePosition) -> AVCaptureDevice {
var devices: NSArray = AVCaptureDevice.devicesWithMediaType(mediaType as String)
var captureDevice: AVCaptureDevice = devices.firstObject as! AVCaptureDevice
for object:AnyObject in devices {
let device = object as! AVCaptureDevice
if (device.position == position) {
captureDevice = device
break
}
}
return captureDevice
}
/* Lifecycle
------------------------------------------*/
override init() {
super.init();
self.session = AVCaptureSession()
self.authorizeCamera();
self.sessionQueue = dispatch_queue_create("CameraSessionController Session", DISPATCH_QUEUE_SERIAL)
dispatch_async(self.sessionQueue, {
self.session.beginConfiguration()
self.addVideoInput()
self.addVideoOutput()
self.addStillImageOutput()
self.session.commitConfiguration()
})
}
/* Instance Methods
------------------------------------------*/
func authorizeCamera() {
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo, completionHandler: {
(granted: Bool) -> Void in
// If permission hasn't been granted, notify the user.
if !granted {
dispatch_async(dispatch_get_main_queue(), {
UIAlertView(
title: "Could not use camera!",
message: "This application does not have permission to use camera. Please update your privacy settings.",
delegate: self,
cancelButtonTitle: "OK").show()
})
}
});
}
func addVideoInput() -> Bool {
var success: Bool = false
var error: NSError?
var videoDevice: AVCaptureDevice = CameraSessionController.deviceWithMediaType(AVMediaTypeVideo, position: AVCaptureDevicePosition.Back)
self.videoDeviceInput = AVCaptureDeviceInput.deviceInputWithDevice(videoDevice, error: &error) as! AVCaptureDeviceInput;
if !(error != nil) {
if self.session.canAddInput(self.videoDeviceInput) {
self.session.addInput(self.videoDeviceInput)
success = true
}
}
return success
}
func addVideoOutput() {
//&&&&&&&&&&&&&&&&&&&&&var rgbOutputSettings: NSDictionary = NSDictionary(object: Int(CInt(kCIFormatRGBA8)), forKey: kCVPixelBufferPixelFormatTypeKey)
self.videoDeviceOutput = AVCaptureVideoDataOutput()
self.videoDeviceOutput.alwaysDiscardsLateVideoFrames = true
self.videoDeviceOutput.setSampleBufferDelegate(self, queue: self.sessionQueue)
if self.session.canAddOutput(self.videoDeviceOutput) {
self.session.addOutput(self.videoDeviceOutput)
}
}
func addStillImageOutput() {
self.stillImageOutput = AVCaptureStillImageOutput()
self.stillImageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
if self.session.canAddOutput(self.stillImageOutput) {
self.session.addOutput(self.stillImageOutput)
}
}
func startCamera() {
dispatch_async(self.sessionQueue, {
var weakSelf: CameraSessionController? = self
self.runtimeErrorHandlingObserver = NSNotificationCenter.defaultCenter().addObserverForName(AVCaptureSessionRuntimeErrorNotification, object: self.sessionQueue, queue: nil, usingBlock: {
(note: NSNotification!) -> Void in
let strongSelf: CameraSessionController = weakSelf!
dispatch_async(strongSelf.sessionQueue, {
strongSelf.session.startRunning()
})
})
self.session.startRunning()
})
}
func teardownCamera() {
dispatch_async(self.sessionQueue, {
self.session.stopRunning()
NSNotificationCenter.defaultCenter().removeObserver(self.runtimeErrorHandlingObserver!)
})
}
func focusAndExposeAtPoint(point: CGPoint) {
dispatch_async(self.sessionQueue, {
var device: AVCaptureDevice = self.videoDeviceInput.device
var error: NSErrorPointer!
if device.lockForConfiguration(error) {
if device.focusPointOfInterestSupported && device.isFocusModeSupported(AVCaptureFocusMode.AutoFocus) {
device.focusPointOfInterest = point
device.focusMode = AVCaptureFocusMode.AutoFocus
}
if device.exposurePointOfInterestSupported && device.isExposureModeSupported(AVCaptureExposureMode.AutoExpose) {
device.exposurePointOfInterest = point
device.exposureMode = AVCaptureExposureMode.AutoExpose
}
device.unlockForConfiguration()
}
else {
// TODO: Log error.
}
})
}
func captureImage(completion:((image: UIImage?, error: NSError?) -> Void)?) {
if (completion != nil){
if(self.stillImageOutput != nil) {
return
}}
dispatch_async(self.sessionQueue, {
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo), completionHandler: {
(imageDataSampleBuffer: CMSampleBuffer?, error: NSError?) -> Void in
if (imageDataSampleBuffer != nil)
{
if(error != nil)
{
completion!(image:nil, error:nil)
}
}
else if (imageDataSampleBuffer != nil) {
var imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
var image: UIImage = UIImage(data: imageData)!
completion!(image:image, error:nil)
}
})
})
}
/* AVCaptureVideoDataOutput Delegate
------------------------------------------*/
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
self.sessionDelegate?.cameraSessionDidOutputSampleBuffer?(sampleBuffer)
}
}
相机视图控制器
//
// CameraViewController.swift
// iOSSwiftSimpleAVCamera
//
// Created by Bradley Griffith on 7/1/14.
// Copyright (c) 2014 Bradley Griffith. All rights reserved.
//
import UIKit
import CoreMedia
import AVFoundation
class CameraViewController: UIViewController, CameraSessionControllerDelegate {
var cameraSessionController: CameraSessionController!
var previewLayer: AVCaptureVideoPreviewLayer!
/* Lifecycle
------------------------------------------*/
override func viewDidLoad() {
super.viewDidLoad()
self.cameraSessionController = CameraSessionController()
self.cameraSessionController.sessionDelegate = self
self.setupPreviewLayer()
}
override func viewWillAppear(animated: Bool) {
super.viewWillAppear(animated)
self.cameraSessionController.startCamera()
}
override func viewWillDisappear(animated: Bool) {
super.viewWillDisappear(animated)
self.cameraSessionController.teardownCamera()
}
/* Instance Methods
------------------------------------------*/
func setupPreviewLayer() {
var minSize = min(self.view.bounds.size.width, self.view.bounds.size.height)
var bounds: CGRect = CGRectMake(0.0, 0.0, minSize, minSize)
self.previewLayer = AVCaptureVideoPreviewLayer(session: self.cameraSessionController.session)
self.previewLayer.bounds = bounds
self.previewLayer.position = CGPointMake(CGRectGetMidX(self.view.bounds), CGRectGetMidY(self.view.bounds))
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
self.view.layer.addSublayer(self.previewLayer)
}
func cameraSessionDidOutputSampleBuffer(sampleBuffer: CMSampleBuffer!) {
// Any frame processing could be done here.
}
}
如果你想截屏并保存,也可以选择先截屏,然后再将图片数组转换成视频,不过从性能的角度来看效率不是很高,你可能不会有 30-60 fps,但如果你对 5-20 fps 没问题,你可能想看看 this example for swift3.