录制视频时在前后摄像头之间切换
Switch between front and back camera while recording a video
我创建了一个自定义摄像头,我正在尝试添加一项功能,让用户可以在录制视频时在前后摄像头之间切换。我现在的方法是在他们切换摄像头时停止并开始播放新视频,但它会切断视频的一小部分,我不确定为什么。我怎样才能让它像 snapchat 一样,以便它获得完整的视频,并且在他们切换相机时不会切断任何内容。到目前为止,这是我的代码
@objc func switchCameraInput() {
self.captureSession.beginConfiguration()
var existingConnection:AVCaptureDeviceInput!
for connection in self.captureSession.inputs {
let input = connection as! AVCaptureDeviceInput
if input.device.hasMediaType(AVMediaType.video) {
existingConnection = input
}
}
self.captureSession.removeInput(existingConnection)
turnFlashOff()
var newCamera:AVCaptureDevice!
if let oldCamera = existingConnection {
if oldCamera.device.position == .back {
newCamera = self.cameraWithPosition(position: .front)
} else {
newCamera = self.cameraWithPosition(position: .back)
}
}
var newInput:AVCaptureDeviceInput!
do {
newInput = try AVCaptureDeviceInput(device: newCamera)
self.captureSession.addInput(newInput)
} catch {
ProgressHUD.showError(error.localizedDescription)
}
self.captureSession.commitConfiguration()
// This is where i handle switching while recording
if self.movieFileOutput.isRecording {
hasSwappedCamera = true
turnFlashOff()
//self.movieFileOutput.stopRecording()
self.movieFileOutput.connection(with: AVMediaType.video)?.videoOrientation = self.videoOrientation()
self.movieFileOutput.maxRecordedDuration = self.maxRecordedDuration()
self.movieFileOutput.startRecording(to: URL(fileURLWithPath:self.videoFileLocation()), recordingDelegate: self)
turnOnFlash()
}
}
因为 我认为 Objective-C 会帮助回答您的问题,而您更喜欢 Swift,所以我在下面 "translated" 所有代码。
注意,我没有编译这个,并且知道有几件事不会编译启动。 AVMediaTypeVideo
之类的枚举值通常只是 Swift 中的 .video
。另外,我很确定答案有一些不正确的代码,主要是将 isFrontRecording
和 isBackRecording
布尔值设置回 false
。我认为这些应该发生在 completionHandler
内,但如前所述,我没有编译它,所以对此持保留态度。我包括了那个问题(Objective-C)的所有代码以及我对 Swift.
的快速和肮脏的翻译
不过我希望这对您有所帮助:)
Objective-C:
/* Front camera settings */
@property bool isFrontRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputBack;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputBack;
@property (strong, nonatomic) AVCaptureSession *sessionBack;
/* Back camera settings */
@property bool isBackRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputFront;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputFront;
@property (strong, nonatomic) AVCaptureSession *sessionFront;
Swift:
var isFrontRecording: Bool
var videoInputBack: AVCaptureDeviceInput
var imageOutputBack: AVCaptureStillImageOutput
var sessionBack: AVCaptureSession
var isBackRecording: Bool
var videoInputFront: AVCaptureDeviceInput
var imageOutputFront: AVCaptureStillImageOutput
var sessionFront: AVCaptureSession
Objective-C
- (void)viewDidLoad {
[super viewDidLoad];
[self setupBackAVCapture];
self.isFrontRecording = NO;
self.isBackRecording = NO;
}
- (void)setupBackAVCapture
{
NSError *error = nil;
self.sessionBack = [[AVCaptureSession alloc] init];
self.sessionBack.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.videoInputBack = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionBack addInput:self.videoInputBack];
self.imageOutputBack = [[AVCaptureStillImageOutput alloc] init];
[self.sessionBack addOutput:self.imageOutputBack];
}
Swift:
override func viewDidLoad() {
super.viewDidLoad()
setupBackAVCapture()
isFrontRecording = false
isBackRecording = false
}
func setupBackAVCapture() {
var error: NSError = nil
sessionBack = AVCaptureSession()
sessionBack.sessionPreset = AVCaptureSessionPresetPhoto
let camera: AVCaptureDevice = AVCaptureDevice(defaultDeviceWithMediaType: AVMediaTypeVideo)
videoInputBack = AVCaptureDeviceInput(withDevice: camera, error: error)
sessionBack.addInput(videoInputBack)
imageOutputBack = AVCaptureStillImageOutput()
sessionBack.addOutput(imageOutputBack)
}
Objective-C:
- (IBAction)buttonCapture:(id)sender {
[self takeBackPhoto];
}
- (void)takeBackPhoto
{
[self.sessionBack startRunning];
if (!self.isFrontRecording) {
self.isFrontRecording = YES;
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputBack connectionWithMediaType:AVMediaTypeVideo];
if (videoConnection == nil) {
return;
}
[self.imageOutputBack
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageView setImage:image];
[self.sessionBack stopRunning];
// Set up front camera setting and capture photo.
[self setupFrontAVCapture];
[self takeFrontPhoto];
}];
self.isFrontRecording = NO;
}
}
Swift:
@IBOutlet func buttonCapture(sender: Any) {
takeBackPhoto()
}
func takeBackPhoto() {
sessionBack.startRunning()
if !isFrontRecording {
isFrontRecording = true
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
let videoConnection: AVCaptureConnection = imageOutputBack.connectionWithMediaType(AVMediaTypeVideo)
guard let videoConnection = videoConnection else {
return
}
imageOutputBack.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
imageDataSampleBuffer: CMSSampleBufferRef, error: NSError in
guard let imageDataSampleBuffer = imageDataSampleBuffer else {
return
}
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil)
self.imageView.setImage(image)
self.sessionback.stopRunning()
// Set up front camera setting and capture photo.
self.setupFronAVCapture()
self.takeFrontPhoto()
})
isFrontRecording = false
}
}
Objective-C
- (void)setupFrontAVCapture
{
NSError *error = nil;
self.sessionFront = [[AVCaptureSession alloc] init];
self.sessionFront.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
camera = [self cameraWithPosition:AVCaptureDevicePositionFront];
self.videoInputFront = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionFront addInput:self.videoInputFront];
self.imageOutputFront = [[AVCaptureStillImageOutput alloc] init];
[self.sessionFront addOutput:self.imageOutputFront];
}
- (void)takeFrontPhoto
{
[self.sessionFront startRunning];
if (!self.isBackRecording) {
self.isBackRecording = YES;
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputFront connectionWithMediaType:AVMediaTypeVideo];
if (videoConnection == nil) {
return;
}
[self.imageOutputFront
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageViewBack setImage:image];
[self.sessionFront stopRunning];
}];
self.isBackRecording = NO;
}
}
Swift:
func setupFrontAVCapture() {
let error: NSError = nil
sessionFront = AVCaptureSession()
sessionFront.sessionPreset = AVCaptureSessionPresentPhoto
var camera: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
camera = camera.cameraWithPosition(AVCaptureDevicePositionFront)
videoInputFront = AVCaptureDeviceInput(withDevice: camera, error: error)
sessionFront.addInput(videoInputFront)
imageOutputFront = AVCaptureStillImageOutput()
sessionFront.addOutput(imageOutputFront)
}
func takeFrontPhoto() {
sessionFront.startRunning()
if !isBackRecording {
isBackRecording = true
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
let videoConnection: AVCaptureConnection = imageOutputFront.connectionWithMediaType(AVMediaTypeVideo)
guard let videoConnection = videoConnection else {
return
}
imageOutputFront.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
imageDataSampleBuffer: CMSampleBufferRef, error: NSError in
guard let imageDataSampleBuffer = imageDataSampleBuffer else {
return
}
let imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil)
self.imageViewBack.setImage(image)
self.sessionFront.stopRunning()
})
isBackRecording = false
}
}
祝你好运,让转换为你的项目工作!
对于这个完全相同的问题,我找到了一个合适的解决方案,而不是通过一个捕获会话和一个输出切换输入,你必须为每个输入(相机)创建一个会话,然后在它们之间切换输出。
您可以在此处找到更多详细信息:
我创建了一个自定义摄像头,我正在尝试添加一项功能,让用户可以在录制视频时在前后摄像头之间切换。我现在的方法是在他们切换摄像头时停止并开始播放新视频,但它会切断视频的一小部分,我不确定为什么。我怎样才能让它像 snapchat 一样,以便它获得完整的视频,并且在他们切换相机时不会切断任何内容。到目前为止,这是我的代码
@objc func switchCameraInput() {
self.captureSession.beginConfiguration()
var existingConnection:AVCaptureDeviceInput!
for connection in self.captureSession.inputs {
let input = connection as! AVCaptureDeviceInput
if input.device.hasMediaType(AVMediaType.video) {
existingConnection = input
}
}
self.captureSession.removeInput(existingConnection)
turnFlashOff()
var newCamera:AVCaptureDevice!
if let oldCamera = existingConnection {
if oldCamera.device.position == .back {
newCamera = self.cameraWithPosition(position: .front)
} else {
newCamera = self.cameraWithPosition(position: .back)
}
}
var newInput:AVCaptureDeviceInput!
do {
newInput = try AVCaptureDeviceInput(device: newCamera)
self.captureSession.addInput(newInput)
} catch {
ProgressHUD.showError(error.localizedDescription)
}
self.captureSession.commitConfiguration()
// This is where i handle switching while recording
if self.movieFileOutput.isRecording {
hasSwappedCamera = true
turnFlashOff()
//self.movieFileOutput.stopRecording()
self.movieFileOutput.connection(with: AVMediaType.video)?.videoOrientation = self.videoOrientation()
self.movieFileOutput.maxRecordedDuration = self.maxRecordedDuration()
self.movieFileOutput.startRecording(to: URL(fileURLWithPath:self.videoFileLocation()), recordingDelegate: self)
turnOnFlash()
}
}
因为
注意,我没有编译这个,并且知道有几件事不会编译启动。 AVMediaTypeVideo
之类的枚举值通常只是 Swift 中的 .video
。另外,我很确定答案有一些不正确的代码,主要是将 isFrontRecording
和 isBackRecording
布尔值设置回 false
。我认为这些应该发生在 completionHandler
内,但如前所述,我没有编译它,所以对此持保留态度。我包括了那个问题(Objective-C)的所有代码以及我对 Swift.
不过我希望这对您有所帮助:)
Objective-C:
/* Front camera settings */
@property bool isFrontRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputBack;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputBack;
@property (strong, nonatomic) AVCaptureSession *sessionBack;
/* Back camera settings */
@property bool isBackRecording;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInputFront;
@property (strong, nonatomic) AVCaptureStillImageOutput *imageOutputFront;
@property (strong, nonatomic) AVCaptureSession *sessionFront;
Swift:
var isFrontRecording: Bool
var videoInputBack: AVCaptureDeviceInput
var imageOutputBack: AVCaptureStillImageOutput
var sessionBack: AVCaptureSession
var isBackRecording: Bool
var videoInputFront: AVCaptureDeviceInput
var imageOutputFront: AVCaptureStillImageOutput
var sessionFront: AVCaptureSession
Objective-C
- (void)viewDidLoad {
[super viewDidLoad];
[self setupBackAVCapture];
self.isFrontRecording = NO;
self.isBackRecording = NO;
}
- (void)setupBackAVCapture
{
NSError *error = nil;
self.sessionBack = [[AVCaptureSession alloc] init];
self.sessionBack.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.videoInputBack = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionBack addInput:self.videoInputBack];
self.imageOutputBack = [[AVCaptureStillImageOutput alloc] init];
[self.sessionBack addOutput:self.imageOutputBack];
}
Swift:
override func viewDidLoad() {
super.viewDidLoad()
setupBackAVCapture()
isFrontRecording = false
isBackRecording = false
}
func setupBackAVCapture() {
var error: NSError = nil
sessionBack = AVCaptureSession()
sessionBack.sessionPreset = AVCaptureSessionPresetPhoto
let camera: AVCaptureDevice = AVCaptureDevice(defaultDeviceWithMediaType: AVMediaTypeVideo)
videoInputBack = AVCaptureDeviceInput(withDevice: camera, error: error)
sessionBack.addInput(videoInputBack)
imageOutputBack = AVCaptureStillImageOutput()
sessionBack.addOutput(imageOutputBack)
}
Objective-C:
- (IBAction)buttonCapture:(id)sender {
[self takeBackPhoto];
}
- (void)takeBackPhoto
{
[self.sessionBack startRunning];
if (!self.isFrontRecording) {
self.isFrontRecording = YES;
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputBack connectionWithMediaType:AVMediaTypeVideo];
if (videoConnection == nil) {
return;
}
[self.imageOutputBack
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageView setImage:image];
[self.sessionBack stopRunning];
// Set up front camera setting and capture photo.
[self setupFrontAVCapture];
[self takeFrontPhoto];
}];
self.isFrontRecording = NO;
}
}
Swift:
@IBOutlet func buttonCapture(sender: Any) {
takeBackPhoto()
}
func takeBackPhoto() {
sessionBack.startRunning()
if !isFrontRecording {
isFrontRecording = true
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
let videoConnection: AVCaptureConnection = imageOutputBack.connectionWithMediaType(AVMediaTypeVideo)
guard let videoConnection = videoConnection else {
return
}
imageOutputBack.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
imageDataSampleBuffer: CMSSampleBufferRef, error: NSError in
guard let imageDataSampleBuffer = imageDataSampleBuffer else {
return
}
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil)
self.imageView.setImage(image)
self.sessionback.stopRunning()
// Set up front camera setting and capture photo.
self.setupFronAVCapture()
self.takeFrontPhoto()
})
isFrontRecording = false
}
}
Objective-C
- (void)setupFrontAVCapture
{
NSError *error = nil;
self.sessionFront = [[AVCaptureSession alloc] init];
self.sessionFront.sessionPreset = AVCaptureSessionPresetPhoto;
AVCaptureDevice *camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
camera = [self cameraWithPosition:AVCaptureDevicePositionFront];
self.videoInputFront = [[AVCaptureDeviceInput alloc] initWithDevice:camera error:&error];
[self.sessionFront addInput:self.videoInputFront];
self.imageOutputFront = [[AVCaptureStillImageOutput alloc] init];
[self.sessionFront addOutput:self.imageOutputFront];
}
- (void)takeFrontPhoto
{
[self.sessionFront startRunning];
if (!self.isBackRecording) {
self.isBackRecording = YES;
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
AVCaptureConnection *videoConnection = [self.imageOutputFront connectionWithMediaType:AVMediaTypeVideo];
if (videoConnection == nil) {
return;
}
[self.imageOutputFront
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer == NULL) {
return;
}
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil);
[self.imageViewBack setImage:image];
[self.sessionFront stopRunning];
}];
self.isBackRecording = NO;
}
}
Swift:
func setupFrontAVCapture() {
let error: NSError = nil
sessionFront = AVCaptureSession()
sessionFront.sessionPreset = AVCaptureSessionPresentPhoto
var camera: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
camera = camera.cameraWithPosition(AVCaptureDevicePositionFront)
videoInputFront = AVCaptureDeviceInput(withDevice: camera, error: error)
sessionFront.addInput(videoInputFront)
imageOutputFront = AVCaptureStillImageOutput()
sessionFront.addOutput(imageOutputFront)
}
func takeFrontPhoto() {
sessionFront.startRunning()
if !isBackRecording {
isBackRecording = true
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate)
let videoConnection: AVCaptureConnection = imageOutputFront.connectionWithMediaType(AVMediaTypeVideo)
guard let videoConnection = videoConnection else {
return
}
imageOutputFront.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: {
imageDataSampleBuffer: CMSampleBufferRef, error: NSError in
guard let imageDataSampleBuffer = imageDataSampleBuffer else {
return
}
let imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(image, self, nil, nil)
self.imageViewBack.setImage(image)
self.sessionFront.stopRunning()
})
isBackRecording = false
}
}
祝你好运,让转换为你的项目工作!
对于这个完全相同的问题,我找到了一个合适的解决方案,而不是通过一个捕获会话和一个输出切换输入,你必须为每个输入(相机)创建一个会话,然后在它们之间切换输出。
您可以在此处找到更多详细信息: