音频未在 AVCaptureSession 中录制
Audio Not Recording in AVCaptureSession
我有一个应用程序,当加载视图时,它开始捕获视频和音频,并在完成后将其记录到应用程序的文档文件夹以及 iPad 的相机胶卷中运行 开启。我已经确定并添加了音频和视频的输入到会话中,但是当我去查看保存的视频时,没有音频。谁能在我的代码中发现任何可以指出问题所在的地方?
更新:从未显示任何错误消息。但是,我发现了一个共同点。音频将被录制,但前提是录制时间为 10 秒或更短。如果达到 11 秒,则不会录制音频。
NSLog 显示
Finished with error: (null)
-(void)viewWillAppear:(BOOL)animated {
NSDate *today = [NSDate date];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:@"MMM d hh:mm:ss a"];
// display in 12HR/24HR (i.e. 11:25PM or 23:25) format according to User Settings
NSString *currentTime = [dateFormatter stringFromDate:today];
NSError* error4 = nil;
AVAudioSession* audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryAmbient error:&error4];
OSStatus propertySetError = 0;
UInt32 allowMixing = true;
propertySetError |= AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);
// Activate the audio session
error4 = nil;
if (![audioSession setActive:YES error:&error4]) {
NSLog(@"AVAudioSession setActive:YES failed: %@", [error4 localizedDescription]);
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
self.navigationController.navigationBarHidden = YES;
NSError *error = nil;
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error2 = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error2];
AVCaptureDevice *device;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionBack;
// find the front facing camera
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
NSString *editedfilename = [[@"ComeOnDown" lastPathComponent] stringByDeletingPathExtension];
NSString *datestring = [[editedfilename stringByAppendingString:@" "] stringByAppendingString:currentTime];
NSLog(@"%@", datestring);
NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:datestring] stringByAppendingString:@".mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];
[session addInput:audioInput];
[session addInput:deviceInput];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
CALayer *rootLayer = [vImagePreview layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
//session = nil;
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
[NSString stringWithFormat:@"Failed with error %d", (int)[error code]]
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:@"Dismiss"
otherButtonTitles:nil];
[alertView show];
}
[super viewWillAppear:YES];
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections {
}
-(void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo{
NSLog(@"Finished with error: %@", error);
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
//finished
NSLog(@"Finished");
NSString *proud = [[NSString alloc] initWithString:[outputFileURL path]];
UISaveVideoAtPathToSavedPhotosAlbum(proud, self, @selector(video:didFinishSavingWithError: contextInfo:), (__bridge void *)(proud));
}
答案是movieFileOutput.movieFragmentInterval = kCMTimeInvalid;
显然它默认设置为10,之后的任何内容都不会录制音频。引用自 AVCaptureSession audio doesn't work for long videos
一旦我发现问题涉及时间,就很容易找到答案
videoFileOutput.movieFragmentInterval = kCMTimeInvalid
帮我解决了这个问题。
但是,我在调用startRecordingToOutputFileURL 后不小心设置了movieFragmentInterval。一个痛苦的小时后,我意识到我的错误。对于像我这样的新手 - 请注意这个明显的序列。
videoFileOutput.movieFragmentInterval = kCMTimeInvalid
videoFileOutput.startRecordingToOutputFileURL(filePath, recordingDelegate: recordingDelegate)
Swift 4.2
movieFileOutput.movieFragmentInterval = CMTime.invalid
我有一个应用程序,当加载视图时,它开始捕获视频和音频,并在完成后将其记录到应用程序的文档文件夹以及 iPad 的相机胶卷中运行 开启。我已经确定并添加了音频和视频的输入到会话中,但是当我去查看保存的视频时,没有音频。谁能在我的代码中发现任何可以指出问题所在的地方?
更新:从未显示任何错误消息。但是,我发现了一个共同点。音频将被录制,但前提是录制时间为 10 秒或更短。如果达到 11 秒,则不会录制音频。
NSLog 显示
Finished with error: (null)
-(void)viewWillAppear:(BOOL)animated {
NSDate *today = [NSDate date];
NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init];
[dateFormatter setDateFormat:@"MMM d hh:mm:ss a"];
// display in 12HR/24HR (i.e. 11:25PM or 23:25) format according to User Settings
NSString *currentTime = [dateFormatter stringFromDate:today];
NSError* error4 = nil;
AVAudioSession* audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryAmbient error:&error4];
OSStatus propertySetError = 0;
UInt32 allowMixing = true;
propertySetError |= AudioSessionSetProperty(kAudioSessionProperty_OtherMixableAudioShouldDuck, sizeof(allowMixing), &allowMixing);
// Activate the audio session
error4 = nil;
if (![audioSession setActive:YES error:&error4]) {
NSLog(@"AVAudioSession setActive:YES failed: %@", [error4 localizedDescription]);
}
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectoryPath = [paths objectAtIndex:0];
session = [[AVCaptureSession alloc] init];
[session beginConfiguration];
session.sessionPreset = AVCaptureSessionPresetMedium;
self.navigationController.navigationBarHidden = YES;
NSError *error = nil;
AVCaptureDevice *audioCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
NSError *error2 = nil;
AVCaptureDeviceInput *audioInput = [AVCaptureDeviceInput deviceInputWithDevice:audioCaptureDevice error:&error2];
AVCaptureDevice *device;
AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionBack;
// find the front facing camera
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
// get the input device
AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
AVCaptureMovieFileOutput *movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];
NSString *archives = [documentsDirectoryPath stringByAppendingPathComponent:@"archives"];
NSString *editedfilename = [[@"ComeOnDown" lastPathComponent] stringByDeletingPathExtension];
NSString *datestring = [[editedfilename stringByAppendingString:@" "] stringByAppendingString:currentTime];
NSLog(@"%@", datestring);
NSString *outputpathofmovie = [[archives stringByAppendingPathComponent:datestring] stringByAppendingString:@".mp4"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputpathofmovie];
[session addInput:audioInput];
[session addInput:deviceInput];
[session addOutput:movieFileOutput];
[session commitConfiguration];
[session startRunning];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.connection.videoOrientation = AVCaptureVideoOrientationLandscapeRight;
CALayer *rootLayer = [vImagePreview layer];
[rootLayer setMasksToBounds:YES];
[previewLayer setFrame:[rootLayer bounds]];
[rootLayer addSublayer:previewLayer];
[movieFileOutput startRecordingToOutputFileURL:outputURL recordingDelegate:self];
//session = nil;
if (error) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
[NSString stringWithFormat:@"Failed with error %d", (int)[error code]]
message:[error localizedDescription]
delegate:nil
cancelButtonTitle:@"Dismiss"
otherButtonTitles:nil];
[alertView show];
}
[super viewWillAppear:YES];
}
-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections {
}
-(void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo{
NSLog(@"Finished with error: %@", error);
}
- (void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error {
//finished
NSLog(@"Finished");
NSString *proud = [[NSString alloc] initWithString:[outputFileURL path]];
UISaveVideoAtPathToSavedPhotosAlbum(proud, self, @selector(video:didFinishSavingWithError: contextInfo:), (__bridge void *)(proud));
}
答案是movieFileOutput.movieFragmentInterval = kCMTimeInvalid;
显然它默认设置为10,之后的任何内容都不会录制音频。引用自 AVCaptureSession audio doesn't work for long videos
一旦我发现问题涉及时间,就很容易找到答案
videoFileOutput.movieFragmentInterval = kCMTimeInvalid
帮我解决了这个问题。
但是,我在调用startRecordingToOutputFileURL 后不小心设置了movieFragmentInterval。一个痛苦的小时后,我意识到我的错误。对于像我这样的新手 - 请注意这个明显的序列。
videoFileOutput.movieFragmentInterval = kCMTimeInvalid
videoFileOutput.startRecordingToOutputFileURL(filePath, recordingDelegate: recordingDelegate)
Swift 4.2
movieFileOutput.movieFragmentInterval = CMTime.invalid