合成多个视频会导致挂起
Composing multiple videos causes hang
我正在开发一个应用程序,该应用程序可以组合用户拍摄的多个视频剪辑。这些片段被记录在相机上,并与另一个视频叠加,然后将记录的片段组合成一个长片段。每个剪辑的长度由叠加的视频文件决定。
我正在使用 AVAssetExportSession
和 exportAsynchronouslyWithCompletionHandler
。奇怪的是这适用于某些剪辑而不是其他剪辑。真正的问题是导出器不报告任何错误或失败,只是零进度并且从不调用完成处理程序。
我什至不知道从哪里开始寻找问题所在。这是我用来组合剪辑的功能
- (void) setupAndStitchVideos:(NSMutableArray*)videoData
{
// Filepath to where the final generated video is stored
NSURL * exportUrl = nil;
// Contains information about a single asset/track
NSDictionary * assetOptions = nil;
AVURLAsset * currVideoAsset = nil;
AVURLAsset * currAudioAsset = nil;
AVAssetTrack * currVideoTrack = nil;
AVAssetTrack * currAudioTrack = nil;
// Contains all tracks and time ranges used to build the final composition
NSMutableArray * allVideoTracks = nil;
NSMutableArray * allVideoRanges = nil;
NSMutableArray * allAudioTracks = nil;
NSMutableArray * allAudioRanges = nil;
AVMutableCompositionTrack * videoTracks = nil;
AVMutableCompositionTrack * audioTracks = nil;
// Misc time values used when calculating a clips start time and total length
float animationLength = 0.0f;
float clipLength = 0.0f;
float startTime = 0.0f;
CMTime clipStart = kCMTimeZero;
CMTime clipDuration = kCMTimeZero;
CMTimeRange currRange = kCMTimeRangeZero;
// The final composition to be generated and exported
AVMutableComposition * finalComposition = nil;
// Cancel any already active exports
if (m_activeExport)
{
[m_activeExport cancelExport];
m_activeExport = nil;
}
// Initialize and setup all composition related member variables
allVideoTracks = [[NSMutableArray alloc] init];
allAudioTracks = [[NSMutableArray alloc] init];
allVideoRanges = [[NSMutableArray alloc] init];
allAudioRanges = [[NSMutableArray alloc] init];
exportUrl = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
finalComposition = [AVMutableComposition composition];
videoTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
animationLength = m_animation.videoDuration;
// Define all of the audio and video tracks that will be used in the composition
for (NSDictionary * currData in videoData)
{
currVideoAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
currAudioAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
currVideoTrack = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audioTracks != nil && audioTracks.count > 0 )
{
currAudioTrack = audioTracks[0];
}
else
{
currAudioTrack = nil;
}
clipLength = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
clipStart = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
clipDuration = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);
NSLog(@"Clip length: %.2f", clipLength);
NSLog(@"Clip Start: %lld", clipStart.value );
NSLog(@"Clip duration: %lld", clipDuration.value);
currRange = CMTimeRangeMake(clipStart, clipDuration);
[allVideoTracks addObject:currVideoTrack];
if ( currAudioTrack != nil )
{
[allAudioTracks addObject:currAudioTrack];
[allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
}
[allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
startTime += clipLength;
}
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];
if ( allAudioTracks.count > 0 )
{
[audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
}
for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
{
CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
[audioTracks insertEmptyTimeRange:curRange];
}
// Delete any previous exported video files that may already exist
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
// Begin the composition generation and export process!
m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
[m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
[m_activeExport setOutputURL:exportUrl];
NSLog(@"Exporting async");
[m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
{
NSLog(@"Export complete");
// Cancel the update timer
[m_updateTimer invalidate];
m_updateTimer = nil;
// Dismiss the displayed dialog
[m_displayedDialog hide:TRUE];
m_displayedDialog = nil;
// Re-enable touch events
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
// Report the success/failure result
switch (m_activeExport.status)
{
case AVAssetExportSessionStatusFailed:
[self performSelectorOnMainThread:@selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
break;
case AVAssetExportSessionStatusCompleted:
[self performSelectorOnMainThread:@selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
break;
}
// Clear our reference to the completed export
m_activeExport = nil;
}];
}
编辑:
感谢 Josh 的评论,我注意到有一些我没有使用的错误参数。在它现在失败的情况下,我在插入视频轨道的时间范围时遇到非常有用的 "Operation could not be completed" 错误:
NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];
if ( videoError != nil )
{
NSLog(@"Error adding video track: %@", videoError);
}
输出:
Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 {NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}
但值得注意的是,在整个代码库中没有任何地方使用 urlWithString
而不是 fileUrlWithPath
,所以这不是问题所在。
从您对 videoData 数组的 for in
枚举来看,在您初始化组合成员变量之后,您似乎在阻塞调用线程。尽管允许访问每个 AVAssetTrack 实例,但键的值并不总是立即可用并且 运行 同步..
相反,请尝试使用 AVSynchronousKeyValueLoading
协议注册更改通知。 Apple 的 documentation 应该可以帮助您解决问题并让您顺利上路!
以下是我为 AVFoundation 汇总的一些 Apple 推荐:
希望这能奏效!祝你好运,如果你有进一步的 questions/problems.
,请告诉我
我正在开发一个应用程序,该应用程序可以组合用户拍摄的多个视频剪辑。这些片段被记录在相机上,并与另一个视频叠加,然后将记录的片段组合成一个长片段。每个剪辑的长度由叠加的视频文件决定。
我正在使用 AVAssetExportSession
和 exportAsynchronouslyWithCompletionHandler
。奇怪的是这适用于某些剪辑而不是其他剪辑。真正的问题是导出器不报告任何错误或失败,只是零进度并且从不调用完成处理程序。
我什至不知道从哪里开始寻找问题所在。这是我用来组合剪辑的功能
- (void) setupAndStitchVideos:(NSMutableArray*)videoData
{
// Filepath to where the final generated video is stored
NSURL * exportUrl = nil;
// Contains information about a single asset/track
NSDictionary * assetOptions = nil;
AVURLAsset * currVideoAsset = nil;
AVURLAsset * currAudioAsset = nil;
AVAssetTrack * currVideoTrack = nil;
AVAssetTrack * currAudioTrack = nil;
// Contains all tracks and time ranges used to build the final composition
NSMutableArray * allVideoTracks = nil;
NSMutableArray * allVideoRanges = nil;
NSMutableArray * allAudioTracks = nil;
NSMutableArray * allAudioRanges = nil;
AVMutableCompositionTrack * videoTracks = nil;
AVMutableCompositionTrack * audioTracks = nil;
// Misc time values used when calculating a clips start time and total length
float animationLength = 0.0f;
float clipLength = 0.0f;
float startTime = 0.0f;
CMTime clipStart = kCMTimeZero;
CMTime clipDuration = kCMTimeZero;
CMTimeRange currRange = kCMTimeRangeZero;
// The final composition to be generated and exported
AVMutableComposition * finalComposition = nil;
// Cancel any already active exports
if (m_activeExport)
{
[m_activeExport cancelExport];
m_activeExport = nil;
}
// Initialize and setup all composition related member variables
allVideoTracks = [[NSMutableArray alloc] init];
allAudioTracks = [[NSMutableArray alloc] init];
allVideoRanges = [[NSMutableArray alloc] init];
allAudioRanges = [[NSMutableArray alloc] init];
exportUrl = [NSURL fileURLWithPath:[MobveoAnimation getMergeDestination]];
finalComposition = [AVMutableComposition composition];
videoTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
audioTracks = [finalComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
assetOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
animationLength = m_animation.videoDuration;
// Define all of the audio and video tracks that will be used in the composition
for (NSDictionary * currData in videoData)
{
currVideoAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_VIDEO_URL] options:assetOptions];
currAudioAsset = [AVURLAsset URLAssetWithURL:[currData objectForKey:KEY_STITCH_AUDIO_URL] options:assetOptions];
currVideoTrack = [[currVideoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
NSArray *audioTracks = [currAudioAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audioTracks != nil && audioTracks.count > 0 )
{
currAudioTrack = audioTracks[0];
}
else
{
currAudioTrack = nil;
}
clipLength = animationLength * [(NSNumber*)[currData objectForKey:KEY_STITCH_LENGTH_PERCENTAGE] floatValue];
clipStart = CMTimeMakeWithSeconds(startTime, currVideoAsset.duration.timescale);
clipDuration = CMTimeMakeWithSeconds(clipLength, currVideoAsset.duration.timescale);
NSLog(@"Clip length: %.2f", clipLength);
NSLog(@"Clip Start: %lld", clipStart.value );
NSLog(@"Clip duration: %lld", clipDuration.value);
currRange = CMTimeRangeMake(clipStart, clipDuration);
[allVideoTracks addObject:currVideoTrack];
if ( currAudioTrack != nil )
{
[allAudioTracks addObject:currAudioTrack];
[allAudioRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
}
[allVideoRanges addObject:[NSValue valueWithCMTimeRange:currRange]];
startTime += clipLength;
}
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:nil];
if ( allAudioTracks.count > 0 )
{
[audioTracks insertTimeRanges:allAudioRanges ofTracks:allAudioTracks atTime:kCMTimeZero error:nil];
}
for ( int i = 0; i < allVideoTracks.count - allAudioTracks.count; ++i )
{
CMTimeRange curRange = [allVideoRanges[i] CMTimeRangeValue];
[audioTracks insertEmptyTimeRange:curRange];
}
// Delete any previous exported video files that may already exist
[[NSFileManager defaultManager] removeItemAtURL:exportUrl error:nil];
// Begin the composition generation and export process!
m_activeExport = [[AVAssetExportSession alloc] initWithAsset:finalComposition presetName:AVAssetExportPreset1280x720];
[m_activeExport setOutputFileType:AVFileTypeQuickTimeMovie];
[m_activeExport setOutputURL:exportUrl];
NSLog(@"Exporting async");
[m_activeExport exportAsynchronouslyWithCompletionHandler:^(void)
{
NSLog(@"Export complete");
// Cancel the update timer
[m_updateTimer invalidate];
m_updateTimer = nil;
// Dismiss the displayed dialog
[m_displayedDialog hide:TRUE];
m_displayedDialog = nil;
// Re-enable touch events
[[UIApplication sharedApplication] endIgnoringInteractionEvents];
// Report the success/failure result
switch (m_activeExport.status)
{
case AVAssetExportSessionStatusFailed:
[self performSelectorOnMainThread:@selector(videoStitchingFailed:) withObject:m_activeExport.error waitUntilDone:FALSE];
break;
case AVAssetExportSessionStatusCompleted:
[self performSelectorOnMainThread:@selector(videoStitchingComplete:) withObject:m_activeExport.outputURL waitUntilDone:FALSE];
break;
}
// Clear our reference to the completed export
m_activeExport = nil;
}];
}
编辑:
感谢 Josh 的评论,我注意到有一些我没有使用的错误参数。在它现在失败的情况下,我在插入视频轨道的时间范围时遇到非常有用的 "Operation could not be completed" 错误:
NSError *videoError = nil;
[videoTracks insertTimeRanges:allVideoRanges ofTracks:allVideoTracks atTime:kCMTimeZero error:&videoError];
if ( videoError != nil )
{
NSLog(@"Error adding video track: %@", videoError);
}
输出:
Error adding video track: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17426dd00 {NSUnderlyingError=0x174040cc0 "The operation couldn’t be completed. (OSStatus error -12780.)", NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed}
但值得注意的是,在整个代码库中没有任何地方使用 urlWithString
而不是 fileUrlWithPath
,所以这不是问题所在。
从您对 videoData 数组的 for in
枚举来看,在您初始化组合成员变量之后,您似乎在阻塞调用线程。尽管允许访问每个 AVAssetTrack 实例,但键的值并不总是立即可用并且 运行 同步..
相反,请尝试使用 AVSynchronousKeyValueLoading
协议注册更改通知。 Apple 的 documentation 应该可以帮助您解决问题并让您顺利上路!
以下是我为 AVFoundation 汇总的一些 Apple 推荐:
希望这能奏效!祝你好运,如果你有进一步的 questions/problems.
,请告诉我