视频在后台合并 iOS
Video merging in background iOS
任务:将传单图像合并到传单视频中。
案例:
- 创建传单[添加表情符号image/text..等]
- 创建视频
案例 1
- 按后退按钮[用户将转到传单屏幕的应用程序列表],在此期间我们将 flyerSnapShoot 合并到 flyerVideo.and 它完美运行。
- 前往 Phone 图库,我们会在其中看到更新的视频。
案例2
- 按 iPhone 主页按钮,我正在做与上面相同的事情,但遇到以下 错误。
FAIL = Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17266d40 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x172b3920 "The operation couldn’t be completed. (OSStatus error -16980.)", NSLocalizedFailureReason=An unknown error occurred (-16980)}
代码:
- (void)modifyVideo:(NSURL *)src destination:(NSURL *)dest crop:(CGRect)crop
scale:(CGFloat)scale overlay:(UIImage *)image
completion:(void (^)(NSInteger, NSError *))callback {
// Get a pointer to the asset
AVURLAsset* firstAsset = [AVURLAsset URLAssetWithURL:src options:nil];
// Make an instance of avmutablecomposition so that we can edit this asset:
AVMutableComposition* mixComposition = [AVMutableComposition composition];
// Add tracks to this composition
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// Audio track
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// Image video is always 30 seconds. So we use that unless the background video is smaller.
CMTime inTime = CMTimeMake( MAX_VIDEO_LENGTH * VIDEOFRAME, VIDEOFRAME );
if ( CMTimeCompare( firstAsset.duration, inTime ) < 0 ) {
inTime = firstAsset.duration;
}
// Add to the video track.
NSArray *videos = [firstAsset tracksWithMediaType:AVMediaTypeVideo];
CGAffineTransform transform;
if ( videos.count > 0 ) {
AVAssetTrack *track = [videos objectAtIndex:0];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:track atTime:kCMTimeZero error:nil];
transform = track.preferredTransform;
videoTrack.preferredTransform = transform;
}
// Add the audio track.
NSArray *audios = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audios.count > 0 ) {
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:[audios objectAtIndex:0] atTime:kCMTimeZero error:nil];
}
NSLog(@"Natural size: %.2f x %.2f", videoTrack.naturalSize.width, videoTrack.naturalSize.height);
// Set the mix composition size.
mixComposition.naturalSize = crop.size;
// Set up the composition parameters.
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, VIDEOFRAME );
videoComposition.renderSize = crop.size;
videoComposition.renderScale = 1.0;
// Pass through parameters for animation.
AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, inTime);
// Layer instructions
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
// Set the transform to maintain orientation
if ( scale != 1.0 ) {
CGAffineTransform scaleTransform = CGAffineTransformMakeScale( scale, scale);
CGAffineTransform translateTransform = CGAffineTransformTranslate( CGAffineTransformIdentity,
-crop.origin.x,
-crop.origin.y);
transform = CGAffineTransformConcat( transform, scaleTransform );
transform = CGAffineTransformConcat( transform, translateTransform);
}
[passThroughLayer setTransform:transform atTime:kCMTimeZero];
passThroughInstruction.layerInstructions = @[ passThroughLayer ];
videoComposition.instructions = @[passThroughInstruction];
// If an image is given, then put that in the animation.
if ( image != nil ) {
// Layer that merges the video and image
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake( 0, 0, crop.size.width, crop.size.height);
// Layer that renders the video.
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
[parentLayer addSublayer:videoLayer];
// Layer that renders flyerly image.
CALayer *imageLayer = [CALayer layer];
imageLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
imageLayer.contents = (id)image.CGImage;
[imageLayer setMasksToBounds:YES];
[parentLayer addSublayer:imageLayer];
// Setup the animation tool
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
// Now export the movie
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.videoComposition = videoComposition;
// Export the URL
exportSession.outputURL = dest;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
callback( exportSession.status, exportSession.error );
}];
}
我从 AppDelegate.m
调用这个函数
- (void)applicationDidEnterBackground:(UIApplication *)application
{
bgTask = [application beginBackgroundTaskWithName:@"MyTask" expirationHandler:^{
// Clean up any unfinished task business by marking where you
// stopped or ending the task outright.
[application endBackgroundTask:bgTask];
bgTask = UIBackgroundTaskInvalid;
}];
// Start the long-running task and return immediately.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
// Do the work associated with the task, preferably in chunks.
[self goingToBg];
[application endBackgroundTask:bgTask];
bgTask = UIBackgroundTaskInvalid;
});
NSLog(@"backgroundTimeRemaining: %f", [[UIApplication sharedApplication] backgroundTimeRemaining]);
}
在这个问题上做了很多 RND,没有找到解决方案。
想分享几个链接希望它能帮助堆栈社区,如果他们遇到同样的问题[要求]。
链接 1:AVExportSession to run in background
与问题相关的引用[从上面链接1复制]
Sadly, since AVAssetExportSession uses the gpu to do some of it's
work, it cannot run in the background if you are using an
AVVideoComposition.
链接 2:Starting AVAssetExportSession in the Background
与问题相关的引用[从上面链接2复制]
You can start AVAssetExportSession in background. The only limitations
in AVFoundation to performing work in the background, are using
AVVideoCompositions or AVMutableVideoCompositions. AVVideoCompositions
are using the GPU, and the GPU cannot be used in the background
Url(s) 用于后台任务:
Stack question
如果您更新项目功能中的 "Background Modes" 设置以包含音频,参加派对就太晚了。它将允许导出。
这是为了在后台播放音乐。
对我有用。
任务:将传单图像合并到传单视频中。
案例:
- 创建传单[添加表情符号image/text..等]
- 创建视频
案例 1
- 按后退按钮[用户将转到传单屏幕的应用程序列表],在此期间我们将 flyerSnapShoot 合并到 flyerVideo.and 它完美运行。
- 前往 Phone 图库,我们会在其中看到更新的视频。
案例2
- 按 iPhone 主页按钮,我正在做与上面相同的事情,但遇到以下 错误。
FAIL = Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo=0x17266d40 {NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x172b3920 "The operation couldn’t be completed. (OSStatus error -16980.)", NSLocalizedFailureReason=An unknown error occurred (-16980)}
代码:
- (void)modifyVideo:(NSURL *)src destination:(NSURL *)dest crop:(CGRect)crop
scale:(CGFloat)scale overlay:(UIImage *)image
completion:(void (^)(NSInteger, NSError *))callback {
// Get a pointer to the asset
AVURLAsset* firstAsset = [AVURLAsset URLAssetWithURL:src options:nil];
// Make an instance of avmutablecomposition so that we can edit this asset:
AVMutableComposition* mixComposition = [AVMutableComposition composition];
// Add tracks to this composition
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
// Audio track
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
// Image video is always 30 seconds. So we use that unless the background video is smaller.
CMTime inTime = CMTimeMake( MAX_VIDEO_LENGTH * VIDEOFRAME, VIDEOFRAME );
if ( CMTimeCompare( firstAsset.duration, inTime ) < 0 ) {
inTime = firstAsset.duration;
}
// Add to the video track.
NSArray *videos = [firstAsset tracksWithMediaType:AVMediaTypeVideo];
CGAffineTransform transform;
if ( videos.count > 0 ) {
AVAssetTrack *track = [videos objectAtIndex:0];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:track atTime:kCMTimeZero error:nil];
transform = track.preferredTransform;
videoTrack.preferredTransform = transform;
}
// Add the audio track.
NSArray *audios = [firstAsset tracksWithMediaType:AVMediaTypeAudio];
if ( audios.count > 0 ) {
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, inTime) ofTrack:[audios objectAtIndex:0] atTime:kCMTimeZero error:nil];
}
NSLog(@"Natural size: %.2f x %.2f", videoTrack.naturalSize.width, videoTrack.naturalSize.height);
// Set the mix composition size.
mixComposition.naturalSize = crop.size;
// Set up the composition parameters.
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, VIDEOFRAME );
videoComposition.renderSize = crop.size;
videoComposition.renderScale = 1.0;
// Pass through parameters for animation.
AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, inTime);
// Layer instructions
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
// Set the transform to maintain orientation
if ( scale != 1.0 ) {
CGAffineTransform scaleTransform = CGAffineTransformMakeScale( scale, scale);
CGAffineTransform translateTransform = CGAffineTransformTranslate( CGAffineTransformIdentity,
-crop.origin.x,
-crop.origin.y);
transform = CGAffineTransformConcat( transform, scaleTransform );
transform = CGAffineTransformConcat( transform, translateTransform);
}
[passThroughLayer setTransform:transform atTime:kCMTimeZero];
passThroughInstruction.layerInstructions = @[ passThroughLayer ];
videoComposition.instructions = @[passThroughInstruction];
// If an image is given, then put that in the animation.
if ( image != nil ) {
// Layer that merges the video and image
CALayer *parentLayer = [CALayer layer];
parentLayer.frame = CGRectMake( 0, 0, crop.size.width, crop.size.height);
// Layer that renders the video.
CALayer *videoLayer = [CALayer layer];
videoLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
[parentLayer addSublayer:videoLayer];
// Layer that renders flyerly image.
CALayer *imageLayer = [CALayer layer];
imageLayer.frame = CGRectMake(0, 0, crop.size.width, crop.size.height );
imageLayer.contents = (id)image.CGImage;
[imageLayer setMasksToBounds:YES];
[parentLayer addSublayer:imageLayer];
// Setup the animation tool
videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}
// Now export the movie
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.videoComposition = videoComposition;
// Export the URL
exportSession.outputURL = dest;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.shouldOptimizeForNetworkUse = YES;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
callback( exportSession.status, exportSession.error );
}];
}
我从 AppDelegate.m
调用这个函数- (void)applicationDidEnterBackground:(UIApplication *)application
{
bgTask = [application beginBackgroundTaskWithName:@"MyTask" expirationHandler:^{
// Clean up any unfinished task business by marking where you
// stopped or ending the task outright.
[application endBackgroundTask:bgTask];
bgTask = UIBackgroundTaskInvalid;
}];
// Start the long-running task and return immediately.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
// Do the work associated with the task, preferably in chunks.
[self goingToBg];
[application endBackgroundTask:bgTask];
bgTask = UIBackgroundTaskInvalid;
});
NSLog(@"backgroundTimeRemaining: %f", [[UIApplication sharedApplication] backgroundTimeRemaining]);
}
在这个问题上做了很多 RND,没有找到解决方案。
想分享几个链接希望它能帮助堆栈社区,如果他们遇到同样的问题[要求]。
链接 1:AVExportSession to run in background
与问题相关的引用[从上面链接1复制]
Sadly, since AVAssetExportSession uses the gpu to do some of it's work, it cannot run in the background if you are using an AVVideoComposition.
链接 2:Starting AVAssetExportSession in the Background
与问题相关的引用[从上面链接2复制]
You can start AVAssetExportSession in background. The only limitations in AVFoundation to performing work in the background, are using AVVideoCompositions or AVMutableVideoCompositions. AVVideoCompositions are using the GPU, and the GPU cannot be used in the background
Url(s) 用于后台任务:
Stack question
如果您更新项目功能中的 "Background Modes" 设置以包含音频,参加派对就太晚了。它将允许导出。
这是为了在后台播放音乐。
对我有用。