从循环中的 AVURLAssets 构建 AVMutableComposition

Build AVMutableComposition from AVURLAssets in loop

我正在开发一款需要连接一组从相机录制的视频的应用程序。最终我将有一组 URL 可以使用,但我无法弄清楚如何正确连接两个电影资产。这是一些独立的代码:

- (void)buildComposition {
    NSString *path1 = [[NSBundle mainBundle] pathForResource:@"IMG_1049" ofType:@"MOV"];
    NSString *path2 = [[NSBundle mainBundle] pathForResource:@"IMG_1431" ofType:@"MOV"];
    NSURL *url1 = [NSURL fileURLWithPath:path1];
    NSURL *url2 = [NSURL fileURLWithPath:path2];
    AVMutableComposition *composition = [AVMutableComposition composition];
    AVMutableVideoCompositionInstruction *compositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    NSMutableArray *layerInstructions = [NSMutableArray array];
    CGSize renderSize = CGSizeZero;
    NSUInteger count = 0;
    for (NSURL *url in @[url1, url2]) {
        NSDictionary *options = @{ AVURLAssetPreferPreciseDurationAndTimingKey: @(YES) };
        AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:url options:options];
        CMTimeRange editRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(1.0, 600));
        NSError *error = nil;
        CMTime insertionTime = composition.duration;
        NSArray *videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo];
        AVAssetTrack *videoTrack = videoTracks.firstObject;
        AVMutableCompositionTrack *videoCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
        [videoCompositionTrack insertTimeRange:editRange ofTrack:videoTrack atTime:insertionTime error:&error];
        if (count == 0) {
            AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
            CGAffineTransform scale = CGAffineTransformMakeScale(0.6, 0.6);
            [layerInstruction setTransform:CGAffineTransformConcat(videoTrack.preferredTransform, scale) atTime:kCMTimeZero];
            [layerInstructions addObject:layerInstruction];
        }
        else {
            AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
            CGAffineTransform scale = CGAffineTransformMakeScale(0.9, 0.9);
            [layerInstruction setTransform:CGAffineTransformConcat(videoTrack.preferredTransform, scale) atTime:kCMTimeZero];
            [layerInstructions addObject:layerInstruction];
        }
        // set the render size
        CGRect transformed = CGRectApplyAffineTransform(CGRectMakeWithCGSize(videoTrack.naturalSize), videoTrack.preferredTransform);
        renderSize = CGSizeUnion(renderSize, transformed.size);

        NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
        AVAssetTrack *audioTrack = audioTracks.firstObject;
        AVMutableCompositionTrack *audioCompositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
        [audioCompositionTrack insertTimeRange:editRange ofTrack:audioTrack atTime:insertionTime error:&error];
        ++count;
    }
    // set the composition instructions
    compositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, composition.duration);
    compositionInstruction.layerInstructions = layerInstructions;
    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoCompositionWithPropertiesOfAsset:composition];
    videoComposition.frameDuration = CMTimeMake(1, 30);
    videoComposition.instructions = @[compositionInstruction];
    videoComposition.renderSize = renderSize;
    // export the composition
    NSTimeInterval time = [NSDate timeIntervalSinceReferenceDate];
    NSString *filename = [[NSString stringWithFormat:@"video-export-%f", time] stringByAppendingPathExtension:@"mov"];
    NSString *pathTo = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/%@", filename]];
    NSURL *fileUrl = [NSURL fileURLWithPath:pathTo];
    AVAssetExportSession *assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
    assetExport.videoComposition = videoComposition;
    assetExport.outputFileType = AVFileTypeQuickTimeMovie;
    assetExport.shouldOptimizeForNetworkUse = YES;
    assetExport.outputURL = fileUrl;

    [assetExport exportAsynchronouslyWithCompletionHandler:^{
        switch (assetExport.status) {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"\n\nFailed: %@\n\n", assetExport.error);
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"\n\nCancelled: %@\n\n", assetExport.error);
                break;
            default:
                NSLog(@"\n\nExported: %@\n\n", fileUrl);
                break;
        }
    }];
 }

我希望第一个视频以 60% 的比例播放 1 秒,然后第二个视频以 90% 的比例播放 1 秒。

实际情况是,第一个视频在视频开始时同时播放了 60% 和 90%。 1 秒后,视频变黑但音频正常播放。

有什么想法吗?谢谢!

为所有好奇的人弄明白了。在我的图层说明中,我错误地使用 AVURLAsset 的 videoTrack 构建它们,而不是 AVMutableComposition 的 compositionTrack!

这一行:

AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

应该是:

AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];