AVFoundation - 只合并第一个显示的视频

AVFoundation - combine videos only one the first is displayed

我正在尝试采用不同的方法来合并视频。我正在为每个转换创建一个新轨道。

此代码的问题在于显示第一个视频而其他所有视频都是黑色的。

整个片段的音频覆盖都是正确的。看起来视频没有带入合成中,因为文件的大小是 5 M,而它应该是 25M 左右。 5M 大小与第一个剪辑和音轨的大小相关。所有 AVAssets 似乎都是有效的。这些文件确实存在于文件系统中。这是代码:

- (void)mergeVideos:(NSMutableArray *)assets withCompletion:(void (^)(NSString *))completion; {


    //    NSMutableArray *instructions = [NSMutableArray new];
    CGSize size = CGSizeZero;
    CMTime currentstarttime = kCMTimeZero;

    int tracknumber = 1;
    int32_t commontimescale = 600;
    CMTime time = kCMTimeZero;

    AVMutableComposition *mutableComposition = [AVMutableComposition composition];
    NSMutableArray *instructions = [[NSMutableArray alloc] init];

    for (NSURL *assetUrl in assets) {

        AVAsset *asset = [AVAsset assetWithURL:assetUrl];

        NSLog(@"Number of tracks: %lu  Incremental track number %i", (unsigned long)[[asset tracks] count], tracknumber);

        // make sure the timescales are correct for these tracks
        CMTime cliptime = CMTimeConvertScale(asset.duration, commontimescale, kCMTimeRoundingMethod_QuickTime);

        AVMutableCompositionTrack *videoCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                                           preferredTrackID:kCMPersistentTrackID_Invalid];

        AVAssetTrack *assetTrack = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject;

        NSLog(@"Running time: value = %lld  timescale = %d", time.value, time.timescale);
        NSLog(@"Asset length: value = %lld  timescale = %d", asset.duration.value, asset.duration.timescale);
        NSLog(@"Converted Scale: value = %lld  timescale = %d", cliptime.value, cliptime.timescale);

        NSError *error;

        [videoCompositionTrack insertEmptyTimeRange:CMTimeRangeMake(kCMTimeZero, time)];
        [videoCompositionTrack insertTimeRange:CMTimeRangeMake(time, cliptime)
                                       ofTrack:assetTrack
                                        atTime:time
                                         error:&error];
        if (error) {
            NSLog(@"Error - %@", error.debugDescription);
        }

        // this flips the video temporarily for the front facing camera
        AVMutableVideoCompositionLayerInstruction *inst = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompositionTrack];

        // set the flipping trasform to the correct tracks
        if ((tracknumber == 2) || (tracknumber == 4) || (tracknumber == 6) || (tracknumber == 8) || (tracknumber == 10)) {
            CGAffineTransform transform = CGAffineTransformMakeRotation(M_PI);
            [inst setTransform:transform atTime:time];
        } else {
            CGAffineTransform transform = assetTrack.preferredTransform;
            [inst setTransform:transform atTime:time];
        }

        // don't block the other videos with your black - needs to be the incremental time
        [inst setOpacity:0.0 atTime:time];

        // add the instructions to the overall array
        [instructions addObject:inst];

        // increment the total time after w use it for this iteration
        time = CMTimeAdd(time, cliptime);

        if (CGSizeEqualToSize(size, CGSizeZero)) {
            size = [asset tracksWithMediaType:AVMediaTypeVideo].firstObject.naturalSize;;
        }

        // incrememt the track counter
        tracknumber++;
    }

    AVMutableVideoCompositionInstruction *mainVideoCompositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, time);

    mainVideoCompositionInstruction.layerInstructions = instructions;

    // bring all of the video together in the main composition
    AVMutableVideoComposition *mainVideoComposition = [AVMutableVideoComposition videoComposition];
    mainVideoComposition.instructions = [NSArray arrayWithObject:mainVideoCompositionInstruction];

    // setup the audio
    AVMutableCompositionTrack *audioCompositionTrack = [mutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                       preferredTrackID:kCMPersistentTrackID_Invalid];


    // Grab the path, make sure to add it to your project!
    NSURL *soundURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"bink-bink-lexus-3" ofType:@"aif"]];
    AVURLAsset *soundAsset = [AVURLAsset assetWithURL:soundURL];

    NSError *error;

    // add audio to the entire track
    [audioCompositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, mutableComposition.duration)
                                   ofTrack:[soundAsset tracksWithMediaType:AVMediaTypeAudio][0]
                                    atTime:kCMTimeZero
                                     error:&error];

    // Set the frame duration to an appropriate value (i.e. 30 frames per second for video).
    //    mainVideoComposition.frameDuration = CMTimeMake(1, 30);
    mainVideoComposition.renderSize = size;

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths firstObject];
    int number = arc4random_uniform(10000);
    self.outputFile = [documentsDirectory stringByAppendingFormat:@"/export_%i.mov",number];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mutableComposition
                                                                      presetName:AVAssetExportPreset1280x720];

    exporter.outputURL = [NSURL fileURLWithPath:self.outputFile];
    //Set the output file type
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;


    dispatch_group_t group = dispatch_group_create();


    dispatch_group_enter(group);

    [exporter exportAsynchronouslyWithCompletionHandler:^{
        dispatch_group_leave(group);

    }];

    dispatch_group_notify(group, dispatch_get_main_queue(), ^{

        NSLog(@"Export File (Final) - %@", self.outputFile);
        completion(self.outputFile);

    });

}

你的问题是,通过使用多个 AVMutableCompositionTracks 并在 kCMTimeZero 之后一次插入一个时间范围,你导致每个后续轨道的媒体出现在 kCMTimeZero 的合成中。如果你想走这条路,你需要使用 insertEmptyTimeRange:。它会根据您插入的空范围的持续时间及时向前移动该特定轨道的媒体。

或者,更简单的方法是使用单个 AVMutableCompositionTrack。

参考这个post:

此 post 展示了如何使用单个轨道而不是多个轨道。

我正在为任何像我一样无法解决问题的流浪者添加这个答案。

您需要移动这段代码

   time = CMTimeAdd(time, cliptime);`

之前:

// don't block the other videos with your black - needs to be the incremental time
    [inst setOpacity:0.0 atTime:time];