AVFoundation 连接视频

AVFoundation concatenating videos

我正在为此烦恼。

我尝试连接视频,但它无法执行我想要的操作。 具体来说,我有不同方向的视频,我尝试使用图层指令将其设置正确。唉,不管我怎么试,都没有效果...

我阅读了所有教程,尝试实现 Apple 提供的 APLCompositionDebugView(基本上看起来还不错),但无济于事...我准备放弃一切...

这是我的代码:

self.videoComposition = [AVMutableVideoComposition videoComposition];
self.videoComposition.renderSize = CGSizeMake(480, 320);
self.videoComposition.frameDuration = CMTimeMake(1, 30);

NSMutableArray *videoCompositionInstructions = [[NSMutableArray alloc] init];

AVMutableComposition *theMutableComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionVideoTrack = [theMutableComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack = [theMutableComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableVideoCompositionLayerInstruction *passThroughLayer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];

CMTime titleDuration = CMTimeMakeWithSeconds(5, 600);

CMTimeRange titleRange = CMTimeRangeMake(kCMTimeZero, titleDuration);

[compositionVideoTrack insertEmptyTimeRange:titleRange];
[compositionAudioTrack insertEmptyTimeRange:titleRange];

CMTime insertPoint = [[transitionTimes lastObject] CMTimeValue];

CMTime totalTime = CMTimeMakeWithSeconds(5, 600);

for(NSDictionary *clip in collection[@"clips"]){

    NSString *movieName = [NSString stringWithFormat:@"collection_%li/recording_%li_%li.MOV", (long)editCollection, editCollection, [clip[@"clipID"] longValue]];

    NSURL *assetUrl = [NSURL fileURLWithPath:[usefulStuff pathForFile: movieName]];

    AVURLAsset *videoAsset = [AVURLAsset URLAssetWithURL:assetUrl options:nil];

    totalTime = CMTimeAdd(totalTime, videoAsset.duration);

    AVAssetTrack *clipVideoTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    AVAssetTrack *clipAudioTrack = [[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];

    NSError *error;

    [compositionVideoTrack insertTimeRange:clipVideoTrack.timeRange ofTrack:clipVideoTrack atTime:insertPoint error:&error];    // Add video
    [compositionAudioTrack insertTimeRange:clipVideoTrack.timeRange ofTrack:clipAudioTrack atTime:insertPoint error:&error];    // Add audio

    [passThroughLayer setTransform:clipVideoTrack.preferredTransform atTime:insertPoint]; ////// This should supposedly set the video in the right orientation at the given time... 

    insertPoint = CMTimeAdd(insertPoint, videoAsset.duration);

}

AVMutableVideoCompositionInstruction *passThroughInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
passThroughInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime);

NSLog(@"Total time b %f", CMTimeGetSeconds(totalTime));

passThroughInstruction.layerInstructions = [NSArray arrayWithObject:passThroughLayer];
[videoCompositionInstructions addObject:passThroughInstruction];

self.videoComposition.instructions = videoCompositionInstructions;

嘿嘿嘿嘿嘿! :)

我在您发布的代码中没有看到任何处理方向问题的内容。您需要在 layerInstruction 中进行一些转换以使所有方向相同。

尝试这样的事情:

AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];

videoComposition.frameDuration = CMTimeMake(1,30);

videoComposition.renderScale = 1.0;

AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionVideoTrack];

// Get only paths the user selected
NSMutableArray *array = [NSMutableArray array];
for(NSString* string in videoPathArray){
    if(![string isEqualToString:@""]){
        [array addObject:string];
    }
}
self.videoPathArray = array;
float time = 0;

for (int i = 0; i<self.videoPathArray.count; i++) {

    AVURLAsset *sourceAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:[videoPathArray objectAtIndex:i]] options:[NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];

    NSError *error = nil;

    BOOL ok = NO;
    AVAssetTrack *sourceVideoTrack = [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

  CGSize temp = CGSizeApplyAffineTransform(sourceVideoTrack.naturalSize, sourceVideoTrack.preferredTransform);
    CGSize size = CGSizeMake(fabsf(temp.width), fabsf(temp.height));
    CGAffineTransform transform = sourceVideoTrack.preferredTransform;

    videoComposition.renderSize = sourceVideoTrack.naturalSize;
    if (size.width > size.height) {

        [layerInstruction setTransform:transform atTime:CMTimeMakeWithSeconds(time, 30)];
    } else {
        float s = size.width/size.height;

        CGAffineTransform new = CGAffineTransformConcat(transform, CGAffineTransformMakeScale(s,s));

        float x = (size.height - size.width*s)/2;

        CGAffineTransform newer = CGAffineTransformConcat(new, CGAffineTransformMakeTranslation(x, 0));

        [layerInstruction setTransform:newer atTime:CMTimeMakeWithSeconds(time, 30)];
    }
    ok = [compositionVideoTrack insertTimeRange:sourceVideoTrack.timeRange ofTrack:sourceVideoTrack atTime:[composition duration] error:&error];

    if (!ok) {
            // Deal with the error.
        NSLog(@"something went wrong");
    }

    NSLog(@"\n source asset duration is %f \n source vid track timerange is %f %f \n composition duration is %f \n composition vid track time range is %f %f",CMTimeGetSeconds([sourceAsset duration]), CMTimeGetSeconds(sourceVideoTrack.timeRange.start),CMTimeGetSeconds(sourceVideoTrack.timeRange.duration),CMTimeGetSeconds([composition duration]), CMTimeGetSeconds(compositionVideoTrack.timeRange.start),CMTimeGetSeconds(compositionVideoTrack.timeRange.duration));

    time += CMTimeGetSeconds(sourceVideoTrack.timeRange.duration);

}

instruction.layerInstructions = [NSArray arrayWithObject:layerInstruction];
instruction.timeRange = compositionVideoTrack.timeRange;


videoComposition.instructions = [NSArray arrayWithObject:instruction];

我从 here 那里获取了代码,因为它每次都对我有用。您所要做的就是根据您的要求进行结果转换。