无法将 scaleTimeRange 应用于 AVMutableComposition 视频中的多个视频

Trouble applying scaleTimeRange on multiple videos in a AVMutableComposition video

我正在尝试将视频与 scaleTimeRanges 合并(使它们慢动作或加速);但是,它没有按预期工作。只有第一个视频有时间范围效果...不是所有视频。

工作在合并视频功能中完成;这很简单...但是我不确定为什么时间范围的缩放不适用于第一个视频而不适用于下一个视频...

这是一个测试项目,它有我当前的代码https://github.com/meyesyesme/creationMergeProj

这是我使用的合并功能,目前注释掉了时间范围缩放(您可以取消注释以查看它是否不起作用):

func mergeVideosTestSQ(arrayVideos:[VideoSegment], completion:@escaping (URL?, Error?) -> ()) {


let mixComposition = AVMutableComposition()


var instructions: [AVMutableVideoCompositionLayerInstruction] = []
var insertTime = CMTime(seconds: 0, preferredTimescale: 1)

print(arrayVideos, "<- arrayVideos")
/// for each URL add the video and audio tracks and their duration to the composition
for videoSegment in arrayVideos {
    
    let sourceAsset = AVAsset(url: videoSegment.videoURL!)
    
    let frameRange = CMTimeRange(start: CMTime(seconds: 0, preferredTimescale: 1), duration: sourceAsset.duration)
    
    guard
        let nthVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)),
        let nthAudioTrack = mixComposition.addMutableTrack(withMediaType: .audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)), //0 used to be kCMPersistentTrackID_Invalid
        let assetVideoTrack = sourceAsset.tracks(withMediaType: .video).first
    else {
        print("didnt work")
        return
    }
    
    var assetAudioTrack: AVAssetTrack?
    assetAudioTrack = sourceAsset.tracks(withMediaType: .audio).first
    print(assetAudioTrack, ",-- assetAudioTrack???", assetAudioTrack?.asset, "<-- hes", sourceAsset)
    
    do {
        
        try nthVideoTrack.insertTimeRange(frameRange, of: assetVideoTrack, at: insertTime)
        try nthAudioTrack.insertTimeRange(frameRange, of: assetAudioTrack!, at: insertTime)
        
        //MY CURRENT SPEED ATTEMPT:
        let newDuration = CMTimeMultiplyByFloat64(frameRange.duration, multiplier: videoSegment.videoSpeed)
        nthVideoTrack.scaleTimeRange(frameRange, toDuration: newDuration)
        nthAudioTrack.scaleTimeRange(frameRange, toDuration: newDuration)
        
        print(insertTime.value, "<-- fiji, newdur --->", newDuration.value, "sourceasset duration--->", sourceAsset.duration.value, "frameRange.duration -->", frameRange.duration.value)
        
        //instructions:
        let nthInstruction = ViewController.videoCompositionInstruction(nthVideoTrack, asset: sourceAsset)
        nthInstruction.setOpacity(0.0, at: CMTimeAdd(insertTime, newDuration)) //sourceasset.duration
        
        instructions.append(nthInstruction)
        insertTime = insertTime + newDuration //sourceAsset.duration
        
        
    } catch {
        DispatchQueue.main.async {
            print("didnt wor2k")
        }
    }
    
}


let mainInstruction = AVMutableVideoCompositionInstruction()
mainInstruction.timeRange = CMTimeRange(start: CMTime(seconds: 0, preferredTimescale: 1), duration: insertTime)

mainInstruction.layerInstructions = instructions

let mainComposition = AVMutableVideoComposition()
mainComposition.instructions = [mainInstruction]
mainComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
mainComposition.renderSize = CGSize(width: 1080, height: 1920)

let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")

//below to clear the video form docuent folder for new vid...
let fileManager = FileManager()
try? fileManager.removeItem(at: outputFileURL)

print("<now will export:  ")


/// try to start an export session and set the path and file type
if let exportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality) { //DOES NOT WORK WITH AVAssetExportPresetPassthrough
    exportSession.outputFileType = .mov
    exportSession.outputURL = outputFileURL
    exportSession.videoComposition = mainComposition
    exportSession.shouldOptimizeForNetworkUse = true
    
    /// try to export the file and handle the status cases
    exportSession.exportAsynchronously {
        if let url = exportSession.outputURL{
            completion(url, nil)
        }
        if let error = exportSession.error {
            completion(nil, error)
        }
    }
    
}

}

您会看到这种行为:第一个效果很好,但接下来的视频效果不佳,并且在设置不透明度等方面存在问题...我尝试了不同的组合,这是最接近的还有一个。

我已经坚持了一段时间了!

  1. 缩放视频后,合成的持续时间会重新计算,因此您需要根据此更改追加下一部分。替换

     insertTime = insertTime + duration
    

     insertTime = insertTime + newDuration
    
  2. 您还需要更新 setOpacity at 值,我建议您在 insertTime 更新后移动该行并使用新值,以删除此处的重复工作

  3. 当你应用缩放时,它会应用于新的构图,所以你需要使用相对范围:

     let currentRange = CMTimeRange(start: insertTime, duration: frameRange.duration)
     nthVideoTrack.scaleTimeRange(currentRange, toDuration: newDuration)
     nthAudioTrack.scaleTimeRange(currentRange, toDuration: newDuration)