AVFoundation 导出方向错误

AVFoundation exporting orientation wrong

我正在尝试合并图像和视频。我让它们合并和导出,但它是侧向旋转的。

抱歉,批量代码粘贴。我已经看到有关将转换应用于 compositionVideoTrack.preferredTransform 的答案,但是这没有任何作用。添加到 AVMutableVideoCompositionInstruction 也没有任何作用。

我觉得这方面是事情开始出错的地方。这里这里:

// I feel like this loading here is the problem
        let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]

        // because it makes our parentLayer and videoLayer sizes wrong
        let videoSize       = videoTrack.naturalSize

        // this is returning 1920x1080, so it is rotating the video
        print("\(videoSize.width) , \(videoSize.height)")

所以到这里我们的帧大小对于方法的其余部分是错误的。现在,当我们尝试创建叠加图像层时,框架不正确:

    let aLayer = CALayer()
    aLayer.contents = UIImage(named: "OverlayTestImageOverlay")?.CGImage
    aLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height)
    aLayer.opacity = 1

这是我的完整方法。

  func combineImageVid() {

        let path = NSBundle.mainBundle().pathForResource("SampleMovie", ofType:"MOV")
        let fileURL = NSURL(fileURLWithPath: path!)

        let videoAsset = AVURLAsset(URL: fileURL)
        let mixComposition = AVMutableComposition()

        let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)

        var clipVideoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)

        do {
            try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), ofTrack: clipVideoTrack[0], atTime: kCMTimeZero)
        }
        catch _ {
            print("failed to insertTimeRange")
        }


        compositionVideoTrack.preferredTransform = videoAsset.preferredTransform

        // I feel like this loading here is the problem
        let videoTrack = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]

        // because it makes our parentLayer and videoLayer sizes wrong
        let videoSize       = videoTrack.naturalSize

        // this is returning 1920x1080, so it is rotating the video
        print("\(videoSize.width) , \(videoSize.height)")

        let aLayer = CALayer()
        aLayer.contents = UIImage(named: "OverlayTestImageOverlay")?.CGImage
        aLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height)
        aLayer.opacity = 1


        let parentLayer     = CALayer()
        let videoLayer      = CALayer()

        parentLayer.frame   = CGRectMake(0, 0, videoSize.width, videoSize.height)
        videoLayer.frame    = CGRectMake(0, 0, videoSize.width, videoSize.height)

        parentLayer.addSublayer(videoLayer)
        parentLayer.addSublayer(aLayer)


        let videoComp = AVMutableVideoComposition()
        videoComp.renderSize = videoSize
        videoComp.frameDuration = CMTimeMake(1, 30)
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, inLayer: parentLayer)

        let instruction = AVMutableVideoCompositionInstruction()

        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)

        let mixVideoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0]
        mixVideoTrack.preferredTransform = CGAffineTransformMakeRotation(CGFloat(M_PI * 90.0 / 180))

        let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: mixVideoTrack)
        instruction.layerInstructions = [layerInstruction]
        videoComp.instructions = [instruction]


        //  create new file to receive data
        let dirPaths = NSSearchPathForDirectoriesInDomains(.DocumentDirectory, .UserDomainMask, true)
        let docsDir: AnyObject = dirPaths[0]
        let movieFilePath = docsDir.stringByAppendingPathComponent("result.mov")
        let movieDestinationUrl = NSURL(fileURLWithPath: movieFilePath)

        do {
            try NSFileManager.defaultManager().removeItemAtPath(movieFilePath)
        }
        catch _ {}


        // use AVAssetExportSession to export video
        let assetExport = AVAssetExportSession(asset: mixComposition, presetName:AVAssetExportPresetHighestQuality)
        assetExport?.videoComposition = videoComp
        assetExport!.outputFileType = AVFileTypeQuickTimeMovie
        assetExport!.outputURL = movieDestinationUrl
        assetExport!.exportAsynchronouslyWithCompletionHandler({
            switch assetExport!.status{
            case  AVAssetExportSessionStatus.Failed:
                print("failed \(assetExport!.error)")
            case AVAssetExportSessionStatus.Cancelled:
                print("cancelled \(assetExport!.error)")
            default:
                print("Movie complete")


                // play video
                NSOperationQueue.mainQueue().addOperationWithBlock({ () -> Void in
                    print(movieDestinationUrl)
                })
            }
        })
    }

这就是我要导出的内容:


我尝试添加这两种方法来旋转视频:

class func videoCompositionInstructionForTrack(track: AVCompositionTrack, asset: AVAsset) -> AVMutableVideoCompositionLayerInstruction {

    let instruction = AVMutableVideoCompositionLayerInstruction(assetTrack: track)

    let assetTrack = asset.tracksWithMediaType(AVMediaTypeVideo)[0]

    let transform = assetTrack.preferredTransform
    let assetInfo = orientationFromTransform(transform)
    var scaleToFitRatio = UIScreen.mainScreen().bounds.width / assetTrack.naturalSize.width

    if assetInfo.isPortrait {

        scaleToFitRatio = UIScreen.mainScreen().bounds.width / assetTrack.naturalSize.height
        let scaleFactor = CGAffineTransformMakeScale(scaleToFitRatio, scaleToFitRatio)
        instruction.setTransform(CGAffineTransformConcat(assetTrack.preferredTransform, scaleFactor),
            atTime: kCMTimeZero)
    } else {

        let scaleFactor = CGAffineTransformMakeScale(scaleToFitRatio, scaleToFitRatio)
        var concat = CGAffineTransformConcat(CGAffineTransformConcat(assetTrack.preferredTransform, scaleFactor), CGAffineTransformMakeTranslation(0, UIScreen.mainScreen().bounds.width / 2))
        if assetInfo.orientation == .Down {
            let fixUpsideDown = CGAffineTransformMakeRotation(CGFloat(M_PI))
            let windowBounds = UIScreen.mainScreen().bounds
            let yFix = assetTrack.naturalSize.height + windowBounds.height
            let centerFix = CGAffineTransformMakeTranslation(assetTrack.naturalSize.width, yFix)
            concat = CGAffineTransformConcat(CGAffineTransformConcat(fixUpsideDown, centerFix), scaleFactor)
        }
        instruction.setTransform(concat, atTime: kCMTimeZero)
    }

    return instruction
}

class func orientationFromTransform(transform: CGAffineTransform) -> (orientation: UIImageOrientation, isPortrait: Bool) {
    var assetOrientation = UIImageOrientation.Up
    var isPortrait = false
    if transform.a == 0 && transform.b == 1.0 && transform.c == -1.0 && transform.d == 0 {
        assetOrientation = .Right
        isPortrait = true
    } else if transform.a == 0 && transform.b == -1.0 && transform.c == 1.0 && transform.d == 0 {
        assetOrientation = .Left
        isPortrait = true
    } else if transform.a == 1.0 && transform.b == 0 && transform.c == 0 && transform.d == 1.0 {
        assetOrientation = .Up
    } else if transform.a == -1.0 && transform.b == 0 && transform.c == 0 && transform.d == -1.0 {
        assetOrientation = .Down
    }
    return (assetOrientation, isPortrait)
}

更新了我的 combineImageVid() 方法,将其添加到

let instruction = AVMutableVideoCompositionInstruction()

instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)

let mixVideoTrack = mixComposition.tracksWithMediaType(AVMediaTypeVideo)[0]

//let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: mixVideoTrack)
//layerInstruction.setTransform(videoAsset.preferredTransform, atTime: kCMTimeZero)

let layerInstruction = videoCompositionInstructionForTrack(compositionVideoTrack, asset: videoAsset)

这给了我这个输出:

所以我越来越接近了,但是我觉得因为曲目最初的加载方式是错误的,所以我需要解决那里的问题。另外,我不知道为什么现在有那个巨大的黑盒子。我想可能是因为我的图像层在此处占用了加载的视频资产的边界:

aLayer.frame = CGRectMake(0, 0, videoSize.width, videoSize.height)

然而,将其更改为一些小 width/height 并没有什么不同。然后我考虑添加一个 crop rec 来摆脱黑色方块,但这也没有用:(


遵循艾伦关于不使用这两种方法的建议:

class func videoCompositionInstructionForTrack(track: AVCompositionTrack, asset: AVAsset) -> AVMutableVideoCompositionLayerInstruction

class func orientationFromTransform(transform: CGAffineTransform) -> (orientation: UIImageOrientation, isPortrait: Bool) 

但将我的原始方法更新为如下所示:

videoLayer.frame    = CGRectMake(0, 0, videoSize.height, videoSize.width) //notice the switched width and height
...
videoComp.renderSize = CGSizeMake(videoSize.height,videoSize.width) //this make the final video in portrait
...
layerInstruction.setTransform(videoTrack.preferredTransform, atTime: kCMTimeZero) //important piece of information let composition know you want to rotate the original video in output

我们已经非常接近了,但是现在问题似乎出在编辑 renderSize 上。如果我将其更改为横向尺寸以外的任何其他尺寸,我会得到:

这是 Apple 的培训文档:

https://developer.apple.com/library/ios/qa/qa1744/_index.html

如果您的原始视频是在纵向模式下拍摄的 iOS,它的自然尺寸仍将是横向的,但它在 mov 文件中带有旋转元数据。为了旋转您的视频,您需要对第一段代码进行以下更改:

videoLayer.frame    = CGRectMake(0, 0, videoSize.height, videoSize.width) //notice the switched width and height
...
videoComp.renderSize = CGSizeMake(videoSize.height,videoSize.width) //this make the final video in portrait
...
layerInstruction.setTransform(videoTrack.preferredTransform, atTime: kCMTimeZero) //important piece of information let composition know you want to rotate the original video in output

是的,你真的很接近!

也许你应该检查 videoTrack 的 preferredTransform 以便给它一个准确的 renderSize 和转换:

CGAffineTransform transform = assetVideoTrack.preferredTransform;
CGFloat rotation = [self rotationWithTransform:transform]; 
//if been rotated
        if (rotation != 0)
        {
            //if rotation is 360°
            if (fabs((rotation - M_PI * 2)) >= valueOfError) {

                CGFloat m = rotation / M_PI;
                CGAffineTransform t1;
                //rotation is 90° or 270°
                if (fabs(m - 1/2.0) < valueOfError || fabs(m - 3/2.0) < valueOfError) {
                    self.mutableVideoComposition.renderSize = CGSizeMake(assetVideoTrack.naturalSize.height,assetVideoTrack.naturalSize.width);
                    t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.height, 0);
                }
                //rotation is 180°
                if (fabs(m - 1.0) < valueOfError) {
                    t1 = CGAffineTransformMakeTranslation(assetVideoTrack.naturalSize.width, assetVideoTrack.naturalSize.height);
                }
                CGAffineTransform t2 = CGAffineTransformRotate(t1,rotation);
                //                CGAffineTransform transform = makeTransform(1.0, 1.0, 90, videoTrack.naturalSize.height, 0);
                [passThroughLayer setTransform:t2 atTime:kCMTimeZero];
            }
        }

//convert transform to radian
- (CGFloat)rotationWithTransform:(CGAffineTransform)t
{
    return atan2f(t.b, t.a);
}