如何在 iOS Swift 中为 CMSampleBufferGetFormatDescription 创建一个 AudioSampleBuffer

How to create a anAudioSampleBuffer for CMSampleBufferGetFormatDescription in iOS Swift

我一直在 iOS Swift 中研究视频压缩,并关注 this SO 的回答。它工作正常,直到我将这段代码的文件格式更改为 .mp4

    let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov)

我需要 .mp4 文件格式的输出是有原因的。所以当我这样做时,它会使应用程序崩溃。并给我这个错误,

2020-04-27 18:20:52.573614+0500 BrightCaster[7847:1513728] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriter addInput:] In order to perform passthrough to file type public.mpeg-4, please provide a format hint in the AVAssetWriterInput initializer'
*** First throw call stack:
(0x1b331d5f0 0x1b303fbcc 0x1bd53b2b0 0x102383c0c 0x102382164 0x1021897cc 0x1b6ca73bc 0x1b6caba7c 0x1b6daec94 0x1b7835080 0x1b7834d30 0x1e9d077b4 0x1b786a764 0x1b783eb68 0x1b783f070 0x1e9d468f4 0x1b783f1c0 0x1e9d468f4 0x1b9e21d9c 0x105173730 0x105181710 0x1b329b748 0x1b329661c 0x1b3295c34 0x1bd3df38c 0x1b73c822c 0x10230f8a0 0x1b311d800)
libc++abi.dylib: terminating with uncaught exception of type NSException

所以我搜索了 SO 并找到了 this 与我的问题相关的问题。 但现在的问题是,当我尝试将它的 answer 添加到我的函数时,它给我错误 anAudioSampleBuffer not defined。由于我是 audio/video 域的新手,我无法理解为什么它会给我这个。以及如何解决这个问题。 下面是我在函数中添加的答案中的一段代码。

    //setup audio writer
    //let formatDesc = CMSampleBufferGetFormatDescription(anAudioSampleBuffer)
    //let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: formatDesc)
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.add(audioWriterInput)

评论部分无效。任何帮助将不胜感激 谢谢。

整个转换函数如下

func convertVideoToLowQuailtyWithInputURL(inputURL: URL, outputURL: URL, completion: @escaping (Bool , _ url: String) -> Void) {

    let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
    let videoTrack = videoAsset.tracks(withMediaType: AVMediaType.video)[0]
    let videoSize = videoTrack.naturalSize
    let videoWriterCompressionSettings = [
        AVVideoAverageBitRateKey : Int(125000)
    ]

    let videoWriterSettings:[String : AnyObject] = [
        AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
        AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
        AVVideoWidthKey : Int(videoSize.width) as AnyObject,
        AVVideoHeightKey : Int(videoSize.height) as AnyObject
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoWriterSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
    videoWriterInput.transform = videoTrack.preferredTransform
    let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov) // for now its converting in .mov I THINK SO.
    videoWriter.add(videoWriterInput)



    //setup video reader
    let videoReaderSettings:[String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
    ]

    let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    var videoReader: AVAssetReader!

    do{

        videoReader = try AVAssetReader(asset: videoAsset)
    }
    catch {

        print("video reader error: \(error)")
        completion(false, "")
    }
    videoReader.add(videoReaderOutput)


    //setup audio writer
    //let formatDesc = CMSampleBufferGetFormatDescription(anAudioSampleBuffer) // this is giving me error here of un initilize, which I didn't I know.
    //let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: formatDesc)
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.add(audioWriterInput)
    //setup audio reader
    let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
    let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader = try! AVAssetReader(asset: videoAsset)
    audioReader.add(audioReaderOutput)
    videoWriter.startWriting()



    //start writing from video reader
    videoReader.startReading()
    videoWriter.startSession(atSourceTime: CMTime.zero)
    let processingQueue = DispatchQueue(label: "processingQueue1")
    videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
        while videoWriterInput.isReadyForMoreMediaData {
            let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
            if videoReader.status == .reading && sampleBuffer != nil {
                videoWriterInput.append(sampleBuffer!)
            }
            else {
                videoWriterInput.markAsFinished()
                if videoReader.status == .completed {
                    //start writing from audio reader
                    audioReader.startReading()
                    videoWriter.startSession(atSourceTime: CMTime.zero)
                    let processingQueue = DispatchQueue(label: "processingQueue2")
                    audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                        while audioWriterInput.isReadyForMoreMediaData {
                            let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .reading && sampleBuffer != nil {
                                audioWriterInput.append(sampleBuffer!)
                            }
                            else {
                                audioWriterInput.markAsFinished()
                                if audioReader.status == .completed {
                                    videoWriter.finishWriting(completionHandler: {() -> Void in
                                        completion(true, "\(videoWriter.outputURL)")
                                    })
                                }
                            }
                        }
                    })
                }
            }
        }
    })
}

您可以输出为 mp4,通过提供格式提示来传递音频(无转码):

let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: audioTrack.formatDescriptions[0] as! CMFormatDescription)

注意 audioTrack 定义的新位置。

我想 Apple 的 .mov.mp4 实现都需要知道写入文件的压缩音频格式,但我想 .mov 可以在之后推断该信息初始化,其中 .mp4 不是。也许是另一个 AVFoundation Surprise!.

在你的情况下,我看到重新编写代码以从第一个样本缓冲区获取音频格式会很烦人,但后来我记得该格式可从输入音轨获得。