将 CIFilter 应用于视频文件并保存

Applying a CIFilter to a Video File and Saving it

是否有任何快速、尽可能轻便的方法来将 CIFilter 应用于视频?在提到它之前,我 已经 看过 GPUImage - 它看起来非常强大 magic 代码,但它对我来说真的太过分了'我正在努力。

基本上,我想

  1. 拍摄一个视频文件,比如说存储在/tmp/myVideoFile.mp4
  2. CIFilter 应用到此视频文件
  3. 将视频文件保存到不同(或相同)位置,比如 /tmp/anotherVideoFile.mp4

我已经能够将 CIFilter 应用于视频,使用 AVPlayerItemVideoOutput

可以非常轻松快速地播放
let player = AVPlayer(playerItem: AVPlayerItem(asset: video))
let output = AVPlayerItemVideoOutput(pixelBufferAttributes: nil)
player.currentItem?.addOutput(self.output)
player.play()

let displayLink = CADisplayLink(target: self, selector: #selector(self.displayLinkDidRefresh(_:)))
displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSRunLoopCommonModes)

func displayLinkDidRefresh(link: CADisplayLink){
    let itemTime = output.itemTimeForHostTime(CACurrentMediaTime())
    if output.hasNewPixelBufferForItemTime(itemTime){
        if let pixelBuffer = output.copyPixelBufferForItemTime(itemTime, itemTimeForDisplay: nil){
            let image = CIImage(CVPixelBuffer: pixelBuffer)
            // apply filters to image
            // display image
        }
    }
}

效果很好,但我遇到了 很多 问题,只是找到如何将滤镜应用于已保存的视频文件的最微小的麻烦。可以选择基本上只做我上面所做的,使用 AVPlayer,播放视频,并在播放时从每一帧获取像素缓冲区,但这不适用于后台视频处理。我认为只要他们的视频是为了应用过滤器,用户就不会喜欢等待。

在过于简化的代码中,我正在寻找这样的东西:

var newVideo = AVMutableAsset() // We'll just pretend like this is a thing

var originalVideo = AVAsset(url: NSURL(urlString: "/example/location.mp4"))
originalVideo.getAllFrames(){(pixelBuffer: CVPixelBuffer) -> Void in
    let image = CIImage(CVPixelBuffer: pixelBuffer)
        .imageByApplyingFilter("Filter", withInputParameters: [:])

    newVideo.addFrame(image)
}

newVideo.exportTo(url: NSURL(urlString: "/this/isAnother/example.mp4"))

有什么方法可以快速(同样,不涉及 GPUImage,并且理想情况下在 iOS 7 中工作)对视频文件应用滤镜然后保存它?例如,这将获取已保存的视频,将其加载到 AVAsset,应用 CIFilter,然后将新视频保存到不同的位置。

在 iOS 9 / OS X 10.11 / tvOS 中,有一种将 CIFilters 应用于视频的简便方法。它适用于 AVVideoComposition,因此您可以将它用于播放和文件到文件 import/export。有关方法文档,请参阅 AVVideoComposition.init(asset:applyingCIFiltersWithHandler:)

还有一个例子in Apple's Core Image Programming Guide

let filter = CIFilter(name: "CIGaussianBlur")!
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in

    // Clamp to avoid blurring transparent pixels at the image edges
    let source = request.sourceImage.clampingToExtent()
    filter.setValue(source, forKey: kCIInputImageKey)

    // Vary filter parameters based on video timing
    let seconds = CMTimeGetSeconds(request.compositionTime)
    filter.setValue(seconds * 10.0, forKey: kCIInputRadiusKey)

    // Crop the blurred output to the bounds of the original image
    let output = filter.outputImage!.cropping(to: request.sourceImage.extent)

    // Provide the filter output to the composition
    request.finish(with: output, context: nil)
})

那部分设置作文。完成后,您可以通过将其分配给 AVPlayer 来播放它,或者将其写入带有 AVAssetExportSession 的文件。既然你追求的是后者,这里有一个例子:

let export = AVAssetExportSession(asset: asset, presetName: AVAssetExportPreset1920x1200)
export.outputFileType = AVFileTypeQuickTimeMovie
export.outputURL = outURL
export.videoComposition = composition

export.exportAsynchronouslyWithCompletionHandler(/*...*/)

关于这个 in the WWDC15 session on Core Image, starting around 20 minutes in


如果您想要一个适用于早期 OS 的解决方案,那就有点复杂了。

Aside: Think about how far back you really need to support. As of August 15, 2016, 87% of devices are on iOS 9.0 or later, and 97% are on iOS 8.0 or later. Going to a lot of effort to support a small slice of your potential customer base—and it'll get even smaller by the time you get your project done and ready to deploy—might not be worth the cost.

有几种方法可以做到这一点。无论哪种方式,你都会得到 CVPixelBuffers 代表源帧,creating CIImages from them, applying filters, and rendering out new CVPixelBuffers.

  1. 使用AVAssetReaderAVAssetWriter读写像素缓冲区。在 Apple 的 AVFoundation 编程指南的 Export 章节中有如何执行此操作的示例(读取和写入部分;您仍然需要在两者之间进行过滤)。

  2. AVVideoComposition 与自定义合成器 class 结合使用。您的自定义合成器 AVAsynchronousVideoCompositionRequest objects that provide access to pixel buffers and a way for you to provide processed pixel buffers. Apple has a sample code project called AVCustomEdit 展示了如何执行此操作(同样,只是获取和返回样本缓冲区部分;您希望使用 Core Image 进行处理,而不是使用它们的 GL 渲染器)。

在这两者中,AVVideoComposition 路线更灵活,因为您可以使用合成来播放和导出。