自定义 AVVideoCompositing class 未按预期工作
Custom AVVideoCompositing class not working as expected
我正在尝试将 CIFilter 应用于 AVAsset,然后在应用过滤器的情况下保存它。我这样做的方法是使用 AVAssetExportSession
和 videoComposition
设置为具有自定义 AVVideoCompositing
class.
的 AVMutableVideoComposition
对象
我也在设置 instructions
of my AVMutableVideoComposition
object to a custom composition instruction class (conforming to AVMutableVideoCompositionInstruction
)。这个 class 被传递了一个轨道 ID,以及一些其他不重要的变量。
不幸的是,我 运行 遇到了一个问题 - startVideoCompositionRequest:
function in my custom video compositor class (conforming to AVVideoCompositing
) 没有被正确调用。
当我将我的自定义指令 class 的 passthroughTrackID
变量设置为轨道 ID 时,我的 AVVideoCompositing
中的 startVideoCompositionRequest(request)
函数没有被调用。
然而,当我没有设置自定义指令 class 的 passthroughTrackID
变量时,startVideoCompositionRequest(request)
被调用,但没有被调用正确 - 打印 request.sourceTrackIDs
results in an empty array, and request.sourceFrameByTrackID(trackID)
结果为 nil 值。
我发现的一件有趣的事情是 cancelAllPendingVideoCompositionRequests:
函数在尝试导出带有过滤器的视频时总是被调用两次。它要么在 startVideoCompositionRequest:
之前调用一次,之后调用一次,或者在 startVideoCompositionRequest:
未被调用的情况下连续调用两次。
我创建了三个 classes 用于导出带滤镜的视频。这是实用程序 class,它基本上只包含一个 export
函数并调用所有必需的代码
class VideoFilterExport{
let asset: AVAsset
init(asset: AVAsset){
self.asset = asset
}
func export(toURL url: NSURL, callback: (url: NSURL?) -> Void){
guard let track: AVAssetTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).first else{callback(url: nil); return}
let composition = AVMutableComposition()
let compositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
try compositionTrack.insertTimeRange(track.timeRange, ofTrack: track, atTime: kCMTimeZero)
}
catch _{callback(url: nil); return}
let videoComposition = AVMutableVideoComposition(propertiesOfAsset: composition)
videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
videoComposition.frameDuration = CMTimeMake(1, 30)
videoComposition.renderSize = compositionTrack.naturalSize
let instruction = VideoFilterCompositionInstruction(trackID: compositionTrack.trackID)
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.asset.duration)
videoComposition.instructions = [instruction]
let session: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetMediumQuality)!
session.videoComposition = videoComposition
session.outputURL = url
session.outputFileType = AVFileTypeMPEG4
session.exportAsynchronouslyWithCompletionHandler(){
callback(url: url)
}
}
}
这是另外两个 classes - 我将把它们都放在一个代码块中,使这个 post 更短
// Video Filter Composition Instruction Class - from what I gather,
// AVVideoCompositionInstruction is used only to pass values to
// the AVVideoCompositing class
class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
let trackID: CMPersistentTrackID
let filters: ImageFilterGroup
let context: CIContext
// When I leave this line as-is, startVideoCompositionRequest: isn't called.
// When commented out, startVideoCompositionRequest(request) is called, but there
// are no valid CVPixelBuffers provided by request.sourceFrameByTrackID(below value)
override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
override var requiredSourceTrackIDs: [NSValue]{get{return []}}
override var containsTweening: Bool{get{return false}}
init(trackID: CMPersistentTrackID, filters: ImageFilterGroup, context: CIContext){
self.trackID = trackID
self.filters = filters
self.context = context
super.init()
//self.timeRange = timeRange
self.enablePostProcessing = true
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
// My custom AVVideoCompositing class. This is where the problem lies -
// although I don't know if this is the root of the problem
class VideoFilterCompositor : NSObject, AVVideoCompositing{
var requiredPixelBufferAttributesForRenderContext: [String : AnyObject] = [
kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA), // The video is in 32 BGRA
kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
]
var sourcePixelBufferAttributes: [String : AnyObject]? = [
kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA),
kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
]
let renderQueue = dispatch_queue_create("co.getblix.videofiltercompositor.renderingqueue", DISPATCH_QUEUE_SERIAL)
override init(){
super.init()
}
func startVideoCompositionRequest(request: AVAsynchronousVideoCompositionRequest){
// This code block is never executed when the
// passthroughTrackID variable is in the above class
autoreleasepool(){
dispatch_async(self.renderQueue){
guard let instruction = request.videoCompositionInstruction as? VideoFilterCompositionInstruction else{
request.finishWithError(NSError(domain: "getblix.co", code: 760, userInfo: nil))
return
}
guard let pixels = request.sourceFrameByTrackID(instruction.passthroughTrackID) else{
// This code block is executed when I comment out the
// passthroughTrackID variable in the above class
request.finishWithError(NSError(domain: "getblix.co", code: 761, userInfo: nil))
return
}
// I have not been able to get the code to reach this point
// This function is either not called, or the guard
// statement above executes
let image = CIImage(CVPixelBuffer: pixels)
let filtered: CIImage = //apply the filter here
let width = CVPixelBufferGetWidth(pixels)
let height = CVPixelBufferGetHeight(pixels)
let format = CVPixelBufferGetPixelFormatType(pixels)
var newBuffer: CVPixelBuffer?
CVPixelBufferCreate(kCFAllocatorDefault, width, height, format, nil, &newBuffer)
if let buffer = newBuffer{
instruction.context.render(filtered, toCVPixelBuffer: buffer)
request.finishWithComposedVideoFrame(buffer)
}
else{
request.finishWithComposedVideoFrame(pixels)
}
}
}
}
func renderContextChanged(newRenderContext: AVVideoCompositionRenderContext){
// I don't have any code in this block
}
// This is interesting - this is called twice,
// Once before startVideoCompositionRequest is called,
// And once after. In the case when startVideoCompositionRequest
// Is not called, this is simply called twice in a row
func cancelAllPendingVideoCompositionRequests(){
dispatch_barrier_async(self.renderQueue){
print("Cancelled")
}
}
}
我一直在寻找 Apple's AVCustomEdit sample project 很多关于这方面的指导,但我似乎无法在其中找到发生这种情况的任何原因。
如何让 request.sourceFrameByTrackID:
函数正确调用,并为每一帧提供有效的 CVPixelBuffer
?
All of the code for this utility is on GitHub
事实证明,requiredSourceTrackIDs
variable in the custom AVVideoCompositionInstruction
class(问题中的VideoFilterCompositionInstruction
)必须设置为包含轨道 ID 的数组
override var requiredSourceTrackIDs: [NSValue]{
get{
return [
NSNumber(value: Int(self.trackID))
]
}
}
所以最终的自定义组合指令class是
class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
let trackID: CMPersistentTrackID
let filters: [CIFilter]
let context: CIContext
override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
override var requiredSourceTrackIDs: [NSValue]{get{return [NSNumber(value: Int(self.trackID))]}}
override var containsTweening: Bool{get{return false}}
init(trackID: CMPersistentTrackID, filters: [CIFilter], context: CIContext){
self.trackID = trackID
self.filters = filters
self.context = context
super.init()
self.enablePostProcessing = true
}
required init?(coder aDecoder: NSCoder){
fatalError("init(coder:) has not been implemented")
}
}
此实用程序的所有代码is also on GitHub
如您所述,passthroughTrackID
return 要过滤的曲目不是正确的方法 — 您需要 return 从 requiredSourceTrackIDs
代替。 (看起来一旦你这样做了,你是否从 也 return 没关系 passthroughTrackID
。)回答剩下的问题 为什么 它是这样工作的...
passthroughTrackID
和 requiredSourceTrackIDs
的文档当然不是 Apple 有史以来最清晰的文档。 (File a bug about it 而且他们可能会改进。)但是,如果您仔细查看前者的描述,就会发现一个提示(已添加重点)...
If for the duration of the instruction, the video composition result is one of the source frames, this property returns the corresponding track ID. The compositor won't be run for the duration of the instruction and the proper source frame is used instead.
因此,当您制作一条指令 class 时,您只使用 passthroughTrackID
使单个轨道通过 而无需处理.
如果您打算执行任何图像处理,即使只是对没有合成的单个轨道,也请在 requiredSourceTrackIDs
中指定该轨道。
我正在尝试将 CIFilter 应用于 AVAsset,然后在应用过滤器的情况下保存它。我这样做的方法是使用 AVAssetExportSession
和 videoComposition
设置为具有自定义 AVVideoCompositing
class.
AVMutableVideoComposition
对象
我也在设置 instructions
of my AVMutableVideoComposition
object to a custom composition instruction class (conforming to AVMutableVideoCompositionInstruction
)。这个 class 被传递了一个轨道 ID,以及一些其他不重要的变量。
不幸的是,我 运行 遇到了一个问题 - startVideoCompositionRequest:
function in my custom video compositor class (conforming to AVVideoCompositing
) 没有被正确调用。
当我将我的自定义指令 class 的 passthroughTrackID
变量设置为轨道 ID 时,我的 AVVideoCompositing
中的 startVideoCompositionRequest(request)
函数没有被调用。
然而,当我没有设置自定义指令 class 的 passthroughTrackID
变量时,startVideoCompositionRequest(request)
被调用,但没有被调用正确 - 打印 request.sourceTrackIDs
results in an empty array, and request.sourceFrameByTrackID(trackID)
结果为 nil 值。
我发现的一件有趣的事情是 cancelAllPendingVideoCompositionRequests:
函数在尝试导出带有过滤器的视频时总是被调用两次。它要么在 startVideoCompositionRequest:
之前调用一次,之后调用一次,或者在 startVideoCompositionRequest:
未被调用的情况下连续调用两次。
我创建了三个 classes 用于导出带滤镜的视频。这是实用程序 class,它基本上只包含一个 export
函数并调用所有必需的代码
class VideoFilterExport{
let asset: AVAsset
init(asset: AVAsset){
self.asset = asset
}
func export(toURL url: NSURL, callback: (url: NSURL?) -> Void){
guard let track: AVAssetTrack = self.asset.tracksWithMediaType(AVMediaTypeVideo).first else{callback(url: nil); return}
let composition = AVMutableComposition()
let compositionTrack = composition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)
do{
try compositionTrack.insertTimeRange(track.timeRange, ofTrack: track, atTime: kCMTimeZero)
}
catch _{callback(url: nil); return}
let videoComposition = AVMutableVideoComposition(propertiesOfAsset: composition)
videoComposition.customVideoCompositorClass = VideoFilterCompositor.self
videoComposition.frameDuration = CMTimeMake(1, 30)
videoComposition.renderSize = compositionTrack.naturalSize
let instruction = VideoFilterCompositionInstruction(trackID: compositionTrack.trackID)
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.asset.duration)
videoComposition.instructions = [instruction]
let session: AVAssetExportSession = AVAssetExportSession(asset: composition, presetName: AVAssetExportPresetMediumQuality)!
session.videoComposition = videoComposition
session.outputURL = url
session.outputFileType = AVFileTypeMPEG4
session.exportAsynchronouslyWithCompletionHandler(){
callback(url: url)
}
}
}
这是另外两个 classes - 我将把它们都放在一个代码块中,使这个 post 更短
// Video Filter Composition Instruction Class - from what I gather,
// AVVideoCompositionInstruction is used only to pass values to
// the AVVideoCompositing class
class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
let trackID: CMPersistentTrackID
let filters: ImageFilterGroup
let context: CIContext
// When I leave this line as-is, startVideoCompositionRequest: isn't called.
// When commented out, startVideoCompositionRequest(request) is called, but there
// are no valid CVPixelBuffers provided by request.sourceFrameByTrackID(below value)
override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
override var requiredSourceTrackIDs: [NSValue]{get{return []}}
override var containsTweening: Bool{get{return false}}
init(trackID: CMPersistentTrackID, filters: ImageFilterGroup, context: CIContext){
self.trackID = trackID
self.filters = filters
self.context = context
super.init()
//self.timeRange = timeRange
self.enablePostProcessing = true
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
}
// My custom AVVideoCompositing class. This is where the problem lies -
// although I don't know if this is the root of the problem
class VideoFilterCompositor : NSObject, AVVideoCompositing{
var requiredPixelBufferAttributesForRenderContext: [String : AnyObject] = [
kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA), // The video is in 32 BGRA
kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
]
var sourcePixelBufferAttributes: [String : AnyObject]? = [
kCVPixelBufferPixelFormatTypeKey as String : NSNumber(unsignedInt: kCVPixelFormatType_32BGRA),
kCVPixelBufferOpenGLESCompatibilityKey as String : NSNumber(bool: true),
kCVPixelBufferOpenGLCompatibilityKey as String : NSNumber(bool: true)
]
let renderQueue = dispatch_queue_create("co.getblix.videofiltercompositor.renderingqueue", DISPATCH_QUEUE_SERIAL)
override init(){
super.init()
}
func startVideoCompositionRequest(request: AVAsynchronousVideoCompositionRequest){
// This code block is never executed when the
// passthroughTrackID variable is in the above class
autoreleasepool(){
dispatch_async(self.renderQueue){
guard let instruction = request.videoCompositionInstruction as? VideoFilterCompositionInstruction else{
request.finishWithError(NSError(domain: "getblix.co", code: 760, userInfo: nil))
return
}
guard let pixels = request.sourceFrameByTrackID(instruction.passthroughTrackID) else{
// This code block is executed when I comment out the
// passthroughTrackID variable in the above class
request.finishWithError(NSError(domain: "getblix.co", code: 761, userInfo: nil))
return
}
// I have not been able to get the code to reach this point
// This function is either not called, or the guard
// statement above executes
let image = CIImage(CVPixelBuffer: pixels)
let filtered: CIImage = //apply the filter here
let width = CVPixelBufferGetWidth(pixels)
let height = CVPixelBufferGetHeight(pixels)
let format = CVPixelBufferGetPixelFormatType(pixels)
var newBuffer: CVPixelBuffer?
CVPixelBufferCreate(kCFAllocatorDefault, width, height, format, nil, &newBuffer)
if let buffer = newBuffer{
instruction.context.render(filtered, toCVPixelBuffer: buffer)
request.finishWithComposedVideoFrame(buffer)
}
else{
request.finishWithComposedVideoFrame(pixels)
}
}
}
}
func renderContextChanged(newRenderContext: AVVideoCompositionRenderContext){
// I don't have any code in this block
}
// This is interesting - this is called twice,
// Once before startVideoCompositionRequest is called,
// And once after. In the case when startVideoCompositionRequest
// Is not called, this is simply called twice in a row
func cancelAllPendingVideoCompositionRequests(){
dispatch_barrier_async(self.renderQueue){
print("Cancelled")
}
}
}
我一直在寻找 Apple's AVCustomEdit sample project 很多关于这方面的指导,但我似乎无法在其中找到发生这种情况的任何原因。
如何让 request.sourceFrameByTrackID:
函数正确调用,并为每一帧提供有效的 CVPixelBuffer
?
All of the code for this utility is on GitHub
事实证明,requiredSourceTrackIDs
variable in the custom AVVideoCompositionInstruction
class(问题中的VideoFilterCompositionInstruction
)必须设置为包含轨道 ID 的数组
override var requiredSourceTrackIDs: [NSValue]{
get{
return [
NSNumber(value: Int(self.trackID))
]
}
}
所以最终的自定义组合指令class是
class VideoFilterCompositionInstruction : AVMutableVideoCompositionInstruction{
let trackID: CMPersistentTrackID
let filters: [CIFilter]
let context: CIContext
override var passthroughTrackID: CMPersistentTrackID{get{return self.trackID}}
override var requiredSourceTrackIDs: [NSValue]{get{return [NSNumber(value: Int(self.trackID))]}}
override var containsTweening: Bool{get{return false}}
init(trackID: CMPersistentTrackID, filters: [CIFilter], context: CIContext){
self.trackID = trackID
self.filters = filters
self.context = context
super.init()
self.enablePostProcessing = true
}
required init?(coder aDecoder: NSCoder){
fatalError("init(coder:) has not been implemented")
}
}
此实用程序的所有代码is also on GitHub
如您所述,passthroughTrackID
return 要过滤的曲目不是正确的方法 — 您需要 return 从 requiredSourceTrackIDs
代替。 (看起来一旦你这样做了,你是否从 也 return 没关系 passthroughTrackID
。)回答剩下的问题 为什么 它是这样工作的...
passthroughTrackID
和 requiredSourceTrackIDs
的文档当然不是 Apple 有史以来最清晰的文档。 (File a bug about it 而且他们可能会改进。)但是,如果您仔细查看前者的描述,就会发现一个提示(已添加重点)...
If for the duration of the instruction, the video composition result is one of the source frames, this property returns the corresponding track ID. The compositor won't be run for the duration of the instruction and the proper source frame is used instead.
因此,当您制作一条指令 class 时,您只使用 passthroughTrackID
使单个轨道通过 而无需处理.
如果您打算执行任何图像处理,即使只是对没有合成的单个轨道,也请在 requiredSourceTrackIDs
中指定该轨道。