ARKit / SpriteKit - 将 pixelBufferAttributes 设置为 SKVideoNode 或以另一种方式在视频中制作透明像素(色度键效果)
ARKit / SpriteKit - set pixelBufferAttributes to SKVideoNode or make transparent pixels in video (chroma-key effect) another way
我的目标是使用 ARKit
在真实环境中呈现 2D 动画角色。动画角色是视频的一部分,显示在视频的以下快照中:
使用以下代码可以毫无问题地显示视频本身:
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }
let url = URL(fileURLWithPath: urlString)
let asset = AVAsset(url: url)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
let videoNode = SKVideoNode(avPlayer: player)
videoNode.size = CGSize(width: 200.0, height: 150.0)
videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)
return videoNode
}
此代码的结果按预期显示在以下应用程序的屏幕截图中:
但是正如你所看到的,人物的背景不是很好,所以我需要让它消失,以创造人物实际上站在水平面上的错觉。
我试图通过对视频制作色键效果来实现这一点。
- 对于那些不熟悉色度键的人来说,这是 "green screen effect" 有时在电视上看到的使颜色透明的名称。
我的色度键效果方法是基于 "CIColorCube" CIFilter
创建自定义滤镜,然后使用 AVVideoComposition
.
将滤镜应用于视频
首先是创建过滤器的代码:
func RGBtoHSV(r : Float, g : Float, b : Float) -> (h : Float, s : Float, v : Float) {
var h : CGFloat = 0
var s : CGFloat = 0
var v : CGFloat = 0
let col = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: 1.0)
col.getHue(&h, saturation: &s, brightness: &v, alpha: nil)
return (Float(h), Float(s), Float(v))
}
func colorCubeFilterForChromaKey(hueAngle: Float) -> CIFilter {
let hueRange: Float = 20 // degrees size pie shape that we want to replace
let minHueAngle: Float = (hueAngle - hueRange/2.0) / 360
let maxHueAngle: Float = (hueAngle + hueRange/2.0) / 360
let size = 64
var cubeData = [Float](repeating: 0, count: size * size * size * 4)
var rgb: [Float] = [0, 0, 0]
var hsv: (h : Float, s : Float, v : Float)
var offset = 0
for z in 0 ..< size {
rgb[2] = Float(z) / Float(size) // blue value
for y in 0 ..< size {
rgb[1] = Float(y) / Float(size) // green value
for x in 0 ..< size {
rgb[0] = Float(x) / Float(size) // red value
hsv = RGBtoHSV(r: rgb[0], g: rgb[1], b: rgb[2])
// TODO: Check if hsv.s > 0.5 is really nesseccary
let alpha: Float = (hsv.h > minHueAngle && hsv.h < maxHueAngle && hsv.s > 0.5) ? 0 : 1.0
cubeData[offset] = rgb[0] * alpha
cubeData[offset + 1] = rgb[1] * alpha
cubeData[offset + 2] = rgb[2] * alpha
cubeData[offset + 3] = alpha
offset += 4
}
}
}
let b = cubeData.withUnsafeBufferPointer { Data(buffer: [=11=]) }
let data = b as NSData
let colorCube = CIFilter(name: "CIColorCube", withInputParameters: [
"inputCubeDimension": size,
"inputCubeData": data
])
return colorCube!
}
然后是通过修改我之前写的函数func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode?
将滤镜应用于视频的代码:
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }
let url = URL(fileURLWithPath: urlString)
let asset = AVAsset(url: url)
let filter = colorCubeFilterForChromaKey(hueAngle: 38)
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
let source = request.sourceImage
filter.setValue(source, forKey: kCIInputImageKey)
let output = filter.outputImage
request.finish(with: output!, context: nil)
})
let item = AVPlayerItem(asset: asset)
item.videoComposition = composition
let player = AVPlayer(playerItem: item)
let videoNode = SKVideoNode(avPlayer: player)
videoNode.size = CGSize(width: 200.0, height: 150.0)
videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)
return videoNode
}
如果像素颜色与背景的色调范围相匹配,代码应该将视频每一帧的所有像素替换为 alpha = 0.0
。
但是,我没有获得透明像素,而是获得了黑色像素,如下图所示:
现在,尽管这不是想要的效果,但我并不感到惊讶,因为我知道这就是 iOS 显示带有 alpha 通道的视频的方式。
但这是真正的问题 - 当在 AVPlayer
中显示普通视频时,有一个选项可以将 AVPlayerLayer
添加到视图,并为其设置 pixelBufferAttributes
,让播放器层知道我们使用透明像素缓冲区,像这样:
let playerLayer = AVPlayerLayer(player: player)
playerLayer.bounds = view.bounds
playerLayer.position = view.center
playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
view.layer.addSublayer(playerLayer)
此代码为我们提供了具有透明背景(好!)但固定大小和位置的视频(不好... ),如您在此屏幕截图中所见:
我想达到同样的效果,但在 SKVideoNode
上,而不是在 AVPlayerLayer
上。但是,我找不到任何方法可以将pixelBufferAttributes
设置为SKVideoNode
,并且设置播放器图层并没有达到ARKit
的预期效果,因为它固定在位置[=31] =]
我的问题有什么解决方案吗,或者是否有另一种技术可以达到相同的预期效果?
解决方法很简单!
需要做的就是将视频添加为 SKEffectNode
的子项,并将过滤器应用于 SKEffectNode
而不是视频本身(AVVideoComposition
不是必需的)。
这是我使用的代码:
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
// Create and configure a node for the anchor added to the view's session.
let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)
// Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
let effectNode = SKEffectNode()
effectNode.addChild(bialikVideoNode)
effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)
return effectNode
}
这里是需要的结果:
谢谢!有同样的问题 + 混合 [AR/Scene/Sprite]Kit。但我建议改为使用此算法。它给出了更好的结果:
...
var r: [Float] = removeChromaKeyColor(r: rgb[0], g: rgb[1], b: rgb[2])
cubeData[offset] = r[0]
cubeData[offset + 1] = r[1]
cubeData[offset + 2] = r[2]
cubeData[offset + 3] = r[3]
offset += 4
...
func removeChromaKeyColor(r: Float, g: Float, b: Float) -> [Float] {
let threshold: Float = 0.1
let refColor: [Float] = [0, 1.0, 0, 1.0] // chroma key color
//http://www.shaderslab.com/demo-40---video-in-video-with-green-chromakey.html
let val = ceil(saturate(g - r - threshold)) * ceil(saturate(g - b - threshold))
var result = lerp(a: [r, g, b, 0.0], b: refColor, w: val)
result[3] = fabs(1.0 - result[3])
return result
}
func saturate(_ x: Float) -> Float {
return max(0, min(1, x));
}
func ceil(_ v: Float) -> Float {
return -floor(-v);
}
func lerp(a: [Float], b: [Float], w: Float) -> [Float] {
return [a[0]+w*(b[0]-a[0]), a[1]+w*(b[1]-a[1]), a[2]+w*(b[2]-a[2]), a[3]+w*(b[3]-a[3])];
}
我的目标是使用 ARKit
在真实环境中呈现 2D 动画角色。动画角色是视频的一部分,显示在视频的以下快照中:
使用以下代码可以毫无问题地显示视频本身:
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }
let url = URL(fileURLWithPath: urlString)
let asset = AVAsset(url: url)
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
let videoNode = SKVideoNode(avPlayer: player)
videoNode.size = CGSize(width: 200.0, height: 150.0)
videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)
return videoNode
}
此代码的结果按预期显示在以下应用程序的屏幕截图中:
但是正如你所看到的,人物的背景不是很好,所以我需要让它消失,以创造人物实际上站在水平面上的错觉。 我试图通过对视频制作色键效果来实现这一点。
- 对于那些不熟悉色度键的人来说,这是 "green screen effect" 有时在电视上看到的使颜色透明的名称。
我的色度键效果方法是基于 "CIColorCube" CIFilter
创建自定义滤镜,然后使用 AVVideoComposition
.
首先是创建过滤器的代码:
func RGBtoHSV(r : Float, g : Float, b : Float) -> (h : Float, s : Float, v : Float) {
var h : CGFloat = 0
var s : CGFloat = 0
var v : CGFloat = 0
let col = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: 1.0)
col.getHue(&h, saturation: &s, brightness: &v, alpha: nil)
return (Float(h), Float(s), Float(v))
}
func colorCubeFilterForChromaKey(hueAngle: Float) -> CIFilter {
let hueRange: Float = 20 // degrees size pie shape that we want to replace
let minHueAngle: Float = (hueAngle - hueRange/2.0) / 360
let maxHueAngle: Float = (hueAngle + hueRange/2.0) / 360
let size = 64
var cubeData = [Float](repeating: 0, count: size * size * size * 4)
var rgb: [Float] = [0, 0, 0]
var hsv: (h : Float, s : Float, v : Float)
var offset = 0
for z in 0 ..< size {
rgb[2] = Float(z) / Float(size) // blue value
for y in 0 ..< size {
rgb[1] = Float(y) / Float(size) // green value
for x in 0 ..< size {
rgb[0] = Float(x) / Float(size) // red value
hsv = RGBtoHSV(r: rgb[0], g: rgb[1], b: rgb[2])
// TODO: Check if hsv.s > 0.5 is really nesseccary
let alpha: Float = (hsv.h > minHueAngle && hsv.h < maxHueAngle && hsv.s > 0.5) ? 0 : 1.0
cubeData[offset] = rgb[0] * alpha
cubeData[offset + 1] = rgb[1] * alpha
cubeData[offset + 2] = rgb[2] * alpha
cubeData[offset + 3] = alpha
offset += 4
}
}
}
let b = cubeData.withUnsafeBufferPointer { Data(buffer: [=11=]) }
let data = b as NSData
let colorCube = CIFilter(name: "CIColorCube", withInputParameters: [
"inputCubeDimension": size,
"inputCubeData": data
])
return colorCube!
}
然后是通过修改我之前写的函数func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode?
将滤镜应用于视频的代码:
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }
let url = URL(fileURLWithPath: urlString)
let asset = AVAsset(url: url)
let filter = colorCubeFilterForChromaKey(hueAngle: 38)
let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
let source = request.sourceImage
filter.setValue(source, forKey: kCIInputImageKey)
let output = filter.outputImage
request.finish(with: output!, context: nil)
})
let item = AVPlayerItem(asset: asset)
item.videoComposition = composition
let player = AVPlayer(playerItem: item)
let videoNode = SKVideoNode(avPlayer: player)
videoNode.size = CGSize(width: 200.0, height: 150.0)
videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)
return videoNode
}
如果像素颜色与背景的色调范围相匹配,代码应该将视频每一帧的所有像素替换为 alpha = 0.0
。
但是,我没有获得透明像素,而是获得了黑色像素,如下图所示:
现在,尽管这不是想要的效果,但我并不感到惊讶,因为我知道这就是 iOS 显示带有 alpha 通道的视频的方式。
但这是真正的问题 - 当在 AVPlayer
中显示普通视频时,有一个选项可以将 AVPlayerLayer
添加到视图,并为其设置 pixelBufferAttributes
,让播放器层知道我们使用透明像素缓冲区,像这样:
let playerLayer = AVPlayerLayer(player: player)
playerLayer.bounds = view.bounds
playerLayer.position = view.center
playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
view.layer.addSublayer(playerLayer)
此代码为我们提供了具有透明背景(好!)但固定大小和位置的视频(不好... ),如您在此屏幕截图中所见:
我想达到同样的效果,但在 SKVideoNode
上,而不是在 AVPlayerLayer
上。但是,我找不到任何方法可以将pixelBufferAttributes
设置为SKVideoNode
,并且设置播放器图层并没有达到ARKit
的预期效果,因为它固定在位置[=31] =]
我的问题有什么解决方案吗,或者是否有另一种技术可以达到相同的预期效果?
解决方法很简单!
需要做的就是将视频添加为 SKEffectNode
的子项,并将过滤器应用于 SKEffectNode
而不是视频本身(AVVideoComposition
不是必需的)。
这是我使用的代码:
func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
// Create and configure a node for the anchor added to the view's session.
let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)
// Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
let effectNode = SKEffectNode()
effectNode.addChild(bialikVideoNode)
effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)
return effectNode
}
这里是需要的结果:
谢谢!有同样的问题 + 混合 [AR/Scene/Sprite]Kit。但我建议改为使用此算法。它给出了更好的结果:
...
var r: [Float] = removeChromaKeyColor(r: rgb[0], g: rgb[1], b: rgb[2])
cubeData[offset] = r[0]
cubeData[offset + 1] = r[1]
cubeData[offset + 2] = r[2]
cubeData[offset + 3] = r[3]
offset += 4
...
func removeChromaKeyColor(r: Float, g: Float, b: Float) -> [Float] {
let threshold: Float = 0.1
let refColor: [Float] = [0, 1.0, 0, 1.0] // chroma key color
//http://www.shaderslab.com/demo-40---video-in-video-with-green-chromakey.html
let val = ceil(saturate(g - r - threshold)) * ceil(saturate(g - b - threshold))
var result = lerp(a: [r, g, b, 0.0], b: refColor, w: val)
result[3] = fabs(1.0 - result[3])
return result
}
func saturate(_ x: Float) -> Float {
return max(0, min(1, x));
}
func ceil(_ v: Float) -> Float {
return -floor(-v);
}
func lerp(a: [Float], b: [Float], w: Float) -> [Float] {
return [a[0]+w*(b[0]-a[0]), a[1]+w*(b[1]-a[1]), a[2]+w*(b[2]-a[2]), a[3]+w*(b[3]-a[3])];
}