如何以较低的分辨率渲染 SceneKit 着色器?
How to render a SceneKit shader at a lower resolution?
我正在使用 SceneKit shader modifiers 向我的应用添加一些视觉元素,如下所示:
// A SceneKit scene with orthographic projection
let shaderBundle = Bundle(for: Self.self)
let shaderUrl = shaderBundle.url(forResource: "MyShader.frag", withExtension: nil)!
let shaderString = try! String(contentsOf: shaderUrl)
let plane = SCNPlane(width: 512, height: 512) // 1024x1024 pixels on devices with x2 screen resolution
plane.firstMaterial!.shaderModifiers = [SCNShaderModifierEntryPoint.fragment: shaderString]
let planeNode = SCNNode(geometry: plane)
rootNode.addChildNode(planeNode)
问题是性能低下,因为 SceneKit 正在煞费苦心地渲染筛选着色器的平面的每个像素。如何在保持平面大小不变的情况下降低着色器的分辨率?
我已经尝试缩小 plane
并在 planeNode
上使用放大比例变换但没有结果,着色器的再现仍然像以前一样非常详细。
使用 plane.firstMaterial!.diffuse.contentsTransform
也没有帮助(或者我做错了)。
我知道我可以使全局 SCNView
更小,然后应用仿射缩放变换,如果该着色器是场景中的唯一节点,但它不是,还有其他节点(不是着色器)在同一场景中,我宁愿避免以任何方式改变它们的外观。
一般来说,如果没有在其他地方称为 variable rasterization rate in Metal or variable rate shading 的相当新的 GPU 功能,您无法使场景中的一个对象 运行 其片段着色器的分辨率与场景的其余部分不同.
对于这种情况,根据您的设置,您或许可以使用 SCNTechnique to render the plane in a separate pass at a different resolution, then composite that back into your scene, in the same way some game engines render particles at a lower resolution 来节省填充率。这是一个例子。
首先,您的项目中需要一个 Metal 文件(如果您已有,只需添加),包含以下内容:
#include <SceneKit/scn_metal>
struct QuadVertexIn {
float3 position [[ attribute(SCNVertexSemanticPosition) ]];
float2 uv [[ attribute(SCNVertexSemanticTexcoord0) ]];
};
struct QuadVertexOut {
float4 position [[ position ]];
float2 uv;
};
vertex QuadVertexOut quadVertex(QuadVertexIn v [[ stage_in ]]) {
QuadVertexOut o;
o.position = float4(v.position.x, -v.position.y, 1, 1);
o.uv = v.uv;
return o;
}
constexpr sampler compositingSampler(coord::normalized, address::clamp_to_edge, filter::linear);
fragment half4 compositeFragment(QuadVertexOut v [[ stage_in ]], texture2d<half, access::sample> compositeInput [[ texture(0) ]]) {
return compositeInput.sample(compositingSampler, v.uv);
}
然后,在您的 SceneKit 代码中,您可以像这样设置和应用该技术:
let technique = SCNTechnique(dictionary: [
"passes": ["drawLowResStuff":
["draw": "DRAW_SCENE",
// only draw nodes that are in this category
"includeCategoryMask": 2,
"colorStates": ["clear": true, "clearColor": "0.0"],
"outputs": ["color": "lowResStuff"]],
"drawScene":
["draw": "DRAW_SCENE",
// don’t draw nodes that are in the low-res-stuff category
"excludeCategoryMask": 2,
"colorStates": ["clear": true, "clearColor": "sceneBackground"],
"outputs": ["color": "COLOR"]],
"composite":
["draw": "DRAW_QUAD",
"metalVertexShader": "quadVertex",
"metalFragmentShader": "compositeFragment",
// don’t clear what’s currently there (the rest of the scene)
"colorStates": ["clear": false],
// use alpha blending
"blendStates": ["enable": true, "colorSrc": "srcAlpha", "colorDst": "oneMinusSrcAlpha"],
// supply the lowResStuff render target to the fragment shader
"inputs": ["compositeInput": "lowResStuff"],
// draw into the main color render target
"outputs": ["color": "COLOR"]]
],
"sequence": ["drawLowResStuff", "drawScene", "composite"],
"targets": ["lowResStuff": ["type": "color", "scaleFactor": 0.5]]
])
// mark the plane node as belonging to the category of stuff that gets drawn in the low-res pass
myPlaneNode.categoryBitMask = 2
// apply the technique to the scene view
mySceneView.technique = technique
测试场景由两个具有相同纹理的球体组成,scaleFactor
设置为 0.25 而不是 0.5 以夸大效果,结果如下所示。
如果您更喜欢清晰的像素化而不是上面描述的模糊调整大小,请在 Metal 代码中将 filter::linear
更改为 filter::nearest
。另外,请注意合成的低分辨率内容没有考虑深度缓冲区,所以如果你的飞机应该出现在其他物体“后面”,那么你将不得不在合成功能中做更多的工作来修复那。
似乎我设法使用一种“渲染到纹理”的方法解决了这个问题,方法是将 SceneKit 场景嵌套在由顶级 SceneKit 场景显示的 SpriteKit 场景中。
更详细地说,SCNNode
的以下子类在 SpriteKit 的 SK3DNode
中放置一个缩小的着色器平面,然后将那个 SK3DNode
放在 SpriteKit 场景中作为一个 SceneKit 的 SKScene
,然后使用那个 SKScene
作为放置在顶级 SceneKit 场景中的放大平面的漫反射内容。
奇怪的是,为了保持原始分辨率我需要使用 scaleFactor*2
,所以为了将渲染分辨率减半(通常比例因子 0.5)我实际上需要使用 scaleFactor = 1
.
如果有人碰巧知道这种奇怪行为的原因或解决方法,请在评论中告诉我。
import Foundation
import SceneKit
import SpriteKit
class ScaledResolutionFragmentShaderModifierPlaneNode: SCNNode {
private static let nestedSCNSceneFrustumLength: CGFloat = 8
// For shader parameter input
let shaderPlaneMaterial: SCNMaterial
// shaderModifier: the shader
// planeSize: the size of the shader on the screen
// scaleFactor: the scale to be used for the shader's rendering resolution; the lower, the faster
init(shaderModifier: String, planeSize: CGSize, scaleFactor: CGFloat) {
let scaledSize = CGSize(width: planeSize.width*scaleFactor, height: planeSize.height*scaleFactor)
// Nested SceneKit scene with orthographic projection
let nestedSCNScene = SCNScene()
let camera = SCNCamera()
camera.zFar = Double(Self.nestedSCNSceneFrustumLength)
camera.usesOrthographicProjection = true
camera.orthographicScale = Double(scaledSize.height/2)
let cameraNode = SCNNode()
cameraNode.camera = camera
cameraNode.simdPosition = simd_float3(x: 0, y: 0, z: Float(Self.nestedSCNSceneFrustumLength/2))
nestedSCNScene.rootNode.addChildNode(cameraNode)
let shaderPlane = SCNPlane(width: scaledSize.width, height: scaledSize.height)
shaderPlaneMaterial = shaderPlane.firstMaterial!
shaderPlaneMaterial.shaderModifiers = [SCNShaderModifierEntryPoint.fragment: shaderModifier]
let shaderPlaneNode = SCNNode(geometry: shaderPlane)
nestedSCNScene.rootNode.addChildNode(shaderPlaneNode)
// Intermediary SpriteKit scene
let nestedSCNSceneSKNode = SK3DNode(viewportSize: scaledSize)
nestedSCNSceneSKNode.scnScene = nestedSCNScene
nestedSCNSceneSKNode.position = CGPoint(x: scaledSize.width/2, y: scaledSize.height/2)
nestedSCNSceneSKNode.isPlaying = true
let intermediarySKScene = SKScene(size: scaledSize)
intermediarySKScene.backgroundColor = .clear
intermediarySKScene.addChild(nestedSCNSceneSKNode)
let intermediarySKScenePlane = SCNPlane(width: scaledSize.width, height: scaledSize.height)
intermediarySKScenePlane.firstMaterial!.diffuse.contents = intermediarySKScene
let intermediarySKScenePlaneNode = SCNNode(geometry: intermediarySKScenePlane)
let invScaleFactor = 1/Float(scaleFactor)
intermediarySKScenePlaneNode.simdScale = simd_float3(x: invScaleFactor, y: invScaleFactor, z: 1)
super.init()
addChildNode(intermediarySKScenePlaneNode)
}
required init?(coder: NSCoder) {
fatalError()
}
}
我正在使用 SceneKit shader modifiers 向我的应用添加一些视觉元素,如下所示:
// A SceneKit scene with orthographic projection
let shaderBundle = Bundle(for: Self.self)
let shaderUrl = shaderBundle.url(forResource: "MyShader.frag", withExtension: nil)!
let shaderString = try! String(contentsOf: shaderUrl)
let plane = SCNPlane(width: 512, height: 512) // 1024x1024 pixels on devices with x2 screen resolution
plane.firstMaterial!.shaderModifiers = [SCNShaderModifierEntryPoint.fragment: shaderString]
let planeNode = SCNNode(geometry: plane)
rootNode.addChildNode(planeNode)
问题是性能低下,因为 SceneKit 正在煞费苦心地渲染筛选着色器的平面的每个像素。如何在保持平面大小不变的情况下降低着色器的分辨率?
我已经尝试缩小 plane
并在 planeNode
上使用放大比例变换但没有结果,着色器的再现仍然像以前一样非常详细。
使用 plane.firstMaterial!.diffuse.contentsTransform
也没有帮助(或者我做错了)。
我知道我可以使全局 SCNView
更小,然后应用仿射缩放变换,如果该着色器是场景中的唯一节点,但它不是,还有其他节点(不是着色器)在同一场景中,我宁愿避免以任何方式改变它们的外观。
一般来说,如果没有在其他地方称为 variable rasterization rate in Metal or variable rate shading 的相当新的 GPU 功能,您无法使场景中的一个对象 运行 其片段着色器的分辨率与场景的其余部分不同.
对于这种情况,根据您的设置,您或许可以使用 SCNTechnique to render the plane in a separate pass at a different resolution, then composite that back into your scene, in the same way some game engines render particles at a lower resolution 来节省填充率。这是一个例子。
首先,您的项目中需要一个 Metal 文件(如果您已有,只需添加),包含以下内容:
#include <SceneKit/scn_metal>
struct QuadVertexIn {
float3 position [[ attribute(SCNVertexSemanticPosition) ]];
float2 uv [[ attribute(SCNVertexSemanticTexcoord0) ]];
};
struct QuadVertexOut {
float4 position [[ position ]];
float2 uv;
};
vertex QuadVertexOut quadVertex(QuadVertexIn v [[ stage_in ]]) {
QuadVertexOut o;
o.position = float4(v.position.x, -v.position.y, 1, 1);
o.uv = v.uv;
return o;
}
constexpr sampler compositingSampler(coord::normalized, address::clamp_to_edge, filter::linear);
fragment half4 compositeFragment(QuadVertexOut v [[ stage_in ]], texture2d<half, access::sample> compositeInput [[ texture(0) ]]) {
return compositeInput.sample(compositingSampler, v.uv);
}
然后,在您的 SceneKit 代码中,您可以像这样设置和应用该技术:
let technique = SCNTechnique(dictionary: [
"passes": ["drawLowResStuff":
["draw": "DRAW_SCENE",
// only draw nodes that are in this category
"includeCategoryMask": 2,
"colorStates": ["clear": true, "clearColor": "0.0"],
"outputs": ["color": "lowResStuff"]],
"drawScene":
["draw": "DRAW_SCENE",
// don’t draw nodes that are in the low-res-stuff category
"excludeCategoryMask": 2,
"colorStates": ["clear": true, "clearColor": "sceneBackground"],
"outputs": ["color": "COLOR"]],
"composite":
["draw": "DRAW_QUAD",
"metalVertexShader": "quadVertex",
"metalFragmentShader": "compositeFragment",
// don’t clear what’s currently there (the rest of the scene)
"colorStates": ["clear": false],
// use alpha blending
"blendStates": ["enable": true, "colorSrc": "srcAlpha", "colorDst": "oneMinusSrcAlpha"],
// supply the lowResStuff render target to the fragment shader
"inputs": ["compositeInput": "lowResStuff"],
// draw into the main color render target
"outputs": ["color": "COLOR"]]
],
"sequence": ["drawLowResStuff", "drawScene", "composite"],
"targets": ["lowResStuff": ["type": "color", "scaleFactor": 0.5]]
])
// mark the plane node as belonging to the category of stuff that gets drawn in the low-res pass
myPlaneNode.categoryBitMask = 2
// apply the technique to the scene view
mySceneView.technique = technique
测试场景由两个具有相同纹理的球体组成,scaleFactor
设置为 0.25 而不是 0.5 以夸大效果,结果如下所示。
如果您更喜欢清晰的像素化而不是上面描述的模糊调整大小,请在 Metal 代码中将 filter::linear
更改为 filter::nearest
。另外,请注意合成的低分辨率内容没有考虑深度缓冲区,所以如果你的飞机应该出现在其他物体“后面”,那么你将不得不在合成功能中做更多的工作来修复那。
似乎我设法使用一种“渲染到纹理”的方法解决了这个问题,方法是将 SceneKit 场景嵌套在由顶级 SceneKit 场景显示的 SpriteKit 场景中。
更详细地说,SCNNode
的以下子类在 SpriteKit 的 SK3DNode
中放置一个缩小的着色器平面,然后将那个 SK3DNode
放在 SpriteKit 场景中作为一个 SceneKit 的 SKScene
,然后使用那个 SKScene
作为放置在顶级 SceneKit 场景中的放大平面的漫反射内容。
奇怪的是,为了保持原始分辨率我需要使用 scaleFactor*2
,所以为了将渲染分辨率减半(通常比例因子 0.5)我实际上需要使用 scaleFactor = 1
.
如果有人碰巧知道这种奇怪行为的原因或解决方法,请在评论中告诉我。
import Foundation
import SceneKit
import SpriteKit
class ScaledResolutionFragmentShaderModifierPlaneNode: SCNNode {
private static let nestedSCNSceneFrustumLength: CGFloat = 8
// For shader parameter input
let shaderPlaneMaterial: SCNMaterial
// shaderModifier: the shader
// planeSize: the size of the shader on the screen
// scaleFactor: the scale to be used for the shader's rendering resolution; the lower, the faster
init(shaderModifier: String, planeSize: CGSize, scaleFactor: CGFloat) {
let scaledSize = CGSize(width: planeSize.width*scaleFactor, height: planeSize.height*scaleFactor)
// Nested SceneKit scene with orthographic projection
let nestedSCNScene = SCNScene()
let camera = SCNCamera()
camera.zFar = Double(Self.nestedSCNSceneFrustumLength)
camera.usesOrthographicProjection = true
camera.orthographicScale = Double(scaledSize.height/2)
let cameraNode = SCNNode()
cameraNode.camera = camera
cameraNode.simdPosition = simd_float3(x: 0, y: 0, z: Float(Self.nestedSCNSceneFrustumLength/2))
nestedSCNScene.rootNode.addChildNode(cameraNode)
let shaderPlane = SCNPlane(width: scaledSize.width, height: scaledSize.height)
shaderPlaneMaterial = shaderPlane.firstMaterial!
shaderPlaneMaterial.shaderModifiers = [SCNShaderModifierEntryPoint.fragment: shaderModifier]
let shaderPlaneNode = SCNNode(geometry: shaderPlane)
nestedSCNScene.rootNode.addChildNode(shaderPlaneNode)
// Intermediary SpriteKit scene
let nestedSCNSceneSKNode = SK3DNode(viewportSize: scaledSize)
nestedSCNSceneSKNode.scnScene = nestedSCNScene
nestedSCNSceneSKNode.position = CGPoint(x: scaledSize.width/2, y: scaledSize.height/2)
nestedSCNSceneSKNode.isPlaying = true
let intermediarySKScene = SKScene(size: scaledSize)
intermediarySKScene.backgroundColor = .clear
intermediarySKScene.addChild(nestedSCNSceneSKNode)
let intermediarySKScenePlane = SCNPlane(width: scaledSize.width, height: scaledSize.height)
intermediarySKScenePlane.firstMaterial!.diffuse.contents = intermediarySKScene
let intermediarySKScenePlaneNode = SCNNode(geometry: intermediarySKScenePlane)
let invScaleFactor = 1/Float(scaleFactor)
intermediarySKScenePlaneNode.simdScale = simd_float3(x: invScaleFactor, y: invScaleFactor, z: 1)
super.init()
addChildNode(intermediarySKScenePlaneNode)
}
required init?(coder: NSCoder) {
fatalError()
}
}