将像素缓冲区呈现给 MTKView
Present a pixel buffer to a MTKView
这是我的问题:我想显示我计算到 MTKView 的像素缓冲区。我搜索了 MTLTexture、MTLBuffer 和其他 Metal 对象,但找不到任何方法来只显示像素缓冲区。
我看到的每个教程都是关于使用顶点和片段着色器呈现 3D 对象。
我认为缓冲区必须在 drawInMTKView
函数中显示(可能使用 MTLRenderCommandEncoder),但同样,我找不到任何相关信息。
我希望我问的不是一个显而易见的问题。
谢谢
我想我找到了解决方案:https://developer.apple.com/documentation/metal/creating_and_sampling_textures?language=objc。
在这个例子中,他们展示了如何将图像渲染到 Metal 视图,仅使用几个顶点和片段着色器将纹理渲染到 2D 正方形。
我会从那里去。不确定是否有更好(更简单?)的方法来做到这一点。但我想这就是 Metal 希望我们这样做的方式。
欢迎光临!
我建议您使用 Core Image 将像素缓冲区的内容渲染到视图中。这需要最少的手动 Metal 设置。
如下设置 MTKView
和一些必需的对象(假设您有一个视图控制器和一个故事板设置):
import UIKit
import CoreImage
class PreviewViewController: UIViewController {
@IBOutlet weak var metalView: MTKView!
var device: MTLDevice!
var commandQueue: MTLCommandQueue!
var ciContext: CIContext!
var pixelBuffer: CVPixelBuffer?
override func viewDidLoad() {
super.viewDidLoad()
self.device = MTLCreateSystemDefaultDevice()
self.commandQueue = self.device.makeCommandQueue()
self.metalView.delegate = self
self.metalView.device = self.device
// this allows us to render into the view's drawable
self.metalView.framebufferOnly = false
self.ciContext = CIContext(mtlDevice: self.device)
}
}
在委托方法中,您使用 Core Image 转换像素缓冲区以适应视图的内容(这是一个奖励,使其适应您的用例)并使用 CIContext
渲染它:
extension PreviewViewController: MTKViewDelegate {
func draw(in view: MTKView) {
guard let pixelBuffer = self.pixelBuffer,
let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }
// turn the pixel buffer into a CIImage so we can use Core Image for rendering into the view
let image = CIImage(cvPixelBuffer: pixelBuffer)
// bonus: transform the image to aspect-fit the view's bounds
let drawableSize = view.drawableSize
let scaleX = drawableSize.width / image.extent.width
let scaleY = drawableSize.height / image.extent.height
let scale = min(scaleX, scaleY)
let scaledImage = image.transformed(by: CGAffineTransform(scaleX: scale, y: scale))
// center in the view
let originX = max(drawableSize.width - scaledImage.extent.size.width, 0) / 2
let originY = max(drawableSize.height - scaledImage.extent.size.height, 0) / 2
let centeredImage = scaledImage.transformed(by: CGAffineTransform(translationX: originX, y: originY))
// Create a render destination that allows to lazily fetch the target texture
// which allows the encoder to process all CI commands _before_ the texture is actually available.
// This gives a nice speed boost because the CPU doesn't need to wait for the GPU to finish
// before starting to encode the next frame.
// Also note that we don't pass a command buffer here, because according to Apple:
// "Rendering to a CIRenderDestination initialized with a commandBuffer requires encoding all
// the commands to render an image into the specified buffer. This may impact system responsiveness
// and may result in higher memory usage if the image requires many passes to render."
let destination = CIRenderDestination(width: Int(drawableSize.width),
height: Int(drawableSize.height),
pixelFormat: view.colorPixelFormat,
commandBuffer: nil,
mtlTextureProvider: { () -> MTLTexture in
return currentDrawable.texture
})
// render into the view's drawable
let _ = try! self.ciContext.startTask(toRender: centeredImage, to: destination)
// present the drawable
commandBuffer.present(currentDrawable)
commandBuffer.commit()
}
}
有一种比使用 CIRenderDestination
更简单的渲染到可绘制纹理中的方法,但如果您想获得高帧率(请参阅评论),建议使用这种方法。
这是我的问题:我想显示我计算到 MTKView 的像素缓冲区。我搜索了 MTLTexture、MTLBuffer 和其他 Metal 对象,但找不到任何方法来只显示像素缓冲区。 我看到的每个教程都是关于使用顶点和片段着色器呈现 3D 对象。
我认为缓冲区必须在 drawInMTKView
函数中显示(可能使用 MTLRenderCommandEncoder),但同样,我找不到任何相关信息。
我希望我问的不是一个显而易见的问题。
谢谢
我想我找到了解决方案:https://developer.apple.com/documentation/metal/creating_and_sampling_textures?language=objc。 在这个例子中,他们展示了如何将图像渲染到 Metal 视图,仅使用几个顶点和片段着色器将纹理渲染到 2D 正方形。 我会从那里去。不确定是否有更好(更简单?)的方法来做到这一点。但我想这就是 Metal 希望我们这样做的方式。
欢迎光临!
我建议您使用 Core Image 将像素缓冲区的内容渲染到视图中。这需要最少的手动 Metal 设置。
如下设置 MTKView
和一些必需的对象(假设您有一个视图控制器和一个故事板设置):
import UIKit
import CoreImage
class PreviewViewController: UIViewController {
@IBOutlet weak var metalView: MTKView!
var device: MTLDevice!
var commandQueue: MTLCommandQueue!
var ciContext: CIContext!
var pixelBuffer: CVPixelBuffer?
override func viewDidLoad() {
super.viewDidLoad()
self.device = MTLCreateSystemDefaultDevice()
self.commandQueue = self.device.makeCommandQueue()
self.metalView.delegate = self
self.metalView.device = self.device
// this allows us to render into the view's drawable
self.metalView.framebufferOnly = false
self.ciContext = CIContext(mtlDevice: self.device)
}
}
在委托方法中,您使用 Core Image 转换像素缓冲区以适应视图的内容(这是一个奖励,使其适应您的用例)并使用 CIContext
渲染它:
extension PreviewViewController: MTKViewDelegate {
func draw(in view: MTKView) {
guard let pixelBuffer = self.pixelBuffer,
let commandBuffer = self.commandQueue.makeCommandBuffer() else { return }
// turn the pixel buffer into a CIImage so we can use Core Image for rendering into the view
let image = CIImage(cvPixelBuffer: pixelBuffer)
// bonus: transform the image to aspect-fit the view's bounds
let drawableSize = view.drawableSize
let scaleX = drawableSize.width / image.extent.width
let scaleY = drawableSize.height / image.extent.height
let scale = min(scaleX, scaleY)
let scaledImage = image.transformed(by: CGAffineTransform(scaleX: scale, y: scale))
// center in the view
let originX = max(drawableSize.width - scaledImage.extent.size.width, 0) / 2
let originY = max(drawableSize.height - scaledImage.extent.size.height, 0) / 2
let centeredImage = scaledImage.transformed(by: CGAffineTransform(translationX: originX, y: originY))
// Create a render destination that allows to lazily fetch the target texture
// which allows the encoder to process all CI commands _before_ the texture is actually available.
// This gives a nice speed boost because the CPU doesn't need to wait for the GPU to finish
// before starting to encode the next frame.
// Also note that we don't pass a command buffer here, because according to Apple:
// "Rendering to a CIRenderDestination initialized with a commandBuffer requires encoding all
// the commands to render an image into the specified buffer. This may impact system responsiveness
// and may result in higher memory usage if the image requires many passes to render."
let destination = CIRenderDestination(width: Int(drawableSize.width),
height: Int(drawableSize.height),
pixelFormat: view.colorPixelFormat,
commandBuffer: nil,
mtlTextureProvider: { () -> MTLTexture in
return currentDrawable.texture
})
// render into the view's drawable
let _ = try! self.ciContext.startTask(toRender: centeredImage, to: destination)
// present the drawable
commandBuffer.present(currentDrawable)
commandBuffer.commit()
}
}
有一种比使用 CIRenderDestination
更简单的渲染到可绘制纹理中的方法,但如果您想获得高帧率(请参阅评论),建议使用这种方法。