如何在arkit(swift4)上添加黑白滤镜

How to add black and white filter on arkit (swift4)

我只想将基本的 arkit 视图转换为黑白视图。现在基本视图很正常,我不知道如何添加过滤器。理想情况下,在截取屏幕截图时,会将黑白滤镜添加到屏幕截图上。

import UIKit
import SceneKit
import ARKit

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func viewDidLoad() {
        super.viewDidLoad()
        sceneView.delegate = self
        sceneView.showsStatistics = true
    }

    override func viewWillAppear(_ animated: Bool) {
        super.viewWillAppear(animated)
        let configuration = ARWorldTrackingConfiguration()
        sceneView.session.run(configuration)
    }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        sceneView.session.pause()
    }

    @IBAction func changeTextColour(){
        let snapShot = self.augmentedRealityView.snapshot()
        UIImageWriteToSavedPhotosAlbum(snapShot, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)
    }
}

snapshot 对象应该是 UIImage。通过导入 CoreImage 框架在此 UIImage 对象上应用过滤器,然后在其上应用 Core Image 过滤器。您应该调整图像的曝光和控制值。有关更多实施细节,请查看此 answer 。从iOS6开始,也可以使用CIColorMonochrome滤镜达到同样的效果。

这是所有可用过滤器的苹果documentation。单击每个滤镜,了解应用滤镜后图像的视觉效果。

这是 swift 4 代码。

 func imageBlackAndWhite() -> UIImage?
    {
        if let beginImage = CoreImage.CIImage(image: self)
        {
            let paramsColor: [String : Double] = [kCIInputBrightnessKey: 0.0,
                                                  kCIInputContrastKey:   1.1,
                                                  kCIInputSaturationKey: 0.0]
            let blackAndWhite = beginImage.applyingFilter("CIColorControls", parameters: paramsColor)

            let paramsExposure: [String : AnyObject] = [kCIInputEVKey: NSNumber(value: 0.7)]
            let output = blackAndWhite.applyingFilter("CIExposureAdjust", parameters: paramsExposure)

            guard let processedCGImage = CIContext().createCGImage(output, from: output.extent) else {
                return nil
            }

            return UIImage(cgImage: processedCGImage, scale: self.scale, orientation: self.imageOrientation)
        }
        return nil
    }

过滤 ARSCNView 快照: 如果你想为你的 ARSCNView 创建黑白屏幕截图,你可以这样做 returns UIImage 在 GrayScale 中 augmentedRealityView 指的是 ARSCNView:

/// Converts A UIImage To A High Contrast GrayScaleImage
///
/// - Returns: UIImage
func highContrastBlackAndWhiteFilter() -> UIImage?
{
    //1. Convert It To A CIIamge
    guard let convertedImage = CIImage(image: self) else { return nil }

    //2. Set The Filter Parameters
    let filterParameters = [kCIInputBrightnessKey: 0.0,
                            kCIInputContrastKey:   1.1,
                            kCIInputSaturationKey: 0.0]

    //3. Apply The Basic Filter To The Image
    let imageToFilter = convertedImage.applyingFilter("CIColorControls", parameters: filterParameters)

    //4. Set The Exposure
    let exposure =  [kCIInputEVKey: NSNumber(value: 0.7)]

    //5. Process The Image With The Exposure Setting
    let processedImage = imageToFilter.applyingFilter("CIExposureAdjust", parameters: exposure)

    //6. Create A CG GrayScale Image
    guard let grayScaleImage = CIContext().createCGImage(processedImage, from: processedImage.extent) else { return nil }

    return UIImage(cgImage: grayScaleImage, scale: self.scale, orientation: self.imageOrientation)
}

因此,使用它的一个例子可能是这样的:

 override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {

    //1. Create A UIImageView Dynamically
    let imageViewResult = UIImageView(frame: CGRect(x: 0, y: 0, width: self.view.bounds.width, height: self.view.bounds.height))
    self.view.addSubview(imageViewResult)

    //2. Create The Snapshot & Get The Black & White Image
    guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }
    imageViewResult.image = snapShotImage

    //3. Remove The ImageView After A Delay Of 5 Seconds
    DispatchQueue.main.asyncAfter(deadline: .now() + 5) {
        imageViewResult.removeFromSuperview()
    }

}

这将产生类似这样的结果:

为了使您的代码可重用,您还可以创建一个 extension of `UIImage:

//------------------------
//MARK: UIImage Extensions
//------------------------

extension UIImage
{

    /// Converts A UIImage To A High Contrast GrayScaleImage
    ///
    /// - Returns: UIImage
    func highContrastBlackAndWhiteFilter() -> UIImage?
    {
        //1. Convert It To A CIIamge
        guard let convertedImage = CIImage(image: self) else { return nil }

        //2. Set The Filter Parameters
        let filterParameters = [kCIInputBrightnessKey: 0.0,
                                kCIInputContrastKey:   1.1,
                                kCIInputSaturationKey: 0.0]

        //3. Apply The Basic Filter To The Image
        let imageToFilter = convertedImage.applyingFilter("CIColorControls", parameters: filterParameters)

        //4. Set The Exposure
        let exposure =  [kCIInputEVKey: NSNumber(value: 0.7)]

        //5. Process The Image With The Exposure Setting
        let processedImage = imageToFilter.applyingFilter("CIExposureAdjust", parameters: exposure)

        //6. Create A CG GrayScale Image
        guard let grayScaleImage = CIContext().createCGImage(processedImage, from: processedImage.extent) else { return nil }

        return UIImage(cgImage: grayScaleImage, scale: self.scale, orientation: self.imageOrientation)
    }

}

然后您可以像这样轻松使用:

guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }

请记住,您应该将扩展名放在 class declaration 之上,例如:

extension UIImage{

}

class ViewController: UIViewController, ARSCNViewDelegate {

}

因此,根据您问题中提供的代码,您将得到如下结果:

/// Creates A Black & White ScreenShot & Saves It To The Photo Album
@IBAction func changeTextColour(){

    //1. Create A Snapshot
    guard let snapShotImage = self.augmentedRealityView.snapshot().highContrastBlackAndWhiteFilter() else { return }

    //2. Save It The Photos Album
    UIImageWriteToSavedPhotosAlbum(snapShotImage, self, #selector(image(_:didFinishSavingWithError:contextInfo:)), nil)

}

///Calback To Check Whether The Image Has Been Saved
@objc func image(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {

    if let error = error {
        print("Error Saving ARKit Scene \(error)")
    } else {
        print("ARKit Scene Successfully Saved")
    }
}

黑白实时渲染: 使用 的这个绝妙答案,我还能够使用以下方法将整个相机提要渲染为黑白:

第一。像这样注册 ARSessionDelegate

 augmentedRealitySession.delegate = self

第二。然后在下面的委托回调中添加以下内容:

 //-----------------------
 //MARK: ARSessionDelegate
 //-----------------------

 extension ViewController: ARSessionDelegate{

 func session(_ session: ARSession, didUpdate frame: ARFrame) {

        /*
        Full Credit To 
        */

        //1. Convert The Current Frame To Black & White
        guard let currentBackgroundFrameImage = augmentedRealityView.session.currentFrame?.capturedImage,
              let pixelBufferAddressOfPlane = CVPixelBufferGetBaseAddressOfPlane(currentBackgroundFrameImage, 1) else { return }

        let x: size_t = CVPixelBufferGetWidthOfPlane(currentBackgroundFrameImage, 1)
        let y: size_t = CVPixelBufferGetHeightOfPlane(currentBackgroundFrameImage, 1)
        memset(pixelBufferAddressOfPlane, 128, Int(x * y) * 2)

      }

 }

成功渲染黑白摄像头画面的:

过滤黑白 SCNScene 的元素:

正如@Confused 所说的那样,如果您决定 cameraFeed 是彩色的,但 AR Experience 的内容是黑白的,您可以直接应用滤镜SCNNode 使用 filters 属性 就是:

An array of Core Image filters to be applied to the rendered contents of the node.

举例来说,我们用 Sphere Geometry 动态创建 3 SCNNodes,我们可以像这样直接将 CoreImageFilter 应用于这些:

/// Creates 3 Objects And Adds Them To The Scene (Rendering Them In GrayScale)
func createObjects(){

    //1. Create An Array Of UIColors To Set As The Geometry Colours
    let colours = [UIColor.red, UIColor.green, UIColor.yellow]

    //2. Create An Array Of The X Positions Of The Nodes
    let xPositions: [CGFloat] = [-0.3, 0, 0.3]

    //3. Create The Nodes & Add Them To The Scene
    for i in 0 ..< 3{

        let sphereNode = SCNNode()
        let sphereGeometry = SCNSphere(radius: 0.1)
        sphereGeometry.firstMaterial?.diffuse.contents = colours[i]
        sphereNode.geometry = sphereGeometry
        sphereNode.position = SCNVector3( xPositions[i], 0, -1.5)
        augmentedRealityView.scene.rootNode.addChildNode(sphereNode)

        //a. Create A Black & White Filter
        guard let blackAndWhiteFilter = CIFilter(name: "CIColorControls", withInputParameters: [kCIInputSaturationKey:0.0]) else { return }
        blackAndWhiteFilter.name = "bw"
        sphereNode.filters = [blackAndWhiteFilter]
        sphereNode.setValue(CIFilter(), forKeyPath: "bw")
    }

}

这将产生如下结果:

有关这些过滤器的完整列表,您可以参考以下内容:CoreImage Filter Reference

示例项目:这是一个完整的Example Project,您可以下载并自行探索。

希望对您有所帮助...

如果您想实时应用滤镜,最好的方法是使用 SCNTechnique。技术用于后处理,并允许我们分多次渲染 SCNView 内容——这正是我们所需要的(首先渲染场景,然后对其应用效果)。

这是 example project


Plist 设置

首先,我们需要在 .plist 文件中描述一种技术。

这是我想出的 plist 的屏幕截图(为了更好的可视化):

这是它的来源:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>sequence</key>
    <array>
        <string>apply_filter</string>
    </array>
    <key>passes</key>
    <dict>
        <key>apply_filter</key>
        <dict>
            <key>metalVertexShader</key>
            <string>scene_filter_vertex</string>
            <key>metalFragmentShader</key>
            <string>scene_filter_fragment</string>
            <key>draw</key>
            <string>DRAW_QUAD</string>
            <key>inputs</key>
            <dict>
                <key>scene</key>
                <string>COLOR</string>
            </dict>
            <key>outputs</key>
            <dict>
                <key>color</key>
                <string>COLOR</string>
            </dict>
        </dict>
    </dict>
</dict>

SCNTechniques 的主题非常宽泛,我只会快速介绍手头案例所需的内容。为了真正了解它们的能力,我建议阅读 Apple's comprehensive documentation 技术。

技术说明

passes 是一个字典,其中包含您希望 SCNTechnique 执行的通行证描述。

sequence 是一个数组,它指定将使用它们的键执行这些传递的顺序。

您没有在此处指定主渲染通道(意味着在不应用 SCNTechniques 的情况下渲染的任何内容)——这是隐含的,并且可以使用 COLOR 常量访问它的结果颜色(更多关于它稍后)。

所以我们要做的唯一 "extra" 通道(除了主要通道)将是 apply_filter 将颜色转换为黑白(可以随意命名,只是确保它在 passessequence).

中具有相同的密钥

现在介绍 apply_filter 通行证本身。

渲染过程描述

metalVertexShadermetalFragmentShaderMetal 将用于绘图的着色器函数的名称。

draw 定义通道要渲染的内容。 DRAW_QUAD代表:

Render only a rectangle covering the entire bounds of the view. Use this option for drawing passes that process image buffers output by earlier passes.

这意味着,粗略地说,我们将渲染一个没有渲染通道的普通 "image"。

inputs 指定我们将能够在着色器中使用的输入资源。正如我之前所说,COLOR 指的是由主渲染通道提供的颜色数据。

outputs 指定输出。它可以是 colordepthstencil,但我们只需要一个 color 输出。 COLOR 值意味着我们将 "directly" 渲染到屏幕(例如,与渲染到中间目标相反)。


金属着色器

创建一个包含以下内容的 .metal 文件:

#include <metal_stdlib>
using namespace metal;
#include <SceneKit/scn_metal>

struct VertexInput {
    float4 position [[ attribute(SCNVertexSemanticPosition) ]];
    float2 texcoord [[ attribute(SCNVertexSemanticTexcoord0) ]];
};

struct VertexOut {
    float4 position [[position]];
    float2 texcoord;
};

// metalVertexShader
vertex VertexOut scene_filter_vertex(VertexInput in [[stage_in]])
{
    VertexOut out;
    out.position = in.position;
    out.texcoord = float2((in.position.x + 1.0) * 0.5 , (in.position.y + 1.0) * -0.5);
    return out;
}

// metalFragmentShader
fragment half4 scene_filter_fragment(VertexOut vert [[stage_in]],
                                    texture2d<half, access::sample> scene [[texture(0)]])
{
    constexpr sampler samp = sampler(coord::normalized, address::repeat, filter::nearest);
    constexpr half3 weights = half3(0.2126, 0.7152, 0.0722);

    half4 color = scene.sample(samp, vert.texcoord);
    color.rgb = half3(dot(color.rgb, weights));

    return color;
}

注意,片段和顶点着色器的函数名称应该与通道描述符中 plist 文件中指定的名称相同。

要更好地理解 VertexInputVertexOut 结构的含义,请参阅 SCNProgram documentation

给定的顶点函数几乎可以在任何 DRAW_QUAD 渲染过程中使用。它基本上为我们提供了屏幕的规范化坐标 space(在片段着色器中使用 vert.texcoord 访问)。

片段函数是所有 "magic" 发生的地方。在那里,您可以操纵从主通道获得的纹理。使用此设置,您可以实现大量 filters/effects 等。

在我们的例子中,我使用了基本的去饱和度(零饱和度)公式来获得黑色和白色。


Swift 设置

现在,我们终于可以在 ARKit/SceneKit 中使用所有这些了。

let plistName = "SceneFilterTechnique" // the name of the plist you've created

guard let url = Bundle.main.url(forResource: plistName, withExtension: "plist") else {
    fatalError("\(plistName).plist does not exist in the main bundle")
}

guard let dictionary = NSDictionary(contentsOf: url) as? [String: Any] else {
    fatalError("Failed to parse \(plistName).plist as a dictionary")
}

guard let technique = SCNTechnique(dictionary: dictionary) else {
    fatalError("Failed to initialize a technique using \(plistName).plist")
}

并将其设置为 ARSCNViewtechnique

sceneView.technique = technique

就是这样。现在整个场景将以灰度渲染包括在拍摄快照时。

这可能是最简单、最快的方法:

将 CoreImage 滤镜应用于场景:

https://developer.apple.com/documentation/scenekit/scnnode/1407949-filters

这个滤镜给人的黑白照片印象非常好,灰色过渡很好:https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIPhotoEffectMono

你也可以使用这个,并且得到的结果也很容易改变色调:

https://developer.apple.com/library/content/documentation/GraphicsImaging/Reference/CoreImageFilterReference/index.html#//apple_ref/doc/filter/ci/CIColorMonochrome

这里是日文版滤镜和 SceneKit ARKit 协同工作的证明:http://appleengine.hatenablog.com/entry/advent20171215