如何使用 ARKit 识别用户是否点击了屏幕上显示的对象 - swift

How to identify if user clicks on object that has been shown on screen using ARKit - swift

我是新手iOS编程。我想构建一个允许用户点击屏幕上显示的特定对象的应用程序。我正在对已显示的对象使用 addGestureRecognizer 以确定用户是否已被单击,然后我只想将另一个对象添加到屏幕。

这是我到目前为止所做的

objpizza = make2dNode(image:#imageLiteral(resourceName: "pizza"),width: 0.07,height: 0.07)
    objpizza.position = SCNVector3(0,0,-0.2)
    objpizza.name = "none"

    self.arView.addGestureRecognizer(UIGestureRecognizer(target: self, action: #selector(selectObject)))
    arView.scene.rootNode.addChildNode(objpizza)

这里是make2dNode函数只是调整对象

func make2dNode(image: UIImage, width: CGFloat = 0.1, height: CGFloat = 0.1) -> SCNNode {
    let plane = SCNPlane(width: width, height: height)
    plane.firstMaterial!.diffuse.contents = image
    let node = SCNNode(geometry: plane)
    node.constraints = [SCNBillboardConstraint()]
    return node
}

这是我在 self.arView.addGestureRecognizer(UIGestureRecognizer(target: self, action: #selector(selectObject)))

中实现时从未调用过的函数
@objc func selectObject() {

    print("Image has been selected")
}

您需要实施

let tapGesture = UITapGestureRecognizer(target: self, action: #selector(didTap(_:)))
arView.addGestureRecognizer(tapGesture)

用于识别用户是否被点击的函数

@obj func didTap(_ gesture: UITabGestureRecognizer){ 
  object2 = make2Node(image: nameOfimage, width: 0.07, height: 0.07)
  object2.position = SCNVector(0,0,0.2)
  arView.scene.rootNode.addChildNode(object2)

兄弟编码愉快

检测 SCNNode 上的触摸需要的不仅仅是向您的视图添加 UITapGestureRecogniser

为了检测您触摸了哪个 SCNNode,您需要使用 SCNHitTest(结合您的 gestureRecognizer),它是:

The process of finding elements of a scene located at a specified point, or along a specified line segment (or ray).

SCNHitTest 查找:

SCNGeometry objects along the ray you specify. For each intersection between the ray and and a geometry, SceneKit creates a hit-test result to provide information about both the SCNNode object containing the geometry and the location of the intersection on the geometry’s surface.

好吧,你可能会想,但这在我的案例中是如何工作的?

好吧,让我们首先创建一个带有 SCNSphere 几何体的 SCNNode 并将其添加到我们的场景中。

//1. Create An SCNNode With An SCNSphere Geometry
let nodeOneGeometry = SCNSphere(radius: 0.2)

//2. Set It's Colour To Cyan
nodeOneGeometry.firstMaterial?.diffuse.contents = UIColor.cyan

//3. Assign The Geometry To The Node
nodeOne = SCNNode(geometry: nodeOneGeometry)

//4. Assign A Name For Our Node
nodeOne.name = "Node One"

//5. Position It & Add It To Our ARSCNView
nodeOne.position = SCNVector3(0, 0, -1.5)
augmentedRealityView.scene.rootNode.addChildNode(nodeOne)

你会在这里注意到,我已经为我们的 SCNNode 分配了一个名称,这使得跟踪它(例如通过 hitTest 识别它)变得容易得多。

现在我们已经将 SCNNode 添加到层次结构中,让我们像这样创建一个 UITapGestureRecognizer

//1. Create A UITapGestureRecognizer & Add It To Our MainView
let tapGesture = UITapGestureRecognizer(target: self, action: #selector(checkNodeHit(_:)))
tapGesture.numberOfTapsRequired = 1
self.view.addGestureRecognizer(tapGesture)

现在我们已经完成了所有设置,我们需要创建我们的 checkNodeHit 函数来检测用户点击了哪个节点:

/// Runs An SCNHitTest To Check If An SCNNode Has Been Hit
///
/// - Parameter gesture: UITapGestureRecognizer
@objc func checkNodeHit(_ gesture: UITapGestureRecognizer){

    //1. Get The Current Touch Location In The View
    let currentTouchLocation = gesture.location(in: self.augmentedRealityView)

    //2. Perform An SCNHitTest To Determine If We Have Hit An SCNNode
    guard let hitTestNode = self.augmentedRealityView.hitTest(currentTouchLocation, options: nil).first?.node else { return }

    if hitTestNode.name == "Node One"{

        print("The User Has Successfuly Tapped On \(hitTestNode.name!)")

    }
}

现在,如果您想在用户点击的地方放置一个 SCNNode,您必须使用 ARSCNHitTest 来代替这样做,它提供:

Information about a real-world surface found by examining a point in the device camera view of an AR session.

通过执行此操作,我们可以使用结果的 worldTransform 属性 将虚拟内容放置在该位置。

供您参考,worldTransform 是:

The position and orientation of the hit test result relative to the world coordinate system.

ARKit 中的定位可以像这样轻松可视化:

再次让我们看看如何使用它来放置虚拟对象:

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {

    //1. Get The Current Touch Location In Our ARSCNView & Perform An ARSCNHitTest For Any Viable Feature Points
    guard let currentTouchLocation = touches.first?.location(in: self.augmentedRealityView),
          let hitTest = self.augmentedRealityView.hitTest(currentTouchLocation, types: .featurePoint).first else { return }

    //2. Get The World Transform From The HitTest & Get The Positional Data From The Matrix (3rd Column)
    let worldPositionFromTouch = hitTest.worldTransform.columns.3

    //3. Create An SCNNode At The Touch Location
    let boxNode = SCNNode()
    let boxGeometry = SCNBox(width: 0.1, height: 0.1, length: 0.1, chamferRadius: 0)
    boxGeometry.firstMaterial?.diffuse.contents = UIColor.cyan
    boxNode.geometry = boxGeometry
    boxNode.position = SCNVector3(worldPositionFromTouch.x, worldPositionFromTouch.y, worldPositionFromTouch.z)

    //4. Add It To The Scene Hierachy
    self.augmentedRealityView.scene.rootNode.addChildNode(boxNode)

}

希望对您有所帮助...