SCNNode 的 ARAnchor
ARAnchor for SCNNode
我试图在将 SCNNode 添加到 ARSCNView 的场景后获取锚点。我的理解是应该自动创建锚点,但我似乎无法检索它。
下面是我的添加方式。该节点保存在一个名为 testNode 的变量中。
let node = SCNNode()
node.geometry = SCNBox(width: 0.5, height: 0.1, length: 0.3, chamferRadius: 0)
node.geometry?.firstMaterial?.diffuse.contents = UIColor.green
sceneView.scene.rootNode.addChildNode(node)
testNode = node
以下是我尝试检索它的方法。它总是打印 nil。
if let testNode = testNode {
print(sceneView.anchor(for: testNode))
}
它不创建锚点吗?如果是:我可以使用另一种方法来检索它吗?
如果您查看 Apple Docs
,它指出:
To track the positions and orientations of real or virtual objects
relative to the camera, create anchor objects and use the add(anchor:)
method to add them to your AR session.
因此,我认为由于您没有使用 PlaneDetection
,如果需要,您需要手动创建一个 ARAnchor
:
Whenever you place a virtual object, always add an ARAnchor representing its position and orientation to the ARSession. After moving a virtual object, remove the anchor at the old position and create a new anchor at the new position. Adding an anchor tells ARKit that a position is important, improving world tracking quality in that area and helping virtual objects appear to stay in place relative to real-world surfaces.
您可以在以下帖子中阅读更多相关信息
无论如何,为了让您开始,我首先创建了一个名为 currentNode:
的 SCNNode
var currentNode: SCNNode?
然后使用 UITapGestureRecognizer
我在 touchLocation
:
手动创建了一个 ARAnchor
@objc func handleTap(_ gesture: UITapGestureRecognizer){
//1. Get The Current Touch Location
let currentTouchLocation = gesture.location(in: self.augmentedRealityView)
//2. If We Have Hit A Feature Point Get The Result
if let hitTest = augmentedRealityView.hitTest(currentTouchLocation, types: [.featurePoint]).last {
//2. Create An Anchore At The World Transform
let anchor = ARAnchor(transform: hitTest.worldTransform)
//3. Add It To The Scene
augmentedRealitySession.add(anchor: anchor)
}
}
添加锚点后,我使用 ARSCNViewDelegate
回调创建 currentNode,如下所示:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if currentNode == nil{
currentNode = SCNNode()
let nodeGeometry = SCNBox(width: 0.2, height: 0.2, length: 0.2, chamferRadius: 0)
nodeGeometry.firstMaterial?.diffuse.contents = UIColor.cyan
currentNode?.geometry = nodeGeometry
currentNode?.position = SCNVector3(anchor.transform.columns.3.x, anchor.transform.columns.3.y, anchor.transform.columns.3.z)
node.addChildNode(currentNode!)
}
}
为了测试它是否有效,例如能够记录相应的 ARAnchor
,我更改了 tapGesture 方法以在末尾包含此内容:
if let anchorHitTest = augmentedRealityView.hitTest(currentTouchLocation, options: nil).first,{
print(augmentedRealityView.anchor(for: anchorHitTest.node))
}
在我的 ConsoleLog
中打印:
Optional(<ARAnchor: 0x1c0535680 identifier="23CFF447-68E9-451D-A64D-17C972EB5F4B" transform=<translation=(-0.006610 -0.095542 -0.357221) rotation=(-0.00° 0.00° 0.00°)>>)
希望对您有所帮助...
我试图在将 SCNNode 添加到 ARSCNView 的场景后获取锚点。我的理解是应该自动创建锚点,但我似乎无法检索它。
下面是我的添加方式。该节点保存在一个名为 testNode 的变量中。
let node = SCNNode()
node.geometry = SCNBox(width: 0.5, height: 0.1, length: 0.3, chamferRadius: 0)
node.geometry?.firstMaterial?.diffuse.contents = UIColor.green
sceneView.scene.rootNode.addChildNode(node)
testNode = node
以下是我尝试检索它的方法。它总是打印 nil。
if let testNode = testNode {
print(sceneView.anchor(for: testNode))
}
它不创建锚点吗?如果是:我可以使用另一种方法来检索它吗?
如果您查看 Apple Docs
,它指出:
To track the positions and orientations of real or virtual objects relative to the camera, create anchor objects and use the add(anchor:) method to add them to your AR session.
因此,我认为由于您没有使用 PlaneDetection
,如果需要,您需要手动创建一个 ARAnchor
:
Whenever you place a virtual object, always add an ARAnchor representing its position and orientation to the ARSession. After moving a virtual object, remove the anchor at the old position and create a new anchor at the new position. Adding an anchor tells ARKit that a position is important, improving world tracking quality in that area and helping virtual objects appear to stay in place relative to real-world surfaces.
您可以在以下帖子中阅读更多相关信息
无论如何,为了让您开始,我首先创建了一个名为 currentNode:
的SCNNode
var currentNode: SCNNode?
然后使用 UITapGestureRecognizer
我在 touchLocation
:
ARAnchor
@objc func handleTap(_ gesture: UITapGestureRecognizer){
//1. Get The Current Touch Location
let currentTouchLocation = gesture.location(in: self.augmentedRealityView)
//2. If We Have Hit A Feature Point Get The Result
if let hitTest = augmentedRealityView.hitTest(currentTouchLocation, types: [.featurePoint]).last {
//2. Create An Anchore At The World Transform
let anchor = ARAnchor(transform: hitTest.worldTransform)
//3. Add It To The Scene
augmentedRealitySession.add(anchor: anchor)
}
}
添加锚点后,我使用 ARSCNViewDelegate
回调创建 currentNode,如下所示:
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if currentNode == nil{
currentNode = SCNNode()
let nodeGeometry = SCNBox(width: 0.2, height: 0.2, length: 0.2, chamferRadius: 0)
nodeGeometry.firstMaterial?.diffuse.contents = UIColor.cyan
currentNode?.geometry = nodeGeometry
currentNode?.position = SCNVector3(anchor.transform.columns.3.x, anchor.transform.columns.3.y, anchor.transform.columns.3.z)
node.addChildNode(currentNode!)
}
}
为了测试它是否有效,例如能够记录相应的 ARAnchor
,我更改了 tapGesture 方法以在末尾包含此内容:
if let anchorHitTest = augmentedRealityView.hitTest(currentTouchLocation, options: nil).first,{
print(augmentedRealityView.anchor(for: anchorHitTest.node))
}
在我的 ConsoleLog
中打印:
Optional(<ARAnchor: 0x1c0535680 identifier="23CFF447-68E9-451D-A64D-17C972EB5F4B" transform=<translation=(-0.006610 -0.095542 -0.357221) rotation=(-0.00° 0.00° 0.00°)>>)
希望对您有所帮助...