RealityKit ARkit:从光线投射中找到一个锚点(或实体) - 始终为零
RealityKit ARkit : find an anchor (or entity) from raycast - always nil
当我触摸屏幕时,我使用光线投射将对象(实体和锚点)放置到它的 worldTransform
然后我尝试触摸这个对象以获得它的锚点(或者它自己的实体)
我正在尝试使用 raycast
(或 hitTest
)找到以前放置的锚点,但每件事 returns nil
像那样:
这是我的 onTap
代码:
@IBAction func onTap(_ sender: UITapGestureRecognizer){
let tapLocation = sender.location(in: arView)
guard let result = arView.raycast(from: tapLocation, allowing: .estimatedPlane, alignment: .any).first else { return }
print("we have anchor??")
print(result.anchor) // always nil
// never hit
if let existResult = arView.hitTest(tapLocation).first {
print("hit test trigger")
if let entity = existResult.entity as Entity? {
NSLog("we have an entity \(entity.name)")
...
}
}
}
这就是我创建对象和锚点的方式:
let anchor = AnchorEntity(world: position)
anchor.addChild(myObj)
arView.scene.anchors.append(anchor)
// my obj is now visible
你知道为什么我摸不到锚点吗?
编辑:
ARview
配置:
arView.session.delegate = self
arView.automaticallyConfigureSession = true
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal, .vertical]
NSLog("FINISHED INIT")
if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
config.sceneReconstruction = .mesh // .meshWithClassification
arView.environment.sceneUnderstanding.options.insert([.occlusion])
arView.debugOptions.insert(.showSceneUnderstanding)
NSLog("FINISHED with scene reco")
} else {
NSLog("no not support scene Reconstruction")
}
let tapGesture = UITapGestureRecognizer(target: self, action:#selector(onTap))
arView.addGestureRecognizer(tapGesture)
arView.session.run(config)
只有在 ARKit 检测到平面后才能进行光线投射。它只能对平面或特征点进行光线投射。因此,请确保您 运行 具有平面检测的 AR 配置(垂直或水平取决于您的情况)
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
可以在renderer didAdd delegate of ARSCNViewDelegate中查看是否添加了plane anchor
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if let planeAnchor = anchor as? ARPlaneAnchor {
//plane detected
}
}
我终于设法找到了解决方案:
我的ModelEntity
(锚定)必须有一个碰撞形状!
所以简单地添加后 entity.generateCollisionShapes(recursive: true)
.
这就是我生成简单框的方式:
let box: MeshResource = .generateBox(width: width, height: height, depth: length)
var material = SimpleMaterial()
material.tintColor = color
let entity = ModelEntity(mesh: box, materials: [material])
entity.generateCollisionShapes(recursive: true) // Very important to active collisstion and hittesting !
return entity
所以之后我们必须告诉 arView
听手势 :
arView.installGestures(.all, for: entity)
最后:
@IBAction func onTap(_ sender: UITapGestureRecognizer){
let tapLocation = sender.location(in: arView)
if let hitEntity = arView.entity(
at: tapLocation
) {
print("touched")
print(hitEntity.name)
// touched !
return ;
}
}
当我触摸屏幕时,我使用光线投射将对象(实体和锚点)放置到它的 worldTransform
然后我尝试触摸这个对象以获得它的锚点(或者它自己的实体)
我正在尝试使用 raycast
(或 hitTest
)找到以前放置的锚点,但每件事 returns nil
像那样:
这是我的 onTap
代码:
@IBAction func onTap(_ sender: UITapGestureRecognizer){
let tapLocation = sender.location(in: arView)
guard let result = arView.raycast(from: tapLocation, allowing: .estimatedPlane, alignment: .any).first else { return }
print("we have anchor??")
print(result.anchor) // always nil
// never hit
if let existResult = arView.hitTest(tapLocation).first {
print("hit test trigger")
if let entity = existResult.entity as Entity? {
NSLog("we have an entity \(entity.name)")
...
}
}
}
这就是我创建对象和锚点的方式:
let anchor = AnchorEntity(world: position)
anchor.addChild(myObj)
arView.scene.anchors.append(anchor)
// my obj is now visible
你知道为什么我摸不到锚点吗?
编辑:
ARview
配置:
arView.session.delegate = self
arView.automaticallyConfigureSession = true
let config = ARWorldTrackingConfiguration()
config.planeDetection = [.horizontal, .vertical]
NSLog("FINISHED INIT")
if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
config.sceneReconstruction = .mesh // .meshWithClassification
arView.environment.sceneUnderstanding.options.insert([.occlusion])
arView.debugOptions.insert(.showSceneUnderstanding)
NSLog("FINISHED with scene reco")
} else {
NSLog("no not support scene Reconstruction")
}
let tapGesture = UITapGestureRecognizer(target: self, action:#selector(onTap))
arView.addGestureRecognizer(tapGesture)
arView.session.run(config)
只有在 ARKit 检测到平面后才能进行光线投射。它只能对平面或特征点进行光线投射。因此,请确保您 运行 具有平面检测的 AR 配置(垂直或水平取决于您的情况)
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = .horizontal
可以在renderer didAdd delegate of ARSCNViewDelegate中查看是否添加了plane anchor
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
if let planeAnchor = anchor as? ARPlaneAnchor {
//plane detected
}
}
我终于设法找到了解决方案:
我的ModelEntity
(锚定)必须有一个碰撞形状!
所以简单地添加后 entity.generateCollisionShapes(recursive: true)
.
这就是我生成简单框的方式:
let box: MeshResource = .generateBox(width: width, height: height, depth: length)
var material = SimpleMaterial()
material.tintColor = color
let entity = ModelEntity(mesh: box, materials: [material])
entity.generateCollisionShapes(recursive: true) // Very important to active collisstion and hittesting !
return entity
所以之后我们必须告诉 arView
听手势 :
arView.installGestures(.all, for: entity)
最后:
@IBAction func onTap(_ sender: UITapGestureRecognizer){
let tapLocation = sender.location(in: arView)
if let hitEntity = arView.entity(
at: tapLocation
) {
print("touched")
print(hitEntity.name)
// touched !
return ;
}
}