检测 AR 对象上的手指?

Detecting finger on AR object?

我正在尝试创建一个移动 AR 应用程序以在平面上创建一个虚拟钢琴键盘并能够用我的手指弹奏琴键。

1) 哪种工具最容易实现这一目标? (例如 Vuforia、ARKit)

2) 是否可以将手指放在钢琴键盘前面而不是后面?

非常感谢您的提前帮助。

What framework to choose for working with Occlusion :

ARKit 和 Vuforia 框架都支持 Occlusion 功能,但目前 。对我来说,在 AR 应用程序中实现遮挡的最简单方法是在 Xcode 中使用 ARKit。如果您更喜欢 Unity 中的 Vuforia 及其 UI 设置 – 也欢迎您。

How to use Occlusion in ARKit and Vuforia :

要将 hand/finger 合成到真实世界的对象上,您必须在 Vuforia 中使用 People Occlusion feature in ARKit or Occlusion Management

使用frameSemantics实例属性在ARKit中激活深度通道合成非常容易——但是请记住它适用于 iOS 13+ with A12 chip and higher:

let session = ARSession()

if let config = session.configuration as? ARWorldTrackingConfiguration {
    config.frameSemantics.insert(.personSegmentationWithDepth)
    session.run(config)
}

如果你想了解如何在Unity/Vuforiadownload a CylinderTargets中打开Occlusion Management演示圆柱对象的使用,以及遮挡效果。

如何实现触摸 :

To make a touch (when real finger touches AR Piano Keys) you must implement a collision. You have to activate Image Detection AI feature to detect where a finger's nail is, then place there an anchor, and tether to that anchor a small invisible sphere. You need to constantly update a position for that anchor at 60 fps. Then create a collision shape for piano key and a second collision shape for that sphere. Then allow them collide. If you're developing with ARKit use SceneKit framework to implement a collision event. If you're developing with Vuforia – use official documentation to find out how to do it.