我可以使用 RealityKit 在世界跟踪配置中执行 ARKit "Continuous Image Tracking" 吗?

Can I do ARKit "Continuous Image Tracking" in a World Tracking Configuration with RealityKit?

更新:我认为 RealityKit ARViews 不可能开箱即用的“连续图像跟踪”的前提是不正确的。我需要做的就是为连续跟踪的参考图像正确创建 AnchorEntity。

需要使用 init(anchor: ARAnchor) 初始值设定项创建锚点实体。 (init(world: SIMD3<Float>) 初始化程序对于固定在现实世界中的锚点是正确的,但对于应该跟踪参考图像的锚点则不正确。)

将 ARKit 和 RealityKit 与 ARWorldTrackingConfiguration 一起使用,我正在尝试进行“连续图像跟踪”(其中每帧跟踪参考图像,并且可以将虚拟对象锚定到它上面,看起来像是附加的与参考图像一起移动)。因为参考图像在世界跟踪中仅被识别一次(与 ARImageTrackingConfiguration 相反,参考图像只要在帧中就会被连续跟踪),这是不可能开箱即用的。

为了在世界跟踪配置中获得相同的结果,我在 session(_:didAdd:) 委托方法中将虚拟对象锚定到参考图像,并使用 session(_:didUpdate:) 委托方法作为移除对象的机会每次识别后的 ARImageAnchor。这会导致参考图像被一遍又一遍地重新识别,从而使虚拟对象能够锚定到图像上,并似乎逐帧跟踪它。

在下面的示例中,我放置了两个球标记来跟踪参考图像的位置。第一个标记仅放置一次,位于最初检测到参考图像的位置。每次重新检测到参考图像时,另一个标记都会重新定位,似乎跟随它。

这行得通。虚拟内容在 ARWorldTrackingConfiguration 中跟踪参考图像,就像在图像跟踪配置中一样。但是,虽然 ARImageTrackingConfiguration 中的“动画”非常流畅,但世界跟踪中的动画却不那么流畅,更加跳跃,就好像它是每秒 10 或 15 帧的 运行 一样。 (.showStatistics 报告的实际 FPS 在两种配置中都接近 60 FPS。)

我假设平滑度的差异是由 ARKit 在每个 didAdd/didUpdate 周期重复重新识别和删除参考图像锚点所花费的时间造成的。

我想知道是否有更好的技术在 ARWorldTrackingConfiguration 中获得“连续图像跟踪”,and/or我是否有任何方法可以改进委托方法中的代码来实现这种效果.

import ARKit
import RealityKit

class ViewController: UIViewController, ARSessionDelegate {

    @IBOutlet var arView: ARView!
    
    // originalImageAnchor is used to visualize the first-detected location of reference image
    // currentImageAnchor should be continuously updated to match current position of ref image
    var originalImageAnchor: AnchorEntity!
    var currentImageAnchor: AnchorEntity!
    
    let ballRadius: Float = 0.02

    override func viewDidLoad() {
        super.viewDidLoad()
        
        guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources",
             bundle: nil) else { fatalError("Missing expected asset catalog resources.") }
        
        arView.session.delegate = self
        arView.automaticallyConfigureSession = false
        arView.debugOptions = [.showStatistics]
        arView.renderOptions = [.disableCameraGrain, .disableHDR, .disableMotionBlur,
            .disableDepthOfField, .disableFaceOcclusions, .disablePersonOcclusion,
            .disableGroundingShadows, .disableAREnvironmentLighting]

        let configuration = ARWorldTrackingConfiguration()
        configuration.detectionImages = referenceImages
        configuration.maximumNumberOfTrackedImages = 1  // there is one ref image named "coaster_rb"

        arView.session.run(configuration)
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

        // Reference image detected. This will happen multiple times because
        // we delete ARImageAnchor in session(_:didUpdate:)
        if let imageName = imageAnchor.name, imageName  == "coaster_rb" {

            // If originalImageAnchor is nil, create an anchor and
            // add a marker at initial position of reference image.
            if originalImageAnchor == nil {
                originalImageAnchor = AnchorEntity(world: imageAnchor.transform)
                let originalImageMarker = generateBallMarker(radius: ballRadius, color: .systemPink)
                originalImageMarker.position.y = ballRadius + (ballRadius * 2)
                originalImageAnchor.addChild(originalImageMarker)
                arView.scene.addAnchor(originalImageAnchor)
            }
            
            // If currentImageAnchor is nil, add an anchor and marker at reference image position
            // If currentImageAnchor has already been added, adjust it's position to match ref image
            if currentImageAnchor == nil {
                currentImageAnchor = AnchorEntity(world: imageAnchor.transform)
                let currentImageMarker = generateBallMarker(radius: ballRadius, color: .systemTeal)
                currentImageMarker.position.y = ballRadius
                currentImageAnchor.addChild(currentImageMarker)
                arView.scene.addAnchor(currentImageAnchor)
            } else {
                currentImageAnchor.setTransformMatrix(imageAnchor.transform, relativeTo: nil)
            }
        }
    }
    
    func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
        guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

        // Delete reference image anchor to allow for ongoing tracking as it moves
        if let imageName = imageAnchor.name, imageName  == "coaster_rb" {
            arView.session.remove(anchor: anchors[0])
        }
    }
    
    func generateBallMarker(radius: Float, color: UIColor) -> ModelEntity {
        let ball = ModelEntity(mesh: .generateSphere(radius: radius),
            materials: [SimpleMaterial(color: color, isMetallic: false)])
        return ball
    }
}

连续图像跟踪在世界跟踪配置中与 RealityKit ARViews 开箱即用。我的原始代码中的一个错误让我产生了不同的想法。

不正确的锚点实体初始化(对于我试图完成的):

currentImageAnchor = AnchorEntity(world: imageAnchor.transform)

因为我想跟踪分配给匹配参考图像的 ARImageAnchor,我应该这样做:

currentImageAnchor = AnchorEntity(anchor: imageAnchor)

下面的更正示例放置了一个固定在参考图像初始位置的虚拟标记,以及另一个在世界跟踪配置中平滑跟踪参考图像的虚拟标记:

import ARKit
import RealityKit

class ViewController: UIViewController, ARSessionDelegate {

    @IBOutlet var arView: ARView!
    
    let ballRadius: Float = 0.02

    override func viewDidLoad() {
        super.viewDidLoad()
        
        guard let referenceImages = ARReferenceImage.referenceImages(
            inGroupNamed: "AR Resources", bundle: nil) else {
            fatalError("Missing expected asset catalog resources.")
        }
        
        arView.session.delegate = self
        arView.automaticallyConfigureSession = false
        arView.debugOptions = [.showStatistics]
        arView.renderOptions = [.disableCameraGrain, .disableHDR,
            .disableMotionBlur, .disableDepthOfField,
            .disableFaceOcclusions, .disablePersonOcclusion,
            .disableGroundingShadows, .disableAREnvironmentLighting]

        let configuration = ARWorldTrackingConfiguration()
        configuration.detectionImages = referenceImages
        configuration.maximumNumberOfTrackedImages = 1

        arView.session.run(configuration)
    }

    func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
        guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

        if let imageName = imageAnchor.name, imageName  == "target_image" {
            
            // AnchorEntity(world: imageAnchor.transform) results in anchoring
            // virtual content to the real world.  Content anchored like this
            // will remain in position even if the reference image moves.
            let originalImageAnchor = AnchorEntity(world: imageAnchor.transform)
            let originalImageMarker = makeBall(radius: ballRadius, color: .systemPink)
            originalImageMarker.position.y = ballRadius + (ballRadius * 2)
            originalImageAnchor.addChild(originalImageMarker)
            arView.scene.addAnchor(originalImageAnchor)

            // AnchorEntity(anchor: imageAnchor) results in anchoring
            // virtual content to the ARImageAnchor that is attached to the
            // reference image.  Content anchored like this will appear
            // stuck to the reference image.
            let currentImageAnchor = AnchorEntity(anchor: imageAnchor)
            let currentImageMarker = makeBall(radius: ballRadius, color: .systemTeal)
            currentImageMarker.position.y = ballRadius
            currentImageAnchor.addChild(currentImageMarker)
            arView.scene.addAnchor(currentImageAnchor)
        }
    }
    
    func makeBall(radius: Float, color: UIColor) -> ModelEntity {
        let ball = ModelEntity(mesh: .generateSphere(radius: radius),
            materials: [SimpleMaterial(color: color, isMetallic: false)])
        return ball
    }
}