Qt3D:如何将 Scene2D 缩放为与 window(像素方向)相同的大小?

Qt3D: How to scale a Scene2D to be the same size as the window (pixel-wise)?

我创建了一个 800x600 window 的 C++ 应用程序,它使用 Qt Quick 2 元素和 Qt 3D 对象在 QML 中成功绘制了一些对象:

QML 代码使用 Scene2D 中的 Qt Quick 2 Rectangle 元素绘制了几个 green/yellow 矩形。然后将 2D 场景块传输到 3D 立方体的表面之一以进行渲染并显示在 3D 世界中。最后,如上面的屏幕截图所示,来自 Qt 3D 的蓝色 SphereMesh 在中心呈现。

我一直在尝试调整 3D 立方体(2D UI 被渲染到的地方)的大小,使其与 window 具有相同的大小,但我找不到以编程方式执行此操作的方法:

所以问题是如何调整或缩放 3D 立方体,使其自动调整为与 window 具有相同的大小?

我正在寻找一种解决方案,使立方体具有与 window 相同的像素数量。例如,在 800x600 window 上,我希望看到一个 800x600 的绿色矩形。

这是我尝试过的:我可以手动调整 camZ 的值,这是 Camera 与 3D 世界中心的距离,有点眼球,但是这不是一个精确的解决方案:如果 window 稍后更改为不同的维度,我将需要再次进行大量测试以确定 camZ 的新值必须是什么。

有什么想法吗?

main.cpp:

#include <QGuiApplication>
#include <QQmlContext>

#include <Qt3DQuickExtras/qt3dquickwindow.h>
#include <Qt3DQuick/QQmlAspectEngine>


int main(int argc, char **argv)
{
    QGuiApplication app(argc, argv);

    Qt3DExtras::Quick::Qt3DQuickWindow view;
    view.setSource(QUrl("qrc:/main.qml"));
    auto rootContext = view.engine()->qmlEngine()->rootContext();
    rootContext->setContextProperty("_window", &view);
    view.resize(800, 600);
    view.show();

    return app.exec();
}

main.qml:

import Qt3D.Core 2.12
import Qt3D.Render 2.12
import Qt3D.Extras 2.12
import Qt3D.Input 2.12

import QtQuick 2.0
import QtQuick.Scene2D 2.9
import QtQuick.Controls 1.4
import QtQuick.Layouts 1.2

Entity
{
    id: sceneRoot
    property int w: _window.width
    property int h: _window.height
    property real camZ: 1000

    /* setup camera */

    Camera {
        id: mainCamera
        projectionType: CameraLens.PerspectiveProjection
        fieldOfView: 45
        aspectRatio: _window.width / _window.height
        nearPlane:   0.01
        farPlane: 1000000.0
        position:    Qt.vector3d( 0.0, 0.0, sceneRoot.camZ )
        viewCenter:  Qt.vector3d( 0.0, 0.0, 0.0 )
        upVector:    Qt.vector3d( 0.0, 1.0, 0.0 )
    }

    components: [
        RenderSettings {
            activeFrameGraph: ForwardRenderer {
                camera: mainCamera
                clearColor: "white"
            }
            pickingSettings.pickMethod: PickingSettings.TrianglePicking
        },

        InputSettings {}
    ]

    /* setup a 3D cube to be used as the 2D drawing surface for all Qt Quick 2 stuff */

    Entity {
        id: drawingSurface

        CuboidMesh {
            id: planeMesh
        }

        Transform {
            id: planeTransform
            translation: Qt.vector3d(0, 0, 0)
            scale3D: Qt.vector3d(sceneRoot.w, sceneRoot.h, 1)
        }

        TextureMaterial {
            id: planeMaterial
            texture: offscreenTexture  // created by qmlTexture below
        }

        // picked up by Scene2D’s "entities" property and used as a source for events
        ObjectPicker {
            id: planePicker
            hoverEnabled: false
            dragEnabled: false
        }

        components: [ planeMesh, planeMaterial, planeTransform, planePicker ]
    }

    /* setup Scene2D offscreen texture to be used as canvas by Qt Quick 2 */

    Scene2D {
        id: qmlTexture
        output: RenderTargetOutput {
            attachmentPoint: RenderTargetOutput.Color0
            texture: Texture2D {
                id: offscreenTexture
                width: sceneRoot.w
                height: sceneRoot.h
                format: Texture.RGBA8_UNorm
                generateMipMaps: true
                magnificationFilter: Texture.Linear
                minificationFilter: Texture.LinearMipMapLinear
                wrapMode {
                    x: WrapMode.ClampToEdge
                    y: WrapMode.ClampToEdge
                }
            }
        }

        mouseEnabled: false
        entities: [ drawingSurface ]

        /* Qt Quick 2 rendering */

        Rectangle {
            width: offscreenTexture.width
            height: offscreenTexture.height
            x: 0
            y: 0
            border.color: "red"
            color: "green"

            Component.onCompleted: {
                console.log("Outter rectangle size: " + width + "x" + height + " at " + x + "," + y);
            }

            Rectangle {
                id: innerRect
                height: parent.height*0.6
                width: height
                x: (parent.width/2) - (width/2)
                y: (parent.height/2) - (height/2)
                border.color: "red"
                color: "yellow"
                transform: Rotation { origin.x: innerRect.width/2; origin.y: innerRect.height/2; angle: 45}

                Component.onCompleted: {
                    console.log("Inner rectangle size: " + width + "x" + height + " at " + x + "," + y);
                }
            }
        }

    } // Scene2D

    /* add light source at the same place as the camera */

    Entity {
        PointLight {
            id: light
            color: "white"
            intensity: 1
            constantAttenuation: 1.0
            linearAttenuation: 0.0
        }

        Transform {
            id: lightTransform
            translation: Qt.vector3d(0.0, 0.0, sceneRoot.camZ)
        }

        components: [ light, lightTransform ]
    }

    /* display 3D object */

    Entity {
        SphereMesh {
            id: mesh
            radius: 130
        }

        PhongMaterial {
            id: material
            ambient: "blue"
        }

        Transform {
           id: transform
           translation: Qt.vector3d(0, 0, 0)
       }

       components: [ mesh, material, transform ]
   }

} // sceneRoot

将这些模块添加到您的 .pro 文件中:

QT += qml quick 3dquick 3dquickextras

通常,当您想要纹理覆盖整个屏幕时,您会使用 orthographic projection。与透视投影相反,无论物体与相机的距离如何,它们在屏幕上总是显示相同的大小。这种类型的投影通常用于可视化建筑物等的 3D 平面图或渲染 3D 中的 UI 元素。

现在的想法是您必须绘制分支:

  1. 绘制背景图像
  2. 绘制所有对象
                     RenderSurfaceSelector
                                |
                             Viewport
                                |
          -------------------------------------------
          |             |             |             |
     ClearBuffers  LayerFilter   ClearBuffers  LayerFilter
          |             |             |             |
        NoDraw    CameraSelector    NoDraw    CameraSelector

第一个(从左到右)clear buffers 清除所有缓冲区。第一层过滤器过滤背景层(您必须将其附加到背景实体)。第二个清除缓冲区仅清除深度(以便明确绘制对象)。第二层过滤器过滤主层(您必须将其附加到要绘制的所有对象)。

然后创建背景相机并将其投影类型设置为正投影:

Camera {
        id: backgroundCamera
        projectionType: CameraLens.OrthographicProjection
        fieldOfView: 45
        aspectRatio: sceneRoot.w / sceneRoot.h
        left: - sceneRoot.w / 2
        right: sceneRoot.w / 2
        bottom: - sceneRoot.h / 2
        top: sceneRoot.h / 2
        nearPlane:   0.1
        farPlane:    1000.0
        position:    Qt.vector3d( 0.0, 0.0, 1.0 )
        viewCenter:  Qt.vector3d( 0.0, 0.0, 0.0 )
        upVector:    Qt.vector3d( 0.0, 1.0, 0.0 )
}

您也可以选择 -11 左-右和下-上而不是 sceneRoot.wsceneRoot.h。在这种情况下,您必须将纹理平面的大小调整为 (2, 2)。我想绘制用户在纹理上所做的点击,这就是我选择屏幕尺寸的原因。

旁注:不要为 nearPlanefarPlane 使用非常高或非常低的值。它在 Qt3D 文档(某处,现在无法找到它)中说,当远平面设置为更大的 100.000 时,将会出现不准确。此外,如果您将其设置得太小,也会发生同样的情况。你可以上网查一查,这是3D电脑绘图的普遍问题。

好吧,这是完整的代码:

import Qt3D.Core 2.12
import Qt3D.Render 2.12
import Qt3D.Extras 2.12
import Qt3D.Input 2.12

import QtQuick 2.0
import QtQuick.Scene2D 2.9
import QtQuick.Controls 1.4
import QtQuick.Layouts 1.2

Entity
{
    id: sceneRoot
    property int w: _window.width
    property int h: _window.height
    property real camZ: 1000

    components: [
        RenderSettings {
            activeFrameGraph:  RenderSurfaceSelector {
                id: surfaceSelector

                Viewport {
                    id: mainViewport
                    normalizedRect: Qt.rect(0, 0, 1, 1)

                    ClearBuffers {
                        buffers: ClearBuffers.ColorDepthBuffer
                        clearColor: Qt.rgba(0.6, 0.6, 0.6, 1.0)

                        NoDraw {
                            // Prevent drawing here, we only want to clear the buffers
                        }
                    }

                    LayerFilter {
                        id: backgroundLayerFilter

                        layers: [backgroundLayer]

                        CameraSelector {
                            id: backgroundCameraSelector
                            camera: backgroundCamera
                        }
                    }

                    ClearBuffers {
                        buffers: ClearBuffers.DepthBuffer

                        NoDraw {
                            // Prevent drawing here, we only want to clear the buffers
                        }
                    }

                    LayerFilter {
                        id: mainLayerFilter

                        layers: [mainLayer]

                        CameraSelector {
                            id: mainCameraSelector
                            camera: mainCamera
                        }
                    }
                }
            }
            pickingSettings.pickMethod: PickingSettings.TrianglePicking
        },

        InputSettings {}
    ]

    Camera {
        id: mainCamera
        projectionType: CameraLens.PerspectiveProjection
        fieldOfView: 45
        aspectRatio: _window.width / _window.height
        nearPlane:   0.1
        farPlane:    1000.0
        position:    Qt.vector3d( 0.0, 0.0, camZ )
        viewCenter:  Qt.vector3d( 0.0, 0.0, 0.0 )
        upVector:    Qt.vector3d( 0.0, 1.0, 0.0 )
    }

    /* setup camera */

    Camera {
        id: backgroundCamera
        projectionType: CameraLens.OrthographicProjection
        fieldOfView: 45
        aspectRatio: sceneRoot.w / sceneRoot.h
        left: - sceneRoot.w / 2
        right: sceneRoot.w / 2
        bottom: - sceneRoot.h / 2
        top: sceneRoot.h / 2
        nearPlane:   0.1
        farPlane:    1000.0
        position:    Qt.vector3d( 0.0, 0.0, 1.0 )
        viewCenter:  Qt.vector3d( 0.0, 0.0, 0.0 )
        upVector:    Qt.vector3d( 0.0, 1.0, 0.0 )
    }

    /* setup a 3D cube to be used as the 2D drawing surface for all Qt Quick 2 stuff */

    Entity {
        id: drawingSurface

        PlaneMesh {
            id: planeMesh
            width: sceneRoot.w
            height: sceneRoot.h
        }

        Transform {
            id: planeTransform
            translation: Qt.vector3d(0, 0, 0)
            rotationX: 90
        }

        TextureMaterial {
            id: planeMaterial
            texture: offscreenTexture  // created by qmlTexture below
        }

        Layer {
            id: backgroundLayer
        }

        // picked up by Scene2D’s "entities" property and used as a source for events
        ObjectPicker {
            id: planePicker
            hoverEnabled: false
            dragEnabled: false
        }

        components: [ planeMesh, planeMaterial, planeTransform, planePicker, backgroundLayer ]
    }

    /* setup Scene2D offscreen texture to be used as canvas by Qt Quick 2 */

    Scene2D {
        id: qmlTexture
        output: RenderTargetOutput {
            attachmentPoint: RenderTargetOutput.Color0
            texture: Texture2D {
                id: offscreenTexture
                width: sceneRoot.w
                height: sceneRoot.h
                format: Texture.RGBA8_UNorm
                generateMipMaps: true
                magnificationFilter: Texture.Linear
                minificationFilter: Texture.LinearMipMapLinear
                wrapMode {
                    x: WrapMode.ClampToEdge
                    y: WrapMode.ClampToEdge
                }
            }
        }

        mouseEnabled: false
        entities: [ drawingSurface ]

        /* Qt Quick 2 rendering */

        Rectangle {
            width: offscreenTexture.width
            height: offscreenTexture.height
            x: 0
            y: 0
            border.color: "red"
            color: "green"

            Component.onCompleted: {
                console.log("Outter rectangle size: " + width + "x" + height + " at " + x + "," + y);
            }

            Rectangle {
                id: innerRect
                height: parent.height*0.6
                width: height
                x: (parent.width/2) - (width/2)
                y: (parent.height/2) - (height/2)
                border.color: "red"
                color: "yellow"
                transform: Rotation { origin.x: innerRect.width/2; origin.y: innerRect.height/2; angle: 45}

                Component.onCompleted: {
                    console.log("Inner rectangle size: " + width + "x" + height + " at " + x + "," + y);
                }
            }
        }

    } // Scene2D

    /* add light source at the same place as the camera */

    Layer {
        id: mainLayer
    }

    Entity {
        PointLight {
            id: light
            color: "white"
            intensity: 1
            constantAttenuation: 1.0
            linearAttenuation: 0.0
        }

        Transform {
            id: lightTransform
            translation: Qt.vector3d(0.0, 0.0, sceneRoot.camZ)
        }

        components: [ light, lightTransform, mainLayer ]
    }

    /* display 3D object */

    Entity {
        SphereMesh {
            id: mesh
            radius: 130
        }

        PhongMaterial {
            id: material
            ambient: "blue"
        }

        Transform {
           id: transform
           translation: Qt.vector3d(0, 0, 0)
       }

       components: [ mesh, material, transform, mainLayer ]
   }

} // sceneRoot

结果截图:

顺便说一句:由于在离屏表面上绘图,您的代码会产生错误结果。我建议您创建和实际的屏幕外渲染框架图并在其中绘制您的东西。签出 this very nice and informative GitHub repo and my C++ Qt3D offscreen renderer implementation.

也许作为旁注:您绝对可以通过使用透视投影来获得相同的结果。您可以在互联网上阅读透视投影,例如here。本质上,您有一个线性问题系统,您知道像素坐标(您希望平面出现在屏幕上的位置)并求解平面的 3D 点。但它可能会变得复杂,我相信我发布的解决方案更容易使用 ;)