如何在没有arFragment的情况下使用ARcore在平面上的锚点之间画一条线
How to draw a line between anchors on the plane with ARcore without arFragment
我正在围绕这个 Agora ARcore Demo based on Google's hello_ar_java Sample APP 构建我的应用程序。
这个应用程序,捕捉用户的点击并检查是否在场景中找到任何平面。如果是这样,请在该点创建一个锚点。
我想在各种锚点之间画一条线。
我在网上找到的所有内容都使用 sceneForm 和 arFragment。
目前我已经设法在没有 arFragment 的情况下实现了 sceneForm 但该行没有显示,可能是因为
我不知道如何在没有 arFragment 的情况下替换这种方法:nodeToAdd.setParent(arFragment.getArSceneView().getScene());
为了在我的项目中实现 sceneform,我从这个项目中得到了启发 LineView
有没有不用sceneform的其他方法?
我是这样处理的:
public void onDrawFrame(GL10 gl) {
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if (mSession == null) {
return;
}
// Notify ARCore session that the view size changed so that the perspective matrix and
// the video background can be properly adjusted.
mDisplayRotationHelper.updateSessionIfNeeded(mSession);
try {
// Obtain the current frame from ARSession. When the configuration is set to
// UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the
// camera framerate.
Frame frame = mSession.update();
Camera camera = frame.getCamera();
// Handle taps. Handling only one tap per frame, as taps are usually low frequency
// compared to frame rate.
MotionEvent tap = queuedSingleTaps.poll();
if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon
Trackable trackable = hit.getTrackable();
// Creates an anchor if a plane or an oriented point was hit.
if ((trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose()))
|| (trackable instanceof Point
&& ((Point) trackable).getOrientationMode()
== Point.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
// Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
if (anchors.size() >= 250) {
anchors.get(0).detach();
anchors.remove(0);
}
// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor is created on the Plane to place the 3D model
// in the correct position relative both to the world and to the plane.
anchors.add(hit.createAnchor());
break;
}
}
}
// Draw background.
mBackgroundRenderer.draw(frame);
// If not tracking, don't draw 3d objects.
if (camera.getTrackingState() == TrackingState.PAUSED) {
return;
}
// Get projection matrix.
float[] projmtx = new float[16];
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);
// Get camera matrix and draw.
float[] viewmtx = new float[16];
camera.getViewMatrix(viewmtx, 0);
// Compute lighting from average intensity of the image.
final float lightIntensity = frame.getLightEstimate().getPixelIntensity();
if (isShowPointCloud()) {
// Visualize tracked points.
PointCloud pointCloud = frame.acquirePointCloud();
mPointCloud.update(pointCloud);
mPointCloud.draw(viewmtx, projmtx);
// Application is responsible for releasing the point cloud resources after
// using it.
pointCloud.release();
}
// Check if we detected at least one plane. If so, hide the loading message.
if (mMessageSnackbar != null) {
for (Plane plane : mSession.getAllTrackables(Plane.class)) {
if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING
&& plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
if (isShowPlane()) {
// Visualize planes.
mPlaneRenderer.drawPlanes(
mSession.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
}
// Visualize anchors created by touch.
float scaleFactor = 1.0f;
for (Anchor anchor : anchors) {
if (anchor.getTrackingState() != TrackingState.TRACKING) {
continue;
}
// Get the current pose of an Anchor in world space. The Anchor pose is updated
// during calls to session.update() as ARCore refines its estimate of the world.
anchor.getPose().toMatrix(mAnchorMatrix, 0);
// Update and draw the model and its shadow.
mVirtualObject.updateModelMatrix(mAnchorMatrix, mScaleFactor);
mVirtualObjectShadow.updateModelMatrix(mAnchorMatrix, scaleFactor);
mVirtualObject.draw(viewmtx, projmtx, lightIntensity);
mVirtualObjectShadow.draw(viewmtx, projmtx, lightIntensity);
}
sendARViewMessage();
} catch (Throwable t) {
// Avoid crashing the application due to unhandled exceptions.
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}
然后:
public void drawLineButton(View view) {
AnchorNode nodeToAdd;
for (Anchor anchor : anchors) {
anchorNode = new AnchorNode(anchor);
anchorNodeList.add(anchorNode);
//this is the problem imho
//nodeToAdd.setParent(arFragment.getArSceneView().getScene());
numberOfAnchors++;
}
if (numberOfAnchors == 2 ) {
drawLine(anchorNodeList.get(0), anchorNodeList.get(1));
}
}
这里的节点是真实存在的。我没有发现任何错误,也没有显示这些行:
private void drawLine(AnchorNode node1, AnchorNode node2) {
//Here the knots exist and are real. I don't find any errors, and the lines don't show
runOnUiThread(new Runnable() {
@Override
public void run() {
Vector3 point1, point2;
point1 = node1.getWorldPosition();
point2 = node2.getWorldPosition();
//First, find the vector extending between the two points and define a look rotation
//in terms of this Vector.
final Vector3 difference = Vector3.subtract(point1, point2);
final Vector3 directionFromTopToBottom = difference.normalized();
final Quaternion rotationFromAToB =
Quaternion.lookRotation(directionFromTopToBottom, Vector3.up());
MaterialFactory.makeOpaqueWithColor(getApplicationContext(), new Color(0, 255, 244))
.thenAccept(
material -> {
/* Then, create a rectangular prism, using ShapeFactory.makeCube() and use the difference vector
to extend to the necessary length. */
Log.d(TAG,"drawLine insie .thenAccept");
ModelRenderable model = ShapeFactory.makeCube(
new Vector3(.01f, .01f, difference.length()),
Vector3.zero(), material);
/* Last, set the world rotation of the node to the rotation calculated earlier and set the world position to
the midpoint between the given points . */
Anchor lineAnchor = node2.getAnchor();
nodeForLine = new Node();
nodeForLine.setParent(node1);
nodeForLine.setRenderable(model);
nodeForLine.setWorldPosition(Vector3.add(point1, point2).scaled(.5f));
nodeForLine.setWorldRotation(rotationFromAToB);
}
);
}
});
}
这是我在 drawLine()
函数中的 point1,poin2 和 directionFromTopToBottom 的示例:
point1: [x=0.060496617, y=-0.39098215, z=-0.21526277]
point2: [x=0.05695567, y=-0.39132282, z=-0.33304527]
directionFromTopToBottom: [x=0.030049745, y=0.0028910497, z=0.9995442]
你的代码没有调用你的 drawLineButton()
函数,是吗?无论如何,看起来你正在尝试使用 Sceneform 中的一些东西(MaterialFactory、ModelRenderable 等),同时进行一些纯 OpenGL 渲染,如 hello_ar_java.
由于 Sceneform 使用 filament 作为可以使用 OpenGL 或 Vulkan 的渲染引擎,因此将它们混合使用不会产生任何好处。因此,要么完全使用 Sceneform,要么完全使用 OpenGL(并了解 OpenGL 和 Android 的工作原理)。
现在,如果您想继续 hello_ar_java 示例,请遵循 OpenGL 教程,以便能够为每个锚点生成一个顶点并使用GL_LINES 用你喜欢的线宽。这是一个很好的 OpenGL 教程:https://learnopengl.com/ 我建议通读 所有 入门部分,但请记住它是 OpenGL 并且 Android 使用 OpenGL ES,有一些不同,但计算机图形原理是相同的。
我正在围绕这个 Agora ARcore Demo based on Google's hello_ar_java Sample APP 构建我的应用程序。
这个应用程序,捕捉用户的点击并检查是否在场景中找到任何平面。如果是这样,请在该点创建一个锚点。
我想在各种锚点之间画一条线。
我在网上找到的所有内容都使用 sceneForm 和 arFragment。
目前我已经设法在没有 arFragment 的情况下实现了 sceneForm 但该行没有显示,可能是因为
我不知道如何在没有 arFragment 的情况下替换这种方法:nodeToAdd.setParent(arFragment.getArSceneView().getScene());
为了在我的项目中实现 sceneform,我从这个项目中得到了启发 LineView 有没有不用sceneform的其他方法?
我是这样处理的:
public void onDrawFrame(GL10 gl) {
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if (mSession == null) {
return;
}
// Notify ARCore session that the view size changed so that the perspective matrix and
// the video background can be properly adjusted.
mDisplayRotationHelper.updateSessionIfNeeded(mSession);
try {
// Obtain the current frame from ARSession. When the configuration is set to
// UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the
// camera framerate.
Frame frame = mSession.update();
Camera camera = frame.getCamera();
// Handle taps. Handling only one tap per frame, as taps are usually low frequency
// compared to frame rate.
MotionEvent tap = queuedSingleTaps.poll();
if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon
Trackable trackable = hit.getTrackable();
// Creates an anchor if a plane or an oriented point was hit.
if ((trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose()))
|| (trackable instanceof Point
&& ((Point) trackable).getOrientationMode()
== Point.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
// Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
if (anchors.size() >= 250) {
anchors.get(0).detach();
anchors.remove(0);
}
// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor is created on the Plane to place the 3D model
// in the correct position relative both to the world and to the plane.
anchors.add(hit.createAnchor());
break;
}
}
}
// Draw background.
mBackgroundRenderer.draw(frame);
// If not tracking, don't draw 3d objects.
if (camera.getTrackingState() == TrackingState.PAUSED) {
return;
}
// Get projection matrix.
float[] projmtx = new float[16];
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);
// Get camera matrix and draw.
float[] viewmtx = new float[16];
camera.getViewMatrix(viewmtx, 0);
// Compute lighting from average intensity of the image.
final float lightIntensity = frame.getLightEstimate().getPixelIntensity();
if (isShowPointCloud()) {
// Visualize tracked points.
PointCloud pointCloud = frame.acquirePointCloud();
mPointCloud.update(pointCloud);
mPointCloud.draw(viewmtx, projmtx);
// Application is responsible for releasing the point cloud resources after
// using it.
pointCloud.release();
}
// Check if we detected at least one plane. If so, hide the loading message.
if (mMessageSnackbar != null) {
for (Plane plane : mSession.getAllTrackables(Plane.class)) {
if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING
&& plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
if (isShowPlane()) {
// Visualize planes.
mPlaneRenderer.drawPlanes(
mSession.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
}
// Visualize anchors created by touch.
float scaleFactor = 1.0f;
for (Anchor anchor : anchors) {
if (anchor.getTrackingState() != TrackingState.TRACKING) {
continue;
}
// Get the current pose of an Anchor in world space. The Anchor pose is updated
// during calls to session.update() as ARCore refines its estimate of the world.
anchor.getPose().toMatrix(mAnchorMatrix, 0);
// Update and draw the model and its shadow.
mVirtualObject.updateModelMatrix(mAnchorMatrix, mScaleFactor);
mVirtualObjectShadow.updateModelMatrix(mAnchorMatrix, scaleFactor);
mVirtualObject.draw(viewmtx, projmtx, lightIntensity);
mVirtualObjectShadow.draw(viewmtx, projmtx, lightIntensity);
}
sendARViewMessage();
} catch (Throwable t) {
// Avoid crashing the application due to unhandled exceptions.
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}
然后:
public void drawLineButton(View view) {
AnchorNode nodeToAdd;
for (Anchor anchor : anchors) {
anchorNode = new AnchorNode(anchor);
anchorNodeList.add(anchorNode);
//this is the problem imho
//nodeToAdd.setParent(arFragment.getArSceneView().getScene());
numberOfAnchors++;
}
if (numberOfAnchors == 2 ) {
drawLine(anchorNodeList.get(0), anchorNodeList.get(1));
}
}
这里的节点是真实存在的。我没有发现任何错误,也没有显示这些行:
private void drawLine(AnchorNode node1, AnchorNode node2) {
//Here the knots exist and are real. I don't find any errors, and the lines don't show
runOnUiThread(new Runnable() {
@Override
public void run() {
Vector3 point1, point2;
point1 = node1.getWorldPosition();
point2 = node2.getWorldPosition();
//First, find the vector extending between the two points and define a look rotation
//in terms of this Vector.
final Vector3 difference = Vector3.subtract(point1, point2);
final Vector3 directionFromTopToBottom = difference.normalized();
final Quaternion rotationFromAToB =
Quaternion.lookRotation(directionFromTopToBottom, Vector3.up());
MaterialFactory.makeOpaqueWithColor(getApplicationContext(), new Color(0, 255, 244))
.thenAccept(
material -> {
/* Then, create a rectangular prism, using ShapeFactory.makeCube() and use the difference vector
to extend to the necessary length. */
Log.d(TAG,"drawLine insie .thenAccept");
ModelRenderable model = ShapeFactory.makeCube(
new Vector3(.01f, .01f, difference.length()),
Vector3.zero(), material);
/* Last, set the world rotation of the node to the rotation calculated earlier and set the world position to
the midpoint between the given points . */
Anchor lineAnchor = node2.getAnchor();
nodeForLine = new Node();
nodeForLine.setParent(node1);
nodeForLine.setRenderable(model);
nodeForLine.setWorldPosition(Vector3.add(point1, point2).scaled(.5f));
nodeForLine.setWorldRotation(rotationFromAToB);
}
);
}
});
}
这是我在 drawLine()
函数中的 point1,poin2 和 directionFromTopToBottom 的示例:
point1: [x=0.060496617, y=-0.39098215, z=-0.21526277]
point2: [x=0.05695567, y=-0.39132282, z=-0.33304527]
directionFromTopToBottom: [x=0.030049745, y=0.0028910497, z=0.9995442]
你的代码没有调用你的 drawLineButton()
函数,是吗?无论如何,看起来你正在尝试使用 Sceneform 中的一些东西(MaterialFactory、ModelRenderable 等),同时进行一些纯 OpenGL 渲染,如 hello_ar_java.
由于 Sceneform 使用 filament 作为可以使用 OpenGL 或 Vulkan 的渲染引擎,因此将它们混合使用不会产生任何好处。因此,要么完全使用 Sceneform,要么完全使用 OpenGL(并了解 OpenGL 和 Android 的工作原理)。
现在,如果您想继续 hello_ar_java 示例,请遵循 OpenGL 教程,以便能够为每个锚点生成一个顶点并使用GL_LINES 用你喜欢的线宽。这是一个很好的 OpenGL 教程:https://learnopengl.com/ 我建议通读 所有 入门部分,但请记住它是 OpenGL 并且 Android 使用 OpenGL ES,有一些不同,但计算机图形原理是相同的。