我正在基于谷歌的hello_ar_java示例应用程序,围绕这个Agora ARcore演示构建我的应用程序。
此应用程序将捕获用户的点击,并检查是否在场景中找到任何平面。如果是,请在该点创建一个锚点。
我想在各种锚之间画一条线。
我在网上找到的所有东西都使用场景窗体和片段。
目前,我已经设法实现了没有arFraium但行不显示,可能是因为这种方法,我不知道如何取代没有arFrature:nodeToAdd.set家长(arFragment.getArSceneView(). getScene());
要在我的项目中实现场景形式,我从这个项目中得到一个提示LineView有没有其他方法不使用场景形式?
我是这样进行的:
public void onDrawFrame(GL10 gl) {
// Clear screen to notify driver it should not load any pixels from previous frame.
GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
if (mSession == null) {
return;
}
// Notify ARCore session that the view size changed so that the perspective matrix and
// the video background can be properly adjusted.
mDisplayRotationHelper.updateSessionIfNeeded(mSession);
try {
// Obtain the current frame from ARSession. When the configuration is set to
// UpdateMode.BLOCKING (it is by default), this will throttle the rendering to the
// camera framerate.
Frame frame = mSession.update();
Camera camera = frame.getCamera();
// Handle taps. Handling only one tap per frame, as taps are usually low frequency
// compared to frame rate.
MotionEvent tap = queuedSingleTaps.poll();
if (tap != null && camera.getTrackingState() == TrackingState.TRACKING) {
for (HitResult hit : frame.hitTest(tap)) {
// Check if any plane was hit, and if it was hit inside the plane polygon
Trackable trackable = hit.getTrackable();
// Creates an anchor if a plane or an oriented point was hit.
if ((trackable instanceof Plane && ((Plane) trackable).isPoseInPolygon(hit.getHitPose()))
|| (trackable instanceof Point
&& ((Point) trackable).getOrientationMode()
== Point.OrientationMode.ESTIMATED_SURFACE_NORMAL)) {
// Hits are sorted by depth. Consider only closest hit on a plane or oriented point.
// Cap the number of objects created. This avoids overloading both the
// rendering system and ARCore.
if (anchors.size() >= 250) {
anchors.get(0).detach();
anchors.remove(0);
}
// Adding an Anchor tells ARCore that it should track this position in
// space. This anchor is created on the Plane to place the 3D model
// in the correct position relative both to the world and to the plane.
anchors.add(hit.createAnchor());
break;
}
}
}
// Draw background.
mBackgroundRenderer.draw(frame);
// If not tracking, don't draw 3d objects.
if (camera.getTrackingState() == TrackingState.PAUSED) {
return;
}
// Get projection matrix.
float[] projmtx = new float[16];
camera.getProjectionMatrix(projmtx, 0, 0.1f, 100.0f);
// Get camera matrix and draw.
float[] viewmtx = new float[16];
camera.getViewMatrix(viewmtx, 0);
// Compute lighting from average intensity of the image.
final float lightIntensity = frame.getLightEstimate().getPixelIntensity();
if (isShowPointCloud()) {
// Visualize tracked points.
PointCloud pointCloud = frame.acquirePointCloud();
mPointCloud.update(pointCloud);
mPointCloud.draw(viewmtx, projmtx);
// Application is responsible for releasing the point cloud resources after
// using it.
pointCloud.release();
}
// Check if we detected at least one plane. If so, hide the loading message.
if (mMessageSnackbar != null) {
for (Plane plane : mSession.getAllTrackables(Plane.class)) {
if (plane.getType() == Plane.Type.HORIZONTAL_UPWARD_FACING
&& plane.getTrackingState() == TrackingState.TRACKING) {
hideLoadingMessage();
break;
}
}
}
if (isShowPlane()) {
// Visualize planes.
mPlaneRenderer.drawPlanes(
mSession.getAllTrackables(Plane.class), camera.getDisplayOrientedPose(), projmtx);
}
// Visualize anchors created by touch.
float scaleFactor = 1.0f;
for (Anchor anchor : anchors) {
if (anchor.getTrackingState() != TrackingState.TRACKING) {
continue;
}
// Get the current pose of an Anchor in world space. The Anchor pose is updated
// during calls to session.update() as ARCore refines its estimate of the world.
anchor.getPose().toMatrix(mAnchorMatrix, 0);
// Update and draw the model and its shadow.
mVirtualObject.updateModelMatrix(mAnchorMatrix, mScaleFactor);
mVirtualObjectShadow.updateModelMatrix(mAnchorMatrix, scaleFactor);
mVirtualObject.draw(viewmtx, projmtx, lightIntensity);
mVirtualObjectShadow.draw(viewmtx, projmtx, lightIntensity);
}
sendARViewMessage();
} catch (Throwable t) {
// Avoid crashing the application due to unhandled exceptions.
Log.e(TAG, "Exception on the OpenGL thread", t);
}
}
然后:
public void drawLineButton(View view) {
AnchorNode nodeToAdd;
for (Anchor anchor : anchors) {
anchorNode = new AnchorNode(anchor);
anchorNodeList.add(anchorNode);
//this is the problem imho
//nodeToAdd.setParent(arFragment.getArSceneView().getScene());
numberOfAnchors++;
}
if (numberOfAnchors == 2 ) {
drawLine(anchorNodeList.get(0), anchorNodeList.get(1));
}
}
这里的节点是真实存在的。我没有发现任何错误,也没有显示行:
private void drawLine(AnchorNode node1, AnchorNode node2) {
//Here the knots exist and are real. I don't find any errors, and the lines don't show
runOnUiThread(new Runnable() {
@Override
public void run() {
Vector3 point1, point2;
point1 = node1.getWorldPosition();
point2 = node2.getWorldPosition();
//First, find the vector extending between the two points and define a look rotation
//in terms of this Vector.
final Vector3 difference = Vector3.subtract(point1, point2);
final Vector3 directionFromTopToBottom = difference.normalized();
final Quaternion rotationFromAToB =
Quaternion.lookRotation(directionFromTopToBottom, Vector3.up());
MaterialFactory.makeOpaqueWithColor(getApplicationContext(), new Color(0, 255, 244))
.thenAccept(
material -> {
/* Then, create a rectangular prism, using ShapeFactory.makeCube() and use the difference vector
to extend to the necessary length. */
Log.d(TAG,"drawLine insie .thenAccept");
ModelRenderable model = ShapeFactory.makeCube(
new Vector3(.01f, .01f, difference.length()),
Vector3.zero(), material);
/* Last, set the world rotation of the node to the rotation calculated earlier and set the world position to
the midpoint between the given points . */
Anchor lineAnchor = node2.getAnchor();
nodeForLine = new Node();
nodeForLine.setParent(node1);
nodeForLine.setRenderable(model);
nodeForLine.setWorldPosition(Vector3.add(point1, point2).scaled(.5f));
nodeForLine.setWorldRotation(rotationFromAToB);
}
);
}
});
}
这是我在drawLine()
函数中的point1、poin2和directionFromTopToBottom的示例:
point1: [x=0.060496617, y=-0.39098215, z=-0.21526277]
point2: [x=0.05695567, y=-0.39132282, z=-0.33304527]
directionFromTopToBottom: [x=0.030049745, y=0.0028910497, z=0.9995442]
你的代码没有调用你的绘图按钮()
函数,是吗?不管怎样,看起来你正在尝试使用Sceneform中的一些东西(MaterialFactory,ModelRenderable等),同时进行一些纯OpenGL渲染,就像hello_ar_java中所做的那样。
由于Sceneform使用灯丝作为渲染引擎,可以使用OpenGL或Vulkan,所以混合这些元素不会产生任何效果。因此,要么完全使用Sceneform,要么完全使用OpenGL(并了解OpenGL和Android的工作原理)。
现在,如果您想继续使用hello_ar_java示例,请遵循OpenGL教程,以便能够为每个锚点生成顶点,并使用您喜欢的线大小的GL_线绘制它们。下面是一个很好的OpenGL教程:https://learnopengl.com/ 我建议大家阅读所有的入门部分,但请记住,这是OpenGL和Android使用OpenGL ES,虽然有一些不同,但计算机图形原理是相同的。
我的布局如下所示: 我的要求是在和之间画一条水平线 有人能帮忙吗?
问题内容: http://marakana.com/tutorials/android/2d-graphics-example.html 我在下面使用这个例子。但是,当我在屏幕上移动手指的速度太快时,线条会变成单个点。 我不确定是否可以加快绘图速度。或者,我应该用一条直线连接最后两个点。这两种解决方案中的第二种似乎是一个不错的选择,除了非常快速地移动手指时,您会得到一条较长的直线段,然后是清晰的曲
我一直在尝试这样做,但当我尝试以下操作时,会得到一个NotTrackingException 或
我需要同一个锚链接有条件地指向本地或外部资源。我试过了 但是不管用。我没有得到任何错误,但它指向同一个本地页面,忽略了外部URL。有什么想法吗? 另一种选择是构建链接,但我找不到任何文档如何在服务内部访问 编辑:我知道我可以用克隆整个链接,但我不想这样做,我的链接包含一个带有一系列选项的视频标签
ARCore场景表单示例项目“hello场景表单”很酷,工作非常好。 问题是需要移动手机以获得一个放置锚的表面。太慢了。 我的应用程序不需要在垂直平面(墙)上显示任何东西,只需要在地板上显示。我是否可以跳过“移动手机”步骤,或者至少加快速度? 我试过: 认为如果我不需要寻找垂直面,那么一切都会更快。。。。。似乎还不够快。 谢谢
问题内容: 我正在尝试使用2D 绘制一条线,但是该线出现在所有其他组件上,从而使它们不可见。我该如何解决这个问题? 这是代码: 问题答案: 更多提示 在EDT上创建GUI。有关更多详细信息,请参见Swing中的并发。 使用@nIcEcOw建议的,而不是覆盖。再次,首先调用该方法。 不扩展框架,仅使用一个实例。使用来根据组件所需的空间设置尺寸。