arcore
by Ayusch Jain
通过Ayusch Jain
This article was originally posted here
本文最初发布在这里
In the previous post, I explained what ARCore is and how it helps developers build awesome augmented reality apps without the need to understand OpenGL or Matrix maths.
在上 一篇 文章中 ,我解释了ARCore是什么以及它如何帮助开发人员构建超棒的增强现实应用程序,而无需了解OpenGL或Matrix数学。
If you haven’t checked it out yet, I highly recommend doing so before moving ahead with this article and diving into ARCore app development.
如果您还没有检查过,我强烈建议您这样做,然后再继续本文并深入研究ARCore应用程序开发。
According to Wikipedia, ARCore is a software development kit developed by Google that allows for augmented reality applications to be built.
根据Wikipedia的说法 ,ARCore是Google开发的软件开发套件,可用于构建增强现实应用程序。
ARCore uses three key technologies to integrate virtual content with the real environment:
ARCore使用三种关键技术将虚拟内容与实际环境集成在一起:
Motion Tracking: it allows the phone to understand its position relative to the world.
运动追踪:它可以让手机了解其相对于世界的位置。
Environmental understanding: This allows the phone to detect the size and location of all type of surfaces, vertical, horizontal and angled.
对环境的了解:这使手机可以检测所有类型的表面(垂直,水平和倾斜)的大小和位置。
Light Estimation: it allows the phone to estimate the environment’s current lighting conditions.
灯光估计:它使手机可以估计环境当前的照明条件。
To get started with ARCore app development, you first need to enable ARCore in your project. This is simple as we will be using Android Studio and Sceneform SDK. There are two major operations Sceneform performs automatically:
要开始进行ARCore应用程序开发,您首先需要在项目中启用ARCore。 这很简单,因为我们将使用Android Studio和Sceneform SDK。 Sceneform自动执行两个主要操作:
Checking for availability of ARCore
检查ARCore的可用性
Asking for camera permission
要求相机许可
You don’t need to bother with these two steps when creating an ARCore app using Sceneform SDK. But you do need to include Sceneform SDK in your project.
使用Sceneform SDK创建ARCore应用时,您无需费心这两个步骤。 但是您确实需要在项目中包含Sceneform SDK。
Create a new Android Studio project and select an empty activity.
创建一个新的Android Studio项目并选择一个空的活动。
Add the following dependency to your project level build.gradle file:
将以下依赖项添加到项目级别的build.gradle文件中:
dependencies { classpath 'com.google.ar.sceneform:plugin:1.5.0'}
Add the following to your app level build.gradle file:
将以下内容添加到您的应用程序级别build.gradle文件中:
implementation "com.google.ar.sceneform.ux:sceneform-ux:1.5.0"
Now sync project with Gradle files and wait for the build to finish. This will install the Sceneform SDK to the project and Sceneform plugin to AndroidStudio. It will help you to view the .sfb files. These files are the 3D models which are rendered in your camera. It also helps you in importing, viewing, and building 3D assets.
现在,将项目与Gradle文件同步,并等待构建完成。 这会将Sceneform SDK安装到项目中,并将Sceneform插件安装到AndroidStudio中 。 它将帮助您查看。 sfb文件。 这些文件是在相机中渲染的3D模型。 它还可以帮助您导入,查看和构建3D资产 。
Now with our Android Studio setup complete and Sceneform SDK installed, we can get started with writing our very first ARCore app.
现在,我们完成了Android Studio设置并安装了Sceneform SDK,我们可以开始编写我们的第一个ARCore应用程序了。
First, we need to add the Sceneform fragment to our layout file. This will be the Scene where we place all our 3D models. It takes care of the camera initialization and permission handling.
首先,我们需要将Sceneform片段添加到布局文件中。 这将是我们放置所有3D模型的场景。 它负责相机的初始化和权限处理。
Head over to your main layout file. In my case it is activity_main.xml and add the Sceneform fragment:
转到您的主布局文件。 在我的情况下,它是activity_main.xml并添加Sceneform片段:
<?xml version="1.0" encoding="utf-8"?><FrameLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent" tools:context=".MainActivity">
<fragment android:name="com.google.ar.sceneform.ux.ArFragment" android:id="@+id/ux_fragment" android:layout_width="match_parent" android:layout_height="match_parent" />
</FrameLayout>
I’ve set the width and height to match parent as this will cover my entire activity. You can choose the dimensions according to your requirements.
我将宽度和高度设置为与父项匹配,因为这将覆盖我的整个活动。 您可以根据需要选择尺寸。
This is all that you need to do in the layout file. Now head over to the java file, in my case which is MainActivity.java. Add the method below in your class:
这是您在布局文件中需要做的所有事情。 现在转到Java文件,在我的例子中是MainActivity.java。 在您的课程中添加以下方法:
public static boolean checkIsSupportedDeviceOrFinish(final Activity activity) { if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) { Log.e(TAG, "Sceneform requires Android N or later"); Toast.makeText(activity, "Sceneform requires Android N or later", Toast.LENGTH_LONG).show(); activity.finish(); return false; } String openGlVersionString = ((ActivityManager) activity.getSystemService(Context.ACTIVITY_SERVICE)) .getDeviceConfigurationInfo() .getGlEsVersion(); if (Double.parseDouble(openGlVersionString) < MIN_OPENGL_VERSION) { Log.e(TAG, "Sceneform requires OpenGL ES 3.0 later"); Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG) .show(); activity.finish(); return false; } return true;}
This method checks whether your device can support Sceneform SDK or not. The SDK requires Android API level 27 or newer and OpenGL ES version 3.0 or newer. If a device does not support these two, the Scene would not be rendered and your application will show a blank screen.
此方法检查您的设备是否可以支持Sceneform SDK。 该SDK需要Android API级别27或更高版本以及OpenGL ES 3.0版或更高版本。 如果设备不支持这两个设备,则不会渲染场景,并且您的应用程序将显示空白屏幕。
Although, you can still continue to deliver all the other features of your app which don’t require the Sceneform SDK.
不过,您仍然可以继续提供应用程序的所有其他功能,这些功能不需要Sceneform SDK。
Now with the device compatibility check complete, we shall build our 3D model and attach it to the scene.
现在完成设备兼容性检查后,我们将构建3D模型并将其附加到场景。
You will need to add the 3D models which will be rendered on your screen. Now you can build these models yourself if you are familiar with 3D model creation. Or, you can visit Poly.
您将需要添加将在屏幕上渲染的3D模型。 现在,如果您熟悉3D模型创建,则可以自己构建这些模型。 或者,您可以访问Poly。
There you’ll find a huge repository of 3D assets to choose from. They are free to download. Just credit the creator and you are good to go.
在那里,您将找到庞大的3D资产存储库供您选择。 可以免费下载。 只要相信创作者,您就可以开始了。
In the Android Studio, expand your app folder available on the left-hand side project pane. You’ll notice a “sampledata” folder. This folder will hold all of your 3D model assets. Create a folder for your model inside the sample data folder.
在Android Studio中,展开左侧项目窗格上可用的应用程序文件夹。 您会注意到一个“ sampledata ”文件夹。 该文件夹将保存您的所有3D模型资源。 在样本数据文件夹中为您的模型创建一个文件夹。
When you download the zip file from poly, you will most probably find 3 files.
从poly下载zip文件时,很可能会找到3个文件。
.mtl file
.mtl文件
.obj file
.obj文件
.png file
.png文件
Most important of these 3 is the .obj file. It is your actual model. Place all the 3 files inside sampledata -> “your model’s folder”.
这3个文件中最重要的是.obj文件。 这是您的实际模型。 将所有3个文件放在sampledata- >“模型的文件夹 r”中。
Now right click on the .obj file. The first option would be to Import Sceneform Asset. Click on it, do not change the default settings, just click finish on the next window. Your gradle will sync to include the asset in the assets folder. Once the gradle build finishes, you are good to go.
现在,右键单击.obj 文件 。 第一个选项是导入Sceneform Asset。 单击它,不更改默认设置,只需在下一个窗口中单击完成。 您的gradle将同步以将资产包括在资产文件夹中。 一旦gradle构建完成,您就可以开始了。
You’ve finished importing a 3D asset used by Sceneform in your project. Next, let’s build the asset from our code and include it in the scene.
您已完成在项目中导入Sceneform使用的3D资源。 接下来 ,让我们从代码中构建资产并将其包含在场景中。
Add the following code to your MainActivity.java file (or whatever it is in your case). Don’t worry, I’ll explain all the code line by line:
将以下代码添加到MainActivity.java文件(或您的情况)中。 不用担心,我将逐行解释所有代码:
private static final String TAG = MainActivity.class.getSimpleName();private static final double MIN_OPENGL_VERSION = 3.0;
ArFragment arFragment;ModelRenderable lampPostRenderable;
@Override@SuppressWarnings({"AndroidApiChecker", "FutureReturnValueIgnored"})
protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); if (!checkIsSupportedDeviceOrFinish(this)) { return; } setContentView(R.layout.activity_main); arFragment = (ArFragment) getSupportFragmentManager().findFragmentById(R.id.ux_fragment);
ModelRenderable.builder() .setSource(this, Uri.parse("LampPost.sfb")) .build() .thenAccept(renderable -> lampPostRenderable = renderable) .exceptionally(throwable -> { Toast toast = Toast.makeText(this, "Unable to load andy renderable", Toast.LENGTH_LONG); toast.setGravity(Gravity.CENTER, 0, 0); toast.show(); return null; });
}
First, we find the arFragment that we included in the layout file. This fragment is responsible for hosting the scene. You can think of it as the container of our scene.
首先 ,我们找到包含在布局文件中的arFragment 。 该片段负责主持场景。 您可以将其视为我们场景的容器。
Next, we are using the ModelRenderable class to build our model. With the help of setSource method, we load our model from the .sfb file. This file was generated when we imported the assets. thenAccept method receives the model once it is built. We set the loaded model to our lampPostRenderable.
接下来 ,我们使用ModelRenderable类构建模型。 借助setSource方法,我们从中加载了模型。 sfb文件。 该文件是在我们导入资产时生成的。 构建模型后, thenAccept方法将接收模型。 我们将加载的模型设置为lampPostRenderable。
For error handling, we have .exceptionally method. It is called in case an exception is thrown.
对于错误处理,我们有.exceptionally方法。 如果引发异常,则调用该方法。
All this happens asynchronously, hence you don’t need to worry about multi-threading or deal with handlers XD
所有这些都是异步发生的,因此您无需担心多线程或处理XD处理程序。
With the model loaded and stored in the lampPostRenderable variable, we’ll now add it to our scene.
将模型加载并存储在lampPostRenderable变量中之后,我们现在将其添加到场景中。
The arFragment hosts our scene and will receive the tap events. So we need to set the onTap listener to our fragment to register the tap and place an object accordingly. Add the following code to onCreate method:
arFragment托管我们的场景,并将接收点击事件。 因此,我们需要将onTap侦听器设置为片段,以注册拍击并相应地放置一个对象。 将以下代码添加到onCreate方法:
arFragment.setOnTapArPlaneListener( (HitResult hitresult, Plane plane, MotionEvent motionevent) -> { if (lampPostRenderable == null){ return; }
Anchor anchor = hitresult.createAnchor(); AnchorNode anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene());
TransformableNode lamp = new TransformableNode(arFragment.getTransformationSystem()); lamp.setParent(anchorNode); lamp.setRenderable(lampPostRenderable); lamp.select(); });
We set the onTapArPlaneListener to our AR fragment. Next what you see is the Java 8 syntax, in case you are not familiar with it, I would recommend checking out this guide.
我们将onTapArPlaneListener设置为我们的AR片段 。 接下来,您会看到Java 8语法 ,如果您不熟悉Java 8语法 ,我建议您查阅本指南 。
First, we create our anchor from the HitResult using hitresult.createAnchor() and store it in an Anchor object.
首先,我们使用hitresult.createAnchor()从HitResult创建锚,并将其存储在Anchor对象中。
Next, create a node out of this anchor. It will be called AnchorNode. It will be attached to the scene by calling the setParent method on it and passing the scene from the fragment.
接下来 ,从该锚点创建一个节点。 它将被称为AnchorNode。 通过在其上调用setParent方法并从片段传递场景,将其附加到场景。
Now we create a TransformableNode which will be our lamppost and set it to the anchor spot or our anchor node. The node still doesn’t have any information about the object it has to render. We’ll pass that object using lamp.setRenderable method which takes in a renderable as it’s parameter. Finally call lamp.select();
现在,我们创建一个TransformableNode ,将其作为路灯柱并将其设置为锚点或锚节点。 该节点仍然没有任何有关它必须呈现的对象的信息。 我们将使用lamp.setRenderable方法传递该对象, 该方法以renderable作为参数。 最后调用lamp.select();
Phew!! Too much terminology there, but don’t worry, I’ll explain it all.
ew! 那里的术语太多了,但是请放心,我将全部解释。
Scene: This is the place where all your 3D objects will be rendered. This scene is hosted by the AR Fragment which we included in the layout. An anchor node is attached to this screen which acts as the root of the tree and all the other objects are rendered as its objects.
场景 :这是将渲染所有3D对象的地方。 该场景由我们包含在布局中的AR片段托管。 此屏幕上附加了一个锚点,该锚点充当树的根,所有其他对象均渲染为其树对象。
HitResult: This is an imaginary line (or a ray) coming from infinity which gives the point of intersection of itself with a real-world object.
HitResult :这是一条来自无限远的假想线(或射线),它给出了其与现实世界对象的交点。
Anchor: An anchor is a fixed location and orientation in the real world. It can be understood as the x,y,z coordinate in the 3D space. You can get an anchor’s post information from it. Pose is the position and orientation of the object in the scene. This is used to transform the object’s local coordinate space into real-world coordinate space.
锚点 :锚点是现实世界中的固定位置和方向。 可以理解为3D空间中的x,y,z坐标。 您可以从中获取主播的信息。 姿势是场景中对象的位置和方向。 这用于将对象的局部坐标空间转换为实际坐标空间。
TransformableNode: It is a node that can be interacted with. It can be moved around, scaled rotated and much more. In this example, we can scale the lamp and rotate it. Hence the name Transformable.
TransformableNode :它是可以与之交互的节点。 它可以移动,缩放旋转等等。 在此示例中,我们可以缩放灯并旋转。 因此,名称为可变形。
There is no rocket science here. It’s really simple. The entire scene can be viewed as a graph with Scene as the parent, AnchorNode as its child and then branching out different nodes/objects to be rendered on the screen.
这里没有火箭科学。 真的很简单。 可以将整个场景视为一个图形,以Scene为父级, AnchorNode作为其子级,然后分支出要在屏幕上呈现的不同节点/对象。
Your final MainActivity.java must look something like this:
您最终的MainActivity.java必须看起来像这样:
package com.ayusch.arcorefirst;
import android.app.Activity;import android.app.ActivityManager;import android.content.Context;import android.net.Uri;import android.os.Build;import android.support.v7.app.AppCompatActivity;import android.os.Bundle;import android.util.Log;import android.view.Gravity;import android.view.MotionEvent;import android.widget.Toast;
import com.google.ar.core.Anchor;import com.google.ar.core.HitResult;import com.google.ar.core.Plane;import com.google.ar.sceneform.AnchorNode;import com.google.ar.sceneform.rendering.ModelRenderable;import com.google.ar.sceneform.ux.ArFragment;import com.google.ar.sceneform.ux.TransformableNode;
public class MainActivity extends AppCompatActivity { private static final String TAG = MainActivity.class.getSimpleName(); private static final double MIN_OPENGL_VERSION = 3.0;
ArFragment arFragment; ModelRenderable lampPostRenderable;
@Override @SuppressWarnings({"AndroidApiChecker", "FutureReturnValueIgnored"}) protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); if (!checkIsSupportedDeviceOrFinish(this)) { return; } setContentView(R.layout.activity_main); arFragment = (ArFragment) getSupportFragmentManager().findFragmentById(R.id.ux_fragment);
ModelRenderable.builder() .setSource(this, Uri.parse("LampPost.sfb")) .build() .thenAccept(renderable -> lampPostRenderable = renderable) .exceptionally(throwable -> { Toast toast = Toast.makeText(this, "Unable to load andy renderable", Toast.LENGTH_LONG); toast.setGravity(Gravity.CENTER, 0, 0); toast.show(); return null; });
arFragment.setOnTapArPlaneListener( (HitResult hitresult, Plane plane, MotionEvent motionevent) -> { if (lampPostRenderable == null){ return; }
Anchor anchor = hitresult.createAnchor(); AnchorNode anchorNode = new AnchorNode(anchor); anchorNode.setParent(arFragment.getArSceneView().getScene());
TransformableNode lamp = new TransformableNode(arFragment.getTransformationSystem()); lamp.setParent(anchorNode); lamp.setRenderable(lampPostRenderable); lamp.select(); } );
}
public static boolean checkIsSupportedDeviceOrFinish(final Activity activity) { if (Build.VERSION.SDK_INT < Build.VERSION_CODES.N) { Log.e(TAG, "Sceneform requires Android N or later"); Toast.makeText(activity, "Sceneform requires Android N or later", Toast.LENGTH_LONG).show(); activity.finish(); return false; } String openGlVersionString = ((ActivityManager) activity.getSystemService(Context.ACTIVITY_SERVICE)) .getDeviceConfigurationInfo() .getGlEsVersion(); if (Double.parseDouble(openGlVersionString) < MIN_OPENGL_VERSION) { Log.e(TAG, "Sceneform requires OpenGL ES 3.0 later"); Toast.makeText(activity, "Sceneform requires OpenGL ES 3.0 or later", Toast.LENGTH_LONG) .show(); activity.finish(); return false; } return true; }}
Congratulations!! You’ve just completed your first ARCore app. Start adding objects and see them come alive in the real world!
恭喜!! 您刚刚完成了第一个ARCore应用。 开始添加对象,看看它们在现实世界中变得栩栩如生!
This was your first look into how to create a simple ARCore app from scratch with Android studio. In the next tutorial, I would be going deeper into ARCore and adding more functionality to the app.
这是您第一次了解如何使用Android Studio从头开始创建简单的ARCore应用 。 在下一个教程中,我将更深入地研究ARCore并为该应用程序添加更多功能。
If you have any suggestions or any topic you would want a tutorial on, just mention in the comments section and I’ll be happy to oblige.
如果您有任何建议或主题想要在本教程上学习,请在评论部分中提及,我们非常乐意为您服务。
Like what you read? Don’t forget to share this post on Facebook, Whatsapp and LinkedIn.
喜欢你读的书吗? 不要忘记在Facebook , Whatsapp和LinkedIn上分享此帖子。
You can follow me on LinkedIn, Quora, Twitter and Instagram where I answer questions related to Mobile Development, especially Android and Flutter.
您可以在LinkedIn , Quora , Twitter和Instagram上关注我,在那里我回答与移动开发(尤其是Android和Flutter)有关的问题 。
arcore