Build an Android app with augmented reality with Google ARCore

21 min read

Augmented Reality (AR) is a huge buzzword and a topic that really appeals to the imagination of mobile app developers.

In AR applications, a live image of the physical, real world environment is supplemented with virtual content, which provides a more immersive user experience. Pokemon Go is perhaps the first thing that comes to mind when you think of AR mobile apps, but there are many mobile applications that use the power of AR technology. For example, Snapchat uses AR to add filters and masks to the device's camera feed and the Google Translate Word Lens function is made possible by AR.

Whether you are dreaming of creating the next great mobile AR game, or want to improve your existing app with a few AR-powered functions, augmented reality can help you design new and innovative experiences for your users.

In this article I will show you how to get started with AR, using the ARCore platform from Google and the plug-in Sceneform. At the end of this article you created a simple AR application that analyzes the environment, including light sources and the position of walls and floors. The user can then place virtual 3D models in the real world.

What is Google ARCore?

ARCore is a Google platform that allows your applications to & # 39; see & # 39; and understand through the camera of your device.

Instead of relying on user input, Google ARCore automatically searches for & # 39; clusters & # 39; with feature points that it uses to understand the environment. Specifically, ARCore looks for clusters that indicate the presence of common horizontal and vertical surfaces such as floors, desks and walls and then makes these surfaces available for your application as airplanes. ARCore can also identify light levels and light sources and use this information to create realistic shadows for all AR objects that users place in the augmented scene.

ARCore powered applications can use this concept of aircraft and light sources to seamlessly insert virtual objects into the real world, such as annotating a poster with virtual labels or placing a 3D model in an aircraft – and that is exactly what we will do in our application.

Import 3D models, with the Sceneform add-in

Working with 3D models usually requires specialist knowledge, but with the release of the Sceneform plug-in, Google has made it possible to display 3D models using Java and without learn OpenGL.

The Sceneform plug-in provides a high-level API that you can use to make Renderdables from standard Android widgets, shapes or materials, or from 3D items such as .OBJ or .FBX files.

In our project we will use the Sceneform plug-in to import an .OBJ file into Android Studio. Every time you import a file with Sceneform, this plug-in is automatically:

  • Convert the asset file to a .sfb file. This is a runtime-optimized Sceneform binary format (.sfb) that has been added to your APK and then loaded at runtime. We will use this .sfb file to create a Renderable consisting of nets, materials and textures, and can be placed anywhere in the augmented scene.
  • Generate an .sfa file. This is a file with asset descriptions, a text file with a human-readable description of the .sfb file. Depending on the model, you may be able to change its appearance by editing the text in the .sfa file.

Please note that the Sceneform plug-in was still in beta at the time of writing, so you may encounter bugs, errors or other strange behavior when using this plug-in.

Installing the Sceneform plug-in

Android Studio 3.1 or higher is required for the Sceneform plug-in. If you are not sure which version of Android Studio you are using, select & # 39; Android Studio> About Android Studio & # 39; in the toolbar. The following pop-up contains some basic information about your Android Studio installation, including the version number.

To install the Sceneform plug-in:

  • If you use a Mac, select & # 39; Android Studio> Preferences … & # 39; in the Android Studio toolbar and choose & # 39; Plug-ins & # 39; in the menu on the left. If you are working on a Windows PC, select & # 39; File> Settings> Plug-ins> Browse repositories & # 39 ;.
  • Search for & # 39; Sceneform & # 39 ;. When & # 39; Google Sceneform Tools & # 39; is displayed, select & # 39; Install & # 39 ;.
  • Restart Android Studio when prompted and your plug-in is ready for use.

UX and Java 8 Sceneform: Update your project dependencies

Let's start by adding the dependencies that we will use in this project. Open your build.gradle file at the module level and add the Sceneform UX library, which contains the ArFragment that we will use in our layout:

dependencies {
implementation fileTree (dir: & # 39; libs & # 39 ;, include: (& # 39; * .jar & # 39;))
implementation & # 39; androidx.appcompat: appcompat: 1.0.2 & # 39;
implementation & # 39; androidx.constraint layout: constraint layout: 1.1.3 & # 39;
testImplementation & # 39; junit: junit: 4.12 & # 39;
androidTestImplementation & # 39; androidx.test.ext: junit: 1.1.0 & # 39;
androidTestImplementation & # 39; androidx.test.espresso: espresso core: 3.1.1 & # 39;

// Sceneform UX offers UX sources, including ArFragment //

implementation "com.google.ar.sceneform.ux: sceneform-ux: 1.7.0"
implementation "com.android.support:appcompat-v7:28.0.0"
}

Sceneform uses language constructs from Java 8, so we also need to update the source and target compatibility of our project to Java 8:

compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}

Finally, we must apply the Sceneform plug-in:

apply plug-in: & # 39; com.google.ar.sceneform.plugin & # 39;

The completed build.gradle file should look something like this:

apply plug-in: & # 39; com.android.application & # 39;

Android {
compileSdkVersion 28
defaultConfig {
applicationId "com.jessicathornsby.arcoredemo"
minSdkVersion 23
targetSdkVersion 28
versionCode 1
versionName "1.0"
testInstrumentationRunner "androidx.test.runner.AndroidJUnitRunner"
}

compileOptions {
sourceCompatibility JavaVersion.VERSION_1_8
targetCompatibility JavaVersion.VERSION_1_8
}

buildTypes {
release {
minifyEnabled false
proguardFiles getDefaultProguardFile (& # 39; proguard-android-optimize.txt & # 39;), & # 39; proguard-rules.pro & # 39;
}
}
}

dependencies {
implementation fileTree (dir: & # 39; libs & # 39 ;, include: (& # 39; * .jar & # 39;))
implementation & # 39; androidx.appcompat: appcompat: 1.0.2 & # 39;
implementation & # 39; androidx.constraint layout: constraint layout: 1.1.3 & # 39;
testImplementation & # 39; junit: junit: 4.12 & # 39;
androidTestImplementation & # 39; androidx.test.ext: junit: 1.1.0 & # 39;
androidTestImplementation & # 39; androidx.test.espresso: espresso core: 3.1.1 & # 39;
implementation "com.google.ar.sceneform.ux: sceneform-ux: 1.7.0"
implementation "com.android.support:appcompat-v7:28.0.0"
}

apply plug-in: & # 39; com.google.ar.sceneform.plugin & # 39;

Requesting permissions from ArFragment

Our application uses the device's camera to analyze the environment and to position 3D models in the real world. Before our application can access the camera, it needs permission from the camera, so open the Manifesto of your project and add the following:


Android 6.0 offered users the ability to grant, deny, and revoke permissions based on permission. While this improved the user experience, Android developers must manually request permissions and handle the user's response at runtime. The good news is that when working with Google ARCore, the process of requesting the camera permission and handling the user's response has been implemented automatically.

The ArFragment component automatically checks if your app has permission for the camera and then, if necessary, requests it before the AR session is created. Because we will use ArFragment in our app, we do not have to write a code to request permission from the camera.

AR required or optional?

There are two types of applications that use AR functionality:

1. AR Required

If your application relies on Google ARCore to provide a good user experience, make sure that it is only downloaded to devices that support ARCore. If you mark your app as & # 39; AR requires & # 39 ;, it will only appear in the Google Play Store if the device supports ARCore.

Since our application is doing Require ARCore, open the manifest and add the following:


There is also a chance that your application will theoretically be downloaded to a device that supports ARCore but on which ARCore is not installed. As soon as we mark our app as "AR Required", Google Play will automatically download and install ARCore next to your app, if it is not already present on the target device.

Keep in mind that even if your app is Android: required = "true", you will still must check if ARCore is present at runtime, as there is a chance that the user has removed ARCore since downloading your app or that their version of ARCore is outdated.

The good news is that we use ArFragment, which automatically checks if ARCore is installed and up-to-date before it is created each AR session – so again, this is something that we don't have to implement manually.

2. AR Optional

If your app contains AR functions that can be pleasant, but not essential for providing core functionality, you can mark this app as & # 39; AR Optional & # 39 ;. Your app can then check if Google ARCore is present at runtime and disable the app on AR functions on devices that do not support ARCore.

If you create an "AR Optional" app, ARCore will do that not is automatically installed alongside your application, even if the device has all the hardware and software required to support ARCore. Your app "AR Optional" must then check if ARCore is present and up-to-date and download the latest version when and when needed.

If ARCore is not crucial for your app, you can add the following to your manifest:


While opening the manifest, I also add android: configChanges and android: screenOrientation, to ensure that MainActivity handles elegant and orientation changes.

After adding all this to your manifest, the finished file should look something like this:


   
   

   

       

       
           
               

               
           
       

   

Add ArFragment to your layout

I use ArCore & # 39; s ArFragment because it automatically handles a number of important ARCore tasks at the start of each AR session. Most importantly, ArFragment checks whether a compatible version of ARCore is installed on the device and whether the app currently has camera access.

Once ArFragment has determined that the device can support the AR functions of your app, an ArSceneView ARCore session is created and the AR experience of your app is ready for use!

You can add the ArFragment clip to a layout file, just like a regular Android clip, so open your activity_main.xml file and add a "com.google.ar.sceneform.ux.ArFragment" component.

   

Download 3D models with Google Poly

There are several ways you can create Renderables, but in this article we use a 3D asset file.

Sceneform supports 3D items in .OBJ, .glTF and .FBX formats, with or without animations. There are plenty of places where you can acquire 3D models in one of these supported formats, but in this tutorial I will use an .OBJ file that is downloaded from Google & # 39; s Poly repository.

Go to the Poly website and download the item you want to use in .OBJ format (I use this T-Rex model).

  • Extract the folder that should contain the source asset file of your model (.OBJ, .FBX or .glTF). Depending on the model, this folder can also contain some model dependencies, such as files in the .mtl, .bin, .png or .jpeg formats.

Import 3D models into Android Studio

Once you have your item, you must import it into Android Studio using the Sceneform plug-in. This is a multi-step process where you:

  • Create a "sample data" folder. Sampledata is a new folder type for sample design time data that is not included in your MOT, but is available in the Android Studio editor.
  • Drag the original .OBJ asset file to the "sample data" folder.
  • Perform the Sceneform import and conversion on the .OBJ file, which generates the .SFA and .SFB files.

Although it seems simpler, do not drag the .OBJ file directly to the "res" directory of your project, as this will unnecessarily include the model in your APK.

Android Studio projects do not contain a "sample data" folder by default, so you must create one manually:

  • Control-click on the "app" folder of your project.
  • Select "New> Sample Data Directory" and create a folder named "sample data."
  • Navigate to the 3D model files that you have previously downloaded. Locate the source asset file (.OBJ, .FBX or .glTF) and then drag it to the "sample data" folder.
  • Check if your model has dependencies (such as files in the .mtl, .bin, .png or .jpeg formats). If you do find one of these files, drag them to the "sample data" folder.
  • In Android Studio, hold down Ctrl and click on the source file of your 3D model (.OBJ, .FBX or .glTF) and then select "Import Sceneform Item."

  • The following window shows some information about the files that Sceneform will generate, including where the resulting .sfa file will be saved in your project; I am going to use the "raw" folder.
  • If you are satisfied with the information you entered, click & # 39; Finish & # 39 ;.

This import makes a few changes to your project. When you open your build.gradle file, you will see that the Sceneform plug-in has been added as a project dependency:

            dependencies {
classpath & # 39; com.android.tools.build: gradle: 3.5.0-alpha06 & # 39;
classpath & # 39; com.google.ar.sceneform: plugin: 1.7.0 & # 39;

// NOTE: do not place your application dependencies here; they belong to
// in the individual build.gradle files module
}
}

Open your build.gradle file at module level and you will find a new sceneform.asset () item for your imported 3D model:

apply plug-in: & # 39; com.google.ar.sceneform.plugin & # 39;

// The "Source Asset Path" that you specified during import //

sceneform.asset (& # 39; sampleData / dinosaur.obj & # 39 ;,

// The "Material path" that you specified during import //

& # 39; Standard & # 39 ;,

// The ".sfa output path" that you specified during import //

& # 39; SampleData / dinosaur.sfa & # 39 ;,

// The ".sfb Output Path" that you specified during import //

& # 39; Src / main / assets / dinosaur & # 39;)

If you view your "sample data" and "raw" folders, you will see that they contain new .sfa and .sfb files, respectively.

You can preview the .sfa file in the new Sceneform Viewer from Android Studio:

  • Select "View> Windows Tools> Viewer" from the Android Studio menu bar.
  • Select your .sfa file in the menu on the left. Your 3D model should now appear in the Viewer window.

View your 3D model

Our next task is to create an AR session that understands the environment and enables the user to place 3D models in an expanded scene.

This requires that we do the following:

1. Create an ArFragment member variable

The ArFragment does a lot of the hard work needed to create an AR session, so we will refer to this fragment in our MainActivity class.

In the following excerpt, I create an LED variable for ArFragment and initialize it in the onCreate () method:

            private ArFragment arCoreFragment;

@ Override
protected void onCreate (Bundle savedInstanceState) {
super.onCreate (savedInstanceState);
...
...
...
}

setContentView (R.layout.activity_main);
arCoreFragment = (ArFragment)

// Search for the fragment with the fragment manager //

. GetSupportFragmentManager () findFragmentById (R.id.main_fragment);

2. Build a Model Renderable

We now have to transform our .sfb file into a ModelRenderable, which ultimately renders our 3D object.

Here I make a ModelRenderable of the res / raw / dinosaur .sfb file of my project:

            private ModelRenderable dinoRenderable;
...
...
...

ModelRenderable.builder ()
.setSource (this, R.raw.dinosaur)
.to build()
.thenAccept (renderable -> dinoRenderable = renderable)
.out of the ordinary(
throwable -> {
Log.e (TAG, "Can't load render");
return null;
});
}

3. Respond to user input

ArFragment has built-in support for ticking, dragging, pinching and turning gestures.

In our app, the user adds a 3D model to an ARCore aircraft by tapping that aircraft.

To provide this functionality, we must register a callback that is called when an aircraft is tapped:

                            arCoreFragment.setOnTapArPlaneListener (
(HitResult hitResult, Plane plane, MotionEvent motionEvent) -> {
if (dinoRenderable == null) {
to return;
}

4. Anchor your model

In this step we will retrieve an ArSceneView and link it to an AnchorNode, which will serve as the parent node of the scene.

The ArSceneView is responsible for performing various important ARCore tasks, including displaying the device's camera images and displaying a Sceneform UX animation that shows the user how to hold and move their device to enhance the AR experience. start. The ArSceneView also marks all aircraft that are detected, so that the user can place his 3D models within the scene.

A Scene is linked to the ARSceneView component, this is a data structure of parent and child elements that contains all nodes to be rendered.

We start by creating an AnchorNode node that will act as the parent node of our ArSceneView.

All anchor nodes remain in the same real world position, so by creating an anchor point, we ensure that our 3D models stay in place within the enlarged scene.

Let's make our anchor knot:

                            AnchorNode anchorNode = new AnchorNode (anchor);

We can then retrieve an ArSceneView using getArSceneView (), and link it to the AnchorNode:

                            anchorNode.setParent (arCoreFragment.getArSceneView () getScene ().);

5. Add support for moving, scaling, and rotating

Next, I will create a TransformableNode node. The TransformableNode is responsible for moving, scaling and rotating nodes based on user movements.

Once you have created a TransformableNode, you can link it to the Renderable, giving the model the ability to scale and move based on user interaction. Finally, you must connect the TransformableNode to the AnchorNode, in a child-parent relationship that provides the TransformableNode and Renderable remain fixed in the augmented scene.

                                    TransformableNode transformableNode = new TransformableNode (arCoreFragment.getTransformationSystem ());

// Connect TransformableNode to anchorNode //

transformableNode.setParent (anchorNode);
transformableNode.setRenderable (dinoRenderable);

// Select the node //

transformableNode.select ();
});
}

The completed MainActivity

After performing all of the above, your MainActivity should look something like this:

import android.app.Activity;
import android.app.ActivityManager;
import androidx.appcompat.app.AppCompatActivity;
import android.content.Context;
import android.net.Uri;
import android.os.Build;
import android.os.Build.VERSION_CODES;
import android.os.Bundle;
import android.util.Log;
import android.view.MotionEvent;

import androidx.annotation.RequiresApi;

import com.google.ar.core.Anchor;
import com.google.ar.core.HitResult;
import com.google.ar.core.Plane;

import com.google.ar.sceneform.AnchorNode;
import com.google.ar.sceneform.rendering.ModelRenderable;
import com.google.ar.sceneform.ux.ArFragment;
import com.google.ar.sceneform.ux.TransformableNode;

public class MainActivity expands AppCompatActivity {
private static final String TAG = MainActivity.class.getSimpleName ();
private static last double MIN_OPENGL_VERSION = 3.0;

// Create a member variable for ModelRenderable //

private ModelRenderable dinoRenderable;

// Create an LED variable for ArFragment //

private ArFragment arCoreFragment;

@RequiresApi (api = VERSION_CODES.N)
@ Override
protected void onCreate (Bundle savedInstanceState) {
super.onCreate (savedInstanceState);
if (! checkDevice ((this))) {
to return;
}

setContentView (R.layout.activity_main);
arCoreFragment = (ArFragment)

// Search for the fragment using fragment manager //

. GetSupportFragmentManager () findFragmentById (R.id.main_fragment);

if (Build.VERSION.SDK_INT> = VERSION_CODES.N) {

// Build the ModelRenderable //

ModelRenderable.builder ()
.setSource (this, R.raw.dinosaur)
.to build()
.thenAccept (renderable -> dinoRenderable = renderable)
.out of the ordinary(

// If an error occurs ... //

throwable -> {

// ... then print the following message to Logcat //

Log.e (TAG, "Can't load render");
return null;
});
}

// Listen to onTap events //

arCoreFragment.setOnTapArPlaneListener (
(HitResult hitResult, Plane plane, MotionEvent motionEvent) -> {
if (dinoRenderable == null) {
to return;
}

Anchor anchor = hitResult.createAnchor ();

// Build an AnchorNode node //

AnchorNode anchorNode = new AnchorNode (anchor);

// Connect the AnchorNode with the Scene //

anchorNode.setParent (arCoreFragment.getArSceneView () getScene ().);

// Build a TransformableNode node //

TransformableNode transformableNode = new TransformableNode (arCoreFragment.getTransformationSystem ());

// Connect the TransformableNode with the AnchorNode //

transformableNode.setParent (anchorNode);

// Confirm the Renderable //

transformableNode.setRenderable (dinoRenderable);

// Set the node //

transformableNode.select ();
});
}

public static boolean checkDevice (last activity activity) {

// If the device is running Android Marshmallow or earlier ... //

if (Build.VERSION.SDK_INT <VERSION_CODES.N) {

// ... then print the following message to Logcat //

Log.e (TAG, "Sceneform requires Android N or higher");
activity.finish ();
return false;
}
String openGlVersionString =
((ActivityManager) activity.getSystemService (Context.ACTIVITY_SERVICE))
.getDeviceConfigurationInfo ()

// Check the version of OpenGL ES //

.getGlEsVersion ();

// If the device performs slightly less than OpenGL ES 3.0 ... //

if (Double.parseDouble (openGlVersionString) <MIN_OPENGL_VERSION) {

// ... then print the following message to Logcat //

Log.e (TAG, "Requires OpenGL ES 3.0 or higher");
activity.finish ();
return false;
}
return where;
}
}

You can download the completed project from GitHub.

Testing your Google ARCore augmented reality app

You are now ready to test your application on a physical, supported Android device. If you do not have a device that supports ARCore, it is possible to test your AR app in the Android Emulator (with a little extra configuration, which we will discuss in the next section).

To test your project on a physical Android device:

  • Install your application on the target device.
  • Allow the app to access your device's camera when prompted.
  • If you are prompted to install or update the ARCore app, tap "Continue" and complete the dialog to make sure you are using the latest and best version of ARCore.
  • You should now see a camera view complete with an animation of a hand with a device. Aim the camera at a flat surface and move your device in a circular motion, as the animation shows. After a few moments, a series of dots should appear, indicating that an aircraft has been detected.

  • If you are satisfied with the position of these dots, give them a tap – your 3D model should now appear on your chosen plane!

  • Try to physically move around the model; Depending on your environment you can possibly do the full 360 degrees around it. You should also check whether the object casts a shadow that is consistent with the real world light sources.

Test ARCore on an Android virtual device

To test your ARCore apps on an Android Virtual Device (AVD), you need Android Emulator version 27.2.9 or higher. You must also be signed in to the Google Play Store on your AVD and have OpenGL ES 3.0 or higher enabled.

To check if OpenGL ES 3.0 or higher is currently enabled on your AVD:

  • Start your AVD as normal.
  • Open a new Terminal window (Mac) or a command prompt (Windows).
  • Change directory ("cd") so that the Terminal / Command Prompt points to the location of the "adb" program of your Android SDK, for example my command looks like this:

Cd / Users / jessicathornsby / Library / Android / sdk / platform tools

  • Press the "Enter" key on your keyboard.
  • Copy / paste the following command into the terminal and then press the "Enter" key:

./adb logcat | grep eglMakeCurrent

If the terminal returns "ver 3 0" or higher, OpenGL ES is correctly configured. If the terminal or command prompt displays slightly earlier than 3.0, you must enable OpenGL ES 3.0:

  • Switch back to your AVD.
  • Locate the set of "Extended Control" buttons that float next to the Android Emulator and then select "Settings> Advanced".
  • Navigate to "OpenGL ES API level> Renderer maximum (up to OpenGL ES 3.1)."
  • Restart the emulator.

In the terminal / command prompt window, copy / paste the following command and then press the "Enter" key

./adb logcat | grep eglMakeCurrent

You now get a result of "ver 3 0" or higher, which means that OpenGL ES is configured correctly.

Finally, make sure that your AVD uses the very latest version of ARCore:

  • Go to the GitHub page of ARCore and download the latest version of ARCore for the emulator. For example, at the time of writing, the most recent release was "ARCore_1.7.0.x86_for_emulator.apk"
  • Drag the MOT to your running AVD.

If you want to test your project on an AVD, install your application and grant it access to the "camera" of the AVD when prompted.

You should now see a camera summary of a simulated space. To test your application, move around this virtual space, find a simulated, flat surface, and click to place a model on this surface.

You can move the virtual camera in the virtual space by holding down the "Option" (macOS) or "Alt" (Linux or Windows) keys and then using one of the following shortcuts:

  • Go left or right. Press A or D.
  • Go down or up. Press Q or E.
  • Go forward or backward. Press W or S.

You can also "move" around the virtual scene by pressing "Option" or "Alt" and then using your mouse. This may feel awkward at first, but with practice you should be able to successfully explore the virtual space. Once you have found a simulated surface, click on the white dots to place your 3D model on this surface.

Shut down

In this article we have created a simple augmented reality app with ARCore and the Sceneform plug-in.

If you decide to use Google ARCore in your own projects, share your creations in the comments below!

!function(f,b,e,v,n,t,s){if(f.fbq)return;n=f.fbq=function(){n.callMethod?
n.callMethod.apply(n,arguments):n.queue.push(arguments)};if(!f._fbq)f._fbq=n;
n.push=n;n.loaded=!0;n.version=’2.0′;n.queue=();t=b.createElement(e);t.async=!0;
t.src=v;s=b.getElementsByTagName(e)(0);s.parentNode.insertBefore(t,s)}(window,
document,’script’,’https://connect.facebook.net/en_US/fbevents.js’);

fbq(‘init’, ‘539715236194816’);
fbq(‘track’, “PageView”);

Written by

Don Bradman

Leave a Reply

Your email address will not be published. Required fields are marked *