DELIVERABLES

An Android app that will let users select two points and give the measurement between the two points.

 

TECHNOLOGIES USED

      • ARCore
      • Android Studio
      • Java

Augmented Reality (AR) is an area of computer science that allows programs to supplement the real world with artificial constructs. These constructs can take the form of static images, directions around an area, or even super imposing other environments on the existing one. By using AR, an app can simulate interactions with the real world in a virtual space. Android's solution to AR is ARCore, which lets most of the existing Android phones available execute AR apps with existing hardware. While this doesn't provide as much accuracy as custom hardware might provide (using the single camera on a phone compared to using two cameras to build a virtual understanding of the world), it makes AR technologies more accessible.

 

TECHNICAL DEEP-DIVE

a diagram
Core ML vs. Tensorflow Lite
READ NOW

The AndPlus Innovation Lab is where our passion projects take place. Here, we explore promising technologies, cultivate new skills, put novel theories to the test and more — all on our time (not yours).

DSC00318

OUR RESEARCH

Android initially started developing AR with more custom hardware, but in 2017 they started work on ARCore as a way to make AR more accessible to current Android users. In February 2018, ARCore was released with version 1.0, making it officially available for use in apps across a wide range of phones, rather than just the small set available while it was still in Beta. ARCore is currently available for developers in several different formats; Android, Android NDK, Unity, and the Unreal Engine. For our initial, test and experimentation with AR, we chose to dive directly into the Android code to gain a better understanding of how ARCore works, and to leverage our experience building other Android apps in Android Studio. 

Using the basic tutorials available for ARCore for Android we were able to create an initial application that prompts the user to install ARCore and to ensure it has the proper permissions. Unfortunately, the only other resources available are the API calls, so in order to see how ARCore is integrated into an app we need to use the sample app provided. By leveraging how the sample ARCore project works and using the API references, we built up an understanding of the core concepts in ARCore and how they interact with one another. These involve initializing an ARCore Session, getting an updated Frame from it to draw the current view of the world, generating Anchors as real world points, and then drawing the AR representation based on the Pose of an Anchor. Our initial assumption was that part of the API would provide easy access to draw the camera view and display AR objects, but as we explored the API and sample app we learned that most of the graphical part of ARCore is done through the GL ES framework, making experience with it important.

As our goal was to learn about ARCore and not the GL ES framework, we leveraged a lot of the graphical logic from the sample app, and focused on the interaction with ARCore. From this, we created a basic app that calculates the length between two points in the real world space.

porsche

augmented-reality-1853592_1280

DELIVERABLE

The generated app lets the user place two Anchors by tapping the screen, which then represent real world locations. From these Anchors, we calculated the length based on the basic formula for the distance between two three dimensional points. ARCore defaults it's units to meters, so we then converted the length to feet for simplicity. Below is a screen shot with two Anchors placed ~1 foot apart, and the corresponding length calculated by the app.

 

HOW IT WAS DONE

  • Add ARCore dependencies to app
  • Add logic to ensure ARCore is available and intialize
  • Display camera view in a GLSurfaceView
  • Register single taps from the user to generate Anchors
  • Calculate length based on each Anchor Pose (real world position)

 

LIMITATIONS

As discussed earlier, developing ARCore in Android requires knowledge of the GL ES framework, which makes it difficult to put something together without experience of it. Since ARCore is available in other formats, it may be beneficial to look into other ways to integrate ARCore into an app that doesn't require knowledge of this framework. In addition, ARCore currently doesn't provide support for vertical surface detection (as of version 1.0.0). While it is still possible to work with vertical surfaces, it requires a bit more work on the app side. Lastly, the API doesn't provide camera controls, so while the current view of the camera can be displayed easily, it will not focus on closer objects, or allow the user to zoom. These features will most likely be made available in later versions, but for now in apps using ARCore will have to make due without them.

GET IN TOUCH

LET’S BUILD SOMETHING AWESOME. TOGETHER.