Share

engineering

5min read

Marker-Based AR Content on iOS: Unity vs. ARKit

Marker-Based AR Content on iOS: Unity vs. ARKit

If we were to name one thing that will take over 2018, it would be augmented and virtual reality for sure. We can already see it happening in many areas of our lives (have you seen our recent blog post about AR retail solutions?). In this field, marker-based AR content is the thing—it offers solutions suitable for many different cases, from recognizing and augmenting museum exhibitions to learning about that gooseberry flavored yoghurt on the store shelf.

There are a few tools that can be used to create an iOS app detecting and tracking markers. Let’s take a look at three of them: the long-serving Vuforia in a duet with Unity, the brand new ARKit 1.5 (available in beta iOS 11.3) and even newer, spick-and-span ARKit plugin for Unity.

Veteran Vuforia via Unity

The first discussed approach is using Vuforia within Unity. Vuforia is an SDK for mobile devices that enables creating AR applications. It uses the computer vision technology for the real-time recognition and tracking of 2D images and simple 3D objects, such as boxes. This allows developers to position and orient virtual content in relation to the 2D or 3D targets when they are detected by the camera of a mobile device. The position and orientation of the target is then being tracked so that the user’s perspective corresponds with the real world scene.

Vuforia offers an SDK for iOS and Android, and it’s available as an extension to the Unity engine, too. In here, we will discuss the last solution. In fact, Vuforia is easily installable together with Unity just by ticking the right box in the components list, where it’s suggested as one of the options. AR applications made with Unity are easily portable to both iOS and Android platforms. All the development takes place within the Unity environment.

There is a great Getting Started guide worth recommending for novice Vuforia developers. Of course, you need to officially become a Vuforia developer to gain access to the dev portal and community.

Let’s get to the point: the targets. Among Vuforia targets there are 2D images, but it can also recognize multi-targets (e.g. boxes and cylinder objects) or text, or even 3D objects of your choice — if you scan them with a special Android app as described here. But since we talk iOS here, let’s stick to the image targets for now. Vuforia offers a web application that allows you to upload and manage targets. Here you can create a database of desired targets—all it takes is to upload an image, which must be either 8 or 24 bit PNG or JPG file, RGB or grayscale, not bigger than 2 MB.

Vuforia offers a web application that allows you to upload and manage targets.

So let’s give it a try—say we want the app to recognize a couple of Polidea avatars, like these:

Polidea's avatars

After uploading the images to the Vuforia Target Manager we can see their star ratings and check the analysis of the contrast-based features of each target (yellow crosses in the picture below).

Polidea's avatar in Vuforia

After experimenting with these 4 avatars, it turned out they are poorly-featured, which basically means they don’t have enough details. Vuforia fails to always differentiate them from each other. So let’s try to add some detailed background to each of them.

Polidea avatars with detailed backgrounds

Now the feature analysis looks much better, thanks to the greater complexity of the image. Tips for optimizing target detection and tracking stability are all included in this guide

Polidea avatars in Vuforia with detailed backgrounds

Whether you need to store information about lots of targets or just few, you can choose between two kinds of databases in Vuforia. The first option is to use a device database, which is stored on your device and included in your app at installation. The second option is Vuforia Cloud Recognition Service which allows to recognize image targets through a cloud database, providing more capacity and flexibility.

The Pure-ARKit Approach

Now, what if we want to go conservative and create a standard, native iOS app in Xcode? In iOS 11.3 and above, you can enhance the AR experience by enabling image recognition in ARKit, which is an add-on feature for world-tracking AR sessions (we have talked about the very basics of ARKit in this blog post).

In this approach, adding your image of choice as a marker is quite straightforward too. The first step is to add it to the assets catalog inside the Xcode project. Keep images you want to look for in the same session inside a resource group. This way, separate sets of images are used in different sessions. It might be useful e.g. if the app should look for certain types of markers in different sessions—for instance, based on the location inside a building.

Trying out our fine-tuned Polidea avatars with rich backgrounds indicates that they are still similar to each other—Xcode displays the following warnings:

Vuforia warnings

…but let’s turn a blind eye and give it a try anyway. From the created asset catalog we can load the ARReferenceImage resources and pass them to a detectionImages property of the world-tracking configuration. And this is pretty much all there is to do. The app can already recognize the images.

Also, there is one more thing worth mentioning: the new ARKit finally enables detecting vertical surfaces, too. 🎉

ARKit within Unity

The last solution to be discussed is the very new ARKit plugin announced recently in the Unity’s blog post, thanks to which the developers can now take the full advantage of the ARKit features mentioned above, without moving away from the Unity environment.

This plugin brings two new types of assets: ARReferenceImage and ARReferenceImagesSet, which are used to set up the reference images for detection. Image sets are then mapped to asset folders—exactly like those described in the previous solution—in the Xcode project generated by Unity. The ARCameraManager game object in the scene holds a reference to the set of images which are to be detected. What about the detection results? No surprises here—they are very similar to the previous ones.

What Should I Use?

Choosing from the above options is rather a matter of convenience. There is no actual difference in the accuracy, swiftness or quality of performance that could be visible to the user. So it’s more about the journey, not the destination. There are a couple of questions to be asked when deliberating over the discussed solutions:

  1. Do you primarily want to simplify the process of deploying your AR app to both Android and iOS platforms? If so, pick Vuforia and stay in Unity.
  2. Do you prefer to stay native and/or feel goosebumps when opening Unity? Don’t worry, just pick the pure ARKit approach.
  3. If you feel comfortable with Unity, but don’t want to give up the potential of the new ARKit — go for the Unity plugin.

If you still can’t decide on the right approach after reading this article, get in touch—Polidea experts are waiting for you!:) We have experienced AR developers and designers that can help your business and develop a great AR app.

Share

Monika

Software Engineer

Did you enjoy the read?

If you have any questions, don’t hesitate to ask!

Did you enjoy the read?

If you have any questions, don’t hesitate to ask!