Hello World with Targets (surface)

Welcome to this tutorial. Here we are going through all necessary steps to create a native app for Android / iOS that makes use of Onirix’s Targets technology. The goal of this tutorial is to create an app that runs on your AR compatible device and allows us to place 3D content on a surface as part of an Augmented Reality scene.

Setup and Tools

Let's go over the tools we are going to need.

  1. Onirix Studio: the web platform that is the base/starting point for all our AR projects.
  2. Unity is the most popular and widely used 3D engine for mobile applications. Unity is cross-platform, which allows developers to deploy the same app on multiple platforms.
  3. Onirix SDK in its mobile version, allows us to create Targets-type apps.
  4. Full code available on Github.

Step 1: Create a new Targets project

We will start by creating a new project in the studio. In this Getting Started video you can see how to create your first project in Onirix. Inside the project we create a new Target, and use the scene editor to place our first 3D model.

Follow this video from 0:33 to 1:45 and then continue reading. You can take a look to our Scene editor page if you want to know more about this.

Step 2: Unity platform

The next step is to download Unity and create a new project. When opening Unity, a window appears. Make sure you run at least version 2018.2.7. To create a new project in Unity, go to the projects tab, and click on New project. Give it a name, left all other options by default and create. It may take several seconds to initialize.

Create project in Unity

Step 3: Onirix SDK

The next step is to download the Onirix SDK. For this we access the Onirix Mobile SDK repository on GitHub and download the latest release (Unity Package), which allows us to create Onirix Targets projects (place virtual content over images or any surface).

A double click on the downloaded package will start the import process to Unity. Unity will show you all the contents of the package.

Onirix sdk import screen We leave everything selected and click on Import. This completed the inclusion of the Onirix SDK in our project.

Step 4: Creating the App

Now we have everything we need to create our first app. We will create an empty element inside the scene called MainController (you can name it as you prefer). This element will control the logic of our application.

Now, we will add a series of Scripts in our recently created MainController. Open the inspector and drag & drop the following scripts inside.

Script 1: OnirixMobileManager

The first one is the OnirixMobileManager. We select the MainController and drag&drop the script.

This component provides us the surface and marker detection and works as an abstraction layer from both the ARCore and ARKit frameworks. This script needs some fields to be set from the inspector (Legacy Camera, ARCore Camera, ARKit Camera, Crosshair Prefab, Crosshair Found Material and Crosshair Lost Material). Although you can provide your own prefabs to the script in order to customize its behaviour and appearance, we highly recommend to make use of the default ones provided with the Onirix SDK.

The complete list is:

  • Legacy Camera
  • ARCore Camera
  • ARKit Camera
  • Crosshair Prefab
  • Crosshair Found
  • Crosshair Finding

Script 2: DynamicLoadManager

The next script that we must include is the DynamicLoadManager. It is responsible for allowing elements to be included dynamically in the execution of our app. For this we need to include the prefab of the object placeholder (LoadingElement prefab), that is, the element that we will appear while the asset is being loaded.

Script 3: OnirixAssetLoader

We are not done yet. The next script is the OnirixAssetLoader. It is a dependency for the previous one. It is found in dependencies/Onirix SDK/plugins. We select the Maincontroller and use drag&drop to include the script. It will ask us for 4 prefabs in particular, the default elements that are shown in case there is an error when loading 3D objects, 2D elements, videos or audios.

The complete list is:

  • Default2DElement
  • Default3DElement
  • DefaultVideoElement
  • DefaultAudioElement

Script 4: Your own script

This is the last script we are going to import. We are going to add a script of our own that will be in charge of managing the load logic of the newly created target inside the Studio. We create a Scripts folder and a new Script that we will call MainController. We double click to edit it.

All Unity scripts that extend MonoBehaviour come with two default lifecycle methods: Start (called at the beginning of the script) and Update (called once per frame). In this tutorial we only will need to fill the Start method. Now we are going to create a series of global variables at Script level.

[SerializeField] private string _projectToken;
[SerializeField] private string _targetOid;
[SerializeField] private Button _loadTargetButton;
[SerializeField] private Text _statusText;

First reference to the project token, which is the ID that allows us to access the Onirix project from the SDK. Second, a reference to the Target identifier in Onirix (OID). We will create two other references, a button that will allow us to place the target when we want, and a label for messages that will show us the status of the application.

For Button and Text classes to work we must include UnityEngine.UI namespace. You can later set values for this serialized fields from script inspector.

In the next step, we are going to create an instance of the OnirixMobileManager. It is in charge of managing everything that has to do with the AR in our app. Also, we are going to create a public getter property to get that reference.

public OnirixMobileManager MobileManager

            if (!_onirixMobileManager)
                _onirixMobileManager = gameObject.GetComponent<OnirixMobileManager>();
            return _onirixMobileManager;

Now it's time to link the fields from the Unity editor. To do this, you have to drag&drop the newly created script into the MainController element, and the corresponding placeholders will appear. The first one is the Project token which we will get from the Studio. We access the project view and we get it from the contextual menu of the project, on the copy button. We go back to Unity and paste it in the section.

The Target OID is somewhat similar. We can copy it from the property part, at the top right, and paste it into the corresponding placeholder.

For the two other elements, we use a canvas, as those are graphic elements. We create the canvas and establish some reference resolution to work with, for example, full HD (1920x1080). Then we add the status text "Status text", and we anchor it in the top position center, for example to -100 pixels of the top anchor. We adjust height and width and center it horizontally. We change the default text to "Look around to detect surfaces", increase size, etc.

After that we can also add the button we will use to add our Target to the scene. It will go in the bottom centered. In the same way we customize it to our liking and we add some text. "Load Target" and we are done.

We can now reference all elements in our script.

Main script inspector

The first thing that we want to change inside the app is preventing the screen from switching to energy saving mode. To achieve this, we use a variable of the Screen class, called sleepTimeout, which allows us to prevent it from ever entering that state.

Screen.sleepTimeout = SleepTimeout.NeverSleep;

The next thing is to initialize the DynamicLoadManager, which will load our Target. To do this with our instance we call the Init method and we set the parameters: the MobileManager, the project token; and a listener that tells us when an asset has been loaded, when it is loading or downloading. We can use this class as the listener's own implementer. Simply extend this IDynamicLoadListener class and implement its methods. For now we can write the corresponding message within our reference for the StatusText. We define one for each case.

DynamicLoadManager.Instance.Init(MobileManager, _projectToken, this);

    public void OnTargetAssetsDownloaded(Target target)

        _statusText.text = "Target assets downloaded!";

    public void OnTargetAssetsLoaded(Target target)
        _statusText.text = "Target assets loaded!";

    public void OnTargetAssetsStartDownloading(Target target)
        _statusText.text = "Target assets started download!";

    public void OnTargetAssetsStartLoading(Target target)
        _statusText.text = "Target assets started loading!";

The next logical step would be to start the surface detection, but since this a feature of ARCore/ARKit, we can not access this feature without initializing ARCore/ARKit before. For this we create a coroutine that is waiting until that happens. We create it in a new method: WaitForAR, and we use the IsReady flag of the MobileManager to know when it is available. When it becomes available, we return the coroutine. We start it inside the Start method and are done.

private IEnumerator WaitForAR(System.Action onReady)
        yield return new WaitUntil(() => MobileManager.IsReady);

Once ARCore/ARKit has been started we can begin with the surface detection. We call the StartSurfaceTarget method of the MobileManager. It receives two callbacks: the first indicates when a surface has been detected (this is going to be shown by the Status text we have defined earlier, and also by enabling the button); the second tells us whether the surface was lost (in that case, another StatusText is being displayed and the button is disabled)

    () =>
        _statusText.text = "Click on the button to place the target";
        _loadTargetButton.interactable = true;
    () =>
        _statusText.text = "Move around to detect a surface";
        _loadTargetButton.interactable = false;

Finally, we are going to add a listener to the add target button. When it is being called, we want to load the corresponding asset with the DynamicLoadManager. We go to the loadTarget method, we give the corresponding target ID, and update the StatusText indicating this. We also show the Crosshair that represents the place where we will load the asset. At the same time, we are going to deactivate the start loading button so that it is only active once we have detected a surface.

_loadTargetButton.onClick.AddListener(() =>
            _statusText.text = "Loading target, please wait ...";
            _loadTargetButton.interactable = false;

Step 5: Exporting the App

Exporting for Android

We only have to build (compile and export) the app. We click on BuildSettings, we choose the destination platform (Android in this case) and click on Switch platform. Don’t worry if It takes a while to import the necessary assets.

When the process finishes we have to check four things:

  1. The bundle ID of the app (in this case we we will use com.onirix.hwtargets but this is completely up to you);
  2. The minimum version of the Android API (24 - Nougat is the minimum for ARCore).
  3. The minimum .NET compatibility level to .NET 2.0 (Without subset)
  4. Ensure ARCore supported check is enabled in the XR settings.

Finally, we can press "Build and run".

We give a name to the APK file and connect our device to start exporting the app. It will take a few minutes while compiling a series of shaders to work on Android. This only happens the first time.

Exporting for iOS

See the Hello World with Targets (marker).