Create an AR app using AR Foundation in Unity

Share the Post

Tested with Unity versions: 2020.3.0f1


In this tutorial, we will do the setup for our development environment and create an AR app using AR Foundation in Unity 2020.3.0f1. This tutorial will guide you on setting up the development environment required to create an AR app using AR Foundation in Unity for Android. After that, we will make our first augmented reality app using Unity to ensure that our setup is correct.

Several tools and configurations are necessary to set up before developing AR applications using Unity for iOS or Android devices. If you are developing AR applications for iOS, you will require setting up Xcode (Apple’s integrated development environment (IDE)) on your development machine. Xcode is required to deploy Unity builds onto our iOS devices. On the other hand, if you are developing for Android, you will need to install the Android SDK and NDK.

Other tutorials

Augmented Reality – Fire Effect using Vuforia and Unity

Solving 8 puzzle problem using A* star search

Finite State Machine Using C# Delegates in Unity

Setup for Android Development

To build Unity augmented reality applications on Android devices, you will need to install Unity Android Build Support. You will also need to install the Android SDK and NDK. By default, Unity installs a Java Development Kit based on OpenJDK.

Install Unity 2020.3.0f1 LTS

If you have already installed this version of Unity, you can move on to the next section.

If you have not installed Unity Hub before then you can go to click on Download Unity Hub. Once downloaded install it.

  • Click on Installs and then click on Add
  • Under Recommended Release select Unity 2020.3.0f1 (LTS)
  • Click Next

Select Android Build Support

  • Expand Android Build Support and select both Android SDK and NDK Tools and OpenJDK.
  • Click Next
  • Agree to the End User License Agreement and click Done.
  • Wait till the installation is complete. When the installation is complete, you should see the 2020.3.0f1 in your Unity Hub, as shown below.

Setup Android Device

Enable USB Debugging

To see the option for USB debugging mode in Android 4.2 or higher, do the following:

  • Open up your device’s Settings. You can do this by pressing the Menu button while on your home screen and tapping System settings
  • Now scroll to the bottom and tap About phone or About tablet.
  • At the About screen, scroll to the bottom and tap on “Build number” seven times.
  • After tapping seven times, you will see the message. You are now a developer! Pop up.
  • You have now enabled the Developer Options. You can now open the Developer Options and unlock USB debugging mode by going to Settings -> Developer Options -> Debugging -> USB debugging.
  • Connect your device to the computer with a USB cable. You might need to install device-specific drivers.

Setup AR Foundation

AR Foundation allows the creation of augmented reality applications using Unity for both iOS and Android target devices. This means that you can use AR Foundation with Unity to build an AR application that you can deploy to both iOS and Android devices without changing scripts or scene settings.

  • Create a new Unity 3D project using the 2020.3.0f1 version.
  • Once we have created the project, go to Window in Unity Editor’s main menu and open the Package Manager.
  • Select Unity Registry from the Packages dropdown.
  • Find AR Foundation from the list and select it.
  • Click Install.
  • Select ARCore XR Plugin and install it. (You will need to install ARKit XR Plugin if you are using iOS).
  • If you deploy your AR app into Android and iOS target devices, you will need to install ARCore and ARKit XR Plugins.

Create Your First AR Application

For creating augmented reality applications, we do not need the default Camera GameObject in the scene. Go ahead and delete the Main Camera GameObject from the scene hierarchy.

AR Session GameObject

  • Right-click on the scene hierarchy, select XR, and select the AR Session GameObject.

Every AR app requires an ARSession component. The ARSession component controls the life-cycle of an AR experience. It is responsible for enabling and disabling AR on a target platform. If the scene does not contain an ARSession, it will not track its environmental features. AR Session GameObject comes with an ARSession component. If you do not want to add the AR Session GameObject, you will have to attach the ARSession component to any existing GameObjects in the scene.

Note: An AR session is a global construct. An ARSession component manages this global session, so adding multiple ARSession components will manage the same global session.

AR Session Origin GameObject

  • Right-click on the scene hierarchy, select XR, and select the AR Session Origin GameObject.

AR devices provide their data in “session space,” an unscaled space relative to the AR session’s beginning. The purpose of ARSessionOrigin is to perform the appropriate transformation into Unity space.

ARSessionOrigin transforms trackable features, such as planar surfaces and feature points, into their final position, orientation, and scale in the Unity Scene.

This concept is similar to the difference between “model” or “local” space and world space when working with Unity assets. For instance, if you import a car from Maya or any other content creation tool, the door’s position is relative to the modeler’s origin. We call this the “model space” or “local space.” When Unity instantiates it, it also has a world space that’s relative to Unity’s origin.

Similarly, AR trackable features, such as planes produced by an AR device, are in session space relative to the device’s coordinate system. When instantiated in Unity as GameObjects, they also have a world space. AR Foundation needs to know where the session origin should be in the Unity scene to instantiate them in the correct place. ARSessionOrigin provides just that.

ARSessionOrigin also allows scaling virtual content and applying an offset to the AR Camera. If you are scaling or offsetting the ARSessionOrigin, its AR Camera should be a child of the ARSessionOrigin. Because the AR Camera is session-driven, this setup allows the AR Camera and detected trackable features to move together.

Enable Plane Detection

  • Select the AR Session Origin GameObject from the scene hierarchy and add the AR Plane Manager component. Specify the AR Plane Manager component to detect horizontal and vertical planes by setting the Detection Mode dropdown to Everything.
  • Select the AR Session Origin GameObject from the scene hierarchy and add the AR Raycast Manager component. The AR Session Origin should look like the below in the Inspector view.
  • We will need a plane prefab so that the AR Plane Manager can display the detected planes’ location and orientation. Right-click on the Hierarchy and select XR, select AR Default Plane.
  • Under Assets, create a new folder called Resources. Create another new folder in the Resources directory and name it Prefabs.
  • Drag this AR Default Plane from the Hierarchy to the Prefabs folder to make it a prefab.
  • Delete the AR Default Plane from the Hierarchy. Now drag the prefab from the Project Window to the Plane Prefab variable of AR Plane Manager.

Create a Spawnable Object

We will now create a Spawnable Object in our scene. This Spawnable Object is the object that we will spawn in our AR application. Open your favorite browser by clicking the below link

Download an import the asset into your project.

  • Right-click on the Hierarchy and create a new empty game object and call it Vase.
  • Add the FloralGoldJar prefab from the downloaded Asset folder into the Vase game object.
  • Resize the Vase transform to 0.2, 0.2, 0.2.
  • Create a tag called Spawnable and set this tag to the Vase.
  • Now, drag this Vase from the Hierarchy to the Prefabs folder to make it a prefab.
  • Delete the Vase from the Hierarchy.

Create the Spawner

In the previous section, we have created the Spawnable Capsule. We will now proceed to complete the mechanism for creating the Spawnable Capsule in the scene.

  • Create an empty game object and name it Spawner.
  • Add a new script component and name the script as Spawner.
  • Double-click the Spawner script in Visual Studio.
/// <summary>
/// The reference to ARRaycastManager. This will handle 
/// the ray casting towards the tracckable features.
/// </summary>
public ARRaycastManager m_ARRaycastManager;

/// <summary>
/// The prefab for the spawned object. You can have a list
/// of prefabs as well and create some mechanishm to let user
/// select what object they want to place in the scene. 
/// Think of the IKEA app.
/// </summary>
public GameObject m_spawnableObjectPrefab;

/// <summary>
/// Representation of a Position, and a Rotation in 3D Space. 
/// This structure is used primarily in XR applications to 
/// describe the current "pose" of a device in 3D space.
/// </summary>
Pose m_placementPose;

/// <summary>
/// A temporary variable to hold the recently spawned object.
/// </summary>
GameObject m_spawnedObject = null;

In the Start method, set the value of spawnedObject to null.

void Start()
    m_spawnedObject = null;

In the Update method, we will implement the functionality for the spawning of objects.

private void Update()
    //Check for touch inputs. If there is no touch event then return.
    if (Input.touchCount == 0)

    //If there is a touch event then get the touch position
    var touchPt = Input.GetTouch(0).position;

    List<ARRaycastHit> hits = new List<ARRaycastHit>();

    //Do a raycast using the ARRaycastManager to get the hits.
    //ARRaycastManager manages an XRRaycastSubsystem, exposing 
    //raycast functionality in ARFoundation.Use this component
    //to raycast against trackables(i.e., detected features in 
    //the physical environment) when they do not have a presence 
    //in the Physics world.

    m_ARRaycastManager.Raycast(touchPt, hits);
    if (hits.Count == 0)

    //If there is a hit with trackable features then keep a 
    //reference to the Pose.
    m_placementPose = hits[0].pose;
    if(Input.GetTouch(0).phase == TouchPhase.Began)
        //The Spawn function. 
    else if(Input.GetTouch(0).phase == TouchPhase.Moved && m_spawnedObject != null)
        //If there is a TouchPhase.Moved event and the spawnedObject 
        //is not null then reposition the object based on our touch input. 
        //Move the object.
        m_spawnedObject.transform.position = m_placementPose.position;
    if(Input.GetTouch(0).phase == TouchPhase.Ended)
        //If the TouchPhase has ended then reset the m_spawnedObject 
        //to null so that we can handle another new spawned object.
        m_spawnedObject = null;

Now we will implement the Spawn method as below.

void Spawn(Vector3 position)
    m_spawnedObject = Instantiate(m_spawnableObjectPrefab, position, Quaternion.identity);
  • In the Unity Editor, drag and drop the Vase prefab to the Spawnable Object Prefab field, and drag and drop the m_spawnableObjectPrefabgame object to the AR Raycast Manager field.

Android Build Setting Configuration

In Unity Editor, go to File and click Build Settings.

Click on Player Settings on the Build Settings window.

Player Settings > Other Settings > Rendering If Vulkan appears under Graphics APIs, remove it. 
Player Settings > Other Settings > Package Name Create a unique App ID using a Java package name format.  For example, use com.GDD.MXRT 
Player Settings > Other Settings > Minimum API Level Android 7.0 ‘Nougat’ (API Level 24) or higher  (For AR Optional apps, the Minimum API level is 14.) 
Player Settings > XR Plug-in ManagementEnable ARCore

Build and run the application on your Android Phone.

Leave a Reply

Your email address will not be published. Required fields are marked *