Smartphone Augmented Reality — Lab

Hands-on lab on smartphone augmented reality using ARCore.

Objective

Create an augmented reality application capable of:

  1. detecting an image used as a marker,
  2. displaying a 3D object attached to this marker,
  3. detecting a plane in the environment,
  4. defining a play area,
  5. creating interactions between markers and the area.

Part 1 — Project setup

  1. Create a Unity project.

  2. Switch the platform to Android in Build Settings.

To access logs on the phone, install Android Logcat from the Package Manager.

  1. Install AR Foundation.

  2. Install ARCore XR Plugin (Android) or ARKit XR Plugin (iOS).

  3. Open Project Settings → XR Plug-in Management and enable:

  • ARCore in the Android tab
  • ARKit in the iOS tab (if needed)
  1. In Project Settings → Player → Android:
  • set Minimum API Level: Android 10 / API 29
  • in Graphics APIs, disable Vulkan and keep OpenGLES3 only
  1. Add to the scene:
  • AR Session
  • XR Origin (with AR Camera)
  1. Mobile rendering configuration
  • In the project, open: Assets/Settings/Mobile_Renderer
  • In the inspector: Add Renderer Feature → AR Background Renderer Feature

Verify that the application launches on the phone and that the camera feed is visible.

When launching the app on the phone for the first time, allow camera access when requested by the system.

Part 2 — Image detection

  1. Create a new Reference Image Library asset.
  2. Add a marker image.
  3. Add an ARTrackedImageManager component to the XR Origin.
  4. Assign the image library.
  5. Create a prefab for the 3D object to display on the marker.
  6. Assign this prefab to the ARTrackedImageManager.

You can use the markers provided for this lab. The ImageLibrary asset is available here and the markers are available here.

Part 3 — Plane detection (selecting a plane + play area)

Objective

  • detect horizontal/vertical planes using AR Foundation
  • allow the user to tap on a detected plane
  • place a play area (GameBoard) on this plane

3.1 Add required AR components

On the XR Origin object:

ARPlaneManager

  • Detects and maintains planes.
  • Assign a Plane Prefab to visualize planes.

Create a simple Plane Prefab

  1. Create → Empty GameObject
  2. Name the object: ARPlanePrefab
  3. Add the components:
  • AR Plane
  • AR Plane Mesh Visualizer
  • Mesh Renderer
  • Mesh Filter
  1. Create a simple material (e.g. Unlit / Color transparent).
  2. Assign this material to the Mesh Renderer.
  3. Drag the object into the Project folder to create a prefab.
  4. Assign this prefab in ARPlaneManager → Plane Prefab.

ARRaycastManager

  • Used to cast a ray from the screen to detect planes.

Recommended setting:

  • Detection Mode → Horizontal

3.2 Create a play area prefab (GameBoard)

Create a simple prefab:

  • a Quad lying flat
  • a semi-transparent material
  • scale around 0.5 m
  • optional: BoxCollider

3.3 Implement tap-to-place (New Input System)

In this lab we use the New Input System (Unity Input System Package) to detect taps.

Prerequisites

  1. Open Project Settings → Player
  2. Set Active Input Handling → Input System Package (New)

Principle

Create a script BoardPlacement.cs. The script should:

  1. enable EnhancedTouch (at startup)
  2. retrieve the first active touch
  3. filter the Began phase
  4. perform an ARRaycast toward planes (PlaneWithinPolygon)
  5. create or move the GameBoard at the hit pose

Useful functions / APIs:

  • EnhancedTouchSupport.Enable() / Disable()
  • Touch.activeTouches
  • touch.phase (New Input System)
  • touch.screenPosition
  • ARRaycastManager.Raycast(...)
  • TrackableType.PlaneWithinPolygon
  • Pose
  • Instantiate(...)
  • transform.SetPositionAndRotation(...)

Checkpoint

You should be able to:

  • see detected planes
  • tap a plane
  • place the play area

Part 4 — Interactions

4.1 Basic interaction

Implement an interaction between:

  • detected markers
  • the play area

Possible examples:

  • an object appears when the marker enters the area
  • the marker moves an object within the area
  • the marker orientation modifies a parameter

This section is intentionally open.

4.2 Resources for going further

The AR Foundation package provides several useful features to enrich your project.

Image tracking

  • ARTrackedImageManager
  • events: trackedImagesChanged
  • access to marker position

Plane detection

  • ARPlaneManager
  • access to detected planes

AR Raycast

  • ARRaycastManager.Raycast()
  • allows interaction with the real environment

AR Anchors

  • ARAnchorManager
  • allows objects to remain fixed in space

Light estimation

  • ARCameraManager
  • adapts virtual lighting to the real scene

Face / body tracking (depending on device)

  • Face Tracking
  • Body Tracking

Official documentation:

https://docs.unity3d.com/Packages/com.unity.xr.arfoundation

Part 5 — Scenario design

You now have all the technical elements needed to create a complete AR experience.

Invent a small application or mini-game using:

  • one or several markers
  • a zone
  • at least one interaction.

Possible extensions

  • multiple markers
  • interactions between markers
  • visual effects
  • AR user interface

Deliverables

  1. Report (PDF) including:
  • description of the application
  • screenshots
  • technical explanations
  • Git link to the Unity project

Grading

  • Project setup — 2 pts
  • Image tracking — 2 pts
  • Plane detection and placement — 4 pts
  • Interactions — 4 pts
  • Scenario design — 5 pts
  • Technical quality — 2 pts
  • Submission — 1 pt