Integrating AR Foundation (Unity) into Mobile Application
AR Foundation — cross-platform Unity abstraction over ARKit and ARCore. One code, two builds. But this doesn't mean functionality is identical: LiDAR occlusion works only on iOS Pro devices, Depth API via ML model — Android exclusive on chips without ToF. Writing AR Foundation without understanding these differences gets runtime exceptions on half the devices.
How AR Foundation Differs from Direct ARKit/ARCore Integration
AR Foundation version 6.x (Unity 2023 LTS) covers all key features of both platforms: plane detection, image tracking, face tracking, mesh classification, point clouds. But implementation under hood is different, and some flags work only if platform backend supports:
[SerializeField] AROcclusionManager occlusionManager;
void Start()
{
if (occlusionManager.descriptor?.supportsEnvironmentDepthImage == true)
{
occlusionManager.requestedEnvironmentDepthMode = EnvironmentDepthMode.Fastest;
occlusionManager.requestedOcclusionPreferenceMode = OcclusionPreferenceMode.PreferEnvironmentOcclusion;
}
else
{
// Fallback: disable occlusion or show limited mode
occlusionManager.enabled = false;
}
}
Without descriptor check — NotSupportedException on Pixel 5, which has no ToF.
Scene Architecture
AR Foundation is built on components tied to ARSession and XROrigin:
-
ARPlaneManager— plane detection and tracking -
ARRaycastManager— shooting rays at tracked geometry -
ARTrackedImageManager— marker tracking -
ARAnchorManager— anchor lifecycle management
Correct object placement by tap via ARRaycastManager:
[SerializeField] ARRaycastManager raycastManager;
[SerializeField] ARAnchorManager anchorManager;
[SerializeField] GameObject prefabToPlace;
private List<ARRaycastHit> hits = new List<ARRaycastHit>();
void Update()
{
if (Input.touchCount == 0) return;
var touch = Input.GetTouch(0);
if (touch.phase != TouchPhase.Began) return;
if (raycastManager.Raycast(touch.position, hits, TrackableType.PlaneWithinPolygon))
{
var hitPose = hits[0].pose;
var anchor = anchorManager.AttachAnchor(
hits[0].trackable as ARPlane,
hitPose
);
Instantiate(prefabToPlace, anchor.transform);
}
}
AttachAnchor binds object to specific plane — if plane updates its geometry (ARKit constantly refines shape), object stays on it. Without anchor binding, object drifts.
Image Tracking: Reference Library
Images compiled into XRReferenceImageLibrary via Unity Editor. Limitations: ARKit allows up to 100 images in library with simultaneous tracking up to 4. ARCore — practically no limit on count, but simultaneously also max 20.
[SerializeField] ARTrackedImageManager trackedImageManager;
void OnEnable() => trackedImageManager.trackablesChanged.AddListener(OnImageChanged);
void OnDisable() => trackedImageManager.trackablesChanged.RemoveListener(OnImageChanged);
void OnImageChanged(ARTrackablesChangedEventArgs<ARTrackedImage> args)
{
foreach (var added in args.added)
{
SpawnContent(added.referenceImage.name, added.transform);
}
foreach (var updated in args.updated)
{
// TrackingState.Tracking / Limited / None
SetVisible(updated.referenceImage.name, updated.trackingState == TrackingState.Tracking);
}
}
Performance: Unity + AR = Careful
Unity is not the lightest environment for AR. Typical issues:
Garbage collector pauses. GC in .NET/Mono stops main thread. At 60 FPS AR session, 16ms pause — that's missed frame and object jitter. Solution: use List<T> with pre-allocated capacity (pass to Raycast), avoid new inside Update().
DrawCall overhead. Each AR object without batching — separate drawcall. GPU Instancing in material + Static batching where objects don't move. On Android — Vulkan backend instead of OpenGLES for lower CPU overhead.
Texture Compression. ASTC for iOS and ETC2 for Android. AR Foundation project for both targets — set Override for platform in Texture Import Settings. ASTC on Android requires GPU support (available on all ARCore-compatible).
Build and Publishing
AR Foundation requires separate Player Settings:
- iOS:
Camera Usage Descriptionmandatory (requires App Store),ARKitcapability - Android:
CAMERApermission in AndroidManifest,com.google.ar.corein dependencies
ARCore checks for AR Services at startup — need to handle ARUnavailableException. Without it — silent crash on incompatible devices.
Timeline
Basic AR Foundation integration with plane detection and object placement (iOS + Android): from 5 days. Marker-based AR with image library and custom animations: 1–2 weeks. Complete solution with occlusion, face tracking, and publishing to both platforms: 3–6 weeks.







