Integrating AR Advertising Formats in Mobile Applications
AR advertising is not banner with 3D animation. It's either product try-on, object placement in user space, or face filter with branding. Each format — separate tech stack and separate integration requirements.
Three Main Formats and Implementation
Place-in-Room: AR Object in User Interior
User points camera at floor or table — sneakers, sofa, TV appear in room. Most common e-commerce format.
Stack: ARKit + RealityKit or SceneKit. Models in USDZ format (iOS) or GLB (Android + Sceneform / model-viewer). Integration with advertising SDKs:
- Meta Audience Network supports AR ad blocks via Spark AR platform — ready effects embed in WebAR or via SDK
- Google AR Ads via Google Web Designer + model-viewer on WebAR
- Own implementation via ARKit/ARCore without ads SDK — full control, no built-in analytics
Key point — initialization time. Ad AR block must work in 2–3 seconds from tap to AR scene. Plane detection without LiDAR — 3–6 seconds. Users leave. Solution: show animated overlay "Point at flat surface" and start plane detection early, on banner load.
Face AR: Filters and Try-On
Glasses, hats, makeup, masks. ARKit Face Tracking via ARFaceTrackingConfiguration:
let config = ARFaceTrackingConfiguration()
config.isLightEstimationEnabled = true
arView.session.run(config)
ARFaceAnchor gives 52 blend shape coefficients — mimics, blinking, mouth movement. Glasses overlay on ARFaceAnchor.transform, accounting face geometry for correct bridge fit.
Face tracking problem: works only on TrueDepth camera devices (iPhone X+). On non-TrueDepth need Vision Framework + face landmark detection — accuracy worse, but coverage higher.
Marker-based AR: Advertising Activation via Physical Object
Point camera at product package or printed banner — 3D animation or video appears. ARImageTrackingConfiguration with ARReferenceImage:
let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AdMarkers", bundle: nil)
config.trackingImages = referenceImages
config.maximumNumberOfTrackedImages = 3
Marker image must have high uniqueness (histogramContrastScore > 0.85 by ARKit's internal metric). Packages with large monochromatic zones (white boxes, minimalist design) work poorly. Need textured markers with good contrast.
AR Advertising Analytics
Standard ad analytics (impression, click) for AR supplemented with specific metrics:
- Engagement time — how many seconds user held AR object on screen
- Interaction rate — share of users who rotated/resized object
- Session completion — reached CTA (button "Buy", "Learn More")
Firebase Analytics + custom events standard approach. Send events ar_session_started, ar_object_placed, ar_interaction, ar_cta_tapped with parameters (ad_id, placement_id, duration).
What's Truly Critical for AR Advertising
AR resource sizes. USDZ model for place-in-room shouldn't exceed 10–15 MB — else user leaves during loading. Optimization: texture compression via TextureConverter, LOD with 3 detail levels, baked lighting instead PBR realtime.
Fallback for weak devices. iPhone 7 doesn't support ARKit with planes normally. Detect via ARWorldTrackingConfiguration.isSupported and show 2D/3D spinning object as degradation — better than white screen.
Timeline
| Format | Timeline |
|---|---|
| Place-in-room (iOS + Android) | 3–5 weeks |
| Face filter with branding | 2–4 weeks |
| Marker-based AR | 1–2 weeks |
| Full advertising AR SDK with analytics | 8–12 weeks |
Cost calculated individually. Important to understand target devices and required ad formats before starting work.







