Retail AR Mobile Application Development
A customer stands before a furniture store shelf of sofas, unsure whether that sofa will fit in their living room between the window and TV stand. This is where AR stops being a toy and becomes a sales tool. Correctly implemented furniture placement via ARKit or ARCore converts hesitation into purchase—documented in IKEA Place and Wayfair case studies. Poorly implemented—drops app rating to 2.1 with reviews of "model floats across the floor".
Where ARKit/ARCore Fails in Retail Scenarios
Horizontal plane detection on non-typical surfaces. ARKit works well on textured parquet and completely fails on solid white laminate or plain carpeting—ARPlaneDetection.horizontal returns anchors with enormous uncertainty, model drifts or "sinks" below floor. Solution: LiDAR on iPhone 12 Pro and newer via ARWorldTrackingConfiguration with sceneReconstruction: .meshWithClassification. On non-LiDAR devices, add manual placement with drag gesture and visual surface indicator.
Scale and real-world dimension matching. 3D models for AR must have correct 1:1 scale in meters. Often designers provide USDZ or GLB without size metadata—a 2.4 m sofa displays as a stool. Add SCNNode with explicit simdScale based on product dimensions from catalog metadata.
Lighting. AREnvironmentProbeAnchor and ARDirectionalLightEstimate provide baseline estimation, but overexposure in retail halls with cold fluorescent lighting makes dark sofas look "plastic". In iOS 16+, use AREnvironmentTexturing.automatic for environment maps—noticeably improves PBR materials without manual tuning.
Separate pain point—iPhone X users (no LiDAR, no second-gen Neural Engine): A11 Bionic handles it, but loading heavy 15+ MB USDZ models drops FPS below 30. Geometry optimization via Reality Composer Pro or Blender + USDZ Tools is mandatory.
Building Retail AR
Stack. iOS—ARKit 6 + RealityKit 2 (preferred) or SceneKit for legacy support. Android—ARCore 1.40 + Scene Viewer or custom render via Filament. Cross-platform—Flutter with ar_flutter_plugin or React Native + react-native-arkit/ViroReact, but native solutions deliver noticeably smoother tracking.
3D content pipeline. Receive source files from supplier (usually OBJ or FBX), convert to USDZ (iOS) and glTF 2.0 (Android) via Reality Converter and Blender 4.x. Automate via scripts—500+ SKU catalogs cannot be processed manually. Simultaneously set up CDN delivery with device caching via URLCache or custom disk cache, avoiding 20 MB loads on each product view.
Catalog integration. AR feature embedded in existing app as separate module. Product from catalog passes ID → backend returns 3D model URL and dimensions → ARViewController loads and places. Important: handle fallback—no 3D model available, show standard product screen, not crash.
Product try-on for clothing and accessories. Different story—requires face/body tracking. ARKit provides ARFaceTrackingConfiguration (front camera only, iPhone X+), for full-body clothing try-on—ARBodyTrackingConfiguration (A12+). Algorithm: detect skeleton, overlay 3D clothing mesh with morph targets at key skeleton points.
Workflow
- Existing app audit—iOS/Android versions, supported devices, current 3D content pipeline or lack thereof.
- AR UX design—use cases, AR screen wireframes, gesture definitions (place, rotate, scale, snap-to-grid).
- AR module development—native or cross-platform, catalog integration.
- 3D content optimization—first batch of models for pilot.
- QA on real devices—tracking testing in varied lighting conditions, different surfaces.
- Publication and support—updates for new ARKit/ARCore versions.
Timeline Estimates
Minimal AR feature (placement for single product category) in existing app: 3–5 weeks. Full AR module with catalog support, CDN, analytics, and Android version: 2–4 months. Costs calculated individually after requirements audit and 3D content volume.







