Implementing AR Virtual Try-On (Product Virtual Fitting Room)
Virtual try-on — not just "overlay 3D object on body". Credible fitting requires accurate body pose tracking, proper model scaling per user parameters, and physically correct lighting interaction. Without these three, try-on looks like child's collage — doesn't convert.
Stack for Different Product Categories
"Try-on" task — umbrella term that in practice splits into technically different tasks:
| Category | Technology | Complexity |
|---|---|---|
| Glasses, jewelry, hats | Face Tracking (ARKit/ARCore) | Medium |
| Clothes, shirts | Body Tracking + mesh deformation | High |
| Shoes | Foot tracking / AR ground plane | Medium |
| Bags, hand accessories | Hand/Wrist Tracking | Medium |
| Large items (furniture) | Plane Detection + 3D placement | Low |
Most complex — clothes. Without correct mesh deformation per body pose, clothes look like cardboard cutout over person.
Clothes: Body Tracking and Deformation
iOS: ARBodyTrackingConfiguration (A12+, iOS 13+) gives 91-point skeleton in world coordinates. ARSkeleton3D with jointModelTransforms — matrices for each joint. Over skeleton, we stretch skinned mesh of clothing: each mesh vertex tied to 1-4 joints with weights (skinning weights). When joint moves, vertices follow per weights.
Skinned mesh format — USDZ with SkinningComponent in RealityKit. Mesh preparation in Blender with rigging for standard ARKit skeleton — critical stage. If artist unfamiliar with ARKit joint hierarchy, rib cage and spine deform incorrectly.
Android: ARCore doesn't provide body tracking out of box. Options: MediaPipe Pose (Google), BlazePose (33 keypoints), MoveNet (Google). Not ARKit-level accuracy, but for marketing try-on sufficient. Mesh deformation — via custom OpenGL/Vulkan shader or TensorFlow Lite with pose estimation.
Third-Party SDKs as Alternative
For fashion e-commerce without wanting to write deformer from scratch:
Zakeke — SaaS with AR try-on for e-commerce. SDK for iOS/Android, API integration with catalog. Supports clothes via 2D overlay (not 3D deformation) — faster to implement, less realistic.
Snap AR / Lens Studio — try-on via Snapchat camera. Integration via Camera Kit SDK into native app. Ready templates for clothes, glasses, shoes.
Perfect Corp YouCam SDK — specializes in beauty/fashion. Native SDK for iOS/Android with face+body tracking. Enterprise license.
Scaling for User
Glasses try-on without parameters — face tracking holds them precisely on nose. Clothes — different story. Basic approach: standard size S/M/L with mesh scaling per shoulder BoundingBox from skeleton. Precise fitting with chest/waist circumference requires user input or body scanning (separate task with even more complexity).
Lighting: Why Try-On Looks "Plastic"
PBR material with correct roughness/metallic + ARKit environment probe (AREnvironmentProbeAnchor) or automaticEnvironmentTexturing — minimum set for realistic appearance. Without environment map, shiny bag skin and matte jacket look identical.
Shadows from virtual clothing on real body — only with occluder mesh: invisible body model casts shadow on real floor/walls. In RealityKit — .occlusion material on occluder entity.
Timeline and Strategy
Perfect Corp or Banuba SDK integration with basic effect set (lipstick, eyeshadow, blush) — 2-3 weeks. UI development for product catalog selection, saving/sharing — another 1-2 weeks. Custom implementation on MediaPipe (if SDK price doesn't fit) — from 3 months. Cost calculated individually.







