Implementing AR Makeup Virtual Try-On (Makeup Visualization)
Virtual makeup — technically most demanding face AR scenario. Lipstick should precisely follow lip contour, eyeshadow — lay on eyelid folds, foundation — blend with real skin tone. User instantly notices if boundary "swims" when head rotates 20°. That's why most cosmetics brands don't build this themselves — they take specialized SDK.
Why ARKit/ARCore Insufficient for Cosmetics
ARKit ARFaceAnchor provides 1220-point face mesh (geometry.vertices) and blendshapes (blendShapes) for expressions. Sufficient for masks and AR filters. For lipstick — not: lip contour in ARKit mesh ~80 points, rendering gives angular geometry, especially visible on thin lips.
Specialized SDKs work with 400-900 points just for lips, separately track upper and lower eyelid, bridge, cheek area.
Specialized SDKs for Beauty AR
Perfect Corp YouCam SDK — industry standard. Supports 120+ cosmetic effects out of box: lipstick, blush, concealer, mascara, eyeshadow, contouring. Own ML face mesh with 478 points. Native SDK for iOS (Swift/ObjC) and Android (Kotlin/Java). Enterprise license, price on request.
ModiFace (L'Oréal) — comparable level. Available via L'Oréal Group API and closed to third-party brands without partnership.
Banuba Face AR SDK — flexible alternative with custom effect capability via Banuba Effect Player. Allows creating effects in editor yourself. Public license from $149/mo, enterprise on request. Supports iOS, Android, Web.
Custom implementation via MediaPipe Face Mesh: 468 points, works on iOS and Android via MediaPipe Tasks. Free, open source. Requires developing own cosmetics renderer — that's 3-5 months of work and significantly inferior to specialized SDKs in realism.
Key Rendering Techniques
Lipstick. Lip mesh → UV unwrap → texture with lipstick color, blended via multiply/screen blend mode with original camera image. Opacity controls tone saturation. Edges — feather via gaussian blur on mask.
Eyeshadow. Gradient texture per eyelid accounting surface curvature. Main complexity — when gaze lowers, eyelid closes and shadow must deform with it. Without correct deformation, shadow "peels off" skin.
Foundation / Concealer. Not just overlay — need skin tone matching. Algorithm analyzes skin pixels outside application area, determines undertone (warm/cool/neutral) and adjusts product color for real tone. Without this, foundation looks like mask.
Live Camera vs Photo Mode
For live preview via camera — 30 fps minimum, else lag makes try-on uncomfortable. Perfect Corp SDK maintains 30 fps on iPhone X+, 25 fps on Android with Snapdragon 845+. On weaker Android devices, performance degrades — plan fallback to photo mode.
Photo mode: user selects photo from gallery, gets static makeup render. Technically simpler (no realtime demands), result more accurate due to more processing time.
Timeline and Strategy
Perfect Corp or Banuba SDK integration with basic effect set (lipstick, eyeshadow, blush) — 2-3 weeks. UI development for product catalog selection, saving/sharing — another 1-2 weeks. Custom implementation on MediaPipe (if SDK pricing doesn't fit) — from 3 months. Cost calculated individually.







