Implementing AR Furniture Placement in Interior
IKEA Place made this scenario mass in 2017 — user expectations grew since. "Just put model" already insufficient: furniture must sit perfectly level on detected floor, cast shadows correctly, interact realistically with room lighting, not sink 5 cm into floor on non-LiDAR iPhone.
Preparing 3D Furniture Models
Often underestimated part. Catalog of 500 items, each — GLTF with correct PBR materials, accurate real-size metadata, pivot point strictly on bottom surface of object.
Typical problems receiving models from client:
- Pivot point in center — table flies at its center height
- Scale in centimeters not meters — sofa room-sized
- Textures in separate files (not embedded in GLB) — model loads without textures
- Y-up vs Z-up mismatch — table lies on side
Catalog conversion and normalization via Blender Python API (batch script) or via Cesium ion / Sketchfab API — depends on catalog scale.
Object Placement: From Raycast to Stable Position
Standard pipeline already described in task 514, but furniture-specific: objects large, user wants precise placement not in room center, but at specific wall. This means:
- Detect both horizontal and vertical planes simultaneously
- Snapping to walls — object "sticks" 15 cm from vertical plane
- Collision detection between objects — two sofas shouldn't overlap
Collision detection in RealityKit — CollisionComponent with ShapeResource.generateBox(size:). ARView.scene.subscribe(to: CollisionEvents.Began.self) — collision event. On intersection — visual red highlight and placement prohibition.
LiDAR: Occlusion and Realistic Space Interaction
On LiDAR devices (iPhone 12 Pro+, iPad Pro), enable ARWorldTrackingConfiguration.sceneReconstruction = .meshWithClassification. ARKit builds dense environment mesh with surface classification (wall, floor, ceiling, furniture, door, window).
Two applications for furniture fitting:
Occlusion (blocking): real furniture blocks virtual sofa when user walks behind it. Enabled via ARView.environment.sceneUnderstanding.options = [.occlusion]. Without LiDAR — no correct occlusion, object always on top.
Placement without visible floor: user wants shelf placement, but floor covered by neutral-texture rug — SLAM poorly detects. LiDAR-mesh floor builds independent of texture — placement works.
Interior Lighting
ARWorldTrackingConfiguration.environmentTexturing = .automatic — ARKit builds HDR environment map from camera. Works but with delay: first 5-10 seconds, object lit incorrectly. For furniture app where user sees object immediately after placement — noticeable.
Improvement: AREnvironmentProbeAnchor with manual placement in room center. Allows forcing environment map update on demand (e.g., "update lighting" button).
Multi-Object Placement and Scene Saving
User places several items, wants save result and return later. ARSession.getCurrentWorldMap(completionHandler:) — saves ARWorldMap state with anchors to Data. Next launch: ARWorldTrackingConfiguration.initialWorldMap = savedMap, ARKit relocalizes and restores object positions.
Works only in same room with sufficient lighting. Relocalization takes 3-15 seconds.
Result screenshot for sharing — ARView.snapshot(saveToHDR:completion:) + UIActivityViewController.
Timeline
Basic single-object placement with plane detection — 5-7 days. Multi-object with collision, wall snapping, scene saving — 3-5 weeks. LiDAR occlusion support — plus 1 week. Model catalog conversion — assessed separately by volume. Cost calculated individually.







