Implementing AR Placement of 3D Objects on Plane
Putting a 3D model on detected plane — not an evening task if production stability matters. Object should sit level when camera moves, not "swim" with lighting change, scale reasonably, not sink into surface by 5 centimeters. Each of these — separate technical question.
3D Model Formats and Optimization for AR
Mobile AR standard — USDZ (iOS/RealityKit) and GLB/GLTF (ARCore/cross-platform). Typical mistake: take model from Blender or 3ds Max unoptimized and try loading in AR. 500k triangles polygon count, 4096×4096 textures without mip-mapping — guaranteed FPS collapse on iPhone 12 and lower.
Target parameters for mobile AR:
| Object Type | Polygons | Textures | GLB Size |
|---|---|---|---|
| Small item (chair, lamp) | up to 30k | 1024×1024 | up to 5 MB |
| Medium object (sofa, table) | up to 80k | 2048×2048 | up to 15 MB |
| Large (furniture set, kitchen) | up to 200k | 2048×2048 | up to 40 MB |
Compression: KTX2 + Basis Universal for GLB, HEIC textures in USDZ. In RealityKit — Reality Composer Pro (Xcode 15+) for baking physical PBR materials straight into .reality format.
Anchor, Raycast, and Why hitTest is Outdated
In ARKit 4+ and ARCore 1.18+, preferred way to determine placement point — Raycast, not deprecated hitTest. Difference is substantial:
ARSession.raycast(from:allowing:alignment:) returns list of ARRaycastResult with target: .estimatedPlane or .existingPlaneGeometry. existingPlaneGeometry more accurate — uses geometry of already-detected plane. estimatedPlane works where plane not yet fixed.
guard let query = arView.makeRaycastQuery(
from: arView.center,
allowing: .existingPlaneGeometry,
alignment: .horizontal
) else { return }
let results = arView.session.raycast(query)
if let first = results.first {
placeEntity(at: first.worldTransform)
}
For ARCore — Frame.hitTest() still documented, but Session.createRaycastQuery() + Frame.raycast() gives more stable result at plane edges.
Stabilizing Object Position
After first placement, object shouldn't "walk" when camera moves. Standard approach — anchoring to AnchorEntity:
let anchor = ARAnchor(transform: worldTransform)
arView.session.add(anchor: anchor)
let anchorEntity = AnchorEntity(anchor: anchor)
anchorEntity.addChild(modelEntity)
arView.scene.addAnchor(anchorEntity)
Without explicit ARAnchor, object continues updating position with plane updates. Especially noticeable first 10-15 seconds when ARKit actively refines plane geometry.
Shadows and Physical Lighting
AR object without shadow looks "flying". RealityKit renders contact shadow automatically for .castsShadow = true. In SceneKit — SCNLight of type .ambient + directional light with castsShadow = true.
ARKit Environment Texturing (available A12+): ARWorldTrackingConfiguration.environmentTexturing = .automatic — ARKit builds environment map from camera and applies to PBR materials. Metal and glossy surfaces start reflecting real environment. Without this, chrome item looks plastic.
Moving and Rotating After Placement
Drag gesture to move object — via ARView.installGestures(.translation, for: entity) in RealityKit. For custom behavior — UILongPressGestureRecognizer + continuous raycast during finger movement.
Rotation around vertical axis — ARView.installGestures(.rotation, for: entity) or manual via UIPanGestureRecognizer with simd_quatf(angle:axis:). Must limit rotation axis to Y only — else object starts "tilting" on imprecise gesture.
Timeline
Basic single-object placement with raycast, anchor, shadow — 4-6 days. Multi-object placement, move/rotate, multiple model format support — 10-15 days. Cost calculated individually.







