Implementing AR Distance and Area Measurement
AR measuring tape — one of few AR use cases where user immediately sees value. But production accuracy is painful: on iPhone 11 without LiDAR, error can reach 5-8%, on iPad Pro with LiDAR — less than 1%. Need to honestly communicate this difference at design stage.
LiDAR vs SLAM Measurements: Real Difference
Without LiDAR (ARKit Visual SLAM): ARKit determines distance via triangulation of feature points. Accuracy heavily depends on surface texture, lighting, camera path. Up to 2 meters — 2-4% error with good lighting. 5+ meters — 5-10%. Monochrome surfaces (white table, uniform floor) add +3-5% to error.
With LiDAR (iPhone 12 Pro+, iPad Pro): rangefinder shoots IR beam, error — 0.5-1.5% at distance up to 5 meters. ARKit with sceneReconstruction: .meshWithClassification builds dense mesh, from which you pick points without raycast along plane.
For app needing certification accuracy (construction, insurance, real estate), LiDAR — not option, requirement. For household use SLAM sufficient.
How Distance Measurement is Built
User taps screen — point created in world coordinates via raycast. Distance between two points — Euclidean distance in 3D space:
func distance(_ a: simd_float3, _ b: simd_float3) -> Float {
return simd_distance(a, b)
}
To display line between points in RealityKit — ModelEntity with MeshResource.generateBox(size:) stretched along vector between points and rotated via simd_look(at:from:up:relativeTo:). Alternative — SCNGeometry with two vertices and .line primitive in SceneKit.
Text label with distance — ModelEntity with MeshResource.generateText() or billboard SCNNode with SCNBillboardConstraint to always face camera.
Area Measurement
Polygon area via points user places along perimeter. Gauss algorithm (shoelace formula) for 2D projection:
func polygonArea(_ points: [simd_float3]) -> Float {
// Project to horizontal plane (XZ)
var area: Float = 0
let n = points.count
for i in 0..<n {
let j = (i + 1) % n
area += points[i].x * points[j].z
area -= points[j].x * points[i].z
}
return abs(area) / 2
}
Important nuance: if points placed on surface with slope (e.g., stairs), XZ projection gives underestimated area. For sloped surfaces need 3D area via cross product.
UI Pattern for Accurate Hit
Crosshair in screen center with confidence indicator — user should know when point fixed accurately. If ARKit returns ARRaycastResult with .estimatedPlane — show warning "aim at surface". If .existingPlaneGeometry — all good.
Snapping to already-placed points within < 5 cm in screen space — mandatory for closing polygon in area measurement. Without it, user can't precisely close contour.
Exporting Results
Typical requirements: save screenshot with applied measurements, export to PDF or JSON. ARView.snapshot(saveToHDR:completion:) in RealityKit. For PDF — UIGraphicsPDFRenderer with overlaid ARView.snapshot on schematic floor plan.
Timeline
Linear distance measurement with two points — 4-6 days. Multi-point area measurement with export — 10-14 days. LiDAR mesh picking support — additional 3-4 days. Cost calculated individually.







