LiDAR scanning implementation in iOS AR app

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
LiDAR scanning implementation in iOS AR app
Complex
~3-5 business days
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1054
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

Implementing LiDAR Scanning in iOS AR Applications

LiDAR sensor appeared in iPad Pro 2020, iPhone 12 Pro and newer. ARKit uses its data via ARWorldTrackingConfiguration with enabled sceneReconstruction — and this changes everything about plane detection quality, object occlusion, and scene initialization speed.

Without LiDAR ARKit determines horizontal planes in 2–5 seconds, vertical much longer. With LiDAR get environment mesh in fractions of second. This isn't marketing — this is difference between "AR object appears immediately" and "user waves phone 10 seconds before anything happens".

Where LiDAR Integration Breaks

First problem — ARMeshGeometry gives too dense mesh. Typical frame: 50,000–200,000 vertices per medium room. If pass directly to SceneKit or RealityKit without LOD and culling, FPS drops on A14.

Solution: use ARMeshAnchor and ARMeshGeometry.faces for sparse mesh, for display — ModelEntity with MeshResource.generate(from:) only for visible sections. ARView in RealityKit knows this via sceneUnderstanding.options with .occlusion flag — activates only necessary mesh subset for occlusion calculation, not rendering all.

Second problem — raycast in LiDAR mode. ARRaycastQuery with type .estimatedPlane works differently than .existingPlaneGeometry. On LiDAR devices correct path: ARRaycastQuery(origin:direction:allowing:.estimatedPlane, alignment:.any) with subsequent mesh refinement. Adding .existingPlaneGeometry as fallback — get double hits and placement artifacts.

Third — sessionWasInterrupted. When user backgrounds app, LiDAR session resets accumulated mesh. On restoration must call session.run(configuration, options: [.removeExistingAnchors, .resetSceneReconstruction]) — without .resetSceneReconstruction old ARMeshAnchors overlay new ones with drift.

How We Build LiDAR Pipeline

Use RealityKit 2 as main render layer: directly integrated with ARKit 5+ and uses Metal for mesh rendering without SceneKit CPU overhead. Configuration:

let config = ARWorldTrackingConfiguration()
config.sceneReconstruction = .meshWithClassification
config.environmentTexturing = .automatic
config.frameSemantics = [.personSegmentationWithDepth]
arView.session.run(config)

meshWithClassification enables surface classification (floor, wall, ceiling, window, door) — allows filtering ARMeshAnchor by ARMeshClassification and reacting only to needed surface type. For furniture placement or indoor navigation apps this is critical.

For object occlusion behind real items enable:

arView.environment.sceneUnderstanding.options = [.occlusion, .physics]

.physics adds AR object collisions with real surfaces — AR cube falls on table and doesn't pass through.

Case: furniture try-on app, iPhone 13 Pro. Without LiDAR occlusion sofa "hung" over user's legs in selfie. With .occlusion legs correctly occlude AR object. Plane initialization time — 0.3 seconds vs 4.2 seconds on non-LiDAR device.

Fallback for Non-LiDAR Devices

LiDAR only iPhone 12 Pro+. For broad coverage write two paths:

if ARWorldTrackingConfiguration.supportsSceneReconstruction(.mesh) {
    // LiDAR path
} else {
    // Plane detection fallback
    config.planeDetection = [.horizontal, .vertical]
}

This not simple if — different UX scenarios. On non-LiDAR show "point at surface" indicator, on LiDAR — immediately offer placement.

What's Included

  • Setting up ARWorldTrackingConfiguration with sceneReconstruction and surface classification
  • Implementing occlusion and physics via RealityKit sceneUnderstanding
  • Optimizing mesh rendering (LOD, frustum culling, sparse mesh)
  • Correct raycast with LiDAR and fallback for non-LiDAR devices
  • Handling session interruptions and state recovery
  • Testing on real devices (iPhone 12 Pro+, iPad Pro)

Timeline

Complexity Timeline
Basic LiDAR integration with occlusion 1–2 weeks
Full pipeline + fallback + optimization 3–5 weeks
Custom surface classifiers + AR physics 6–8 weeks

Cost calculated individually after analyzing AR scene requirements and target devices.