AR scene reconstruction 3D mesh building

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
AR scene reconstruction 3D mesh building
Complex
~5 business days
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1054
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

Implementing 3D Scene Reconstruction (Scene Reconstruction Mesh) in AR

Scene Reconstruction is not just "AR sees floor". It's a live environment mesh that ARKit updates in real-time as camera moves. ARMeshAnchor accumulates room geometry, surfaces get classified, app gets data to work with: detect obstacles, build navigation graphs, run physical simulation.

Implementing this without killing performance is separate task.

Mesh Architecture and Main Pitfalls

ARMeshGeometry stores vertices, normals and triangle indices. Updates via session(_:didUpdate:) delegate — each frame ARKit can send dozens of updated ARMeshAnchor. Naive implementation recreating MeshResource or SCNGeometry on each update kills main thread in seconds.

Correct approach: update mesh only for changed anchors, use MDLMesh as intermediate format and pass data to Metal buffers directly. In RealityKit looks like:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
    for anchor in anchors.compactMap({ $0 as? ARMeshAnchor }) {
        updateMeshVisualization(for: anchor)
    }
}

func updateMeshVisualization(for anchor: ARMeshAnchor) {
    let geometry = anchor.geometry
    // Work with geometry.vertices, geometry.faces directly
    // Don't create new MeshResource every time — patch existing
}

Second pitfall — surface classification. ARMeshClassification gives: .floor, .ceiling, .wall, .door, .window, .seat, .table, .none. But classification only works with sceneReconstruction = .meshWithClassification and only on LiDAR devices. Without checking ARWorldTrackingConfiguration.supportsSceneReconstruction(.meshWithClassification) — crash or silent ignoring.

Third — mesh coordinates in world space. ARMeshGeometry.vertices give local coordinates relative to ARMeshAnchor.transform. To get world coordinates multiply each vertex by anchor's transformation matrix. Forgot — mesh renders in wrong place.

How We Build Scene Reconstruction

Basic stack: ARKit 5+ + RealityKit 2 + Metal. Don't use SceneKit for mesh — not optimized for dynamic geometry.

Session setup:

let config = ARWorldTrackingConfiguration()
config.sceneReconstruction = .meshWithClassification
arView.debugOptions = [.showSceneUnderstanding] // for debugging
arView.session.run(config)

For mesh visualization in debug mode draw wireframe via arView.debugOptions. In production hide mesh visibility, but use data for:

  • Occlusion — objects behind walls invisible
  • PhysicsCollisionComponent interacts with real geometry
  • Raycast — accurate hit on real surfaces, not just planes

Case from practice: warehouse navigation AR app. Needed to detect obstacles (shelves, pallets) and build route. Used Scene Reconstruction for occupancy grid: each mesh vertex with .none classification (unrecognized object) added to obstacle graph. NavMesh updated every 2 seconds — balance between freshness and CPU load. On iPad Pro M2 keeps 60 FPS without drops.

Fallback and Diagnostics

On non-LiDAR devices Scene Reconstruction unavailable. Offer degradation to plane detection with manual mesh capture via ARPlaneAnchor. Worse, but better than blank screen.

For mesh quality diagnostics — ARView.debugOptions.insert(.showSceneUnderstanding): green wireframe shows what ARKit sees. Useful testing non-standard environments (glossy floors, mirrors — LiDAR works poorly due to reflection).

Timeline

Basic integration with mesh visualization — 1–2 weeks. If need surface classification, physical collisions and mesh-based navigation — 4–6 weeks. Cost calculated after detailed discussion of requirements.