Implementing AR Visualization of Architectural Projects
Showing client building that doesn't exist yet — traditionally task of render farm and video presentation. AR changes format: architect arrives at site, aims tablet at land, and three-dimensional model stands exactly in place at 1:1 or 1:100 scale. Client walks around it, looks inside. Not replacement for renders — different tool for different conversation.
Two Modes: Site-Scale and Table-Scale
Site-scale (lot scale, 1:1). Building placed on real site via ARKit Geo Tracking or GPS+compass. User physically walks around virtual construction. Requires A12+ device and supported city for Geo Tracking. For countryside plots — GPS only with 3-8 meter error.
Table-scale (model scale, 1:100 and smaller). Model of district or building placed on table or floor like model. Plane detection + placement. Suitable for office presentations, exhibition booths. Simpler implementation, works on all ARKit/ARCore devices.
Often both modes used with switching: "view as model" / "enter building".
Working with Architectural BIM Models
Architects work in Revit, ArchiCAD, Rhino. Direct import of these formats into ARKit/ARCore impossible — conversion needed:
- Revit → FBX/OBJ via export or Autodesk Forge API → GLTF via
gltf-pipelineor Blender - ArchiCAD → IFC → conversion via IfcOpenShell → GLTF
- Rhino → OBJ/FBX → GLTF
Main BIM model problem: Level of Detail (LOD) too high for real-time. Residential complex in Revit LOD 300 — 10-50M polygons. For AR max 200k-2M. Retopology and LOD — mandatory stage, often takes more time than actual AR development.
Optimization tools: Simplygon (automatic retopology, cloud service), Blender Decimate Modifier (free, manual control), Reality Composer Pro (Xcode 15+) for final packing into .reality with baked textures.
Interactivity: Floors, Apartments, Materials
Client needs more than static model:
Floor selection. Divide model into separate ModelEntity per floor. Tap gesture on floor — highlight selected, others reduce opacity to 0.3. In RealityKit — ModelComponent.materials with SimpleMaterial(color: .yellow.withAlphaComponent(0.5)).
Apartment viewing inside. Separate scene with apartment interior, switch via ARView.scene.anchors.removeAll() + load new scene. Or — enter VR mode (SceneKit + AVPlayer + 360° panorama).
Facade material change. Brick / plaster / glass — swap texture on entity.model?.materials. Textures in .ktx2 format for fast loading.
Streaming 3D Content
Full district model in GLB — 50-200 MB. Can't embed in app. Options:
Progressive loading via Reality Composer Pro: .reality files support streaming — RealityKit loads LOD as camera approaches.
Babylon.js + WebXR in WKWebView: web-based approach with streaming via GLTF extensions (MSFT_lod, KHR_materials_variants).
Cesium for Mobile: if model pinned to geographic coordinates and need 3D Tiles streaming.
Collaborative Viewing (Multiplayer AR)
Multiple people see one model in one space — via MultipeerConnectivity + ARKit collaborative session. ARSession.getCurrentWorldMap() → transfer via Multipeer → ARWorldTrackingConfiguration.initialWorldMap on other devices. All participants see same scene with shared anchors.
Works up to ~30 meters via Wi-Fi/Bluetooth. Large sites — server sync positions via WebSocket.
Timeline
Table-scale visualization with single model no interactivity — 2-3 weeks (including model conversion). With interactive floors, material changes, site-scale via Geo Tracking — 6-10 weeks. Multiplayer AR — plus 2-3 weeks. Cost calculated individually.







