Mobile App for Virtual Exhibitions
A physical exhibition in Milan attracted 3,000 visitors over 5 days. A mobile app with a virtual exhibition serves 30,000 people in the same timeframe without hall rental costs. The challenge isn't just displaying 3D models and photos — a virtual exhibition must convey spatial experience: room navigation, artifact context, ability to examine details up close.
AR vs VR vs 3D Viewer: Format Selection
Augmented Reality (AR). Artifacts are placed in the user's real space — a sculpture in the living room, a painting on the wall. Immersive, requires adequate space, works via ARKit/ARCore. Best use: individual objects, furniture, installations.
WebVR/VR. Fully virtual exhibition space via VR headset or Google Cardboard. Limited audience due to equipment requirements.
3D Walk-through (non-AR). Virtual tour through modeled halls — user navigates like in a first-person game. Wide accessibility (any smartphone), no special conditions needed. Implemented via SceneKit/Metal (iOS), OpenGL ES/Vulkan (Android), or WebGL in WKWebView.
Hybrid Approach. Primary feature — 3D walk-through virtual halls. Optional — AR mode for specific artifacts. This is the most common approach.
3D Exhibition Space: Content Pipeline
Virtual halls are modeled in Blender or SketchUp → exported as glTF. For iOS: conversion to Reality file via Reality Composer Pro with baked lighting (otherwise real-time PBR lighting for large halls is too heavy). For Android: glTF 2.0 with KHR_lights_punctual extension via Filament renderer.
Wall, floor, ceiling textures — lightmap baking in Blender is mandatory: mobile GPU can't handle dynamic shadows for complex geometry. Compress textures to KTX2 / ETC2 (Android) and ASTC (iOS) to reduce VRAM.
3D artifacts: photogrammetry (RealityCapture, Meshroom) for physical objects — scan with 50–200 photos, obtain photorealistic mesh. For 2D art: high-res photo on SCNPlane with emissive material (independent of lighting, color accurate).
Navigation and Interactivity
Room Navigation. Virtual stick on screen (left — movement, right — camera rotation) — gamer standard, but unfamiliar to general audience. Alternative for cultural projects: tap-to-move (tap floor — character walks there), waypoint navigation (room list, automatic camera). Often implement both with toggle option.
Artifact Close-up. Tap object → popup card with info → "View" button → isolated view with orbit camera (orbital controller, pinch zoom, rotation). In isolated view add audio guide: AVAudioPlayer plays MP3 per exhibitId.
AR Mode for Artifacts. From isolated view — "View in AR" button. Standard ARKit placement flow. Return to virtual hall.
Social Features and Analytics
Virtual exhibition without data is a missed opportunity. Track: exhibit_viewed, exhibit_time_spent, audio_guide_played, ar_mode_used, exhibit_shared. Firebase Analytics or Amplitude. Heat map of exhibit popularity for curators.
Sharing: screenshot with artifact + exhibition name → UIActivityViewController → Instagram, WhatsApp, etc. Generate OG card on server for URL sharing in messengers.
Timeline and Stages
Client content audit (what exists, what needs creation) → 3D space → artifact content pipeline → app development → population → testing on various devices → release.
Minimal virtual exhibition (one hall, 10–20 artifacts, no AR): 8–12 weeks. Multi-hall exhibition with AR mode, audio guide, analytics, and sharing: 4–7 months. Cost calculated individually.







