Mobile VR App for 360° Video Viewing
360° video isn't ordinary video in spherical wrapper. It's equirectangular projection that must unwrap onto sphere around user, synchronize with head movement, and avoid seam artifacts — all at framerate sufficient for comfortable VR.
Technical Side: Sphere Mapping
Equirectangular video (2:1 aspect ratio, YouTube and Facebook 360 standard) maps onto inverted sphere — user views from inside. Comfortable viewing resolution: minimum 4K (3840×2160), preferably 5.7K or 8K. At 4K ~30 pixels per degree reaches each eye — threshold below which "pixel grid" is visible.
In Unity create inverted sphere with normals facing inward:
// Standard Unity Sphere + special material
// Or via com.unity.xr.management package
// Shader for 360 video
// vert: pass UV as-is
// frag: sample _MainTex with UV, horizontal flip for correct orientation
On native Android via MediaPlayer + OpenGL ES:
// Create Surface for MediaPlayer, render as texture on sphere
SurfaceTexture surfaceTexture = new SurfaceTexture(textureId);
Surface surface = new Surface(surfaceTexture);
mediaPlayer.setSurface(surface);
mediaPlayer.prepareAsync();
On iOS — AVPlayer + SCNSphere in SceneKit or RealityKit:
let sphere = SCNSphere(radius: 10)
sphere.firstMaterial?.isDoubleSided = true // or inverted normals
sphere.firstMaterial?.diffuse.contents = avPlayer
let sphereNode = SCNNode(geometry: sphere)
sphereNode.scale = SCNVector3(-1, 1, 1) // flip X for correct direction
sceneView.scene.rootNode.addChildNode(sphereNode)
Stereoscopic 360: Top-Bottom and Side-by-Side Formats
Regular 360 video is monocular. Stereoscopic 360 (true VR depth effect) encodes two ways:
- Top-Bottom (TB): upper half frame = left eye, lower = right. 1:1 aspect ratio instead of 2:1.
- Side-by-Side (SBS): left eye — left half, right — right half. 4:1 aspect ratio.
When rendering on sphere pass format type to shader and correctly sample UV per eye.
Streaming: HLS/DASH for 360
360 video is 2–8 GB files for 10-minute content at 4K–8K. Full download before playback unacceptable. Solution — adaptive streaming.
For 360 HLS need special segment cutting accounting for spherical projection — ideally use Google's Spatial Media spec embedding projection metadata directly in file. ffmpeg with --spherical flag when creating manifest.
Adaptive bitrate switching critical: on head rotation entire sphere visible but main load on gaze direction area. Viewport-dependent streaming (or Tile-based streaming) delivers high resolution only for current gaze direction. Reduces traffic 3–5x. Implemented via MPEG-OMAF or custom DASH server with viewport info.
Spatial Audio
360 video without positional audio is half the experience. Ambisonics (B-format or AmbiX) — spatial format where sound automatically orients to gaze direction.
On Android — AndroidMediaPlayer + Google Resonance Audio SDK (built into Google Cardboard SDK). On iOS — AVAudioEngine with AVAudioEnvironmentNode for spatial source positioning.
Unity: com.google.resonance-audio package or built-in Unity Spatial Audio with Ambisonics support from Audio Settings.
Caching and Offline
User wants to watch 360 tours without internet. Preload: background DownloadManager (Android) / URLSessionDownloadTask (iOS), store HLS segments on device. For tour catalog — SQLite with metadata (preview frame, duration, description) and paths to local files.
Workflow
Define content strategy: 360 video sources, monocular or stereoscopic, formats.
Develop player: sphere rendering, UV-mapping for file formats, head tracking sync.
Streaming: HLS/DASH integration, adaptive quality, offline caching.
Spatial audio integration.
Test on target devices in Cardboard, assess latency and motion sickness.
Timeline Estimates
Basic monocular 360 video player with local playback — 1 week. Full-featured player with streaming, spatial audio, stereoscopic format, offline — 3–6 weeks.







