360-Degree Photo Viewer in Mobile App
360° photo — equirectangular image: spherical projection where 360° horizontal and 180° vertical packed into rectangle with 2:1 aspect ratio. On phone need to "unwrap" back into sphere and let user view from inside. Task not just showing picture — task is rendering sphere without distortion artifacts and scrolling smooth even on 50 megapixel JPEG.
Why Standard UIImageView or ImageView Doesn't Work
UIImageView displays flat image. For spherical projection need 3D render: sphere with inverted normals (to view from inside), equirectangular texture on it, camera at center, rotation control via gyroscope or touch.
SceneKit (SCNSphere) — fastest path on iOS. Metal — if full shader control needed. OpenGL ES on Android via GLSurfaceView or Vulkan for modern devices. Most projects use SceneKit on iOS and Filament/OpenGL ES 3.0 on Android.
iOS Implementation with SceneKit
let sceneView = SCNView(frame: view.bounds)
let scene = SCNScene()
// Sphere with inverted normals
let sphere = SCNSphere(radius: 10.0)
sphere.segmentCount = 96 // more segments = less distortion at poles
let material = SCNMaterial()
material.isDoubleSided = true
material.diffuse.contents = UIImage(named: "pano.jpg")
sphere.materials = [material]
let sphereNode = SCNNode(geometry: sphere)
sphereNode.scale = SCNVector3(-1, 1, 1) // invert normals
scene.rootNode.addChildNode(sphereNode)
let camera = SCNCamera()
camera.fieldOfView = 90
let cameraNode = SCNNode()
cameraNode.camera = camera
scene.rootNode.addChildNode(cameraNode)
sceneView.scene = scene
Control via gyroscope — CMMotionManager → Euler angles → SCNNode.rotation. Via touch — UIPanGestureRecognizer with angle accumulation. Inertia on swipe via SCNAction with easing, not physics — SceneKit physics adds unpredictable damping.
Loading Large Images Without OOM
50 megapixel JPEG after decoding to RGB takes ~150 MB in memory. On iPhone SE 2nd gen with 3 GB RAM that's already third of available memory — background processes get SIGKILL for memory pressure.
Solution — tiled loading via CATiledLayer or downscale before SceneKit:
// Progressive loading: preview 2048px first, then full resolution
let thumbnail = image.preparingThumbnail(of: CGSize(width: 4096, height: 2048))
material.diffuse.contents = thumbnail
// Asynchronously load full resolution
Task.detached(priority: .background) {
let full = await loadFullResImage(url: imageURL)
await MainActor.run { material.diffuse.contents = full }
}
preparingThumbnail(of:) — async API (iOS 15+), decodes in background thread without main run loop blocking.
Google Photo Sphere XMP Metadata Support
Panorama cameras and Google Street View write JPEG XMP metadata with projection type and crop info (GPano:ProjectionType, GPano:CroppedAreaImageWidthPixels). Without reading incomplete panorama (270° instead 360°) shows as complete — with stretched edges.
Read via CGImageSourceCopyPropertiesAtIndex → XMP metadata → parse GPano:* fields, correct UV-mapping. This 30–50 lines code but without it third of real panoramas look wrong.
Android
Use Filament (Google's physically-based renderer) or Panorama360View via GLSurfaceView. Filament requires more setup but gives correct tone mapping for HDR panoramas.
Timeline
Basic viewer (iOS, SceneKit, gyroscope + touch): 1–1.5 weeks. Full implementation with tiled loading, XMP metadata, iOS + Android: 3–4 weeks. Cost calculated individually.







