Mobile VR App Development (Google Cardboard)
Google Cardboard is the most accessible path to VR: cardboard housing for a few dollars, smartphone inside. For developers this means: stereoscopic rendering, head tracking via IMU, and strict performance constraints — all on standard mobile hardware without specialized VR processor.
Google Cardboard SDK
After Google open-sourced Cardboard SDK in 2021, it became the official path for Cardboard apps on iOS and Android. SDK provides:
- Lens distortion correction (each housing has distortion profile, scanned via QR code)
- Head tracking via IMU fusion (accelerometer and gyroscope data)
- Eye matrices for correct per-eye projection
- Trigger button handling (magnetic button in cardboard Cardboard)
Unity integration — com.google.cardboard UPM package. After connecting:
// CardboardHeadTracker read via Cardboard.SDK
void Update() {
Cardboard.SDK.UpdateScreenParams();
// Head position/rotation applied automatically via CardboardCamera component
}
Native Android integration via CardboardHeadTracker and CardboardLensDistortion:
// Initialization
headTracker = CardboardHeadTracker.create();
lensDistortion = CardboardLensDistortion.create(encodedDeviceParams, width, height);
// In render loop
headTracker.getPose(monotonic_time_nanos, target_time_nanos, outEyeFromHead);
Stereoscopic Rendering: Split-screen
Screen splits in half. Left half for left eye, right for right eye. Each half renders with slight camera offset (IPD — interpupillary distance, ~63–65mm). Difference between two images creates stereo effect.
In Unity this is managed via CardboardCamera component with two render textures. Each texture renders separately, then barrel distortion correction applies.
Critical performance issue: two-pass rendering doubles GPU load. On average 2020 smartphone at 1080p this is 30–40 FPS without optimizations. Solutions:
- Foveated rendering — reduce resolution at screen edges (non-central area seen through lenses)
- Single Pass Instanced Rendering in Unity (both eyes rendered in one draw call)
- Reduce render texture resolution to 0.7–0.8 of screen resolution
Head Tracking and Motion Sickness
Motion sickness occurs when visual latency > 20ms. IMU runs at 200–1000Hz, but rendering at 60Hz. To compensate use ATW (Asynchronous TimeWarp) — reproject last frame accounting for head orientation change between render and screen output.
In Cardboard SDK ATW is implemented automatically via additional compositing pass. Ensure Cardboard.SDK.UpdateScreenParams() is called at Update() start, not less frequently.
Scene design recommendations to reduce motion sickness:
- No acceleration-based locomotion (teleportation preferred)
- Stable horizon or cockpit-reference (cabin, interior) reduces discomfort
- Don't scale world relative to player at runtime
Input: Button and Gaze
Basic Cardboard has one button. All interaction builds on:
- Gaze input — cursor follows gaze, activation by gaze fixation (dwell time, usually 1.5–2 sec)
- Trigger button — selection confirmation, teleportation
Gaze reticle (cursor) renders in world space at fixed distance from camera. Raycast from each eye center determines object under cursor:
// Unity: Gaze Raycaster
void Update() {
Ray ray = new Ray(Camera.main.transform.position,
Camera.main.transform.forward);
if (Physics.Raycast(ray, out RaycastHit hit, maxDistance, interactableLayer)) {
gazeTarget = hit.collider.GetComponent<IGazeable>();
gazeTarget?.OnGazeEnter();
gazeTimer += Time.deltaTime;
if (gazeTimer >= DWELL_TIME) {
gazeTarget?.OnGazeActivate();
gazeTimer = 0f;
}
} else {
gazeTarget?.OnGazeExit();
gazeTimer = 0f;
gazeTarget = null;
}
}
QR-scanning Device Profile
On first launch user scans QR code from Cardboard housing. SDK loads lens distortion profile for specific housing. Without this step distortion is incorrect — image looks deformed.
Cardboard SDK stores profile in SharedPreferences / NSUserDefaults after scanning. Add first screen instruction with QR icon and explicit "Rescan device" button in settings.
Workflow
Define app type: passive experience (360 video, tour) or interactive (game, learning).
Set up Cardboard SDK: Unity package or native integration, projection configuration.
Develop scene accounting for VR constraints: no acceleration locomotion, gaze input.
Optimize performance: Single Pass Instanced, foveated rendering, LOD.
Test on devices across price segments, assess comfort.
Timeline Estimates
Simple passive VR experience (360 content, basic gaze navigation) — 1–2 weeks. Interactive VR app with game mechanics, multiple scenes, full UI — 2–3 months.







