3DoF Head Tracking Implementation in Mobile VR Apps
3DoF (three degrees of freedom) — rotation: pitch (nod), yaw (turn), roll (tilt). Smartphone knows orientation via IMU — accelerometer and gyroscope. Fusing two sensor data into stable orientation without drift accumulation — that's where complexity lies.
IMU Fusion: Why Gyroscope Alone Isn't Enough
Gyroscope measures angular velocity with high precision and low noise. Integrate over time — get rotation angle. Problem: numerical integration accumulates error. After minutes gyroscope "drifts" by several degrees — virtual horizon shifts.
Accelerometer in statics points to Earth center — absolute orientation. Problem: accelerometer can't distinguish gravity from acceleration during movement, data is noisy.
Solution — Complementary Filter or Madgwick/Mahony filter:
// Android: simplified Complementary Filter
class ComplementaryFilter(val alpha: Float = 0.98f) {
private var pitch = 0f
private var roll = 0f
fun update(gyroDelta: FloatArray, accel: FloatArray, dt: Float) {
// Angle from gyroscope (fast, accurate short-term)
val gyroPitch = pitch + gyroDelta[0] * dt
val gyroRoll = roll + gyroDelta[1] * dt
// Angle from accelerometer (slow, absolute orientation)
val accelPitch = Math.toDegrees(Math.atan2(accel[1].toDouble(), accel[2].toDouble())).toFloat()
val accelRoll = Math.toDegrees(Math.atan2(-accel[0].toDouble(), accel[2].toDouble())).toFloat()
// Mix: 98% gyroscope + 2% accelerometer
pitch = alpha * gyroPitch + (1f - alpha) * accelPitch
roll = alpha * gyroRoll + (1f - alpha) * accelRoll
}
}
alpha = 0.98 — standard value. On fast head movements temporarily lower alpha (more trust to accelerometer), on slow movements — raise.
Android SensorManager: Read IMU Correctly
On Android IMU read via SensorManager. Two options: TYPE_ROTATION_VECTOR (already fused by system) or raw TYPE_GYROSCOPE + TYPE_ACCELEROMETER + custom fusion.
TYPE_GAME_ROTATION_VECTOR — specifically for games: doesn't use magnetometer (compass), independent of nearby metal objects. For VR — best choice:
val sensorManager = getSystemService(SENSOR_SERVICE) as SensorManager
val gameRotationSensor = sensorManager.getDefaultSensor(Sensor.TYPE_GAME_ROTATION_VECTOR)
sensorManager.registerListener(object : SensorEventListener {
override fun onSensorChanged(event: SensorEvent) {
// event.values: [x, y, z, w] quaternion
val rotationMatrix = FloatArray(16)
SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values)
// Apply to camera transform
updateCameraRotation(rotationMatrix)
}
override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
}, gameRotationSensor, SensorManager.SENSOR_DELAY_FASTEST) // ~500Hz
SENSOR_DELAY_FASTEST — critical for VR. SENSOR_DELAY_GAME (~50Hz) causes noticeable lag on fast turns.
iOS: CoreMotion and CMMotionManager
On iOS it's simpler: CMMotionManager.deviceMotion already contains fusion from accelerometer, gyroscope, magnetometer. Attitude returns as CMAttitude with roll, pitch, yaw or quaternion:
let motionManager = CMMotionManager()
motionManager.deviceMotionUpdateInterval = 1.0 / 60.0 // 60Hz minimum, prefer 90+
motionManager.startDeviceMotionUpdates(
using: .xArbitraryZVertical, // independent of magnetic north
to: .main
) { [weak self] motion, error in
guard let motion else { return }
let quaternion = motion.attitude.quaternion
self?.cameraNode.orientation = SCNQuaternion(
x: Float(quaternion.x),
y: Float(quaternion.y),
z: Float(quaternion.z),
w: Float(quaternion.w)
)
}
xArbitraryZVertical — reference frame independent of magnetic north. Initial direction arbitrary, correct for VR: user chooses where to look at startup.
Latency: Main Comfort Enemy
Motion-to-photon latency — time from head movement to screen update. Comfort threshold: < 20ms. Typical pipeline:
- IMU → sensor event: 1–3ms
- Sensor event → camera rotation update: 1–5ms (thread scheduling dependent)
- Camera rotation → render: 8–16ms (one frame at 60–120 FPS)
- Render → display: 8–16ms (display latency)
Total easily reaches 20–40ms. ATW (Asynchronous TimeWarp) in Cardboard SDK takes last rendered frame and reprojects it accounting for new orientation — virtually reduces motion-to-photon latency without reducing render time.
Recenter (Reset Orientation)
User turned sideways or stood up — their "straight ahead" changed. Recenter sets current head orientation as zero:
// iOS
func recenter() {
referenceAttitude = motionManager.deviceMotion?.attitude.copy() as? CMAttitude
}
// In update: apply delta relative to reference
func updateCamera() {
guard let current = motionManager.deviceMotion?.attitude,
let reference = referenceAttitude else { return }
current.multiply(byInverseOf: reference)
// use current.quaternion as camera rotation
}
Recenter usually bound to Cardboard button or special gesture (device shake).
Workflow
Choose IMU API: system rotation vector vs raw gyroscope/accelerometer with custom fusion.
Implement sensor reading with minimum latency on dedicated thread.
Sync orientation with rendering, configure recenter.
Test drift: 10-minute session without recenter, assess accumulated error.
Integrate with ATW via Cardboard SDK.
Timeline Estimates
Basic 3DoF head tracking via system rotation vector — 1–2 days. Custom implementation with own fusion, latency optimization, recenter — 3–5 days.







