3DoF head tracking in mobile VR app

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
3DoF head tracking in mobile VR app
Medium
~3-5 business days
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1052
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

3DoF Head Tracking Implementation in Mobile VR Apps

3DoF (three degrees of freedom) — rotation: pitch (nod), yaw (turn), roll (tilt). Smartphone knows orientation via IMU — accelerometer and gyroscope. Fusing two sensor data into stable orientation without drift accumulation — that's where complexity lies.

IMU Fusion: Why Gyroscope Alone Isn't Enough

Gyroscope measures angular velocity with high precision and low noise. Integrate over time — get rotation angle. Problem: numerical integration accumulates error. After minutes gyroscope "drifts" by several degrees — virtual horizon shifts.

Accelerometer in statics points to Earth center — absolute orientation. Problem: accelerometer can't distinguish gravity from acceleration during movement, data is noisy.

Solution — Complementary Filter or Madgwick/Mahony filter:

// Android: simplified Complementary Filter
class ComplementaryFilter(val alpha: Float = 0.98f) {
    private var pitch = 0f
    private var roll = 0f

    fun update(gyroDelta: FloatArray, accel: FloatArray, dt: Float) {
        // Angle from gyroscope (fast, accurate short-term)
        val gyroPitch = pitch + gyroDelta[0] * dt
        val gyroRoll  = roll  + gyroDelta[1] * dt

        // Angle from accelerometer (slow, absolute orientation)
        val accelPitch = Math.toDegrees(Math.atan2(accel[1].toDouble(), accel[2].toDouble())).toFloat()
        val accelRoll  = Math.toDegrees(Math.atan2(-accel[0].toDouble(), accel[2].toDouble())).toFloat()

        // Mix: 98% gyroscope + 2% accelerometer
        pitch = alpha * gyroPitch + (1f - alpha) * accelPitch
        roll  = alpha * gyroRoll  + (1f - alpha) * accelRoll
    }
}

alpha = 0.98 — standard value. On fast head movements temporarily lower alpha (more trust to accelerometer), on slow movements — raise.

Android SensorManager: Read IMU Correctly

On Android IMU read via SensorManager. Two options: TYPE_ROTATION_VECTOR (already fused by system) or raw TYPE_GYROSCOPE + TYPE_ACCELEROMETER + custom fusion.

TYPE_GAME_ROTATION_VECTOR — specifically for games: doesn't use magnetometer (compass), independent of nearby metal objects. For VR — best choice:

val sensorManager = getSystemService(SENSOR_SERVICE) as SensorManager
val gameRotationSensor = sensorManager.getDefaultSensor(Sensor.TYPE_GAME_ROTATION_VECTOR)

sensorManager.registerListener(object : SensorEventListener {
    override fun onSensorChanged(event: SensorEvent) {
        // event.values: [x, y, z, w] quaternion
        val rotationMatrix = FloatArray(16)
        SensorManager.getRotationMatrixFromVector(rotationMatrix, event.values)
        // Apply to camera transform
        updateCameraRotation(rotationMatrix)
    }
    override fun onAccuracyChanged(sensor: Sensor, accuracy: Int) {}
}, gameRotationSensor, SensorManager.SENSOR_DELAY_FASTEST) // ~500Hz

SENSOR_DELAY_FASTEST — critical for VR. SENSOR_DELAY_GAME (~50Hz) causes noticeable lag on fast turns.

iOS: CoreMotion and CMMotionManager

On iOS it's simpler: CMMotionManager.deviceMotion already contains fusion from accelerometer, gyroscope, magnetometer. Attitude returns as CMAttitude with roll, pitch, yaw or quaternion:

let motionManager = CMMotionManager()
motionManager.deviceMotionUpdateInterval = 1.0 / 60.0 // 60Hz minimum, prefer 90+

motionManager.startDeviceMotionUpdates(
    using: .xArbitraryZVertical, // independent of magnetic north
    to: .main
) { [weak self] motion, error in
    guard let motion else { return }
    let quaternion = motion.attitude.quaternion
    self?.cameraNode.orientation = SCNQuaternion(
        x: Float(quaternion.x),
        y: Float(quaternion.y),
        z: Float(quaternion.z),
        w: Float(quaternion.w)
    )
}

xArbitraryZVertical — reference frame independent of magnetic north. Initial direction arbitrary, correct for VR: user chooses where to look at startup.

Latency: Main Comfort Enemy

Motion-to-photon latency — time from head movement to screen update. Comfort threshold: < 20ms. Typical pipeline:

  • IMU → sensor event: 1–3ms
  • Sensor event → camera rotation update: 1–5ms (thread scheduling dependent)
  • Camera rotation → render: 8–16ms (one frame at 60–120 FPS)
  • Render → display: 8–16ms (display latency)

Total easily reaches 20–40ms. ATW (Asynchronous TimeWarp) in Cardboard SDK takes last rendered frame and reprojects it accounting for new orientation — virtually reduces motion-to-photon latency without reducing render time.

Recenter (Reset Orientation)

User turned sideways or stood up — their "straight ahead" changed. Recenter sets current head orientation as zero:

// iOS
func recenter() {
    referenceAttitude = motionManager.deviceMotion?.attitude.copy() as? CMAttitude
}

// In update: apply delta relative to reference
func updateCamera() {
    guard let current = motionManager.deviceMotion?.attitude,
          let reference = referenceAttitude else { return }
    current.multiply(byInverseOf: reference)
    // use current.quaternion as camera rotation
}

Recenter usually bound to Cardboard button or special gesture (device shake).

Workflow

Choose IMU API: system rotation vector vs raw gyroscope/accelerometer with custom fusion.

Implement sensor reading with minimum latency on dedicated thread.

Sync orientation with rendering, configure recenter.

Test drift: 10-minute session without recenter, assess accumulated error.

Integrate with ATW via Cardboard SDK.

Timeline Estimates

Basic 3DoF head tracking via system rotation vector — 1–2 days. Custom implementation with own fusion, latency optimization, recenter — 3–5 days.