Eye Tracking SDK Integration in VR

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Eye Tracking SDK Integration in VR
Complex
~1-2 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    670
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    860
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Eye tracking SDK integration for VR

Eye Tracking in VR – not just "where player looks." This gaze direction data in world coordinates, updated at 90–120 Hz. Applications several: foveated rendering (render expensive only where eye looks), interactivity without controllers, attention analytics, NPC reactions. Each application requires different accuracy and different integration.

Platforms and SDKs

Meta Quest Pro / Quest 3 provide Eye Tracking via Meta Movement SDK (or OVRPlugin). Available OVREyeGaze component and low-level OVRPlugin.GetEyeGazesState(). Accuracy: ±1.5–3° angular at good calibration. Requires explicit user permission (permission dialog).

OpenXR – cross-platform path via extension XR_EXT_eye_gaze_interaction. In Unity via Eye Gaze Interaction in XR Interaction Toolkit 2.3+. Advantage – single code for Quest Pro, Varjo, HTC Vive Pro Eye, Pico 4 Pro.

SteamVR (Valve Index has no Eye Tracking, Tobii integrated in some PC-headsets) – via Tobii XR SDK with separate Unity package.

Foveated Rendering: why needed and how to connect

Quest 3 GPU budget tight. Fixed Foveated Rendering (FFR) – static variant: screen edges render at reduced resolution, center at full. Works, but center fixed.

Dynamic Foveated Rendering (EyeTracked FFR) uses Eye Tracking data so high-quality zone follows gaze. Meta OpenXR SDK supports this via XR_FB_foveation + XR_FB_foveation_vulkan extensions. In Unity – via OVRManager.eyeTrackingEnabled = true and OVRManager.fixedFoveatedRenderingLevel configuration with OVRPlugin.useDynamicFixedFoveatedRendering.

Performance gain: 15–25% GPU in high-load scenes. For Quest 3 sometimes difference between 72 and 90 fps.

Interactive gaze: gaze selection without hands

Gaze-based interaction – object selection by gaze. Basic implementation:

// OpenXR via XR Interaction Toolkit
var gazeInteractor = GetComponent<GazeInteractor>();
gazeInteractor.TryGetCurrentUIRaycastResult(out RaycastResult result);

Practical complexity: gaze direction shakes. Even at good calibration, raw gaze data has high-frequency noise ±0.5–1°. Makes targeting unstable – object under gaze flickers selected/unselected.

Solution: smooth gaze cursor via exponential filter. smoothedGaze = Vector3.Slerp(smoothedGaze, rawGaze, smoothFactor * Time.deltaTime), where smoothFactor ≈ 10–15. Removes tremor, keeps responsiveness.

Additionally – dwell time activation: object activates only if gaze held on it longer than dwellDuration (0.5–1.5 seconds). Visual indicator – arc/ring around object, filling like progress bar.

Permissions and privacy

Eye Tracking requires special permissions in Meta Developer Hub. For Meta Store publication – separate policy review: app must not transmit raw eye data to server without explicit consent. In Unity need handling OVRPermissionsRequester and check OVRPlugin.eyeTrackingEnabled before attempting read data.

For corporate VR solutions (enterprise Quest Pro) eye tracking data can be used for attention analytics – separate legal documentation needed.

Timeline for Eye Tracking integration: basic gaze cursor + selection – 2–4 business days; Dynamic Foveated Rendering + gaze interaction + analytics – 1–2 weeks. Cost determined after analyzing requirements.