Eye tracking SDK integration for VR
Eye Tracking in VR – not just "where player looks." This gaze direction data in world coordinates, updated at 90–120 Hz. Applications several: foveated rendering (render expensive only where eye looks), interactivity without controllers, attention analytics, NPC reactions. Each application requires different accuracy and different integration.
Platforms and SDKs
Meta Quest Pro / Quest 3 provide Eye Tracking via Meta Movement SDK (or OVRPlugin). Available OVREyeGaze component and low-level OVRPlugin.GetEyeGazesState(). Accuracy: ±1.5–3° angular at good calibration. Requires explicit user permission (permission dialog).
OpenXR – cross-platform path via extension XR_EXT_eye_gaze_interaction. In Unity via Eye Gaze Interaction in XR Interaction Toolkit 2.3+. Advantage – single code for Quest Pro, Varjo, HTC Vive Pro Eye, Pico 4 Pro.
SteamVR (Valve Index has no Eye Tracking, Tobii integrated in some PC-headsets) – via Tobii XR SDK with separate Unity package.
Foveated Rendering: why needed and how to connect
Quest 3 GPU budget tight. Fixed Foveated Rendering (FFR) – static variant: screen edges render at reduced resolution, center at full. Works, but center fixed.
Dynamic Foveated Rendering (EyeTracked FFR) uses Eye Tracking data so high-quality zone follows gaze. Meta OpenXR SDK supports this via XR_FB_foveation + XR_FB_foveation_vulkan extensions. In Unity – via OVRManager.eyeTrackingEnabled = true and OVRManager.fixedFoveatedRenderingLevel configuration with OVRPlugin.useDynamicFixedFoveatedRendering.
Performance gain: 15–25% GPU in high-load scenes. For Quest 3 sometimes difference between 72 and 90 fps.
Interactive gaze: gaze selection without hands
Gaze-based interaction – object selection by gaze. Basic implementation:
// OpenXR via XR Interaction Toolkit
var gazeInteractor = GetComponent<GazeInteractor>();
gazeInteractor.TryGetCurrentUIRaycastResult(out RaycastResult result);
Practical complexity: gaze direction shakes. Even at good calibration, raw gaze data has high-frequency noise ±0.5–1°. Makes targeting unstable – object under gaze flickers selected/unselected.
Solution: smooth gaze cursor via exponential filter. smoothedGaze = Vector3.Slerp(smoothedGaze, rawGaze, smoothFactor * Time.deltaTime), where smoothFactor ≈ 10–15. Removes tremor, keeps responsiveness.
Additionally – dwell time activation: object activates only if gaze held on it longer than dwellDuration (0.5–1.5 seconds). Visual indicator – arc/ring around object, filling like progress bar.
Permissions and privacy
Eye Tracking requires special permissions in Meta Developer Hub. For Meta Store publication – separate policy review: app must not transmit raw eye data to server without explicit consent. In Unity need handling OVRPermissionsRequester and check OVRPlugin.eyeTrackingEnabled before attempting read data.
For corporate VR solutions (enterprise Quest Pro) eye tracking data can be used for attention analytics – separate legal documentation needed.
Timeline for Eye Tracking integration: basic gaze cursor + selection – 2–4 business days; Dynamic Foveated Rendering + gaze interaction + analytics – 1–2 weeks. Cost determined after analyzing requirements.





