Developing physical button interaction system in games
Press button with finger in VR – not same as clicking mouse or pressing X on gamepad. Physical button must depress when pressed, resist, return. Finger shouldn't penetrate surface. Button must activate exactly when user expects – not before, not after.
Three separate technical tasks, each has its pitfalls.
Button physics: why colliders don't solve it alone
First instinct – put Collider on button, catch OnTriggerEnter, activate button. Problem: trigger activates on any passage through zone, not on press. Finger enters collider sideways – button pressed. Hand passes nearby touching edge – also pressed.
Physical button requires directional press. Logic: button activates only on movement along press axis (usually local -Y). This means not OnTriggerEnter check, but finger position along button axis each Update.
In XR Interaction Toolkit approach: button is XRBaseInteractable with custom component PhysicalButton. Tracks float pressDepth = Vector3.Dot(fingerPosition - buttonSurface, buttonAxis). While pressDepth < threshold – pressed state. When pressDepth > releaseThreshold – released. Hysteresis between thresholds prevents bouncing.
Visual and tactile feedback
Button must move. Simple way – Lerp button position between restPosition and pressedPosition based on pressDepth. But simple Lerp doesn't give physical resistance feeling – button moves linearly independent of effort.
More correct approach: spring-damper simulation. Button is Rigidbody with isKinematic = false, spring force acts from ConfigurableJoint with linear motion on single axis. JointDrive.positionSpring and JointDrive.positionDamper define response character. This lets finger literally "push" button with physical resistance – button doesn't go to max instantly, requires effort.
On activation – XRBaseController.SendHapticImpulse(0.8f, 0.03f) for short "click" in controller. Without haptics physical buttons feel mute.
"Ghost finger" problem
In Meta Quest without Hand Tracking (with controllers), "finger" is virtual ray or small sphere attached to controller position. No real finger. Button pressed with controller tip or index finger in hand-model.
With Hand Tracking (Meta Hand Tracking SDK / OpenXR Hand Interaction Extension), fingers exist – better, but complex. Each finger – joint position, no physical collider. Need adding small sphere colliders on fingertip joints (ThumbTip, IndexTip) and properly configure physics layer – interact with buttons, but not conflict with each other and not with avatar body.
Layer matrix – mandatory: HandColliders vs. PhysicalButtons = Detect, HandColliders vs. HandColliders = Ignore, HandColliders vs. Environment = Ignore (else fingers stick in walls).
Scaling the system
When many buttons in scene (control panel, keyboard), optimization important. Don't keep Update() on every button – use Physics.OverlapSphere in manager, once per frame check closest buttons to hand positions and activate only those.
For keyboards – separate approach: PhysicalKeyboard manager with grid-based detection, no individual colliders per key.
Timeline: single physical button with full feedback – 1–2 days; system of 10–20 buttons with Hand Tracking integration – 1–2 weeks. Cost calculated individually.





