Movement Mechanics Design

Our video game development company runs independent projects, jointly creates games with the client and provides additional operational services. Expertise of our team allows us to cover all gaming platforms and develop an amazing product that matches the customer’s vision and players preferences.
Showing 1 of 1 servicesAll 242 services
Movement Mechanics Design
Complex
from 3 business days to 2 weeks
FAQ
Our competencies
What are the stages of Game Development?
Latest works
  • image_games_mortal_motors_495_0.webp
    Game development for Mortal Motors
    670
  • image_games_a_turnbased_strategy_game_set_in_a_fantasy_setting_with_fire_and_sword_603_0.webp
    A turn-based strategy game set in a fantasy setting, With Fire and Sword
    860
  • image_games_second_team_604_0.webp
    Game development for the company Second term
    490
  • image_games_phoenix_ii_606_0.webp
    3D animation - teaser for the game Phoenix 2.
    533

Designing Game Movement Mechanics

A character leaves the ground half a frame before the jump button is pressed—and the player feels it as "plastic" controls. This is where the work on movement mechanics design begins: not in tweaking Rigidbody.mass at random, but in formalizing the feel of responsiveness through concrete numbers and architectural decisions.

Why "Just CharacterController" Doesn't Work

Unity offers two paths: physical Rigidbody + collider and kinematic CharacterController. Both implementations have specific pitfalls.

CharacterController.Move() doesn't interact with the physics engine directly—the character passes through moving platforms unless you implement a custom riding mechanism. The standard isGrounded flag returns false on a single frame when descending a slope—and the character starts bouncing infinitely due to accumulated gravity in the vertical velocity buffer.

A Rigidbody character is more stable against physical interactions, but requires manual friction control: without a PhysicMaterial with zero dynamicFriction on the capsule, the character gets stuck on geometry edges. Meanwhile, Rigidbody.AddForce() in ForceMode.VelocityChange mode behaves predictably only at fixedDeltaTime 0.02—change the physics step, and all tuned sensations "shift."

Ground detection is a separate class of problems. Physics.SphereCast downward with a radius slightly smaller than the capsule is more reliable than Physics.Raycast, but requires careful LayerMask tuning, otherwise the cast will hit the character's own collider.

How Movement System Architecture Is Built

A well-designed movement system divides three areas of responsibility: input, state, and physics/movement.

Input is read in Update() and written to the MovementInput structure. States (Idle, Running, Jumping, Falling, Crouching, WallRunning) are managed by a finite state machine—usually a custom class on top of MonoBehaviour, not an Animator State Machine, because transition logic is often nonlinear and tied to game conditions, not animation weights. The position shift itself happens in FixedUpdate() via Rigidbody.MovePosition() or directly via velocity.

For platformers with air control, the airControlCurve is important: horizontal acceleration in the air should be less than on the ground, but not zero. Implemented via AnimationCurve in ScriptableObject character settings—the designer changes the curve in the inspector without touching code.

Coyote time and jump buffering are mandatory components for responsive controls. The former allows jumping within coyoteTimeDuration (typically 0.1–0.15 seconds) after leaving a platform. The latter saves a jump press to a buffer for jumpBufferDuration (0.1–0.2 seconds) and executes it at the first opportunity. Without these two mechanics, the player constantly "misses" jumps at platform edges.

Variable jump height is another point where details decide everything. If the jump button is released mid-ascent, vertical velocity is cut to minJumpVelocity. This is implemented with a simple check in Update(): if rb.velocity.y > minJumpVelocity && !jumpButtonHeld, then rb.velocity = new Vector3(rb.velocity.x, minJumpVelocity, rb.velocity.z).

Approximate Timeline by Scale

Scale Description Timeframe
Basic 2D/3D character, ground, jump, coyote time 2–5 days
Medium + double jump, dash, wall jump, crouch 1–2 weeks
Extended + swimming, climbing, ragdoll transition, moving platforms 2–4 weeks
Full system + procedural footstep IK, lean, network replication 4–8 weeks

Genre-Specific Considerations

In first-person shooters, Camera and CharacterController live in separate hierarchies and update independently—this prevents camera jitter when the body collides physically. Head bobbing is implemented at the camera level via a sine function of distance traveled, not through animation of the parent object.

In top-down RPG with NavMeshAgent, click-to-move requires clear separation: NavMeshAgent controls the path, but not the animation. Animator parameters Speed and Direction are calculated from agent.velocity each frame, not from key presses. A common mistake is enabling updateRotation = true on the agent while simultaneously rotating the transform manually, which causes rotation jitter.

Project Workflow

We start with analyzing the GDD and reference games—writing down specific numbers from mechanics (running speed in units/sec, jump height, dash time). Then we design the MovementSettings ScriptableObject with all parameters and PlayerMovement MonoBehaviour with documented areas of responsibility.

The prototype is assembled on primitives without animations—just colliders and logic. This allows us to nail the feel of controls in a day or two, without spending time on Animator integration. After feel approval, we connect the Animator Controller with Blend Tree for locomotion and final colliders per mesh geometry.

Testing includes edge cases: jump through 1-unit gap, moving platforms with rotation, transition between NavMesh Surfaces of different scenes during additive loading. These situations most often reveal problems with ground detection and velocity accumulation.