Mobile Strategy Game Development
Mobile strategies are a genre where backend kills projects more often than weak gameplay. City built, troops dispatched, player closed app—and when they return 6 hours later, everything must be where it should be. Async timers, attacks from other players, resource balance—all must live on the server, not the client.
Architecture: client as display layer
The main mistake in strategies is trusting the client. If build timers are calculated client-side, they can be manipulated by system time. If troops are calculated locally—resources can be hacked. Correct architecture:
Server is the single source of truth. All game state lives on server: resources, buildings, troops, timers. Client displays state snapshot and sends commands (BuildCommand, AttackCommand, CollectCommand). Server validates, applies, returns new snapshot or delta.
For persistence: PostgreSQL with strict schema for game state + Redis for hot data (active timers, alliance online status). REST API for meta operations, WebSocket for real-time notifications (you're being attacked, building complete).
On client—optimistic UI update with rollback: when player taps "collect resources," UI updates immediately, request goes to server in parallel. If server returns error—UI reverts. This eliminates the "laggy" game feeling with good internet.
Map and rendering thousands of objects
For strategies with large maps (classic 4X or war-game with hundreds of players), standard approach is tiled map with LOD. Unity Tilemap + Composite Collider2D for basic geometry. On zoom-out: replace detailed tiles with atlas-texture of entire region (RenderTexture snapshot), disable colliders, turn off animation updates.
For other players' markers on large map—GPU Instancing via Graphics.DrawMeshInstanced. 1000 player markers in one draw call instead of 1000 separate GameObjects. Positions and colors passed via MaterialPropertyBlock.
Real case: alliance war with 100 participants
On a 4X project, when 30+ players simultaneously attacked one castle, server timed out. Problem: each attack wrote to one PostgreSQL row synchronously. Solution: Command Queue on Redis Stream—attacks queue, worker processes them sequentially and publishes result via WebSocket. Perceived latency—instant (UI shows "attack sent"), actual processing—100–300ms. Players notice no difference, server stays up.
Push notifications as retention tool
Firebase Cloud Messaging is mandatory. Triggers: building complete, base attacked, resources full. On iOS, properly request UNUserNotificationCenter.requestAuthorization—ask permission not on first launch, but after first completed building when player already understands notification value. Conversion to permission then is 60–70% vs 30–40% on startup request.
Timeline
| Scale | Duration |
|---|---|
| Single-player strategy (no PvP) | 5–8 months |
| PvP with async attacks | 8–12 months |
| Full war-game with alliances, real-time map | 14–20 months |
Pre-production minimum 4 weeks for server architecture design. Launching strategy without planned backend means rewriting it on live audience, extremely painful with async progression.
Cost calculated individually after analyzing server requirements, map scope, and PvP mechanics.







