Procedural Animation Implementation in Mobile Apps
Procedural animations compute in real time by algorithm rather than play recorded frames. Particles following finger. Wave reacting to sound. Background shape changing per data. Separate task class requiring understanding physics modeling and GPU work.
When It's Needed
Procedural animation is justified when:
- Animation responds to real-time input (touch, accelerometer, microphone)
- Animation depends on data unknown beforehand
- Variability is infinite; keyframe approach unrealistic
- Need "living" background or ambient animation effect
Particle Systems
On iOS—CAEmitterLayer + CAEmitterCell. Built-in particle engine supporting gravity, emissionRange, velocity, lifetime. Performant, GPU-based. Limitation: each particle is identical in shape (bitmap or shape), no per-particle custom behavior.
let emitter = CAEmitterLayer()
emitter.emitterPosition = CGPoint(x: view.center.x, y: -10)
emitter.emitterShape = .line
emitter.emitterSize = CGSize(width: view.bounds.width, height: 1)
let cell = CAEmitterCell()
cell.contents = UIImage(named: "particle")?.cgImage
cell.birthRate = 50
cell.lifetime = 4
cell.velocity = 100
cell.velocityRange = 50
cell.emissionRange = .pi / 4
cell.scale = 0.1
cell.scaleRange = 0.05
emitter.emitterCells = [cell]
For custom particle behavior—Metal shader or SpriteKit with SKEmitterNode. SpriteKit supports editor .sks files in Xcode—visual config without code.
On Android—no built-in particle engine. Use Canvas with ValueAnimator or OpenGL ES via GLSurfaceView. For production—Konfetti library (simple cases) or custom View with Canvas.drawBitmap loop via Choreographer.FrameCallback.
In Flutter—packages particles_flutter, flame (game engine with particles). For simple cases—CustomPainter with AnimationController and particle list, each with own physics.
Physical Modeling: Springs and Inertia
Procedural animation via physics isn't spring(dampingRatio:). It's numerical equation-of-motion integration each frame.
Simple spring system (Verlet integration):
// Update each frame via CADisplayLink
func update(dt: Double) {
let springForce = (targetPosition - currentPosition) * stiffness
let dampingForce = velocity * -damping
let acceleration = (springForce + dampingForce) / mass
velocity += acceleration * dt
currentPosition += velocity * dt
}
This gives behavior unconstrained by library parameters: assign any mass, stiffness, damping. Used for "stretchy" elements following finger with lag, hair/tail animation, flag effects.
Noise Algorithms for Organic Motion
Perlin noise and Simplex noise base organic ambient animations. Background "breathing" without repeating patterns; shape slowly deforming; color smoothly shifting.
On iOS via Metal or simd:
// Simplex noise function → float in range [-1, 1]
let noiseValue = simplexNoise2D(x: Float(time * 0.3), y: Float(index) * 0.5)
let yOffset = noiseValue * amplitude
In Flutter—noise package or Dart code. In Flutter shaders (Flutter GPU API, available 3.10+)—float noise(vec2 p) directly in GLSL.
Performance
Key rule: no particle logic on main thread. CADisplayLink with preferredFramesPerSecond = 60 (or 120 for ProMotion)—minimum. For complex scenes—Metal compute shader updating particle positions in parallel on GPU.
Profile via Instruments → Core Animation / Metal System Trace. Goal: GPU utilization < 60% idle, frame time < 8ms for 120Hz.
What's Included
Work through animation algorithm per requirements or references. Implement with target FPS optimization. Integrate with app event system (input, data updates). Profile and optimize.
Timeline: 3–5 days depending on physics model complexity and target platform.







