Implementing Photo Filter Overlays in Mobile Applications
Carousel filter strips with previews — like Instagram's filter gallery. The user swipes through filters and sees results in real-time. If the preview lags or applying the final filter takes 2+ seconds, the UX is destroyed. The entire approach here centers on speed.
How Filters Work at the GPU Level
A filter is pixel-by-pixel transformation: take each pixel's RGBA value, apply mathematics, write the new value. Naive CPU implementation for a 12 MP photo takes 200–400 ms. GPU processes the same pixels in parallel in 5–15 ms.
iOS. Two approaches: CIFilter (Core Image) and Metal shaders.
CIFilter — simpler, built into the system. For most photo filters, this suffices: CIColorMonochrome, CIPhotoEffectChrome, CIVignette, CIColorCurves (iOS 16+). Build a filter chain through filter.outputImage → next filter. CIContext(options: [.useSoftwareRenderer: false]) is mandatory — otherwise rendering happens on CPU.
Metal is needed when you want effects unavailable in CIFilter: duotone with custom colors, zonal adjustments, high-resolution LUT filters. Write a fragment shader in MSL:
fragment float4 duotoneFilter(VertexOut in [[stage_in]],
texture2d<float> texture [[texture(0)]]) {
float4 color = texture.sample(sampler, in.texCoord);
float luma = dot(color.rgb, float3(0.299, 0.587, 0.114));
return mix(shadowColor, highlightColor, luma);
}
Android. GPUImage (library) — port of popular iOS filters to OpenGL ES. GPUImageFilter, GPUImageFilterGroup for chains. For preview — GPUImageView shows the result directly on SurfaceTexture without intermediate Bitmap. In newer projects, consider Vulkan through RenderScript Intrinsics Replacement Toolkit.
Flutter. image (pub.dev) — CPU processing, slow for real-time preview. For real-time — ColorFiltered widget with ColorFilter.matrix() for basic corrections. Complex filters — via Flutter Texture + native OpenGL/Metal code on the platform side.
LUT Filters: Professional Approach
Look Up Table — 3D table 64×64×64 (or 16×16×16 for mobile). Each RGB input gives RGB output via interpolation. A designer creates the LUT in Lightroom/Photoshop; the developer connects it as a texture in the shader. On iOS — CIColorCube or CIColorCubeWithColorSpace. One LUT texture replaces a set of color-correction operations.
Preview Carousel Without Lag
Generate previews in advance: when loading the screen, create a downsampled copy (300×300 px) via UIGraphicsImageRenderer, run through each filter, cache in NSCache. As the user scrolls the carousel, images are already ready. The final full-resolution render starts only after the filter is selected.
Saving Filter Settings
User chose "Chrome" with intensity 0.7 — when returning to the post, this value should be restored. Save not the ready image, but filter parameters (filterName, intensity) — this allows editing the photo after saving a draft without quality loss.
Trigger final full-resolution render only before publishing or exporting. This keeps the original untouched and gives the ability to undo the filter anytime.
Timeline
6–10 CIFilter-based filters with carousel preview — 3–4 days. LUT system with custom Metal/OpenGL shaders — 5–7 days.







