Implementing Data Caching in Mobile Application
App without caching makes a network request every time you navigate to a screen. Slow internet, no internet, expensive mobile traffic — user sees spinner or blank screen. Caching solves this, but implementing it correctly is harder than it seems: you need to decide what to cache, for how long, how to invalidate and what to show on error.
Cache Layers
Good cache architecture uses several levels:
Memory cache — in process memory, fastest. Data lives until app restart. For images — NSCache on iOS (auto-cleans on memory pressure), LruCache on Android.
Disk cache — data in files or local DB. Survive restart. For API responses — Room/SQLite with timestamp, for images — DiskLruCache inside Coil/Glide.
Network — source of truth, accessed only when fresh data needed.
Strategy cache-first, refresh-in-background (stale-while-revalidate) — most user-friendly: instantly show cache, update in parallel, redraw if changed.
Caching API Responses via Room
// Entity with timestamp for invalidation
@Entity(tableName = "products_cache")
data class ProductCacheEntity(
@PrimaryKey val id: String,
val categoryId: String,
val payload: String, // JSON string
val cachedAt: Long, // Unix timestamp
val etag: String? = null
)
// Repository: cache-first logic
class ProductRepository(
private val api: ProductApi,
private val dao: ProductCacheDao,
private val cacheMaxAge: Long = 5 * 60 * 1000L // 5 minutes
) {
fun getProductsByCategory(categoryId: String): Flow<List<Product>> = flow {
// 1. Emit cache immediately
val cached = dao.getByCategory(categoryId)
if (cached.isNotEmpty()) {
emit(cached.map { it.toProduct() })
}
// 2. Check freshness
val oldestEntry = cached.minOfOrNull { it.cachedAt } ?: 0L
val needsRefresh = System.currentTimeMillis() - oldestEntry > cacheMaxAge
if (needsRefresh || cached.isEmpty()) {
try {
val fresh = api.getProducts(categoryId)
val entities = fresh.map { it.toCacheEntity(categoryId) }
dao.upsertAll(entities)
emit(fresh)
} catch (e: IOException) {
// Network unavailable — cache already emitted, do nothing
if (cached.isEmpty()) throw e // nothing to show — propagate
}
}
}
}
This pattern is the foundation of offline-first. UI subscribed to Flow receives data twice: first cache, then fresh.
ETag and Last-Modified
Instead of time-based invalidation, use HTTP cache headers. Server sends ETag: "v42", next request sends If-None-Match: "v42" — if unchanged, server returns 304 without body. Saves traffic.
OkHttp (basic HTTP client for Android and React Native) supports HTTP cache out of box:
val cache = Cache(
directory = File(context.cacheDir, "http-cache"),
maxSize = 10L * 1024 * 1024 // 10 MB
)
val client = OkHttpClient.Builder()
.cache(cache)
.build()
For iOS URLSession supports URLCache similarly. But HTTP cache works only if server sends correct Cache-Control headers — if backend sends Cache-Control: no-store, cache won't work regardless of client settings.
Image Caching
For images, ready libraries solve the task better than any custom solution:
- Android: Coil 2.x — Kotlin-first, Compose-ready, memory + disk cache, placeholder/error states
- iOS: SDWebImage or Kingfisher — async loading, NSCache + disk, progressive JPEG
-
React Native:
react-native-fast-image(wrapper over SDWebImage/Glide)
// Coil in Compose
AsyncImage(
model = ImageRequest.Builder(context)
.data(product.imageUrl)
.memoryCacheKey(product.id)
.diskCacheKey(product.imageUrl)
.crossfade(true)
.build(),
contentDescription = product.title,
placeholder = painterResource(R.drawable.placeholder),
error = painterResource(R.drawable.error_image)
)
Cache Invalidation
The hardest part. Strategies:
- TTL (Time To Live) — cache lives N minutes, then stale. Simple, predictable.
- Event-based — server sends push on data change → invalidate specific cache. Precise, but requires server support.
-
Version-based — server sends
dataVersionwith each response, client compares to saved. - Pull-to-refresh — user explicitly requests update. Always needed as fallback.
Multi-layer cache implementation with Room + HTTP-cache + invalidation: 1–2 weeks depending on data types. Cost calculated individually.







