Smart Home Security Cameras Mobile App Development
Video surveillance in a mobile app is technically the heaviest part of a smart home. Video streaming from cameras, motion detection notifications, archive viewing, P2P connection without a dedicated IP — each point carries non-trivial solutions.
Video Stream: Protocols and Latency
Three main protocols for IP cameras:
RTSP — standard for most NVRs, DVRs and IP cameras (Hikvision, Dahua, Reolink, Amcrest). Low latency (300–800ms), but no native browser support and requires TCP/UDP sockets. On Flutter: flutter_vlc_player (libVLC under the hood) or media_kit — both support RTSP. On React Native: react-native-vlc-media-player.
HLS (HTTP Live Streaming) — works everywhere, latency 3–15 seconds. Server (FFmpeg, MediaMTX) takes camera RTSP stream and serves HLS. For archive — ideal, for live monitoring — acceptable, for two-way communication — no.
WebRTC — minimal latency (< 500ms), P2P without relay server. Used for intercom and baby monitor scenarios. On Flutter: flutter_webrtc. On React Native: react-native-webrtc. Requires STUN/TURN servers for NAT traversal — coturn as self-hosted TURN or Twilio/Cloudflare TURN.
For most home security apps: WebRTC for live + HLS for archive.
P2P and NAT Traversal
User watches home camera while roaming abroad. Camera is behind home router without public IP. Options:
UPnP/Port Forwarding — user must configure themselves — unrealistic for consumer products.
TURN relay — all data goes through relay server. Works always, but server costs money and adds latency.
Hole punching (ICE via STUN) — direct P2P connection through external IP/port exchange. Works in ~75–85% of cases with Symmetric NAT. coturn as STUN: free, deploy on VPS.
Tunnels (WireGuard, ZeroTier, Tailscale) — devices on home network connected to VPN mesh. Mobile app connects to same mesh. Tailscale SDK for mobile is official. Most reliable option, requires router/server setup.
Real project: apartment intercom-camera on Raspberry Pi + WebRTC + coturn. Hole punching worked in 80% of cases, TURN relay for the rest. Average video latency in P2P mode — 180–250ms. Sufficient for an intercom.
Motion Detection Notifications
Camera detects motion → notification to phone → user watches recording.
Detection at camera level: most IP cameras send HTTP webhook or MQTT on trigger. Receive on backend → FCM/APNs push with content-available: 1 (iOS silent push for background processing).
On iOS: UNNotificationServiceExtension to add preview image to notification. Camera sends snapshot URL with webhook — extension downloads and attaches to notification. User sees notification with picture straight from notification panel.
On Android: BigPictureStyle notification via Firebase Messaging. Download image in NotificationCompat.BigPictureStyle builder.
Motion detection via ML on mobile client (background frame processing) — kills battery. Don't do this on phone.
Recording and Archive
Local recording to camera SD card → archive viewing through app. NVR API (Hikvision ISAPI, Blue Iris API) for recording navigation.
Archive timeline — horizontal time strip with motion markers. On Flutter: custom CustomPainter with Canvas.drawRect for each segment. Zoom via gesture (pinch-to-zoom) changes time scale.
Cloud storage: upload motion segments to S3/GCS. Chunking via FFmpeg on server by webhook signals. Storage 7–30 days with TTL policy.
Common Mistakes
Using Image.network to update MJPEG snapshots every second — each update recreates widget, flicker in UI. Use custom ImageStream with cache or proper MJPEG player.
Not thinking about background mode: on iOS PiP (Picture-in-Picture) for video stream requires AVPlayerViewController or explicit AVFoundation support. flutter_vlc_player supports PiP starting from version 7.x.
Timeline
Single camera, RTSP/HLS viewing, motion notifications — 5–7 weeks. Multi-camera view, P2P/WebRTC, archive with timeline, two-way communication — 3–5 months. Cost depends on camera types and cloud storage requirements.







