Screen Broadcasting from Mobile Device
Screen broadcasting on iOS works via ReplayKit — only via it. Attempts to capture UIScreen directly via UIScreen.main.snapshotView give UIImage once, not frame stream. ReplayKit — only public Apple API for real-time screen capture.
Two ReplayKit Modes and When to Use What
RPScreenRecorder (in-app recording). Captures only app content. User doesn't see system picker. Suits recording gameplay, app UI capture. Downside — capture stops on app minimize.
RPBroadcastSampleHandler (broadcast upload extension). Works as system Extension — process lives separate from main app. Captures entire device screen including other apps and notifications. User launches via Control Center or RPSystemBroadcastPickerView. This is the mode needed for "entire screen" broadcasting.
Broadcast Extension architecture:
iOS System → RPBroadcastSampleHandler (Extension process)
↓
CMSampleBuffer (video + audio)
↓
App Group shared container (if communicating with main app)
↓
RTMP/SRT → broadcast server
Extension has no UI and limited to 50 MB RAM. Hard limit: all encoding and sending must fit this budget.
RPBroadcastSampleHandler: Implementation
class BroadcastHandler: RPBroadcastSampleHandler {
private var rtmpStream: RTMPStream?
override func broadcastStarted(withSetupInfo setupInfo: [String: NSObject]?) {
// Initialize RTMP or SRT connection
// setupInfo — data from main app via Info.plist
}
override func processSampleBuffer(_ sampleBuffer: CMSampleBuffer,
with sampleBufferType: RPSampleBufferType) {
switch sampleBufferType {
case .video:
rtmpStream?.append(sampleBuffer)
case .audioApp:
rtmpStream?.append(sampleBuffer) // app audio
case .audioMic:
// microphone audio — separate stream, requires explicit permission
break
}
}
override func broadcastFinished() {
rtmpStream?.close()
}
}
Passing params (stream key, endpoint) from main app to Extension — via App Group UserDefaults:
let defaults = UserDefaults(suiteName: "group.com.yourapp.broadcast")
defaults?.set(streamKey, forKey: "streamKey")
Latency and Bandwidth
ReplayKit screen broadcast adds ~2–5 seconds buffering. Not a bug — Apple buffers for privacy (hiding password dialogs). Can't reduce.
Resolution depends on device model: iPhone 13+ delivers 1668×2388 (native scale), excessive for stream. In processSampleBuffer before encoding need downscale via VTPixelTransferSession or CIContext:
// Scale to 1280×720 before sending to encoder
let scaledBuffer = pixelTransferSession.scale(pixelBuffer, to: CGSize(width: 1280, height: 720))
Without downscale Extension exceeds 50 MB RAM on first I-frame in 4K.
Android: MediaProjection API
On Android equivalent — MediaProjection. User confirms via system dialog (startActivityForResult with MediaProjectionManager.createScreenCaptureIntent()). After getting MediaProjection create VirtualDisplay and direct to MediaCodec via Surface:
val virtualDisplay = mediaProjection.createVirtualDisplay(
"ScreenCapture",
width, height, dpi,
DisplayManager.VIRTUAL_DISPLAY_FLAG_AUTO_MIRROR,
mediaCodec.createInputSurface(), null, null
)
Unlike iOS, no ReplayKit latency — capture near real-time. But Android 14+ requires: if MediaProjection session used in ForegroundService, need type FOREGROUND_SERVICE_TYPE_MEDIA_PROJECTION.
User Notification and Watchdog Timer
On iOS Extension can run indefinitely, but exceeding 50 MB kills process silently. Monitor via heartbeat through App Group: Extension writes timestamp every 5 seconds, app checks. If timestamp not updated 15 seconds — consider Extension crashed, show warning.
Timeline
iOS Broadcast Extension with RTMP/SRT, downscaling, App Group communication: 2–3 weeks. Android MediaProjection: 1.5–2 weeks. Cross-platform with common management code: 4–5 weeks. Cost calculated individually.







