Mobile Application Development for Robot Control
Controlling robot from phone — almost always real-time task with latency < 100 ms requirements and reliable disconnection handling. Communication protocol, command architecture and reconnection scheme more important than UI. Cover specific transports and patterns that work in production.
Transport and Protocols
ROS 2 + rosbridge (WebSocket)
Most common stack for ROS robots: rosbridge_server on robot, phone connects via WebSocket and publishes/subscribes to topics via JSON protocol rosbridge_protocol. For mobile clients, roslibjs-compatible wrappers exist, but for native Android/iOS write client ourselves — 300–400 lines code with reconnection and message queue.
Problem: rosbridge — JSON over WebSocket, slow for high-frequency topics. /cmd_vel with Twist commands at 20 Hz via JSON — ~40 kB/s traffic. If robot behind Wi-Fi with real throughput 1 Mbit/s — normal. Via LTE with jitter — packets arrive in bursts, commands execute with delay. Solution: reduce command frequency to 10 Hz and add watchdog: no command from phone 500 ms — robot transitions to safe_stop.
MQTT for Lightweight Control
For IoT robots without ROS (AGV, sorters, custom platforms) — MQTT broker (Mosquitto or EMQ X) plus light protocol. Phone publishes to robot/{id}/cmd, robot subscribed. Telemetry back: robot/{id}/state, robot/{id}/battery.
QoS 1 (at least once) for control commands — QoS 0 loses packets on unstable Wi-Fi, QoS 2 creates extra round-trips. Retained message for robot/{id}/state allows new client to immediately get current state without waiting next update.
UDP for Real-time (< 50 ms)
For minimal latency — direct UDP socket to control port. On Android — DatagramSocket in CoroutineScope(Dispatchers.IO), send every 50 ms. No delivery guarantee — that's plus: old command doesn't block new in queue. Used for manipulator control where command latency matters more than delivery guarantee.
Mobile Client Architecture
class RobotControlViewModel(
private val robotRepository: RobotRepository
) : ViewModel() {
private val _robotState = MutableStateFlow<RobotState>(RobotState.Disconnected)
val robotState: StateFlow<RobotState> = _robotState
fun sendVelocityCommand(linear: Float, angular: Float) {
viewModelScope.launch {
robotRepository.publishVelocity(
TwistCommand(linear = linear, angular = angular)
)
}
}
fun connect(robotIp: String) {
viewModelScope.launch {
robotRepository.connect(robotIp)
.onEach { state -> _robotState.value = state }
.launchIn(this)
}
}
}
RobotRepository encapsulates specific transport — WebSocket, MQTT or UDP. Transport change doesn't affect ViewModel and UI. Critical: in real projects often hardware or protocol changes between prototype and production.
Virtual Joystick and Input Handling
MotionEvent.ACTION_MOVE arrives up to 60 times per second on fast finger movement. Sending command per event — channel overload. Use throttleLatest(50) from Kotlin Coroutines Flow — take last value per 50 ms. Old intermediate values discarded, response latency — max 50 ms.
Dead zone (dead zone) in joystick center: 10–15% radius filter to zero. Without this hand micro-tremor causes constant low-speed commands, robot "jerks" at rest.
Video Stream from Robot Camera
For H.264 streams from robot IP cameras use ExoPlayer (Android) or AVPlayer with HLS (iOS). HLS latency — 5–30 seconds. For control unacceptable. Correct path: WebRTC (latency < 200 ms) or RTSP via LibVLC / ffmpeg in MediaCodec pipeline (latency 100–300 ms).
WebRTC more complex to setup (needs STUN/TURN server or P2P in one network), but gives minimal latency and adaptive bitrate.
Safety and Emergency Scenarios
- Watchdog on robot: no command 1–2 seconds →
safe_stopor transition to autonomous position hold - PIN/QR authentication for connecting to specific robot (prevent controlling wrong one)
- Command lock on protection cover open (via MQTT status or GPIO)
Timeframes
Basic client with joystick, telemetry and video stream on one platform — 3–5 weeks. Cross-platform solution with multiple protocol support, building map and autonomous missions — 2–4 months. Estimate after robot platform and latency requirement study.







