Developing a Mobile App for Robot Control
Controlling a robot from a mobile device splits into several fundamentally different tasks: real-time teleoperation (joystick for wheeled platform or manipulator), state monitoring (battery, temperature, task status), mission programming (waypoints, pick-and-place sequences), and video stream. Each has different latency, reliability, and technology stack requirements.
ROS 2 and rosbridge: Standard in Research and Industrial Robotics
ROS 2 (Robot Operating System)—de facto standard in academia and increasingly in industrial robotics (Universal Robots UR series, MiR mobile platforms, Boston Dynamics Spot SDK). rosbridge_suite provides WebSocket API to interact with ROS topics, services, and parameters from any language.
class RosbridgeClient {
late WebSocketChannel _channel;
final _topicStreams = <String, StreamController<dynamic>>{};
int _opId = 0;
Future<void> connect(String url) async {
_channel = WebSocketChannel.connect(Uri.parse(url));
_channel.stream.listen(_handleMessage);
}
// Subscribe to ROS topic
Stream<T> subscribe<T>(String topic, String type,
T Function(Map<String, dynamic>) fromJson) {
final controller = StreamController<T>.broadcast();
_topicStreams[topic] = controller as StreamController<dynamic>;
_channel.sink.add(jsonEncode({
'op': 'subscribe',
'topic': topic,
'type': type,
'id': 'sub_${_opId++}',
}));
return controller.stream.map((data) => fromJson(data as Map<String, dynamic>));
}
// Publish to ROS topic
void publish(String topic, String type, Map<String, dynamic> message) {
_channel.sink.add(jsonEncode({
'op': 'publish',
'topic': topic,
'type': type,
'msg': message,
}));
}
void _handleMessage(dynamic raw) {
final msg = jsonDecode(raw as String) as Map<String, dynamic>;
if (msg['op'] == 'publish') {
final topic = msg['topic'] as String;
_topicStreams[topic]?.add(msg['msg']);
}
}
}
Teleoperation: Joystick and Latency
Controlling a wheeled platform via virtual joystick publishes geometry_msgs/Twist to /cmd_vel topic. Critical requirement—latency. Delays >200 ms make control uncomfortable, >500 ms become dangerous.
class TeleopController {
final RosbridgeClient _rosbridge;
Timer? _publishTimer;
void startTeleop(Stream<Offset> joystickInput) {
joystickInput.listen((offset) {
_currentLinear = offset.dy * MAX_LINEAR_SPEED; // m/s
_currentAngular = -offset.dx * MAX_ANGULAR_SPEED; // rad/s
});
// Publish Twist at fixed 10 Hz
// Don't tie to joystick events—need consistent frequency for ROS controller
_publishTimer = Timer.periodic(const Duration(milliseconds: 100), (_) {
_rosbridge.publish('/cmd_vel', 'geometry_msgs/Twist', {
'linear': {'x': _currentLinear, 'y': 0.0, 'z': 0.0},
'angular': {'x': 0.0, 'y': 0.0, 'z': _currentAngular},
});
});
}
void stopTeleop() {
_publishTimer?.cancel();
// Explicit stop—safety first
_rosbridge.publish('/cmd_vel', 'geometry_msgs/Twist', {
'linear': {'x': 0.0, 'y': 0.0, 'z': 0.0},
'angular': {'x': 0.0, 'y': 0.0, 'z': 0.0},
});
}
}
Watchdog on robot: if /cmd_vel doesn't arrive for 0.5 seconds—emergency stop. Standard practice for mobile platforms. WiFi loss or network switch = robot stops itself.
Video Streaming: web_video_server or WebRTC
web_video_server—ROS package streaming sensor_msgs/Image or sensor_msgs/CompressedImage topics via HTTP MJPEG/h264. For mobile client on Flutter:
// MJPEG from web_video_server
Widget buildCameraView(String topic) {
final url = 'http://$robotIp:8080/stream?topic=$topic&type=mjpeg&quality=70';
return MjpegStreamView(url: url); // custom widget with MJPEG decoder
}
For latency <200 ms, use WebRTC—webrtc_ros package or Janus Gateway. WebRTC provides real-time video with sub-100 ms delay via flutter_webrtc:
class RobotVideoCall {
late RTCPeerConnection _peerConnection;
Future<void> startStream() async {
_peerConnection = await createPeerConnection({
'iceServers': [{'urls': 'stun:stun.l.google.com:19302'}],
});
_peerConnection.onTrack = (RTCTrackEvent event) {
if (event.track.kind == 'video') {
_videoRenderer.srcObject = event.streams.first;
}
};
// Signaling via rosbridge or separate WebSocket
final offer = await _peerConnection.createOffer();
await _peerConnection.setLocalDescription(offer);
_signalingChannel.send(offer.sdp);
}
}
Navigation and Waypoints
For autonomous navigation (ROS Navigation Stack, Nav2)—send goal pose via geometry_msgs/PoseStamped to /move_base_simple/goal (ROS 1) or action server /navigate_to_pose (Nav2 in ROS 2):
Future<void> navigateTo(double x, double y, double yaw) async {
final quaternion = yawToQuaternion(yaw);
// ROS 2 Nav2 via rosbridge action
_rosbridge.publish('/goal_pose', 'geometry_msgs/PoseStamped', {
'header': {
'frame_id': 'map',
'stamp': {'sec': DateTime.now().millisecondsSinceEpoch ~/ 1000, 'nanosec': 0},
},
'pose': {
'position': {'x': x, 'y': y, 'z': 0.0},
'orientation': quaternion,
},
});
}
Map<String, double> yawToQuaternion(double yaw) {
return {
'x': 0.0, 'y': 0.0,
'z': sin(yaw / 2),
'w': cos(yaw / 2),
};
}
Map display—nav_msgs/OccupancyGrid from /map topic. Raster map (uint8 array, 0=free, 100=occupied, -1=unknown) converts to PNG and renders via flutter_map or custom CustomPainter.
Proprietary Robot SDKs
Boston Dynamics Spot—Spot SDK (Python + gRPC, mobile client via REST wrapper). Universal Robots—UR RTDE (Real-Time Data Exchange) for telemetry, URScript via socket for commands. Doosan, Kuka—own SDKs with REST API or Modbus TCP.
Developing a mobile app for ROS 2 robot control with teleoperation, video streaming, and navigation: 12–18 weeks. Integration with proprietary SDK of specific robot quoted separately. Cost individually quoted after system architecture review.







