Implementing Collaborative AR Experience
Collaborative Session in ARKit allows multiple devices on same local network to exchange AR data: feature points, anchors, transformations. Unlike Cloud Anchors, no external server — devices build common map together, in real-time.
Enabled one line in configuration and turns out significantly more complex than seems.
How Collaborative Session Works
let config = ARWorldTrackingConfiguration()
config.isCollaborationEnabled = true
arView.session.run(config)
After this ARKit generates ARSession.CollaborationData packets with environment information from this device. Must send to other participants via any transport (MultipeerConnectivity, WebSocket, GameKit):
func session(_ session: ARSession, didOutputCollaborationData data: ARSession.CollaborationData) {
let encoded = try? NSKeyedArchiver.archivedData(withRootObject: data, requiringSecureCoding: true)
// send encoded to all peers
}
// On receiving side:
let data = try? NSKeyedUnarchiver.unarchivedObject(ofClass: ARSession.CollaborationData.self, from: received)
session.update(with: data)
Devices start building common map — each adds their feature points. Through 10–30 seconds maps merge, all participants see same coordinate system.
What Goes Wrong
MultipeerConnectivity — unreliable transport. MCSession periodically drops peers without clear errors. Device remains in connectedPeers list, but data doesn't arrive. Solution: heartbeat every 2 seconds + auto-reconnect on three missed pings.
CollaborationData packet size. With active camera movement — to 50–100 KB/s per device. On 5 participants — 250–500 KB/s outgoing traffic. MCSession with reliable delivery (.reliable) creates queue and lags. Switch to .unreliable for CollaborationData — individual packet loss non-critical, ARKit recovers. Critical data (object placement) send .reliable on separate channel.
Map merge moment. Before merge devices work in own local coordinate systems. Anchors added before merge can jump during sync. Track ARParticipantAnchor — when another participant appears in scene means maps merged:
func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
let participants = anchors.compactMap { $0 as? ARParticipantAnchor }
if !participants.isEmpty {
// Maps merged — can place objects
enableObjectPlacement()
}
}
Placing objects before ARParticipantAnchor — get desync between participants.
iOS-only limitation. Collaborative Session works only between Apple devices. For Android — only ARCore Cloud Anchors.
Practical Case
Corporate AR app for office planning collaboration. 4 participants with iPad Pro in one room. Collaborative Session via MultipeerConnectivity. Each could drag virtual furniture, others saw changes with ~100 ms delay.
Main problem: when participant left room MCSession dropped them, but ARKit kept sending data to void and queued buffer. On return — avalanche of accumulated packets froze UI. Solution: on peer loss — immediately clear buffer and reconnect from scratch.
What's Included
- Setting ARWorldTrackingConfiguration with
isCollaborationEnabled - Implementing transport layer via MultipeerConnectivity or GameKit
- Handling session lifecycle: discovering participants, map merge, peer exit
- Syncing user data on top of AR space
- Testing on real devices in different network conditions
Timeline
Basic Collaborative Session with MultipeerConnectivity — 2–3 weeks. With full edge-case handling, state sync and UI — 4–7 weeks. Cost calculated individually.







