Implementing Persistent AR (Saving AR Scenes Between Sessions)
User placed furniture in AR, closed app, opened again — everything in place. That's what Persistent AR does. ARKit implements this via ARWorldMap — snapshot of world space map that can be serialized, saved and loaded in next session.
Sounds simple. In practice — several non-trivial issues with map quality, localization recovery and edge case handling.
How ARWorldMap Works
ARWorldMap contains feature points — visual landmarks ARKit uses for localization. More points and more diverse — more precise scene recovery. Map saved via:
arView.session.getCurrentWorldMap { worldMap, error in
guard let worldMap = worldMap else { return }
let data = try? NSKeyedArchiver.archivedData(withRootObject: worldMap, requiringSecureCoding: true)
// save data to disk or cloud
}
Loading in new session:
let worldMap = try? NSKeyedUnarchiver.unarchivedObject(ofClass: ARWorldMap.self, from: data)
let config = ARWorldTrackingConfiguration()
config.initialWorldMap = worldMap
session.run(config, options: [.resetTracking, .removeExistingAnchors])
After running session with initialWorldMap ARKit tries to relocalize — find current camera view in saved map. Status tracked via session(_:cameraDidChangeTrackingState:): wait .limited(.relocalizing) → .normal.
Real Problems Encountered
Poor map quality. ARWorldMap has mappingStatus: .notAvailable, .limited, .extending. Saving map at .limited status — get poor relocalization. Block "Save" button until .extending reached and show user hint: "Slowly walk around room".
Relocalization doesn't happen. Come to same room, but lighting changed (day/night, curtains drawn). ARKit uses visual feature points — under radically different lighting conditions landmarks don't match. Partial solution: save maps in different lighting conditions and select closest to current conditions. No full solution at ARKit level — fundamental VIO limitation.
Anchors drift on recovery. Anchors saved inside ARWorldMap. On recovery anchor position recovered with map quality precision. Critical objects (e.g., floor markings) can shift 2–5 cm. For such cases add ARAnchor with name and after relocalization pull them to nearest surface via raycast.
Map size. Large room ARWorldMap — 5–20 MB. Store on device — normal. Sync in CloudKit uncompressed — expensive. Use Data.compressed(using: .lzfse) (iOS 16+) or Compression.lzma to reduce to 1–4 MB.
Storing Custom Data with Map
Anchors saved in ARWorldMap.anchors. But app data (what stands on anchor — sofa, lamp, price table) store separately and link by ARAnchor.identifier:
// On saving
let metadata: [String: Any] = [
anchor.identifier.uuidString: ["type": "sofa", "modelName": "ikea_kallax"]
]
// On recovery — match by UUID
Standard practice, but often skipped trying to stuff data in ARAnchor.name — 256 char string no typing.
Case
Interior design app, 3000 active users. Saving furniture placement plans between sessions. Main pain: users saved map immediately after startup (.limited mapping status) — complained objects "float". Added map quality UI indicator (green/yellow/red) blocking save until green. Complaints dropped radically.
Timeline
| Functionality | Timeline |
|---|---|
| Basic scene save/restore | 1–2 weeks |
| Cloud sync + multi-device | 3–4 weeks |
| Smart relocalization + quality management | 2–3 weeks |
Cost calculated individually after analyzing requirements.







