Implementing AR Portal (Immersion in Virtual Scene)
AR portal is a frame in real world through which another environment visible: virtual forest, historical interior, different planet. Inside portal — 360° environment or 3D scene. Outside — real world. User can look in, walk around, go inside.
Technically task of stencil management and culling mask. Implemented via Metal or SceneKit — RealityKit 1.x doesn't support this technique directly.
How It Works: Stencil Buffer
Portal is geometry writing to stencil buffer, but not to color buffer. Virtual scene content renders only where stencil equals 1 (inside frame). Real world renders everywhere stencil equals 0.
In SceneKit implemented via SCNMaterial with custom Metal shader:
// Material for portal frame — writes to stencil, doesn't write color
portalFrameMaterial.writesToDepthBuffer = false
portalFrameMaterial.colorBufferWriteMask = []
// In Metal: stencilReference = 1, stencilWriteMask = 0xFF
// Material for portal content — renders only when stencil == 1
contentMaterial.readsFromDepthBuffer = true
// In Metal: stencilTestFunction = .equal, stencilRef = 1
In RealityKit 2 Metal render passes appeared, but full stencil portal easier via SceneKit or pure Metal with ARSCNView / ARView with custom RenderCallbacks.
What Complicates Implementation
Occlusion from inside. When user enters portal, need to invert logic: outside world hidden, virtual environment fills screen. Detect intersection: if ARCamera.transform inside portal volume — switch rendering mode. Simple AABB check works for most scenarios.
Portal ceiling and floor. Frame is one plane. But virtual scene must be bounded sides, top, bottom, else content "leaks" past frame on side view. Solution: additional invisible "walls" with disabled color-write around virtual volume — they close stencil outside.
Lighting at boundary. Real scene lit by ARKit environment map. Virtual — own skybox or IBL. At portal boundary — hard transition. Soften via SCNScene.fogStartDistance inside virtual volume and alpha-blending on frame edges.
Performance. Render two scenes simultaneously: real world via AR camera and virtual scene inside. On old devices (iPhone X, iPhone 8) FPS drops below 30 from double draw call. Optimization: simplify virtual scene geometry (LOD), use baked lighting instead realtime, limit draw distance inside portal.
Case
Museum app: AR portals in hall with Ancient Rome reconstruction. Frame — marble arch, 2×3 meters in world space. Inside — photogrammetry scans of real artifacts with IBL lighting. Three AR markers on hall floor set portal positions via ARImageAnchor. On user entering — audio environment switched from museum to Rome atmosphere via AVAudioEnvironmentNode with positional sound.
Main testing problem: "leakage" of virtual scene through real walls. User looked at portal through glass partition — stencil didn't account real geometry. Solution: depth occlusion via ARMatteGenerator masking real opaque surfaces.
What's Included
- Metal shader for stencil-based portal
- Detecting user enter/exit via AABB
- Virtual environment: skybox, IBL, LOD geometry optimization
- Depth occlusion for correct real object occlusion
- Positional sound on portal crossing
- Testing on devices from iPhone SE to iPhone 15 Pro
Timeline
| Complexity | Timeline |
|---|---|
| Basic portal with simple 360° skybox | 2–3 weeks |
| Portal with 3D scene, occlusion, sound | 4–6 weeks |
| Multiple portals + enter detection + environment switching | 7–10 weeks |
Cost calculated individually.







