AR environment occlusion effects implementation

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
AR environment occlusion effects implementation
Complex
~3-5 business days
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1054
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

Implementing AR Environment Occlusion Effects

Environment occlusion is when real objects properly occlude virtual ones. Placed AR sofa against wall, walk behind it — sofa partially hides behind your silhouette. Without occlusion AR object always draws on top — and presence illusion breaks immediately. Technically one of most demanding effects because it needs real-world depth information in real-time.

Two Fundamentally Different Paths to Occlusion

LiDAR path (iPhone 12 Pro+, iPad Pro): ARWorldTrackingConfiguration.sceneReconstruction = .mesh builds dense environment mesh. This mesh writes values to depth buffer. AR object behind mesh is occluded. Accuracy to 1–2 cm at distance up to 5 meters. Real-time no delay.

ML path (any ARKit/ARCore device): neural network predicts depth map from RGB image (monocular depth estimation). ARKit ARDepthMap / ARCore Depth API (DepthPoint). Accuracy worse (5–15 cm), artifacts at object edges, slight 1–3 frame delay.

For production: if audience is iPhone Pro / iPad Pro, use LiDAR. For mass app — ML depth with LiDAR acceleration where available.

RealityKit: Enable Occlusion One Line

On devices with LiDAR and iOS 15+:

arView.environment.sceneUnderstanding.options = [
    .occlusion,        // real objects occlude AR
    .collision,        // AR objects collide with real geometry
    .physics,          // physics relative to real surfaces
    .receivesLighting  // AR objects lit like real
]

This is literally all needed for basic occlusion in RealityKit. Under the hood — ARKit builds environment mesh, RealityKit uses it as occluder.

Without LiDAR on same API: ARView uses ARKit person segmentation (A12+) — only human silhouette occludes AR objects. Table, chair, wall — don't occlude. Limited, but better than nothing.

Person Occlusion: Separate Task

ARWorldTrackingConfiguration.frameSemantics = .personSegmentation — ARKit segments person on frame and creates stencil mask. AR objects "behind" person occluded by their silhouette. Works on A12+ without LiDAR.

Mask quality — small halo 2–5 pixels around body edge. On dark background noticeable, on natural — acceptable.

For custom rendering: ARFrame.segmentationBufferCVPixelBuffer with pixel classes (ARFrame.SegmentationClass.person). Used in Metal shaders for custom compositing.

ARCore Depth API (Android)

On devices with ToF sensor (Samsung Galaxy S21 Ultra, LG V60) — hardware depth. On rest — ML depth from ARCore Depth API (Raw Depth Image):

val frame = session.update()
if (session.isDepthModeSupported(Config.DepthMode.AUTOMATIC)) {
    val depthImage = frame.acquireDepthImage16Bits()
    // depth in mm, 16-bit unsigned short
    // ARCore recommends use via OpenGL texture
}

ARCore provides OcclusionShader via sample — OpenGL fragment shader discarding AR object fragments deeper than real geometry. Foundation for occlusion in Sceneform/custom renderer.

Custom Occlusion Shader (Metal / OpenGL ES)

For non-standard requirements (need blurred edge on occlusion boundary, artistic style) — custom Metal shader:

fragment float4 occlusionFragment(
    VertexOut in [[stage_in]],
    texture2d<float> arDepthMap [[texture(0)]],
    texture2d<float> vrObject [[texture(1)]]
) {
    float2 uv = in.texCoord;
    float realDepth = arDepthMap.sample(s, uv).r; // depth of real scene
    float vrDepth = in.depth; // depth of AR object

    if (vrDepth > realDepth + 0.01) {
        discard_fragment(); // AR object behind real geometry
    }

    return vrObject.sample(s, uv);
}

Offset 0.01 (1 cm) prevents z-fighting on surfaces to which AR object is "pressed".

Soft Occlusion: Blurred Edges

At LiDAR mesh edges occlusion transition sometimes "pixelated" — depth resolution finite. Soft occlusion via bilateral blur on depth mask smooths transition. In Metal: custom kernel with depth-aware blur (blur only within depth threshold).

Noticeable only on careful look, but for premium AR product worth implementing.

Shadows from Virtual Objects on Real Surfaces

Bonus to occlusion: AR sofa casts shadow on real floor. In RealityKit with sceneUnderstanding.receivesLighting — shadows automatic. In custom render: shadow map from AR object, projected on real-world geometry from LiDAR mesh.

Without LiDAR: shadow only on ARKit-detected planes (floor, table). User sees "flat" shadow not following real surface relief.

Timeline

LiDAR occlusion via RealityKit API — 2–3 days integration. Person segmentation occlusion — 3–5 days. ARCore Depth occlusion on Android — 1–2 weeks (custom shader). Soft occlusion with custom Metal/GLSL shading — plus 1–2 weeks. Cost calculated individually.