Integrating ARCore into Android Application
ARCore works on 400+ Android device models — and that's the main complexity. On Pixel 7 with tensor chip, tracking stays stable. On budget Redmi with same ARCore version — planes aren't detected due to weak camera and vibration from cheap OIS. Testing on one device and assuming it works everywhere — mistake.
Compatibility and Degradation
ARCore is supported only on devices from Google's official list. Before starting session, mandatory check:
val availability = ArCoreApk.getInstance().checkAvailability(context)
when (availability) {
ArCoreApk.Availability.SUPPORTED_INSTALLED -> startArSession()
ArCoreApk.Availability.SUPPORTED_NOT_INSTALLED -> promptInstall()
ArCoreApk.Availability.UNSUPPORTED_DEVICE_NOT_CAPABLE -> showFallback()
else -> { /* SUPPORTED_APK_TOO_OLD, UNKNOWN_ERROR */ }
}
UNSUPPORTED_DEVICE_NOT_CAPABLE — device will never get support. Show fallback, don't try installing ARCore. No point fighting this wall.
AR Session Architecture
ARCore in Android is built around Session object from com.google.ar.core. Rendering integration — via GLSurfaceView (old way) or ArSceneView from Sceneform (deprecated by Google, but maintained fork). For new projects, take SceneView — actively maintained Sceneform fork with Kotlin coroutines API.
// build.gradle
implementation("io.github.sceneview:arsceneview:2.0.3")
val arSceneView = binding.arSceneView
arSceneView.onSessionCreated = { session ->
session.configure(
Config(session).apply {
planeFindingMode = Config.PlaneFindingMode.HORIZONTAL_AND_VERTICAL
lightEstimationMode = Config.LightEstimationMode.ENVIRONMENTAL_HDR
depthMode = Config.DepthMode.AUTOMATIC
}
)
}
ENVIRONMENTAL_HDR — key flag for realistic lighting. ARCore captures HDR environment map and applies to PBR materials. Object gets shadows and reflections from real scene.
Depth API
On devices with depth sensor (ToF camera: Pixel 6 Pro, Samsung S21 Ultra) DepthMode.AUTOMATIC enables real depth map. On others — ARCore generates depth via ML model from single camera. Real object occlusion:
arSceneView.onSessionUpdated = { session, frame ->
if (session.isDepthModeSupported(Config.DepthMode.AUTOMATIC)) {
val depthImage = frame.acquireDepthImage16Bits()
// Pass to shader for per-pixel occlusion
depthImage.close()
}
}
Don't forget .close() — Image objects from ARCore hold native memory, leak leads to OutOfMemoryError in minutes of active session.
Plane Detection and Hit Test
Standard object placement by tap:
arSceneView.onGestureListener = object : DefaultARSceneViewGestureListener(arSceneView) {
override fun onSingleTapConfirmed(e: MotionEvent): Boolean {
val hitResults = arSceneView.frame?.hitTest(e.x, e.y) ?: return false
val hitResult = hitResults.firstOrNull { hit ->
hit.trackable is Plane && (hit.trackable as Plane).isPoseInPolygon(hit.hitPose)
} ?: return false
val anchor = hitResult.createAnchor()
val node = ModelNode(modelFileLocation = "models/chair.glb")
node.anchor = anchor
arSceneView.addChild(node)
return true
}
}
isPoseInPolygon check is important — hitTest can return point outside detected plane boundary, and object hangs in air.
Augmented Images
AugmentedImageDatabase — for marker-based AR. Database compiled ahead via arcoreimg CLI (evaluates image quality, Score ≥ 75 for reliable tracking):
arcoreimg eval-img --input_image_path=marker.png
# Image Quality Score: 87 (acceptable range: 0-100, >= 75 recommended)
val imageDatabase = AugmentedImageDatabase(session)
val bitmap = BitmapFactory.decodeStream(assets.open("marker.png"))
imageDatabase.addImage("product-marker", bitmap, 0.10f) // 10 cm physically
config.augmentedImageDatabase = imageDatabase
Monochrome or symmetric-pattern images give low score — ARCore can't find unique feature points. Logos with fine details work better.
Testing on Real Devices
ARCore emulator doesn't provide real tracking. Must test on:
- Flagship (Pixel, Galaxy S-series) — reference performance
- Mid-range (Samsung A-series, Xiaomi Redmi Note) — real audience
- Device without depth sensor — depth API degradation check
Firebase Test Lab supports physical ARCore devices — can automate basic session startup checks.
Timeline
Basic ARCore integration with plane detection and GLB model placement: 3–5 days. Augmented Images with marker database and custom content: 5–8 days. Full solution with depth occlusion, custom lighting, and 200+ device support: 3–5 weeks. Cost calculated individually after requirements analysis.







