Photogrammetry object capture for AR

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
Photogrammetry object capture for AR
Complex
~3-5 business days
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1054
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

Implementing Photogrammetry for Objects in AR (Object Capture)

Apple Object Capture API appeared in macOS 12 and enables creating high-quality 3D models from photo series via photogrammetry. Shoot object from 20–200 angles on iPhone—get USDZ with 4K texture and geometry ready for AR. No manual modeling, no 3D editor.

But "just take photos" only works in ideal conditions. Practice—many limitations.

How Object Capture Works

PhotogrammetrySession from RealityKit analyzes photo overlap, builds point cloud, recovers geometry (depth estimation + multi-view stereo) and generates mesh with texture.

import RealityKit

let session = try PhotogrammetrySession(input: folderWithImages)
try session.process(requests: [
    .modelFile(url: outputURL, detail: .medium)
])

for try await output in session.outputs {
    switch output {
    case .processingComplete:
        print("Done")
    case .requestProgressInfo(_, let info):
        print("Progress: \(info.fractionComplete)")
    default:
        break
    }
}

detail—detail level: .preview (fast, rough), .reduced, .medium, .full, .raw (maximum, Mac Pro only). For mobile AR typically .medium—balance between quality and file size.

Important: PhotogrammetrySession runs only on Mac with Apple Silicon or Intel Mac with macOS 12+. Not on iPhone—only shooting, processing on Mac.

What Makes Result Good or Bad

Lighting. Photogrammetry hates harsh shadows and glare. Ideal: diffuse light on cloudy day or light box. Direct sun—harsh shadows "bake" into texture as artifacts. Glossy surfaces (metal, glass, lacquered plastic)—algorithm doesn't recover geometry from mirror reflection.

Coverage of angles. Need overlapping shots from different points: horizontal row around object (every 15°), two more rows at 30° and 60° angles up, plus top shot. Total: 100–200 photos for medium object. Less—mesh holes. More—redundant, processing time grows.

Object size. Object Capture works best for 10–50 cm objects. Small items (coins, jewelry)—need macro photos with high resolution. Large objects (furniture, car)—need special shooting strategy with overlaps.

Object texture. Uniform surfaces (white ball, clean metal cylinder)—photogrammetry doesn't find feature points, geometry reconstructs incorrectly. Solution: temporarily apply matted powder (chalk spray) for added texture, wash off after shooting.

Pipeline from Shooting to AR

  1. Shoot on iPhone. Apps like Reality Composer or specialized third-party (PolyCam, Luma AI) for coverage control. PolyCam shows heat map in real-time—visually see where angles missing.

  2. Transfer to Mac. AirDrop or iCloud Drive. Folder with HEIC/JPEG photos.

  3. Process via Object Capture. Time: 20–60 minutes for 100 photos on M1 Mac. Detail level .medium. Mac Pro—raw detail in same time.

  4. Optimize USDZ. Output from Object Capture—.full or .raw—can weigh 200–500 MB. For AR need:

    • Reduce polygon count via Blender Decimate (50,000–100,000 polygons)
    • Compress textures via TextureConverter
    • Target size for mobile AR: 5–20 MB
  5. Verify and publish. Quick Look on iPhone, scale check, upload to CDN.

Phone Alternatives

Luma AI / Polycam—apps doing 3D reconstruction on phone or in cloud. Luma uses Neural Radiance Fields (NeRF)—result softer with fewer artifacts on complex surfaces. But NeRF mesh needs extra processing before AR (cloud-to-USDZ conversion). Polycam—LiDAR scanning for iPhone Pro + photogrammetry.

ARCore Geospatial Creator and Gaussian Splatting—new approaches for volumetric scenes, not production-ready for AR catalogs yet due to integration complexity.

Case

Antique shop, 80 items for AR catalog. Photo shoot in shop—60×60 cm light box, iPhone 14 Pro. Auto-processing script: folder with photos → Object Capture → Blender Python for decimation and texture compression → USDZ on CDN. Of 80 objects, 12 had glossy surfaces—processed separately with chalk spray. Average time from shoot to ready AR model: 2.5 hours (including M2 MacBook Pro processing).

Timeline

Task Timeline
Setup shooting pipeline + training 2–3 days
Automate Object Capture → optimization 1 week
Process 50–100 objects 2–4 weeks

Cost calculated after reviewing object types and required quality.