AI posture analysis via mobile camera in mobile app

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
AI posture analysis via mobile camera in mobile app
Complex
~1-2 weeks
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1052
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

AI-Powered Posture Analysis via Mobile Camera

Front camera captures user, pose estimation model outputs 17–33 skeleton key points in real-time. Then—geometry: joint angles, center of mass shift, shoulder line tilt. This is posture analysis.

Pose Estimation: Model Selection

Two main paths on iOS—Apple Vision framework with VNDetectHumanBodyPoseRequest, and MediaPipe Pose (BlazePose). On Android—ML Kit Pose Detection or same MediaPipe.

Apple Vision — Native iOS Choice

import Vision
import AVFoundation

class PostureAnalyzer: NSObject {
    private var poseRequest = VNDetectHumanBodyPoseRequest()

    func analyze(sampleBuffer: CMSampleBuffer) {
        let handler = VNImageRequestHandler(cmSampleBuffer: sampleBuffer, orientation: .up)
        do {
            try handler.perform([poseRequest])
            guard let observation = poseRequest.results?.first else { return }
            processBodyPose(observation)
        } catch {
            print("Pose detection failed: \(error)")
        }
    }

    private func processBodyPose(_ observation: VNHumanBodyPoseObservation) {
        guard
            let leftShoulder = try? observation.recognizedPoint(.leftShoulder),
            let rightShoulder = try? observation.recognizedPoint(.rightShoulder),
            let nose = try? observation.recognizedPoint(.nose),
            leftShoulder.confidence > 0.6,
            rightShoulder.confidence > 0.6
        else { return }

        // Shoulder tilt angle
        let shoulderDelta = leftShoulder.location.y - rightShoulder.location.y
        let shoulderWidth = abs(leftShoulder.location.x - rightShoulder.location.x)
        let shoulderTiltAngle = atan2(shoulderDelta, shoulderWidth) * 180 / .pi

        // Head offset from shoulder center
        let shoulderMidX = (leftShoulder.location.x + rightShoulder.location.x) / 2
        let headOffset = (nose.location.x - shoulderMidX) / shoulderWidth

        postureObserver?(PostureMetrics(
            shoulderTilt: shoulderTiltAngle,
            headOffset: headOffset
        ))
    }
}

confidence > 0.6—key point confidence filter. Below—ignore, else model "hallucinates" points off-screen.

VNDetectHumanBodyPoseRequest returns 19 points in normalized [0, 1] coordinates. Vision coordinates flipped in Y (0 = bottom), UIView coordinates—0 = top. When drawing—invert Y.

MediaPipe Pose — Cross-Platform Variant

MediaPipe BlazePose gives 33 landmark points including face and hands. Precise over Vision on shoulders and hips, heavier on resources. On mobile use Lite or Full model (Heavy—only powerful devices, no battery limits).

// iOS via MediaPipe Tasks SDK
import MediaPipeTasksVision

let poseLandmarker = try PoseLandmarker(options: {
    let options = PoseLandmarkerOptions()
    options.runningMode = .liveStream
    options.numPoses = 1
    options.minPoseDetectionConfidence = 0.5
    options.minPosePresenceConfidence = 0.5
    options.poseLandmarkerLiveStreamDelegate = self
    return options
}())

Posture Metrics: What We Measure

Good posture—geometry, formalized:

Metric Norm How we calculate
Shoulder tilt < 5° atan2(Δy shoulders, Δx shoulders)
Forward head posture < 15° ear-shoulder-neck angle
Torso tilt ±3° vertical through shoulders and hips
Shoulder symmetry in Y < 3% screen height shoulder Y-coordinate difference

Forward head posture—most common among computer users. Measure via angle between ear→shoulder vector and vertical. In Vision: leftEar → leftShoulder vector, angle to screen Y-axis.

Real-Time Analysis: Performance

Pose estimation every AVCaptureSession frame (30 fps)—too expensive on older devices. Strategies:

  • Run analysis not every frame, via CMSampleBuffer.sampleBufferCallbackQueue with throttling: every 100ms (10 fps analysis)
  • VNDetectHumanBodyPoseRequest—execute on background queue, not main thread. VNImageRequestHandler.perform() synchronous, blocks thread
private let analysisQueue = DispatchQueue(label: "posture.analysis", qos: .userInitiated)
private var lastAnalysisTime: CFTimeInterval = 0

func captureOutput(_ output: AVCaptureOutput,
                   didOutput sampleBuffer: CMSampleBuffer,
                   from connection: AVCaptureConnection) {
    let now = CACurrentMediaTime()
    guard now - lastAnalysisTime > 0.1 else { return }  // 10 fps
    lastAnalysisTime = now

    analysisQueue.async {
        self.analyze(sampleBuffer: sampleBuffer)
    }
}

User Feedback

Two modes:

  1. Real-time overlay—lines over camera show deviations (red lines = uneven shoulders). ARKit ARSCNView or just CAShapeLayer over preview layer
  2. Session analysis—user holds phone 30 seconds, gets session report

Sound/tactile signal on deviation + posture hold timer. Gamification: streak of good posture per day.

For correction recommendations—link metrics with specific exercises. Forward head > 20°: chest stretch and posterior neck strengthening—with illustrations and video.

Development Process

Choose and integrate pose estimation (Vision vs MediaPipe per requirements). Implement geometric posture metrics. Throttle and optimize performance on real devices. UI: camera + overlay + report. Correction recommendation library.

Timeframe Estimates

Real-time analysis with basic metrics and overlay—1–2 weeks. Complete app with session history, personal recommendations, gamification—2–4 weeks.