Implementing Pet Activity Monitoring via Mobile App
Pet activity trackers—Whistle Go Explore, Tractive GPS, Fi Collar—solve two tasks: geolocation and health monitoring. Geolocation clear (LTE-M/GPS tracker with server). Activity monitoring—processing IMU data from collar accelerometer and classifying behavior: sleep, walk, run, play, eat. This is real development.
Activity Classification from IMU
Collar tracker contains 3-axis accelerometer (ADXL345, MPU-6050 or similar). Data at 50 Hz—50 acceleration vectors per second. On-device classification saves battery (no raw data streaming), requires ML model in firmware or mobile app.
On iOS—Core ML with model trained on labeled activity data. Convert via coremltools from TensorFlow Lite or PyTorch. Input tensor: 2.5 second window × 50 Hz = 125 samples × 3 axes. Output: softmax over classes {sleep, rest, walk, run, play, eat}.
class PetActivityClassifier {
private let model: PetActivityMLModel
private var window: [[Double]] = []
private let windowSize = 125
private let strideSize = 25 // 50% overlap
func processSample(x: Double, y: Double, z: Double) -> ActivityClass? {
window.append([x, y, z])
guard window.count >= windowSize else { return nil }
let input = try? MLMultiArray(shape: [1, NSNumber(value: windowSize), 3], dataType: .double)
for (i, sample) in window.enumerated() {
input?[i * 3] = NSNumber(value: sample[0])
input?[i * 3 + 1] = NSNumber(value: sample[1])
input?[i * 3 + 2] = NSNumber(value: sample[2])
}
window.removeFirst(strideSize)
guard let modelInput = input,
let prediction = try? model.prediction(input: modelInput) else { return nil }
return ActivityClass(rawValue: prediction.classLabel)
}
}
On Android—TensorFlow Lite with NNAPI delegate for hardware acceleration. Model packaged in assets, loaded via Interpreter. Window logic identical.
Syncing Data from Tracker
Tracker accumulates activity aggregates (minute summaries) and syncs via BLE on phone approach. Protocol: GATT with Indicate Characteristic—tracker notifies data readiness, app reads in 20-byte chunks (default MTU). Speed up by requesting MTU 512: gatt.requestMtu(512) on Android, peripheral.maximumWriteValueLength(for: .withResponse) on iOS.
GPS trackers with LTE-M (Tractive, Fi) sync via cloud. Mobile app—client to REST API. Geofencing on server: "home", "yard" zones, push notification when pet exits perimeter.
Health and Activity Dashboard
Daily activity stats—main screen. Ring chart of activity per day: sleep/rest/walk/run/play. Trend graphs over weeks/months. iOS—HealthKit integration via HKWorkout (pets not standard types, but HealthKit optional—own storage sufficient).
Metrics owners care about:
- Active movement minutes per day
- Distance (calculated from steps or GPS)
- Calories (rough estimate from pet weight and activity)
- Sleep quality (night movement)
- Week-on-week comparison
Anomaly pushes: "Fluffy 80% less active than usual"—Firebase Cloud Messaging with server analytics on baseline activity over previous 7 days.
Geolocation: GPS + Geofencing
For GPS trackers with cellular—periodic positions via API (every 1-5 min in battery save mode, every 10-30 sec in tracking mode). Map—MapKit (iOS) or Google Maps SDK (Android) with pet track history per day.
App-level geofencing: CLRegion/CLCircularRegion on iOS for small zones via CLLocationManager—works only when phone in zone. More reliable—server geofencing with pushes.
Developing mobile app for BLE pet activity tracker with behavior classification and dashboard: 6-10 weeks. With GPS integration and server geofencing: 3-4 months. Cost individually quoted after tracker protocol and platform requirements.







