Integrating Siri for Voice Control in iOS App
Siri Shortcuts allow users to trigger your app's functions by voice without manually opening it. User says "Hey Siri, add milk to the list" — and the app processes the command in the background, without showing the interface. This is not UIApplicationShortcutItem for 3D Touch — this is full-fledged NLP integration through SiriKit.
Two Approaches: Intents vs NSUserActivity
NSUserActivity + INVoiceShortcut — the simplest path. Mark the activity as "supports Siri" and the user assigns a phrase to it in Shortcuts settings:
let activity = NSUserActivity(activityType: "com.yourapp.openDashboard")
activity.title = "Open Dashboard"
activity.isEligibleForSearch = true
activity.isEligibleForPrediction = true // important for Siri Suggestions
activity.suggestedInvocationPhrase = "Open my dashboard"
view.userActivity = activity
activity.becomeCurrent()
When the user says the assigned phrase — the app opens and receives the activity through application(_:continue:restorationHandler:). Drawback: doesn't work in background, requires opening the app.
INIntent + INExtension — more powerful. The command is processed in background through App Extension, without requiring opening the main app. That's how "Turn on the light" or "Order a taxi" work.
Implementation Through Custom Intent
Xcode → File → New → Target → Intents Extension. Create .intentdefinition file with parameter descriptions:
Intent: AddItemIntent
Parameters:
- itemName: String (required)
- listName: String (optional)
Response:
- success: "Added \(itemName) to \(listName)"
- failure: "Failed to add"
Handler in extension:
class AddItemIntentHandler: NSObject, AddItemIntentHandling {
func handle(intent: AddItemIntent,
completion: @escaping (AddItemIntentResponse) -> Void) {
guard let itemName = intent.itemName else {
completion(AddItemIntentResponse(code: .failure, userActivity: nil))
return
}
// Access shared App Group container
let store = ListStore(appGroup: "group.com.yourapp")
store.addItem(itemName, to: intent.listName ?? "Main List")
let response = AddItemIntentResponse(code: .success, userActivity: nil)
response.itemName = itemName
completion(response)
}
func resolveItemName(for intent: AddItemIntent,
with completion: @escaping (INStringResolutionResult) -> Void) {
if let name = intent.itemName, !name.isEmpty {
completion(.success(with: name))
} else {
completion(.needsValue())
}
}
}
App Groups — mandatory for Extension access to main app data. Extension and main target are added to one App Group (group.com.yourapp), SharedFileList/UserDefaults are initialized through this identifier.
Siri Vocabulary: Improving Recognition Accuracy
For domain-specific words (names, terms) — INVocabulary.shared().setVocabularyStrings(names, of: .contactGroupName). For user-generated content (playlist names, tasks) — file AppIntentVocabulary.plist in main bundle with global terms.
Without this, Siri may transcribe "add to Inbox" as "add to inbox" (lowercase) or "add to index" — especially with non-standard names.
Testing
Simulator supports Intent testing through Xcode: run Intents Extension scheme with parameters from .intentdefinition. But voice recognition — only on real device. First Intent request takes time: Siri trains its model for the user.
Debugging tool — os_log with subsystem com.apple.siri, plus standard Xcode Console when extension is running.
Work Process
Analysis of app functionality: what makes sense to call by voice, what parameters.
Approach selection: NSUserActivity (simple navigation shortcuts) or custom Intents (background actions).
Development: .intentdefinition, Intent Handler, App Groups for shared state.
Intent registration in main app's Info.plist and extension.
Testing on real device, Siri Vocabulary for domain-specific terms.
Timeline Benchmarks
Simple shortcuts through NSUserActivity — 1 day. Custom INIntent with App Extension and App Groups — 3–5 days including device testing and Vocabulary setup.







