Implementing In-App Feedback (Screenshot + Annotation + Description) in Mobile Apps
Traditional feedback forms provide weak context: a user writes "button doesn't work," and the developer cannot understand which button or under what conditions. In-App Feedback with screenshot capture and annotation tools solves this—users literally show what's wrong. The level of detail in bug reports increases exponentially.
Screenshot Capture
iOS—UIGraphicsImageRenderer
func captureScreenshot() -> UIImage? {
let renderer = UIGraphicsImageRenderer(bounds: UIScreen.main.bounds)
return renderer.image { ctx in
UIApplication.shared.windows.first?.layer.render(in: ctx.cgContext)
}
}
Important nuance: layer.render does not capture content from WKWebView and ARSCNView—they render through a separate GPU context. For WebView, use WKWebView.takeSnapshot(with:):
webView.takeSnapshot(with: nil) { image, error in
// Insert into the final screenshot via Core Graphics
}
Android—PixelCopy API
Before Android 8.0, View.getDrawingCache() was used, but it doesn't capture SurfaceView and TextureView (video, maps, camera). From Android 8.0+, use PixelCopy:
fun captureScreenshot(activity: Activity, callback: (Bitmap?) -> Unit) {
val bitmap = Bitmap.createBitmap(
activity.window.decorView.width,
activity.window.decorView.height,
Bitmap.Config.ARGB_8888
)
PixelCopy.request(activity.window, bitmap, { result ->
callback(if (result == PixelCopy.SUCCESS) bitmap else null)
}, Handler(Looper.getMainLooper()))
}
For Flutter, use RenderRepaintBoundary:
Future<ui.Image> captureWidget(GlobalKey key) async {
final boundary = key.currentContext!.findRenderObject()
as RenderRepaintBoundary;
return boundary.toImage(pixelRatio: 3.0);
}
Annotation Tool
After capturing the screenshot, users should highlight the problem area. Basic tools include: marker (freehand drawing), arrow, rectangular selection, and text label. Optionally—pixelization (blur) to hide sensitive data before sending.
Canvas Implementation on iOS
class AnnotationCanvasView: UIView {
private var paths: [UIBezierPath] = []
private var currentPath: UIBezierPath?
var strokeColor: UIColor = .red
var strokeWidth: CGFloat = 3.0
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
let path = UIBezierPath()
path.move(to: touches.first!.location(in: self))
currentPath = path
}
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
currentPath?.addLine(to: touches.first!.location(in: self))
setNeedsDisplay()
}
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) {
if let path = currentPath { paths.append(path) }
currentPath = nil
}
override func draw(_ rect: CGRect) {
for path in paths {
strokeColor.setStroke()
path.lineWidth = strokeWidth
path.stroke()
}
strokeColor.setStroke()
currentPath?.stroke()
}
}
The final annotated screenshot is created by merging the canvas layer over the screenshot image via UIGraphicsImageRenderer.
Ready-Made Libraries
If you don't need to build canvas from scratch: PSPDFKit Annotations (commercial, professional), Pen (iOS, open source), Annotatable (Flutter). For most product tasks, a custom canvas takes 2–3 days to develop and provides complete UX control.
Metadata Collection
To the screenshot, automatically attach:
struct FeedbackPayload: Encodable {
let screenshot: Data // JPEG, quality 0.7
let description: String
let appVersion: String
let osVersion: String
let deviceModel: String
let screenName: String // current screen (router/NavigationStack)
let userId: String?
let sessionId: String // UUID for correlation with logs
let timestamp: Date
}
screenName is especially important—it immediately clarifies which screen the issue occurred on, without questioning the user.
Sending and Storage
Send the screenshot as multipart/form-data. For backend storage—S3 or similar with pre-signed URLs. Attach a link to the image in a ticket system (Jira, Linear, Sentry).
Example via Sentry:
let attachment = Attachment(
data: screenshotData,
filename: "screenshot.jpg",
contentType: "image/jpeg"
)
SentrySDK.capture(message: feedback.description) { scope in
scope.addAttachment(attachment)
scope.setTag(value: feedback.screenName, key: "screen")
}
Timeline Estimates
Custom implementation with canvas annotation, screenshot capture, and sending to Jira/Sentry—1–1.5 weeks. Integration of a ready-made annotation library with custom UI wrapper—3–5 days.







