AI Image Editing (Outpainting) for Mobile App

NOVASOLUTIONS.TECHNOLOGY is engaged in the development, support and maintenance of iOS, Android, PWA mobile applications. We have extensive experience and expertise in publishing mobile applications in popular markets like Google Play, App Store, Amazon, AppGallery and others.
Development and support of all types of mobile applications:
Information and entertainment mobile applications
News apps, games, reference guides, online catalogs, weather apps, fitness and health apps, travel apps, educational apps, social networks and messengers, quizzes, blogs and podcasts, forums, aggregators
E-commerce mobile applications
Online stores, B2B apps, marketplaces, online exchanges, cashback services, exchanges, dropshipping platforms, loyalty programs, food and goods delivery, payment systems.
Business process management mobile applications
CRM systems, ERP systems, project management, sales team tools, financial management, production management, logistics and delivery management, HR management, data monitoring systems
Electronic services mobile applications
Classified ads platforms, online schools, online cinemas, electronic service platforms, cashback platforms, video hosting, thematic portals, online booking and scheduling platforms, online trading platforms

These are just some of the types of mobile applications we work with, and each of them may have its own specific features and functionality, tailored to the specific needs and goals of the client.

Showing 1 of 1 servicesAll 1735 services
AI Image Editing (Outpainting) for Mobile App
Complex
~5 business days
FAQ
Our competencies:
Development stages
Latest works
  • image_mobile-applications_feedme_467_0.webp
    Development of a mobile application for FEEDME
    756
  • image_mobile-applications_xoomer_471_0.webp
    Development of a mobile application for XOOMER
    624
  • image_mobile-applications_rhl_428_0.webp
    Development of a mobile application for RHL
    1054
  • image_mobile-applications_zippy_411_0.webp
    Development of a mobile application for ZIPPY
    947
  • image_mobile-applications_affhome_429_0.webp
    Development of a mobile application for Affhome
    862
  • image_mobile-applications_flavors_409_0.webp
    Development of a mobile application for the FLAVORS company
    445

Implementing AI Image Editing (Outpainting) in a Mobile App

Outpainting extends image beyond its borders. User uploads 9:16 photo, wants 16:9 — model paints sides while preserving style and context. Or wants to "pull camera back": show more environment around subject. DALL-E 2 supports this natively, Stable Diffusion — via same inpainting with expanded canvas.

Client-side logic: canvas preparation

Essence of outpainting: place original image on larger canvas (with transparent or neutral padding), send to API as image + mask (where mask = new empty areas), receive filled result.

func prepareOutpaintingCanvas(
    original: UIImage,
    targetSize: CGSize,
    placement: CGPoint // where to place original on canvas
) -> (image: UIImage, mask: UIImage) {
    // Create canvas of desired size
    UIGraphicsBeginImageContextWithOptions(targetSize, false, 1.0)
    let context = UIGraphicsGetCurrentContext()!

    // Fill with gray (neutral color for expansion areas)
    context.setFillColor(UIColor.gray.cgColor)
    context.fill(CGRect(origin: .zero, size: targetSize))

    // Insert original
    original.draw(at: placement)
    let compositeImage = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext()

    // Create mask: black pixels = preserve, white = paint
    // For DALL-E: alpha channel instead of grayscale
    UIGraphicsBeginImageContextWithOptions(targetSize, false, 1.0)
    let maskContext = UIGraphicsGetCurrentContext()!

    // White (transparent for DALL-E) — new areas
    maskContext.setFillColor(UIColor.white.cgColor)
    maskContext.fill(CGRect(origin: .zero, size: targetSize))

    // Black (opaque) — original image
    maskContext.setFillColor(UIColor.black.cgColor)
    maskContext.fill(CGRect(origin: placement, size: original.size))
    let maskImage = UIGraphicsGetImageFromCurrentImageContext()!
    UIGraphicsEndImageContext()

    return (compositeImage, maskImage)
}

Original position on canvas determines expansion direction:

  • Canvas center → expand on all sides
  • Left edge → expand right only
  • Arbitrary → user controls via drag

UI for controlling expansion

Interaction pattern:

  1. User sees original image in canvas "frame"
  2. Can drag image inside frame (or move frame)
  3. Select final aspect ratio: 16:9, 4:3, 1:1, or custom
  4. Press "Expand"
// Android: draggable image inside canvas with GestureDetector
class OutpaintingView(context: Context) : View(context) {
    var imageOffsetX = 0f
    var imageOffsetY = 0f
    private val gestureDetector = GestureDetector(context, object : SimpleOnGestureListener() {
        override fun onScroll(e1: MotionEvent?, e2: MotionEvent, dx: Float, dy: Float): Boolean {
            imageOffsetX -= dx
            imageOffsetY -= dy
            // Constrain offset within canvas
            imageOffsetX = imageOffsetX.coerceIn(-maxOffsetX, 0f)
            imageOffsetY = imageOffsetY.coerceIn(-maxOffsetY, 0f)
            invalidate()
            return true
        }
    })

    override fun onDraw(canvas: Canvas) {
        canvas.drawColor(Color.DKGRAY) // expansion area background
        canvas.drawBitmap(originalBitmap, imageOffsetX, imageOffsetY, null)
    }
}

Send to DALL-E 2

// DALL-E 2: image and mask sent as PNG with alpha channel
// In mask: transparency = edit, opacity = preserve

func outpaint(composite: UIImage, mask: UIImage, prompt: String, targetSize: String = "1024x1024") async throws -> UIImage {
    // Convert mask: white pixels → transparent (alpha = 0)
    let alphaMask = convertToAlphaMask(mask)

    guard let compositeData = composite.pngData(),
          let maskData = alphaMask.pngData() else { throw OutpaintError.conversionFailed }

    // Request identical to inpainting — same /v1/images/edits endpoint
    return try await sendInpaintRequest(imageData: compositeData, maskData: maskData, prompt: prompt, size: targetSize)
}

private func convertToAlphaMask(_ mask: UIImage) -> UIImage {
    UIGraphicsBeginImageContextWithOptions(mask.size, false, 1.0)
    guard let context = UIGraphicsGetCurrentContext() else { return mask }
    // Invert: white → transparent
    context.setBlendMode(.normal)
    mask.draw(in: CGRect(origin: .zero, size: mask.size))
    // Invert alpha channel via CIImage
    UIGraphicsEndImageContext()
    // Use Core Image for mask inversion
    let ciImage = CIImage(image: mask)!
    let inverted = ciImage.applyingFilter("CIColorInvert")
    return UIImage(ciImage: inverted)
}

Size limitations

DALL-E 2 accepts only square images: 256×256, 512×512, 1024×1024. For outpainting in 16:9, need intermediate solution: generate in 1024×1024, then crop to desired ratio. Or do multiple outpainting iterations.

Stable Diffusion via Replicate/FAL accepts arbitrary sizes (multiples of 64), more convenient for outpainting in non-standard ratios.

Common issues

Visible seam at original and painted border — happens when original placed with sharp edge on neutral background. Solution: slight gradient transition (feather) on mask edges — blur mask 10–20 pixels.

Style mismatch — model painted something completely different style. Prompt should describe entire scene, not just new part: "sunny urban landscape, photorealism, warm tones" not "continue".

Timeline

Basic outpainting with DALL-E 2 (fixed expansion directions) — 5–7 days. Interactive draggable canvas with arbitrary positioning, multiple iterations, history — 3–4 weeks.