Деплой Automatic1111 (SDXL WebUI)
Automatic1111 (A1111) is the most widely used web interface for Stable Diffusion, with a vast ecosystem of extensions (1,000+). It's perfect for teams looking for a ready-to-use UI for generation and a REST API for automation.
Production deployment
# Клонируем и устанавливаем
git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui
cd stable-diffusion-webui
# Модели в правильные директории
# ./models/Stable-diffusion/ — основные чекпоинты (.safetensors)
# ./models/Lora/ — LoRA файлы
# ./models/ControlNet/ — ControlNet модели
# ./models/VAE/ — VAE чекпоинты
# Запуск с API и оптимизациями
./webui.sh \
--api \
--api-auth user:password \
--listen \
--port 7860 \
--xformers \
--opt-sdp-attention \
--medvram-sdxl \
--no-progressbar-hiding
Nginx reverse proxy
server {
listen 443 ssl;
server_name sd.example.com;
ssl_certificate /etc/ssl/sd.crt;
ssl_certificate_key /etc/ssl/sd.key;
location / {
proxy_pass http://127.0.0.1:7860;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_read_timeout 300s; # Генерация может занимать > 60 сек
proxy_send_timeout 300s;
}
}
API usage
import httpx
import base64
from PIL import Image
import io
class A1111Client:
def __init__(self, base_url: str, username: str = None, password: str = None):
self.base_url = base_url.rstrip("/")
self.auth = (username, password) if username else None
async def txt2img(self, payload: dict) -> list[bytes]:
async with httpx.AsyncClient(timeout=300, auth=self.auth) as client:
resp = await client.post(f"{self.base_url}/sdapi/v1/txt2img", json=payload)
resp.raise_for_status()
return [base64.b64decode(img) for img in resp.json()["images"]]
async def interrogate(self, image_bytes: bytes, model: str = "clip") -> str:
"""Определяем промпт для существующего изображения"""
payload = {
"image": base64.b64encode(image_bytes).decode(),
"model": model # clip, deepdanbooru
}
async with httpx.AsyncClient(timeout=60, auth=self.auth) as client:
resp = await client.post(f"{self.base_url}/sdapi/v1/interrogate", json=payload)
return resp.json()["caption"]
async def upscale(self, image_bytes: bytes, scale: float = 2.0, upscaler: str = "ESRGAN_4x") -> bytes:
payload = {
"image": base64.b64encode(image_bytes).decode(),
"upscaling_resize": scale,
"upscaler_1": upscaler
}
async with httpx.AsyncClient(timeout=120, auth=self.auth) as client:
resp = await client.post(f"{self.base_url}/sdapi/v1/extra-single-image", json=payload)
return base64.b64decode(resp.json()["image"])
Useful extensions
| Extension | Function |
|---|---|
| ControlNet | Pose/Structure/Depth Control |
| ADetailer | Automatic Face/Hand Enhancement |
| Ultimate SD Upscale | Tile-based upscaling of large images |
| Regional Prompt | Different prompts for different image areas |
| AnimateDiff | Generate video from prompt |
| IP-Adapter | Style-reference image |
System requirements
| Configuration | VRAM | Images/hour (1024×1024) |
|---|---|---|
| RTX 3060 12GB | 12 GB | ~120 (without xformers) |
| RTX 3090 24GB | 24 GB | ~300 |
| RTX 4090 24GB | 24 GB | ~600 |
| 2× A10G | 2×24 GB | ~800 (with batching) |
Timeframe: Basic deployment with multiple models takes 4–8 hours. Production setup with authentication, monitoring, and reverse proxy takes 1–2 days.







