Implementing Real-Time Notifications (WebSocket/SSE) on a Website
Real-time notifications are one of the most frequent requests. Technically, there's a fundamental difference between WebSocket and SSE: SSE is a one-way stream from server to client over regular HTTP, WebSocket is a two-way channel. For notifications, SSE is often sufficient.
SSE vs WebSocket: When to Choose What
SSE works if you only need to receive events from the server: new messages, status updates, alerts. Works over standard HTTP/2, supports automatic reconnection, doesn't require additional client libraries.
WebSocket is needed if the client also sends data in real-time: chat, games, collaborative editing.
SSE: Server → Client (one-way, HTTP)
WebSocket: Server ↔ Client (two-way, WS protocol)
SSE: Implementation on Node.js/Express
// server/routes/notifications.ts
import { Router, Request, Response } from 'express';
import { authMiddleware } from '../middleware/auth';
const router = Router();
// Map userId -> Set<Response>
const clients = new Map<string, Set<Response>>();
router.get('/stream', authMiddleware, (req: Request, res: Response) => {
const userId = req.user!.id;
res.writeHead(200, {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
'Connection': 'keep-alive',
'X-Accel-Buffering': 'no', // important for nginx
});
// Heartbeat every 30 seconds — otherwise proxy/browser will disconnect
const heartbeat = setInterval(() => {
res.write(':heartbeat\n\n');
}, 30_000);
// Register client
if (!clients.has(userId)) clients.set(userId, new Set());
clients.get(userId)!.add(res);
// Initial snapshot of unread
getUnreadNotifications(userId).then((notifications) => {
res.write(sseEvent('init', notifications));
});
req.on('close', () => {
clearInterval(heartbeat);
clients.get(userId)?.delete(res);
if (clients.get(userId)?.size === 0) clients.delete(userId);
});
});
function sseEvent(type: string, data: unknown, id?: string): string {
let msg = '';
if (id) msg += `id: ${id}\n`;
msg += `event: ${type}\n`;
msg += `data: ${JSON.stringify(data)}\n\n`;
return msg;
}
// Public function to send notification to user
export function pushNotification(userId: string, notification: Notification) {
const userClients = clients.get(userId);
if (!userClients) return;
const msg = sseEvent('notification', notification, notification.id);
userClients.forEach((res) => res.write(msg));
}
export default router;
Client Side: EventSource
class NotificationService {
private es: EventSource | null = null;
private reconnectDelay = 1000;
connect() {
this.es = new EventSource('/api/notifications/stream', {
withCredentials: true,
});
this.es.addEventListener('init', (e) => {
const notifications = JSON.parse(e.data);
notificationStore.setAll(notifications);
});
this.es.addEventListener('notification', (e) => {
const notification = JSON.parse(e.data);
notificationStore.add(notification);
this.showToast(notification);
});
this.es.addEventListener('error', () => {
this.es?.close();
// Exponential backoff
setTimeout(() => {
this.reconnectDelay = Math.min(this.reconnectDelay * 2, 30_000);
this.connect();
}, this.reconnectDelay);
});
this.es.addEventListener('open', () => {
this.reconnectDelay = 1000; // reset on successful connection
});
}
disconnect() {
this.es?.close();
this.es = null;
}
private showToast(notification: Notification) {
// integration with toast library of choice (sonner, react-hot-toast, etc.)
toast(notification.title, {
description: notification.body,
action: notification.actionUrl
? { label: 'Open', onClick: () => navigate(notification.actionUrl!) }
: undefined,
});
}
}
WebSocket Variant: Queue Integration
For production, notifications aren't sent directly from a request to SSE clients — there's a queue between the application layer and delivery:
HTTP Request → DB save → Redis Publish → WebSocket Server → Client
// Emit event (from any service/worker)
import { createClient } from 'redis';
const pub = createClient({ url: process.env.REDIS_URL });
await pub.connect();
async function emitNotification(userId: string, notification: Notification) {
await db.notifications.create({ data: notification });
await pub.publish(
`notifications:${userId}`,
JSON.stringify(notification)
);
}
// WebSocket server subscribes via separate Redis subscriber
const sub = createClient({ url: process.env.REDIS_URL });
await sub.connect();
io.on('connection', (socket) => {
const userId = socket.data.userId;
// Subscribe to personal channel
sub.subscribe(`notifications:${userId}`, (message) => {
socket.emit('notification', JSON.parse(message));
});
socket.on('notification:read', async (notificationId: string) => {
await db.notifications.update({
where: { id: notificationId },
data: { readAt: new Date() },
});
});
socket.on('disconnect', () => {
sub.unsubscribe(`notifications:${userId}`);
});
});
Notification Structure
interface Notification {
id: string;
userId: string;
type: 'comment' | 'mention' | 'order' | 'system' | 'alert';
title: string;
body: string;
actorId?: string; // who initiated
entityType?: string; // 'post' | 'order' | ...
entityId?: string;
actionUrl?: string;
imageUrl?: string;
readAt?: Date;
createdAt: Date;
}
Grouping and Batching
If many events arrive per second (bulk send, data stream), the client receives a separate SSE event for each. Better to group:
// Server buffer: 200ms debounce on flush
const pendingByUser = new Map<string, Notification[]>();
function bufferNotification(userId: string, notification: Notification) {
if (!pendingByUser.has(userId)) {
pendingByUser.set(userId, []);
setTimeout(() => flushUser(userId), 200);
}
pendingByUser.get(userId)!.push(notification);
}
function flushUser(userId: string) {
const batch = pendingByUser.get(userId) ?? [];
pendingByUser.delete(userId);
if (batch.length === 1) {
pushToClient(userId, sseEvent('notification', batch[0]));
} else {
pushToClient(userId, sseEvent('notifications:batch', batch));
}
}
Scaling SSE
SSE keeps an HTTP connection open — each connected user takes one file descriptor. Node.js comfortably maintains 10k+ connections, but with horizontal scaling (multiple instances) a user may be connected to instance A while a notification is generated on instance B. Redis Pub/Sub solves this — each instance subscribes to all channels and delivers only to its clients.
For nginx: proxying SSE requires disabling buffering:
location /api/notifications/stream {
proxy_pass http://app_backend;
proxy_buffering off;
proxy_cache off;
proxy_read_timeout 3600s;
proxy_set_header Connection '';
chunked_transfer_encoding on;
}
SSE notification implementation with Redis: 2–3 days. Adding WebSocket with two-way logic (read receipts, typing): another day.







