# Implementing Real-Time Status Updates with Server-Sent Events in Next.js
In modern web applications, providing real-time status updates is crucial for delivering an exceptional user experience. While traditional polling mechanisms work, they often introduce unnecessary latency and resource consumption. This article explores how we implemented **Server-Sent Events (SSE)** in our Next.js-based image processing platform to achieve near-instant status updates with minimal overhead.
## The Challenge: Real-Time Image Processing Status
At [Online Image Upscaler](https://onlineimageupscaler.com), users upload images for AI-powered enhancement that can take 10-30 seconds to complete. Our initial implementation used polling every 2-5 seconds, which resulted in:
- **High latency**: Users waited 1-2 seconds after processing completed before seeing results
- **Resource waste**: 15+ HTTP requests per image during a 30-second processing window
- **Poor user experience**: Visible delays and unnecessary server load
We needed a better solution that could push status updates in real-time while maintaining simplicity and reliability.
## Why Server-Sent Events?
When evaluating real-time communication options, we considered three approaches:
| Feature | Traditional Polling | SSE | WebSocket |
| -------------------- | ----------------------------------- | ------------------------ | --------------------------- |
| **Latency** | High (polling interval dependent) | Near real-time | Real-time |
| **Complexity** | Simple | Simple | Complex |
| **Resource Usage** | High (frequent requests) | Low (long connection) | Low |
| **Auto-Reconnect** | Manual | Native browser support | Manual |
| **HTTP Protocol** | Standard HTTP | Standard HTTP | Protocol upgrade required |
SSE emerged as the ideal choice because:
1. **Near real-time performance**: Updates are pushed immediately when state changes occur
2. **Simple implementation**: Built on standard HTTP, no protocol upgrades needed
3. **Automatic reconnection**: Browser handles reconnection automatically
4. **Better security**: Works seamlessly with existing authentication and HTTPS
5. **Lower overhead**: Single persistent connection vs. multiple polling requests
## Architecture Overview
Our implementation follows this flow:
```
User uploads image
↓
Processing task created (returns historyId)
↓
Frontend establishes SSE connection
↓
Backend processes image via Replicate API
↓
Webhook receives completion notification
↓
SSE pushes status update to clients
↓
Frontend updates UI in real-time
```
## Implementation Details
### 1. SSE Connection Manager
We created a lightweight connection manager to handle multiple SSE connections efficiently:
```typescript
// lib/sse-manager.ts
interface SSEConnection {
write: (data: string) => void
close?: () => void
}
class SSEManager {
private connections = new Map<number, Set<SSEConnection>>()
addConnection(historyId: number, connection: SSEConnection) {
if (!this.connections.has(historyId)) {
this.connections.set(historyId, new Set())
}
this.connections.get(historyId)!.add(connection)
}
removeConnection(historyId: number, connection: SSEConnection) {
const conns = this.connections.get(historyId)
if (conns) {
conns.delete(connection)
if (conns.size === 0) {
this.connections.delete(historyId)
}
}
}
broadcast(historyId: number, data: unknown) {
const conns = this.connections.get(historyId)
if (!conns || conns.size === 0) return
const message = `data: ${JSON.stringify(data)}\n\n`
conns.forEach(connection => {
try {
connection.write(message)
} catch (error) {
// Connection closed, remove it
this.removeConnection(historyId, connection)
}
})
}
}
export const sseManager = new SSEManager()
```
This manager allows us to:
- Track connections by `historyId` (processing task ID)
- Broadcast messages to all clients watching the same task
- Automatically clean up closed connections
### 2. SSE API Endpoint
Our Next.js API route creates the SSE connection:
```typescript
// app/api/realtime/status/route.ts
import { NextRequest } from 'next/server'
import { sseManager } from '@/lib/sse-manager'
import { auth } from '@/lib/auth'
export async function GET(request: NextRequest) {
const searchParams = request.nextUrl.searchParams
const historyId = parseInt(searchParams.get('historyId') || '0')
// Validate user permissions
const session = await auth()
// ... permission checks ...
// Create SSE stream
const encoder = new TextEncoder()
let isClosed = false
const stream = new ReadableStream({
start(controller) {
// Send initial connection message
controller.enqueue(
encoder.encode(`data: {"type":"connected","historyId":${historyId}}\n\n`)
)
// Create connection object
const connection: { write: (data: string) => void; close?: () => void } = {
write: (data: string) => {
if (!isClosed) {
try {
controller.enqueue(encoder.encode(data))
} catch (error) {
isClosed = true
sseManager.removeConnection(historyId, connection)
}
}
},
close: () => {
if (!isClosed) {
isClosed = true
controller.close()
}
},
}
// Register connection
sseManager.addConnection(historyId, connection)
// Handle connection abort
request.signal.addEventListener('abort', () => {
isClosed = true
sseManager.removeConnection(historyId, connection)
controller.close()
})
// Heartbeat to keep connection alive
const heartbeatInterval = setInterval(() => {
if (!isClosed) {
try {
controller.enqueue(encoder.encode(': heartbeat\n\n'))
} catch (error) {
clearInterval(heartbeatInterval)
isClosed = true
sseManager.removeConnection(historyId, connection)
}
} else {
clearInterval(heartbeatInterval)
}
}, 30000) // Every 30 seconds
},
})
return new Response(stream, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache, no-transform',
'Connection': 'keep-alive',
'X-Accel-Buffering': 'no', // Disable Nginx buffering
},
})
}
```
Key implementation details:
- **Initial connection message**: Confirms the connection is established
- **Heartbeat mechanism**: Sends a comment line every 30 seconds to keep the connection alive
- **Connection cleanup**: Properly handles client disconnects and aborts
- **Security**: Validates user permissions before allowing connections
### 3. Webhook Integration
When the external AI processing service completes, it sends a webhook to our backend:
```typescript
// app/api/replicate/webhook/route.ts
import { sseManager } from '@/lib/sse-manager'
export async function POST(request: NextRequest) {
const webhookData = await request.json()
const predictionId = webhookData.id
const status = webhookData.status // 'succeeded', 'failed', 'canceled'
// Find the corresponding processing history
const processingHistory = await prisma.processingHistory.findFirst({
where: { referenceId: predictionId, referenceType: 'replicate_prediction' }
})
if (status === 'succeeded' && webhookData.output) {
// Update database
await prisma.processingHistory.update({
where: { id: processingHistory.id },
data: { status: 'COMPLETED', /* ... */ }
})
// Broadcast SSE notification
sseManager.broadcast(processingHistory.id, {
type: 'completed',
historyId: processingHistory.id,
status: 'completed',
message: 'Processing completed',
timestamp: new Date().toISOString(),
})
}
}
```
This ensures that as soon as processing completes, all connected clients receive immediate notification.
### 4. Frontend Implementation
On the client side, we use the native `EventSource` API with a hybrid approach:
```typescript
// hooks/useImageProcessor.ts
const startPollingStatus = (imageId: string, historyId: number) => {
let eventSource: EventSource | null = null
// Establish SSE connection
const sseUrl = `/api/realtime/status?historyId=${historyId}`
eventSource = new EventSource(sseUrl)
eventSource.onopen = () => {
console.log('SSE connection established')
// Use long-interval polling as backup (30 seconds)
backupPollingInterval = setInterval(() => poll(), 30000)
}
eventSource.onmessage = (event) => {
const data = JSON.parse(event.data)
// Ignore heartbeat messages
if (data.type === 'connected') return
if (data.status === 'completed') {
// Immediately fetch full result from database
poll()
// Clean up
eventSource?.close()
clearInterval(backupPollingInterval)
}
}
eventSource.onerror = (error) => {
console.warn('SSE connection error, falling back to polling')
// Fallback to regular polling
}
}
```
**Hybrid Strategy Benefits**:
- **Primary**: SSE provides near-instant updates
- **Fallback**: 30-second polling ensures reliability if SSE fails
- **Best of both worlds**: Real-time performance with guaranteed delivery
## Performance Results
After implementing SSE, we observed significant improvements:
| Metric | Before (Polling) | After (SSE) | Improvement |
| ---------------------------- | ------------------ | ----------------- | --------------------------- |
| **Average latency** | 1-2 seconds | < 100ms | **10-20x faster** |
| **HTTP requests per task** | 15+ | 1 | **93% reduction** |
| **Server load** | High | Low | **Significant reduction** |
| **User experience** | Noticeable delay | Instant updates | **Much improved** |
## Key Takeaways
1. **SSE is perfect for one-way updates**: If you only need server-to-client communication, SSE is simpler than WebSocket
2. **Hybrid approach ensures reliability**: Combining SSE with polling provides both real-time performance and guaranteed delivery
3. **Connection management matters**: Properly tracking and cleaning up connections prevents memory leaks
4. **Heartbeat keeps connections alive**: Regular heartbeat messages prevent proxy servers from closing idle connections
5. **Security is crucial**: Always validate user permissions before allowing SSE connections
## Conclusion
Implementing SSE in our Next.js application dramatically improved the user experience while reducing server load. The solution is simple, maintainable, and leverages browser-native features. For applications that need real-time server-to-client updates, SSE offers an excellent balance of performance, simplicity, and reliability.
If you're dealing with similar real-time status update challenges, I highly recommend considering SSE as your solution. The implementation is straightforward, and the performance gains are substantial.
---
**Have you implemented SSE in your projects? Share your experiences and challenges in the comments below!**