Developing an Online Video Editor
A browser video editor is one of the most technically heavy tasks in frontend development. Key decision at the start: where does the final video rendering happen — in the browser or on the server. The entire stack depends on this.
Browser vs Server: where to render
Client-side rendering (WebCodecs API + FFmpeg.wasm):
- Doesn't require server computing power for rendering
- Limited by user's CPU performance
- FFmpeg.wasm: 10 minutes video renders ~5–15 minutes
- WebCodecs support: Chrome 94+, Firefox 130+, Safari 16.4+
Server-side rendering (Remotion + FFmpeg):
- Rendering on powerful servers (GPU optional)
- User doesn't wait — gets ready file by link
- Remotion can render React components to video
- Scales well via AWS Lambda
For commercial product with long videos — server-side rendering. For light tool (clips up to 2 minutes) — client-side.
Timeline (Timeline)
Central UI concept — timeline with tracks. Data structure:
interface VideoProject {
id: string;
duration: number; // seconds
fps: number; // 24 | 30 | 60
width: number;
height: number;
tracks: Track[];
}
interface Track {
id: string;
type: 'video' | 'audio' | 'text' | 'image' | 'effect';
clips: Clip[];
muted: boolean;
locked: boolean;
volume: number; // 0–1
}
interface Clip {
id: string;
trackId: string;
assetId: string; // reference to uploaded file
startTime: number; // position on timeline (seconds)
duration: number; // clip duration
trimStart: number; // trim source file start
trimEnd: number; // trim source file end
speed: number; // 0.25 – 4.0
opacity: number;
transform?: ClipTransform;
filters?: VideoFilter[];
}
Browser preview
For preview before rendering, use HTML5 <video> with sync via currentTime:
const PreviewPlayer: React.FC = () => {
const { currentTime, isPlaying, tracks } = useEditorStore();
const videoRefs = useRef<Map<string, HTMLVideoElement>>(new Map());
useEffect(() => {
// Synchronize all video clips with timeline
tracks.forEach(track => {
track.clips.forEach(clip => {
const video = videoRefs.current.get(clip.id);
if (!video) return;
const clipTime = currentTime - clip.startTime;
const isActive = clipTime >= 0 && clipTime <= clip.duration;
video.style.display = isActive ? 'block' : 'none';
if (isActive) {
const targetTime = clip.trimStart + clipTime * clip.speed;
if (Math.abs(video.currentTime - targetTime) > 0.05) {
video.currentTime = targetTime;
}
isPlaying ? video.play() : video.pause();
} else {
video.pause();
}
});
});
}, [currentTime, isPlaying]);
};
Scrubbing on timeline:
const Timeline: React.FC = () => {
const { setCurrentTime, duration } = useEditorStore();
const railRef = useRef<HTMLDivElement>(null);
const handleMouseDown = (e: React.MouseEvent) => {
const handleMove = (e: MouseEvent) => {
const rect = railRef.current!.getBoundingClientRect();
const pct = Math.max(0, Math.min(1, (e.clientX - rect.left) / rect.width));
setCurrentTime(pct * duration);
};
document.addEventListener('mousemove', handleMove);
document.addEventListener('mouseup', () => {
document.removeEventListener('mousemove', handleMove);
}, { once: true });
};
return (
<div ref={railRef} className="timeline-rail" onMouseDown={handleMouseDown}>
<TimelinePlayhead />
</div>
);
};







