I built a system that generates short-form videos from long-form sources using prompt-driven retrieval and automated assembly. Editors describe the desired short (e.g., “the coach’s heated reaction after the equalizer, crowd + scoreboard”), and the pipeline searches analyzed data to find the right moments, stitches them into a 30–60 second sequence, applies subtitles, and auto-reframes for platforms like Reels, Shorts, and TikTok. The result can be downloaded or opened for fine-tuning inside Tessact’s editor.

Tech Stack
ElasticSearch, LangGraph, LangChain, Remotion, React, TypeScript
My Role
I designed the end-to-end pipeline: prompt → semantic retrieval → clip curation → auto-edit assembly → subtitles and aspect-ratio transforms → export to Tessact editor. ElasticSearch powers the semantic search over analyzed metadata; LangGraph/LangChain orchestrate the prompt logic and tool use; Remotion renders the final video with dynamic overlays.
1. Semantic search over analyzed data to find clips matching a prompt and story intent
2. Automatic cut list generation targeting 30–60s duration with context-safe boundaries
3. Subtitle generation with inline editing (timing, line breaks, font/position) and title overlay controls
4. Auto reframing with multiple aspect-ratio presets (9:16, 1:1, 4:5, 16:9) and smart focal tracking
5. Remotion-based rendering pipeline for reproducible, templated motion graphics
6. One-click export: download the short or open it in Tessact’s editor for non-destructive edits
Impact
1. Reduced short-form turnaround from hours to minutes by automating clip discovery and assembly
2. Enabled consistent, on-brand subtitles and titles without After Effects or manual keyframing
3. Increased throughput of publish-ready shorts (observed 2–3× more edits per editor/day)