Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Image
Now in public beta

JSX for AI Video

Declarative programming language for Claude Code. AI Agent writes JSX, you get videos.

$bun install vargai ai
Image
Image

portrait.tsx

portrait.mp4

Trusted by teams from

AI Native

Made for AI Agents

AI agents write good varg-react code out of the box. The declarative API maps naturally to how LLMs think about structure.

Clear errors, easy fixes

For the 5% of cases where agents don't one-shot it, they receive clear runtime errors with actionable messages. No cryptic stack traces. No hidden state to debug.

  • Agents generate correct code 95% of the time
  • Clear runtime errors for the other 5%
  • No magic strings or hidden state
  • Type-safe props catch mistakes early
$varg render scene.tsx
Error: Clip duration required when using Video
at Clip (line 12)
hint: add duration={5} or duration="auto"
# agent fixes and retries
$varg render scene.tsx
Rendering 3 clips...
Done! output/scene.mp4
CORE PRIMITIVES

16 components. Infinite possibilities.

A small, composable set of building blocks for assembling any video pipeline — from simple clips to full characters.

<Animate>

Add motion, camera movement, and subtle dynamics to static content using simple declarative controls.

PRODUCTION PATTERNS

See it in action

Real patterns from production. Copy, paste, render.

09-mirror-selfie.tsx

output.mp4

FAQ

TypeScript SDK that unifies AI video, image, voice, and music generation under one API. Instead of learning five different SDKs, you learn one. Comes with a unique JSX syntax for composing videos declaratively — write <Clip>, <Image>, <Speech> and get a rendered MP4.

No. The JSX syntax looks like React but doesn't use React at all. It's a custom JSX runtime that transforms your components into FFmpeg render instructions. You get the familiar developer experience without the React dependency.

Bun is the default and recommended runtime. Node.js works too, but Bun gives you faster installs, native TypeScript, and better performance. The SDK uses Bun-specific APIs like Bun.file() but falls back gracefully.

fal.ai (video, image, lipsync), ElevenLabs (voice, music), OpenAI (Sora video), Replicate (1000+ models), and Higgsfield (character generation). You only need API keys for providers you actually use.

Every element gets a cache key based on its props. Generate an image with prompt="sunset" — it's cached. Render again with the same prompt — instant hit, no API call. Cache is content-addressed (like Git), stored as files. Survives restarts. Saves real money on API costs.

No. The SDK requires FFmpeg for video composition, file system access for caching, and server-side AI API calls. Use it in Node.js/Bun servers, CLI tools, or serverless functions with FFmpeg layers.

The SDK itself: $0. Apache 2.0 license. You pay AI providers directly: fal.ai ~$0.01-0.10/image, ~$0.50-2.00/video. ElevenLabs has a free tier, ~$0.30/1K characters. Caching dramatically reduces costs on iterative work.

Different tools for different jobs. Remotion renders React frame-by-frame — great for motion graphics and data viz. Varg uses AI generation + FFmpeg composition — great for AI-generated content, talking heads, and ads. Use Remotion for precise animations, Varg for AI-powered content.

Yes, with constraints. Works in Server Components, API Routes, and Server Actions. Does not work in Client Components, Edge Runtime, or Vercel Serverless (FFmpeg + timeout limits). Best setup: separate Node.js/Bun service for video rendering.

Depends on what you generate. Image (Flux Schnell): 3-5s. Video (Kling 2.5): 90-180s. Voice (ElevenLabs): 2-5s. FFmpeg composition: 5-30s. Cached element: <100ms. A 30-second video might take 3-5 min first render, 10 seconds cached.

Vary & Generate. Scale.

$bun install vargai ai