CineMorph

16-bit HDR cinematographic DNA. Extract it. Remix it. Blend two films into one.


Inspiration

Every film has a visual signature. You can recognize Kubrick or Villeneuve from a single frame. But what exactly creates that look?

It's not one thing. It's the lighting, the color palette, the lens choice, the atmosphere—all working together in a specific combination. That combination is the director's cinematographic DNA.

The problem is that when you try to recreate these looks with AI, you're gambling. You type "make it look like Blade Runner" and hope for the best. Did the model grab the lighting? The color grade? The lens distortion? You have no idea. And if it's wrong, your only option is to reroll and pray.

FIBO changes that. Every visual parameter is exposed as structured JSON. That means you can extract the cinematographic DNA from any image and manipulate each element independently.

Extract what makes a frame work. Remix it. Blend two films into something new.


What it does

CineMorph is a suite of tools that give you deterministic control over cinematic style.

Studio — DNA Extraction

Upload any image. CineMorph analyzes it and returns the complete visual breakdown as structured JSON:

  • Camera: angle, focal length, depth of field, lens distortion
  • Lighting: direction, intensity, color temperature, softness, number of sources
  • Color: palette, saturation, contrast, grade, temperature
  • Atmosphere: fog, grain, mood, time of day, weather
  • Composition: framing, symmetry, rule of thirds, leading lines

You see exactly what was extracted. No black box. The DNA panel displays the full structure so you can verify it matches your perception of the image.

Remix — Controlled Variation

Here's where FIBO's disentanglement shines.

Change one parameter—shift the lighting from overcast to golden hour—and regenerate. The scene stays identical. Same subjects, same composition, same framing. Only the lighting changes.

Other tools would give you an entirely different image. CineMorph preserves everything you didn't touch. That's the core differentiator: controlled variation instead of starting over.

Blend — Style Interpolation

This is the feature that doesn't exist anywhere else.

Take two films—Blade Runner and Amélie—and mathematically interpolate their DNA at any ratio. 70% noir cyberpunk, 30% Parisian whimsy. Drag the slider and watch the style shift in real time.

The result: visual styles that couldn't exist naturally. Combinations no cinematographer has ever created because no one could precisely control the mix until now.

Director Presets

Curated parameter sets that capture iconic cinematographic signatures:

  • Kubrick: Wide-angle lenses, naturalistic top-lighting, cold desaturated palette, symmetrical composition
  • Tarantino: Warm grain, low angles, saturated primaries, anamorphic distortion
  • Wong Kar-wai: Neon color bleed, shallow focus, step-printed motion, melancholic atmosphere
  • Villeneuve: Vast negative space, diffused naturalism, minimal palette, slow reveals

One click applies their visual language to your scene—not by replacing your image, but by merging their DNA with yours. Your subjects stay. Their style transfers.

16-bit HDR Export

Professional color depth for production workflows. No banding, no compression artifacts. Ready for color grading pipelines.


How I built it

CineMorph is built on FIBO's JSON-native generation, with a FastAPI backend handling orchestration and a React frontend for the studio interface.

Architecture

I chose a unified architecture with four core endpoints sharing a common backend:

  • /extract: Analyzes uploaded image via FIBO, returns complete DNA as structured JSON
  • /remix: Takes DNA with user modifications, preserves seed, regenerates with changes isolated
  • /blend: Accepts two DNA structures and a ratio, interpolates mathematically, generates hybrid
  • /preset: Merges director-specific parameter overrides with extracted DNA

All generation routes through FAL's API for FIBO inference. Seed management is handled server-side to ensure consistency between extraction and remix.

UI Implementation

I used React with TypeScript and Vite for the build system. Tailwind handles styling with a dark theme and film grain overlays. Zustand manages state across the extraction, remix, and blend workflows.

Each director preset has its own gradient treatment matching their actual visual palette—Kubrick's UI tints cold, Tarantino's runs warm.

DNA Extraction

The extraction identifies visual parameters by analyzing:

  • Lighting patterns: Direction, shadow density, highlight distribution
  • Color clustering: Dominant palette, temperature bias, saturation levels
  • Composition markers: Subject placement, symmetry scores, depth planes
  • Atmospheric elements: Grain presence, fog density, bokeh characteristics

The extraction happens server-side via FIBO's analysis capabilities, with parsed DNA returned to the UI for visualization and editing.

Security

CineMorph processes images through FAL's hosted FIBO inference. No images are stored permanently—extraction and generation happen in real time.


Challenges I ran into

Scene Consistency

Early versions had a critical flaw: you'd extract DNA from a car in brutalist architecture, tweak the lighting, and FIBO would generate a woman in a window. Completely different image.

The issue: seed wasn't being preserved between extraction and remix. Every generation started fresh, so parameter changes couldn't be isolated.

The fix was server-side seed management. The extraction endpoint captures and stores the seed, and remix passes it back explicitly. Same seed + modified parameters = controlled variation.

Preset Application

First implementation of director presets generated generic "Spielberg-looking" images instead of applying his lighting choices to YOUR scene. I was replacing the DNA instead of merging it.

The restructure: director presets are now parameter overrides that merge with extracted DNA. Your scene's composition and subjects stay intact; only the stylistic parameters change.

FIBO Refine Workflow

You can't just call generate with modified parameters and expect isolated changes. FIBO has a specific refinement endpoint designed for iterative editing that preserves context.

Remix now calls the refine endpoint, not generate. The original context passes through, and only specified parameters are modified.

Blend Math

Interpolating two DNA structures isn't straightforward. Some parameters interpolate linearly (color temperature), others need different curves (atmosphere mood), and some are categorical (lens type).

I built a custom interpolation system that handles each parameter type appropriately, with smooth transitions across the full blend range.


Accomplishments that I'm proud of

It Actually Works

CineMorph isn't a mockup. Extract pulls real DNA. Remix makes real isolated changes. Blend creates real hybrid styles. The deterministic control FIBO promises actually delivers.

Blend Mode is Novel

I haven't found another tool that mathematically interpolates between two images' visual DNA. The 70/30 Blade Runner/Amélie hybrid creates looks that couldn't exist any other way.

Authentic Director Presets

I didn't guess at "Kubrick = blue and symmetric." I studied actual frames, extracted patterns, and built parameter sets that capture what makes each director recognizable. Users have correctly identified which preset was applied without being told.

The DNA Panel

You're not wondering what the AI understood—you see the complete structured breakdown and can verify it matches your perception. Transparency, not magic.


What I learned

FIBO's Disentanglement is Real

Other image models treat style as monolithic. FIBO actually separates the components so you can manipulate them independently. That's a genuine technical capability, not marketing.

Seed Management is Everything

So much of professional image generation is fighting for consistency. Having deterministic control over what changes and what stays the same is the difference between a toy and a tool.

Demo-Driven Development Works

Building toward specific moments I wanted to show in the video kept scope focused and features meaningful. Every feature exists because it demos well AND solves a real problem.


What's next for CineMorph

Video Support

Extract DNA from film clips, not just frames. Apply consistent style across sequences.

Community Presets

A library where cinematographers share extracted DNA from classic films. Download "Roger Deakins — Blade Runner 2049" as a preset.

Production Integration

Export DNA as LUT files or camera metadata for DaVinci Resolve, Premiere, or Nuke.

Custom DNA Profiles

Train signatures from multiple reference images. Feed CineMorph ten frames from a film and it learns the signature better than any single frame could capture.


Use Cases

Film Student Reference

You're studying Villeneuve's visual style for a project. Instead of scrubbing through Arrival trying to articulate what makes it look that way, extract the DNA. See the exact parameters. Apply them to your own footage.

Music Video Pre-Production

Director wants "70% Fincher, 30% Gondry." Previously that's a vibes conversation. Now it's a slider. Generate reference frames before the shoot so everyone's aligned on the look.

Game Cinematics

Your cutscenes need consistent visual style across dozens of shots. Extract DNA from your hero frame, apply it to every subsequent generation. Deterministic consistency at scale.

Personal Projects

You just want your vacation photos to look like a Wes Anderson film. One click. Done.

Built With

Share this project:

Updates