Skip to content

SreevikramR/simify

Repository files navigation

Simify

Inspiration

We’ve all been there: staring at a static diagram in a textbook or watching a passive video tutorial, trying to grasp a complex concept like Orbital Mechanics or Binary Search Trees. The disconnect is real: learning shouldn't be about memorizing static images; it should be about playing with variables and seeing the consequences.

We asked ourselves: "What if you could talk to a computer, and instead of just writing back text, it built a simulation for you?"

We wanted to move beyond "Text-to-Text" and unlock "Text-to-Simulation." We wanted to build an engine that turns curiosity ("What happens if gravity is doubled?") into immediate, interactive reality. That desire to bridge the gap between abstract questions and visual answers is what birthed Simify.


What it does

Simify is a Generative UI engine that transforms natural language prompts into fully interactive, animated STEM lessons.

  • Ask Anything: The user types a query like "Show me how a Riemann Sum approximates an integral" or "Simulate a damped spring oscillation."
  • AI Architecting: Behind the scenes, Simify uses Gemini 3 pro to act as a "pedagogical architect." Crucially, it doesn't try to "run" the physics itself. Instead, it designs the blueprint, setting initial conditions, constants, and layout logic.
  • Deterministic Engine Execution: The system hands this blueprint to our custom, high-performance rendering engine. This engine runs real, deterministic code, solving differential equations for physics or executing actual sorting algorithms in real-time. The AI decides what to build; the engine ensures how it behaves is mathematically rigorous.
  • Interactive Play: The user isn't just watching a video. They are viewing a live simulation running at 60fps. They can pan, zoom, and interact with a world where gravity, friction, and logic gates behave exactly as they would in a dedicated physics engine, not as an LLM hallucination.

It effectively turns the LLM into a game developer that builds educational mini-games in seconds, specifically tailored to your question.


How we built it

Simify is built on a sophisticated Two-Stage Generative Architecture that separates intent from execution:

1. The "Architect" (Conceptual Layer)

When a user submits a prompt, our backend (Node.js/Express) sends it to Gemini 3 Flash. We instruct the model to act as a "Visual Teacher," generating a high-level Design Spec. It outlines the educational goal, writes a script, and plans the visual scene in natural language first.

2. The "Engineer" (Implementation Layer)

We feed that Design Spec into a second, strictly constrained AI agent. This agent is trained on our custom LessonPlan schema. Its sole job is to translate the creative intent into valid, executable JSON. Crucially, it generates simulation parameters (e.g., "initial velocity = 50m/s"), not pre-baked animation frames. It selects from our library of primitives (e.g., projectile_motion, binary_tree, logic_gate) to build the scene.

3. The Rendering Engine (Frontend)

We didn't use Unity or Godot. We built a custom React + TypeScript Physics Engine from scratch.

  • Universal Renderer: A dynamic component system that parses the incoming JSON and hot-swaps visual components (Nodes).
  • The Timeline System: We built a custom hook (useLessonPlayer) that executes an array of time-stamped events (update_node, highlight_node), perfectly synchronizing the visual animations with the lesson script.
  • True Simulation Loop: This is the differentiator. Unlike video generation models that "dream" movement, our engine runs a deterministic 60fps game loop. The SolarSystem component calculates Keplerian orbits every frame; the SpringMass component solves differential equations in real-time. The LLM provides the initial conditions, but the rendering engine enforces the laws of physics, ensuring the simulation is mathematically accurate and hallucination-free.

Challenges we ran into

  • The "Hallucination" Problem: Getting an LLM to output valid JSON that compiles into a working application is notoriously difficult. Early versions would invent non-existent components like super_cool_rocket. We solved this by implementing the strict Two-Stage Prompting pipeline and a rigorous schema validation layer that forces the AI to "think" before it codes.
  • Animation Synchronization: Orchestrating a script so that the visual of a "swapped array element" happens exactly when the text explains it was a nightmare of timing. We had to build a deterministic event system where the AI predicts the timestamp of every action.
  • Complex Physics in JSON: Representing dynamic physics (like orbital trajectories) in a static JSON format was tricky. We had to abstract the physics into "Primitives" where the AI only configures the parameters (gravity constant, initial velocity) rather than writing the math logic itself.

Accomplishments that we're proud of

  • The "Cooking" Experience: We managed to make the generation process feel magical. The transition from a text prompt to a living, breathing canvas feels like true sci-fi.
  • Diverse Domain Support: We aren't just doing math. Simify successfully handles Computer Science (sorting algos, BSTs), Physics (kinematics, springs), Calculus (integrals), and Logic Circuits (K-Maps) all within the same unified engine.
  • Custom Game Engine: Building an InfiniteCanvas with zoom/pan capabilities and an entity-component system in pure React was a heavy lift, but it results in a buttery smooth 60fps experience.

What we learned

  • LLMs as Architects: We learned that LLMs are much better at coding when you ask them to design first and implement second. Treating the model like a two-person team (Designer + Dev) drastically reduced errors.
  • Generative UI is the Future: We discovered that the most powerful interface isn't one we build, but one the AI builds for us.

What's next for Simify

  • User Interaction: Allowing the user to grab the "Gravity" slider mid-simulation and derail the lesson to see what happens.
  • More Primitives: expanding our library to include Chemistry (molecule bonding) and Biology (cellular mitosis).
  • Teacher Mode: allowing educators to "edit" the generated JSON to fine-tune lessons for their classes.

About

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages