What inspired us
AI coding assistants generate code that works but doesn't match your codebase's style. They don't know your function names, patterns, or how your code connects. We built FastCTX to give them that context.
What we learned
LLMs don't need every detail from your code - just the important parts. Function signatures, class relationships, and who calls what. Neo4j's graph database is perfect for storing and querying these code relationships. The Model Context Protocol makes it easy to connect to AI tools.
How we built it
We use Neo4j's LLM Graph Builder to extract code entities and relationships from source files. This creates a knowledge graph where functions, classes, and modules are nodes connected by relationships like "calls", "imports", and "inherits". An MCP server queries this graph to provide context to AI assistants.
Challenges we faced
Setting up the Neo4j backend and getting the LLM Graph Builder to understand code structure took time. We had to define the right schema for code entities and tune the extraction process. Integrating with MCP required careful API design to make graph queries feel natural to AI assistants.
Why it matters
Now when you ask an AI to write code, it knows your codebase's patterns. It suggests functions that actually exist and follows your naming conventions. It's the difference between a helpful assistant and one that actually understands your project.
Built With
- fastapi
- langchain
- react


Log in or sign up for Devpost to join the conversation.