Inspiration
Wanting to chat with an AI about information that is on Piazza, but the issue is security and privacy of user data. We wanted to come up with something that would let us have the benefits of a chatbot LLM but with a more privacy and security focus.
What it does
It is an interface that allows user's to chat with a LLM that has been fine tuned with encrypted data. The queries are encrypted on device so it is fully E2EE.
How we built it
We have react frontend that will send queries to our backend. This FastAPI backend connects the other services like QDrant and LLama3 to provide inference. Then returns the data back to the user
Challenges we ran into
Getting everything to work together since there are so many moving pieces.
Accomplishments that we're proud of
Integrating the frontend with a vector database and connecting this to a LLM hosted on a remote container
What we learned
How to build a RAG application
What's next for Chatzar
Integrating the ability upload encrypted data and fine-tune data
Built With
- fastapi
- llama
- qdrant
- react
Log in or sign up for Devpost to join the conversation.