Inspiration: I lost my uncle(Dad's little brother) who was a prison warder in Anambra state Nigeria. He was violently killed while transporting a prisoner to the courthouse. few months before he passed, he sent me a voicenote to encourage and inspire me because I had just moved to america and was going through a rough time. His voice note became a source of solace and strength for me whenever I was going through anything, even years after he was gone. The voice note to me felt like a piece of his essence was preserved, an "echo" in reverberation of the vibrations he made while alive. I made echo vault to simulate that comfort my uncle's voicenote gave me, but make it even more advanced. I have no technical skill so when I first came up with the idea in 2020, it was more of just a sci fi fantasy. Few years later AI and vibe coding have made it possible to create a means of actually preserving and immortalizing a user's legacy digitally. and now unlike my uncle's voicenote which was just static, the User's legacy can now be dynamic and realistic using the user's real personality and memories to interact with their loved ones after they pass

How it works: Echo Vault extracts the user's memories and personality traits by means of a diary session called "check in" or through the the quick memory window on the home dashboard. the check in is guided by a digital professional biographer which is actually just gemini with a system prompt. the ai biographer guides the user through a series of smart questions designed to elicit the most salient of memories and personality traits from the user's response. when the check in session is concluded, the user clicks complete check in. this triggers and edge function that trains a personality model on the contents of the interaction. the edge function extracts traits and memories from each conversation. the extracted summaries are then converted into vector embeddings which we us as a knowledge base to augument gemini LLM with the user's extracted personality. The combination of a user's personality traits, memory and behaviourial patterns is what I refer to as an "echo". An Echo is an application of RAG + LLM to simulate user's consciousness. The echo is the core part of EchoVault and fortunately for us, the Hackathon was sponsored by elevenlabs and Tavus. That made me realize I could easily integrate with their API conversational interfaces to give my echo both my voice and my face. I added voice and video avatars for paid subscribers and left the echo text system free to use. The user is required to have a check in at least once every 2 weeks, this is another core aspect of the project. If a user does not check in 6 times in a row, it automatically triggers a deceased protocol that releases the user's echo to whoever they appointed during their onboarding process, maybe their spouse relative or friends. Users can actually appoint multiple echo custodians. The custodians can interact with the echo for free for the duration that user was subscribed for, after which they will have the option to extend the subscription by paying themselves. In summary it extracts user's memory and personality into vector(768) embeddings using gemini text-embedding 004 model. it extracts user's voice and visuals using tavus and eleven labs. The goal is to digitally immortalize the user's essence through controlled diary sessions an vector extractions.

How we built it: We built the project primarily on Bolt.new. I began the conception process on an app I built about 90% before the Hackathon began, its called the crystallizer. Crystallizer is programmed to pull out a rough business idea from inside your head and turn it to a full functional organized business plan and PRD. I brainstormed with crystallizer to get the full business Plan and PRD, I then pasted crystallizer's generation into chatgpt and asked it to optimize the PRD to be easily buildable by bolt.new agent. I used the deep research to ensure the refinement was thorough and realistic. I then proceeded to post the PRD and prompt into Bolt and asked it to begin with scaffolding the first page and the rest was history. Bolt gave me my beautiful UI, set up all database schema so solidly that it really helped stabilize my development because I usually get derailed by database issues. I built the dashboard and all the pages on bolt and then I ran into a bit of an error loop when trying to add some of my edge functions. That forced me to move to cursor to complete the scaffold wiring into the back end. A platform that was super helpful to me accross every step was supabase. it helped me quickly gain a slight understanding of Schema planning and maintenane.

Challenges we ran into: I had a hard time adding the edge functions on Bolt because the supabase integration was logged into my old account. I had created a new supabase for the account and hard as I tried the ui didnt let me log in. So I had to manually connect my supabase using the anon Key and URL, it worked fine for setting up the tables, but somehow triggered extensive errors when I tried to create the edge functions. Another challenge I experience was determining the best way to create a data pipeline that the user can use to gradually train their perosnality model. But with the help of chatgpt and bolt, we were able to create a watertight plan that actually worked for us using retrieval augumented generation backed by LLM. Another challenge was integrating Tavus to train Avatars for us Via API. it was really difficult and I ended up achieving it with 2 edge functions and a webhook working together. I had a similar experience while integrating stripe as well.

Accomplishments that we're proud of: we were able to successfully simulate synthetic consciousness. I am extremely competitive an a lover of sci fi but no coding skill, when I heard of a no coding hackathon, I felt it was a chance to show how improved my vibe coding had become. I never intended to actually invent the echo system but I did. Synthetic consciousness is no longer just sci fi as I have proven.

What we learned: I learnt that there is hardly any limit to what we can create once we understand that the LLMs are operating systems and language is how we program them. A combination of strong language skills and a formidable appreciation of logic can make a viber even much better than a coder.

What's next for Echo Vault: I hope to get acquired by Tavus since I cant afford to pay for their API. if i don't get acquired I hope to win some prize money from this hackathon to invest into API costs and staffing since I'm only a one man show

Built With

Share this project:

Updates

posted an update

I have been out of circulation for a while since falling ill right after deploying echovault. I returned today and updated auth users to only allow users to create profiles after verifying email address confirmation. old unconfirmed users would still be allowed to log in and user their accounts. also judges can reach out to ugo.nwune@gmail.com with their official emails and I will create full premium accounts for them to bypass the paywall and test full functionality of the Echo Vault.

Log in or sign up for Devpost to join the conversation.