Inspiration
We were inspired by emerging systems like SAM3D/SAM3 that analyze existing images and videos to extract information regarding its contents and we wanted to build something meaningful out of this.
What it does
To allow users to relive their past experiences that only exist in the form of video recordings and images.
How we built it
Using Google's Mediapipe library we were able to track joint motion data using just your webcam. We then use this information to allow the user to interact with a 3d world.
Challenges we ran into
It was a struggle to run/deploy some powerful freshly released models due to their harsh hardware requirements as well as lack of support for cloud deployments. (dockerfile, huggingface inference, http endpoints, etc)
Accomplishments that we're proud of
to successfully utilize motion tracking software and seamlessly integrate with a 3d environment in 3js.
What we learned
bun i and bun for quick app deployment. Google Mediapipe for motion tracking.
What's next for CREATE.world
Improve our integration with 3D model generation technologies for an even easier development.
Built With
- 3js
- hugging-face
- mediapipe
- react
Log in or sign up for Devpost to join the conversation.