Inspiration

When hacking first began, the team was making paper cranes and fortune tellers during dinner while getting to know one another, when we thought about how difficult it was to follow tutorials on YouTube and booklets because of the lack of 360 perspective view of all angles when making pieces. We thought that an XR version of a 3D model would be able to solve this issue.

What it does

Our project takes a scan of a sticker from a mobile device and implements an AR model of a piece of paper over it. This piece of paper becomes a tutorial for creating an origami fortune teller on a click by click basis, allowing the maker to go step by step during the process while also having a full perspective of a model at their fingertips, so they know what their piece should look like from every angle with nothing blocking it.

How we built it

We built our project by utilizing Blender to create 2D-3D models and Unity's AR studio to implement these models to interact with both the real world and augmented reality. Blender's animation was then added in as a file to Unity, which created a mobile app interface and used C# code scripts to integrate the sticker scan and the animated objects together.

Challenges we ran into

Our team wanted to learn something new by creating something we had little experience in. None of our members had worked with VR before. Blender was also completely new to us and a few of us had little to no experience in GitHub. We overcame these challenges by teaching each other what we knew, and asking mentors for help.

Accomplishments that we're proud of

We are proud of how we came together as a team with our understanding that this topic was entirely new to us and was an opportunity to allocate tasks to learn and compare notes on to be as efficient and on task as possible within the 36 hours we were provided to hack.

What we learned

Teamwork and knowledge about Blender animation and AR through Unity were front/back end programs that no one on the team knew about beforehand, but through careful assessment and navigation of the UIs, we were able to learn how to create a process that allowed us to replicate what we had in mind with minor changes to accommodate for gaps in knowledge.

What's next for ARigami

ARigami has potential to have an implemented replay button, create AR tutorials from a real-world video of an origami tutorial, and additional tutorials with increased complexity for understanding.

Share this project:

Updates