Inspiration
Fashion is the largest eCommerce market segment. The market is expected to grow further at 8.6% per year and reach a total market size of US$1003.5 billion by the end of 2025. But people find it difficult to assess how the fashion accessories will look on them especially when looking at them online. Also try ons at physical stores are becoming less and less common due to the risks it proposes in the Covid-19 pandemic. The existing virtual try on solutions are very last-gen and cumbersome as well as very costly and closed-source. So, we have come up with a solution called Ararel.
What it does
The Ararel App is a complete ecommerce solution for Apparel businesses. It employs state of the art AR and Machine Learning technologies to present a very accurate real time image of how the fashion accessory would look on a person. This enables users to make a very informed decision. The app employs PWA technology which enables it to have a single code base for all the platforms along with advanced caching capabilities and better user experience. It also uses TWA technology to present it as a native app on android. Usage of firebase and native JavaScript libraries allow the app to run server less which allows for very low and easy maintenance.
How I built it
We first built the AR component of the app. The app first utilized face-api js library’s advanced machine learning functions to calculate the position of eyes of a person and used ml5 library to render the sunglasses onto them. But there were a few problems in this approach; One of them being that the sunglasses were 2d so they won’t move with the person but would rather remain static. To solve this issue, the app now plots a 3d mesh of the person’s eyes by first tracking them and then using predicted depth data (some cameras like the ones used in VR headsets can provide depth data but here we are getting input from a webcam so the depth data needs to be predicted) to prepare a data frame with three rows representing the x coordinate, y coordinate and predicted depth. It then calculates if there is any rotation of the face and rotates the 3d model with it. The next issue was size of the glasses; If a person moves back, the size of the glasses should decrease and vice versa. To fix this, the app fist sets the size of sunglasses by using the maximum perpendicular distance between any two straight coordinates of eyes as the initial height. Then as the person moves back, it uses the predicted depth to scale the 3d model accordingly. After the 3d model was done, a very responsive and clean progressive web app was coded in HTML, CSS and JavaScript and was connected to firebase. The app’s code was minified and the app was hosted on netlify and was exported as a native android app by using TWA. This approach now allows components of the app to be rendered even when the user is offline.
Challenges I ran into
Time was one of the major problems we encountered as we wanted to build an app which was very clean and had a high scope of expandability. Making 3d AR models work in JavaScript was also a very big challenge.
Accomplishments that I'm proud of
We were mostly able to achieve all the functionality we wanted in Ararel to the point where it is very close to the state of being used by small businesses. By focusing our target market to a very niche area, we were able to develop a very effective solution to a very prevalent problem. But at the same time, we ensured high scope of expandability to ensure that the app works well in the long run.
What I learned
We got to learn a lot of 3d rendering as well as depth simulation in JavaScript.
What's next for Ararel
Implementing more accessories and reaching out to businesses who may want to deploy this technology.
Built With
- css3
- face-api.js
- firebase
- html5
- javascript
- ml5.js
- pwa
- twa


Log in or sign up for Devpost to join the conversation.