Inspiration
Clothing shopping is a diverse experience, eliciting both joy and stress in different individuals, often rooted in their unique relationships with their bodies. For some, it's a delightful adventure, a chance to explore personal style and express themselves through fashion. Trying on various outfits can boost confidence and offer a sense of empowerment. On the contrary, for others, it can be an anxiety-inducing task, as societal standards and self-criticism often loom large. These individuals might feel the weight of unrealistic body expectations or struggle with body image issues, making the process of finding the right clothes a daunting one. Our project aims to work towards embracing diversity, celebrating individuality, and fostering a more inclusive and body-positive shopping environment for everyone to enjoy the experience at their own pace and comfort.
What it does
In our virtual reality experience, users commence their journey within a serene and accommodating changing room scene. Within this immersive environment, they encounter a menu that facilitates the input of measurements, obtained in reality with a fabric measuring tape. As users submit these measurements, our system crafts a virtual representation of their body tailored precisely to their provided data. Subsequently, users have the liberty to select their preferred attire, before unveiling a mirror that offers a true-to-life reflection of the chosen clothing on their unique body type. This dynamic mirror enables users to engage more intimately with their virtual selves, empowering them to assess how the garments drape and conform as they subtly manipulate their arms and legs.
Our software has been meticulously crafted with user comfort at the forefront of its design philosophy. While other virtual shopping simulations have encountered challenges with customers struggling with self-esteem issues, we have implemented several strategic elements to address these concerns: Neutral Changing Room Environment: Our users find themselves in a tranquil changing room setting. It strikes a balance between a bustling retail space and the stark sterility of a plain white backdrop, creating an ambiance that promotes ease and comfort.
Generic Models: To foster a sense of detachment between the user and the virtual body, we employ generic models. This deliberate choice minimizes negative body-related thoughts, enabling users to approach the experience with a more neutral mindset, focusing on the apparel's aesthetics.
Neutral Lighting: Users are able to commence their evaluation with neutral lighting in order to provide a calm and serene environment to focus on the clothes.
Ambient Music Customization: Recognizing that sensory preferences vary, we afford users the freedom to toggle ambient music on or off, tailoring the auditory backdrop to their personal inclinations. This customization ensures a harmonious and enjoyable experience for all.
In essence, our VR shopping environment is meticulously designed to foster inclusivity, empower users with a positive and neutral perspective, and ultimately redefine the shopping experience in a manner that prioritizes individual comfort and self-confidence.
How we built it
The system was developed using the Unity game engine, with the aid of the UMA (Unity Multipurpose Avatar) plugin. This versatile tool allowed us to create models with adaptable bone structures, affording a high degree of customization. Our environment assets, including floor designs, curtains, and lighting fixtures, were crafted in Blender and seamlessly integrated into Unity using the standard 3D asset and design pipeline. We take pride in highlighting several key features:
- The user interface (UI/UX) was designed within the Unity framework, with graphical elements crafted using sprites. The functional aspects were implemented in C#. Users are presented with the option to choose between male and female body representations. Following this selection, users are prompted to input measurements previously obtained with a fabric measuring tape. Subsequently, a model is generated, sculpted to mirror the proportions determined by the user's input.
- User movements via handheld controllers, coupled with head rotation, influence the virtual body's actions through the application of inverse kinematics. These actions are reflected in a mirror, thoughtfully implemented through a C# script that utilizes the view of a second camera.
Challenges we ran into
UMA isn't widely recognized in the Unity community, which unfortunately translates to a scarcity of comprehensive documentation. When we encountered obstacles, we found ourselves dedicating extensive hours to navigating their extensive repository, housing a multitude of files, in search of resolutions. Furthermore, working with hardware introduces an added layer of complexity, as it entails the consideration of a broader spectrum of variables, thereby amplifying the potential for a wider array of errors to manifest.
Accomplishments that we're proud of
Building an entire scene in Unity is not a quick task and we worked very hard to expedite the process without cutting sacrificing too many key features.
What we learned
Debugging while sleepy is not best practice :(
What's next for Virtual Fit Studio
We would like to add additional functionality we did not have the time to get too. For instance, the ability to change lighting color and saturation, have a hex code selector for clothing color, place the user in various different surroundings such as a park or an office, and many others.


Log in or sign up for Devpost to join the conversation.