Inspiration
We've always loved technology and AI, but fashion has never really been our strong suit. Picking out outfits, matching clothes, it’s something we've struggled with more often than not. When our team got access to the new Snap Spectacles with AR/VR capabilities, we saw an opportunity to bridge that gap.
That’s how SnapDrobe was born — a fusion of Snapchat’s visual-first experiences and a smart, AI-powered wardrobe assistant. We wanted to create a seamless way for anyone (fashion experts or beginners like me) to build, manage, and interact with their wardrobe in a smarter way. Imagine seeing a cool jacket at a store or spotting someone with a stylish outfit — with SnapDrobe, all you have to do is say “add to wardrobe,” and it captures the essence of what you’re looking at.
SnapDrobe helps you not just build your wardrobe effortlessly, but also suggests complete outfits based on your collection, the event, the weather, and the time of day, all powered by AI. Our goal was to make fashion less stressful and a lot more fun, using cutting-edge technology and natural interaction through the Snap Spectacles.
What it does
SnapDrobe transforms how you build and interact with your wardrobe — all through a simple voice command and your Snap Spectacles. Here’s what SnapDrobe does:
Instant Wardrobe Capture
While wearing the Snap Spectacles, if you see an outfit, accessory, or item you like — whether it’s in a store, on a person, or anywhere else — you simply say, “Add to wardrobe.”
SnapDrobe captures an image of the item through the Spectacles.AI-Powered Item Description
Instead of saving bulky images, SnapDrobe uses AI to analyze the captured image and generate a detailed description of the clothing item (e.g., “light blue denim jacket with silver buttons, oversized fit”).
This description is then stored in a cloud database (DynamoDB) as part of your personal wardrobe collection.Smart Outfit Suggestions
When you want help picking what to wear, SnapDrobe comes to the rescue.
You can talk to it naturally — for example, “I’m going to a rooftop party tonight” or “I need something casual for a sunny afternoon stroll.”
SnapDrobe analyzes:- Your saved wardrobe items
- The weather at your location
- The event context you provide
- The time of day
Based on all these factors, it recommends a full outfit for you.
Complete Look Recommendations
The outfit suggestion isn’t just one piece of clothing — SnapDrobe gives you a head-to-toe look:- Topwear (shirt, t-shirt, etc.)
- Bottomwear (jeans, pants, skirts, etc.)
- Footwear (shoes, boots, sneakers)
- Optional Accessories (caps, jackets, bags)
Visual Outfit Previews
To help you visualize the final look, SnapDrobe generates an AI-powered mockup image showing the complete outfit styled together.
This way, you can actually see what your outfit will look like before putting it on.
In short, SnapDrobe acts like your personal fashion assistant — helping you grow your wardrobe effortlessly and making smart, stylish outfit decisions using AI and real-world context — all with the natural, hands-free experience enabled by Snap Spectacles.
How we built it
We were a team of three who came together to build SnapDrobe, each taking ownership of different parts of the project. None of us had prior experience working with Snap Spectacles or Lens Studio, so the first big step was diving deep into the documentation and understanding how the Spectacles hardware and the Spectacles Interaction Kit worked.
Snap Spectacles + Lens Studio
One of us focused entirely on building the experience inside the Snap Spectacles.
Using Lens Studio and TypeScript, we designed and built the UI that the user sees when interacting through the Spectacles.
We used the Spectacles Interaction Kit to capture user inputs like the “Add to wardrobe” or “Ask for outfit” commands.
A lot of time was spent understanding how to build interactive, responsive interfaces for an AR/VR environment — completely new territory for us.
Backend Architecture
For the backend, we built two main API endpoints using Python along with FetchAI’s UAgent framework to handle communications between the devices and the cloud.
Submit Endpoint (POST)
This endpoint captures the visual input from the Spectacles. However, instead of simply storing the raw image, we process it using the Gemini API:- The Gemini model analyzes the image and generates a structured JSON containing:
- Name of the clothing item
- Target gender
- Primary and secondary colors (with hex codes for precision)
- Material type
- Best-suited weather conditions
- Ideal time of day to wear it
- Suggested occasions (casual, formal, etc.)
This structured JSON is then stored in Amazon DynamoDB as a new entry in the user’s digital wardrobe.
Ask Endpoint (POST)
When the user wants an outfit recommendation, they voice their request through the Spectacles. This input is recorded and sent to the backend.- The backend uses speech-to-text conversion.
- The text prompt is analyzed with Gemini API, combined with real-time data fetched from the OpenWeather API (such as current weather, temperature, time of day, and location).
- It pulls all items from the DynamoDB wardrobe, matches items based on the context (event type, weather suitability, etc.), and intelligently selects a full outfit (topwear, bottomwear, footwear, accessories).
- Once the outfit is assembled, Gemini generates a visual mockup showing what the complete outfit looks like together.
- This generated outfit preview is then displayed back to the user on the Spectacles.
End-to-End Flow
From seeing an item in the real world → capturing its description → building a personal wardrobe → requesting a context-aware outfit suggestion → and visualizing the suggested look — SnapDrobe delivers the entire cycle seamlessly using Snap Spectacles, AI models, cloud databases, and real-time environment sensing.
Building this system from scratch — especially learning Lens Studio, Spectacles Interaction Kit, setting up cloud infrastructure, and chaining AI models with user prompts — made this an incredibly challenging but rewarding project.
Challenges we ran into
(You still need to fill this section! Would you like me to help you write it too? 🛠️)
Accomplishments that we're proud of
- Successfully built a full end-to-end system connecting AR wearables, AI processing, cloud databases, and real-time environment sensing.
- Designed and deployed an intuitive, voice-activated wardrobe management system.
- Navigated and built a production-ready experience in Lens Studio and Snap Spectacles without any prior AR/VR experience.
- Learned to integrate multiple technologies seamlessly, from AI language models to distributed agents using FetchAI.
- Created an experience that genuinely solves a real-world pain point — making fashion decisions easier, fun, and accessible.
What we learned
Working with Gemini API:
Learned how to structure prompts, parse AI-generated outputs, and integrate Gemini into real-world applications for both text and visual generation.Building with FetchAI and UAgent:
Explored how agent-based systems can manage backend workflows, enabling dynamic, distributed communications between devices and cloud services.Getting comfortable with TypeScript:
Rapidly picked up TypeScript and used it to build robust UI and interactions inside Lens Studio.Lens Studio and Snap Developer Tools:
Gained hands-on experience with Snap’s AR development environment, realizing the critical importance of documentation and platform guidelines.Teamwork and collaboration:
Learned the importance of clear communication, division of responsibilities, and supporting each other to navigate steep technical challenges under tight timelines.
What's next for SnapDrobe
Expand personalization:
Implement advanced user profiles and style learning over time for even better outfit recommendations.Try-on AR Previews:
Enable AR overlays of recommended outfits onto the user’s body through Spectacles.Friend wardrobes:
Allow users to share and collaborate on wardrobes with friends and get outfit suggestions together.Contextual notifications:
Proactively suggest outfits based on user’s calendar events, upcoming trips, or even spontaneous weather changes.Offline mode:
Enable SnapDrobe to function even when network connectivity is limited by caching wardrobe items locally.
SnapDrobe is just getting started — we believe it can change how people interact with their personal style forever.
Built With
- amazon-dynamodb
- fetchai
- geminiapi
- lensstudio
- openweather
- python
- render.com
- spectecleinteractionkit
- typescript

Log in or sign up for Devpost to join the conversation.