Inspiration
We grew up in "ingredient households" where there was food in the fridge, but not a plan. This turned into:
- wasted groceries
- indecisiveness
- ordering food when we actually could cook
SnapChef is the answer: with a few photos of your fridge and confirmation of its contents, users can instantly see the recipes they can cook now!
What it does
- Fridge scan: user takes/uploads photos
- Identify ingredients: computer vision detects foods, user can add/remove anything we missed
- User toggles: mark ingredients as MUST, OPTIONAL, or EXCLUDE
- Pantry basics assumed: salt, oil, pepper, etc. are included automatically but hidden from the UI
- Shows cookable recipes: fast search over the recipe set; sort by best match or quickest make time
- Open recipes: steps and ingredients for selected dish are shown in-app with a link to the original
How we built it
Frontend: Ionic React (TypeScript) works across web + mobile
Computer vision:
- trained model based on datasets found online
- used YOLOv8 as the base model and fine-tuned based on Roboflow
Backend: Node.js + SQLite with database derived from Food.com (found on Kaggle)
- tables: recipes, recipe_ingredients
- strict matches: every MUST must appear, EXCLUDE must not, OPTIONAL improves the rank
- context guards: prevents false hits (ex. "garlic" != "garlic powder")
- speed: token index + precomputed counts keep queries fast for large dataset
Challenges we ran into
- Messy fridge photos: curbed with manual check
- Ingredient ambiguity: curbed with phrase matching
- Dataset size/performance: token indexing instead of lots of slow LIKE queries
- UX clarity: lots of ingredients caused clutter, but hid the assumed includes
- computer vision accuracy (only ~75% because we only had an hour of training)
Accomplishments that we're proud of
- A functional cross-platform app with a very simple flow (photos --> ingredients --> recipes)
- clean and simple UI/UX
- fast, strict searches for recipes
What we learned
- correctness is much more important than volume: strict matches > lots of substitutions!
- time efficiency with large datasets: we needed to be smarter about the data we query/use
- small UX choices make big differences: lots of small details/clutter adds lots of friction and errors
What's next
- Better computer vision model given a larger training set and more time (for more accurate identification)
- nutrition/compost information to further sustainability goals
- nudges based on the estimated expiration dates of ingredients
- household sharing so roommates/family can sync their ingredients

Log in or sign up for Devpost to join the conversation.