Inspiration
AI can generate UI mockups easily. The hard part is turning them into real code. DevSketch Mobile does exactly that — capture any mockup, get Flutter code in seconds. No cloud, no waiting.
What it does
- Capture any UI mockup (camera or upload)
- Detect 12 UI element types using custom-trained AI
- Generate working Flutter code with Material 3
- Export complete Flutter project ready to build
No cloud. No API calls. 100% on-device.
How we built it
Pre-trained models don't detect UI elements. We trained our own:
- Dataset: VINS — 4,800 labeled UI screenshots
- Model: YOLOv8-Nano, 50 epochs, exported to CoreML
- Stack: SwiftUI + Core ML + Vision framework
- Inference: ~250ms on Apple Neural Engine (Arm)
Challenges we ran into
- Dataset selection — Finding UI-specific training data (solved with VINS)
- CoreML export — Configuring NMS in the model pipeline
- Layout logic — Converting 2D bounding boxes into Flutter widget hierarchy
Accomplishments that we're proud of
- Custom-trained AI model — Not generic pre-trained weights, but a
purpose-built UI detector
- ~250ms inference — Real-time detection on-device using Arm Neural Engine
- 12 UI element types — Comprehensive detection for common mobile UI patterns
- Complete project export — Not just code snippets, but runnable Flutter apps with pubspec.yaml, main.dart, and proper structure
- Privacy-first architecture — Zero cloud dependencies, all processing on-device
- Clean codebase — Modular services for camera, ML inference, code generation, and export
What we learned
- How to train and export custom YOLOv8 models for CoreML
- The power of on-device ML with Arm's Neural Engine
- Core ML and Vision framework integration patterns
- Template-based code generation strategies
- The critical importance of dataset selection for domain-specific ML tasks
Key insight: The model is only as good as its training data. VINS gave us clean, labeled UI screenshots — and that made all the difference.
What's next for DevSketch Mobile
- Expand Training Data — Train on more diverse UI styles (iOS, Material,
custom designs)
- Hand-drawn Sketch Support — Fine-tune model on wireframes and paper sketches
- Multi-page Detection — Recognize navigation flows between screens
- More Frameworks — Support React Native and SwiftUI output
- Real-time Preview — Live Flutter preview directly on device
- Figma Integration — Export detected elements to Figma for design refinement
Log in or sign up for Devpost to join the conversation.