Inspiration

AI can generate UI mockups easily. The hard part is turning them into real code. DevSketch Mobile does exactly that — capture any mockup, get Flutter code in seconds. No cloud, no waiting.

What it does

  1. Capture any UI mockup (camera or upload)
  2. Detect 12 UI element types using custom-trained AI
  3. Generate working Flutter code with Material 3
  4. Export complete Flutter project ready to build

No cloud. No API calls. 100% on-device.

How we built it

Pre-trained models don't detect UI elements. We trained our own:

  • Dataset: VINS — 4,800 labeled UI screenshots
  • Model: YOLOv8-Nano, 50 epochs, exported to CoreML
  • Stack: SwiftUI + Core ML + Vision framework
  • Inference: ~250ms on Apple Neural Engine (Arm)

Challenges we ran into

  • Dataset selection — Finding UI-specific training data (solved with VINS)
    • CoreML export — Configuring NMS in the model pipeline
    • Layout logic — Converting 2D bounding boxes into Flutter widget hierarchy

Accomplishments that we're proud of

  • Custom-trained AI model — Not generic pre-trained weights, but a purpose-built UI detector
    • ~250ms inference — Real-time detection on-device using Arm Neural Engine
    • 12 UI element types — Comprehensive detection for common mobile UI patterns
    • Complete project export — Not just code snippets, but runnable Flutter apps with pubspec.yaml, main.dart, and proper structure
    • Privacy-first architecture — Zero cloud dependencies, all processing on-device
    • Clean codebase — Modular services for camera, ML inference, code generation, and export

What we learned

  • How to train and export custom YOLOv8 models for CoreML
  • The power of on-device ML with Arm's Neural Engine
  • Core ML and Vision framework integration patterns
  • Template-based code generation strategies
  • The critical importance of dataset selection for domain-specific ML tasks

Key insight: The model is only as good as its training data. VINS gave us clean, labeled UI screenshots — and that made all the difference.

What's next for DevSketch Mobile

  1. Expand Training Data — Train on more diverse UI styles (iOS, Material, custom designs)
    1. Hand-drawn Sketch Support — Fine-tune model on wireframes and paper sketches
    2. Multi-page Detection — Recognize navigation flows between screens
    3. More Frameworks — Support React Native and SwiftUI output
    4. Real-time Preview — Live Flutter preview directly on device
    5. Figma Integration — Export detected elements to Figma for design refinement

Built With

Share this project:

Updates