Inspiration
We were inspired by the magical intersection of visual art and music, particularly Chrome Music Lab's playful approach to making music creation accessible. We wanted to create something where anyone could be a composer without musical training - where the simple act of drawing could create a unique soundtrack.
What it does
Melody Canvas transforms your drawings into living, animated musical compositions. Draw anything with colors and shapes, then press play to watch your artwork dance to its own AI-generated soundtrack. The system analyzes your drawing's colors, patterns, and mood to automatically create matching music with appropriate instruments - warm colors become bass and drums, cool colors become bells and strings. Your strokes animate in sync with the beat, creating a mesmerizing audio-visual experience.
How we built it
We built it using React with TypeScript for the frontend, p5.js for the drawing canvas, and Tone.js for music synthesis. The backend uses Express.js with OpenAI's Vision API to analyze drawings and GPT-4 to generate musical compositions. We implemented automatic instrument mapping based on color analysis, beat-synchronized animations, and a two-layer control system for intuitive interaction. The entire experience runs in the browser with real-time audio synthesis.
Challenges we ran into
Getting animations to properly sync with music beats required careful coordination between Tone.js scheduling and our animation engine. We struggled with making the AI-generated music sound natural rather than robotic - solved by adding chord progressions, velocity variation, and timing humanization. Layout responsiveness was tricky, balancing between cramped controls and wasted space. We also had to simplify the experience by removing manual instrument selection to make it more magical and accessible.
Accomplishments that we're proud of
Creating a truly zero-barrier music creation tool where anyone can compose just by drawing. The automatic instrument assignment based on drawing analysis feels magical. We're proud of the smooth beat-synchronized animations that make drawings come alive. The AI integration creates genuinely musical compositions that match the artwork's mood, not just random notes.
What we learned
We learned that simplicity enhances creativity - removing manual controls made the experience more engaging. Synchronizing visual and audio elements requires careful timing coordination. AI can effectively bridge the gap between visual art and music when given the right prompts and analysis. User testing revealed that automatic, intelligent defaults create better experiences than excessive manual controls.
What's next for Melody Canvas
Add collaborative drawing sessions where multiple users can create music together in real-time. Implement style transfer to generate music in specific genres (jazz, electronic, classical). Create a gallery feature to share and remix other users' musical drawings. Add MIDI export so creations can be used in professional music software. Develop mobile touch optimization for drawing on tablets and phones. Integrate more sophisticated AI models for even richer musical interpretations.
Built With
- canvas-api
- drizzle-orm
- express.js
- framer-motion
- javascript
- neon-(serverless-database)
- node.js
- openai-gpt-4
- openai-vision-api
- p5.js
- postgresql
- radix-ui
- react
- react-hook-form
- shadcn/ui
- tailwind-css
- tanstack-query
- tone.js
- typescript
- vite
- web-audio-api
- wouter
- zod

Log in or sign up for Devpost to join the conversation.