Inspiration
There are people who can correlate color with music. We took inspiration from that and decided to tap into the musical patterns embedded in the visible world.
What it does
Reads pixel-by-pixel data from an image uploaded by the user. Maps the data to notes and chords in a musical scale. Uses the pixel data to play the notes and chords (with a piano as the instrument).
How we built it
We used Python for the backend to analyze the images (with OpenCV) and synthesize the musical notes and chords. We used HTML and CSS to design a sleek frontend webpage. We used Flask and Google App Engine to host the service on a webpage.
Challenges we ran into
Mapping pixel data to notes and chords proved quite difficult because the pre-existing musical libraries had design flaws that occasionally caused our program to crash. Additionally, synthesizing audio with Python requires a library called PyAudio that is dependent on a C library called PortAudio. Due to the nature of Google App Engine, we were unable to install PortAudio for the webpage. As such, the audio feature works on localhost but not on the webpage.
Accomplishments that we're proud of
All of us were fairly new sound synthesis and frontend programming. So, the fact that we have a product fit to demonstrate is an achievement in itself.
What we learned
We all gained experience with frontend design and web programming using HTML and CSS. We also learned how to host our own web services.
What's next for Musique
We are still looking to find a workaround for the PortAudio error on Google App Engine. Additionally, we also hope to implement a user review system that uses reviews to train an ML algorithm that improves the harmony and quality of the melodies produced by the program.


Log in or sign up for Devpost to join the conversation.