💡 Inspiration 💡
Some of our team members have played musical instruments in the past. They know the pain of stopping or pausing while playing the music to flip over their music sheet. Using the hands to flip over a music sheet is not ideal since the hands should be on the instrument to play the music at all times. Since the eyes have to be on the music sheet to read music, why not use the eyes to flip over the music pages instead? And keep our hands on the instrument.
❓ What it does ❓
TuneTurn tracks the angle of the musician's head using openCV and uses it to detect whether the user's head is turned to the left or right, and accordingly flips the page of the music sheets, allowing for a hands-free experience. The app also has a vast library of music stored which the user can access and listen to and get a better idea of how to actually play the song of their choice before they start.
🤔 How we built it 🤔
Using computer vision (OpenCV) in an app deployed using Qualcomm HDK 8450, we track the movement of the head to trigger the change of page based on which way the head is moving based on the angle that is formed. We ensure then send a request to a Flask server, and ensure only 1 request is sent even though the user's head will be in the turned position for a few hundred milliseconds. The server processes the request and accordingly flips the sheet by using pyautogui to control the computer's left/right keys.
😰 Challenges we ran into 😰
It was our first time using OpenCV and we struggled a lot with accurately tracking the coordinates of the desired landmarks on the user's face as choosing the wrong ones would make it difficult to accurately detect whether the user's head is turned. Another major challenge was using Android App Studio to do the ML using Qualcomm as we had never used it before. We primarily code in Python and found the drastic change to Java quite difficult, especially since we had to do relatively complex ML using openCV.
🥇 Accomplishments that we're proud of 🥇
We are very proud off making an accurate openCV model by choosing good landmarks and calculating the angle between them to detect whether, and in what direction the user's head is turned. We are also proud of using very different platforms for our application and making them interact with each other and also using a completely different device to host our app, the Qualcomm HDK 8450, instead of a laptop which is what we are used to.
🎼 What we learned 🎼
We learned how to make effective mobile apps using Android App Studio and using completely different hardware to power our app. We took a large step out of our comfort zone and experimented with new technologies, allowing us to greatly progress in a short time. We also distributed the tasks accurately to ensure our workflow was streamlined.
😮 What's next for MakeUofT 😮
We are planning to make our feature which converts music to notes of a music sheet more accurate so we can better compare the user's music with the original music. To do this, we might try to make our own model and train it, and if it proves to be too hard we will try to find a better API.
Built With
- android-studio
- flask
- java
- mediapipe
- opencv
- python
- qualcomm-hdk-8450


Log in or sign up for Devpost to join the conversation.