Inspiration
The inspiration behind Rythmic starts with the theme of the UGA Hacks X, Rock and Roll. We knew that we wanted to use an AI model to solve a problem and what better problem is there in Rock and Roll than drumming. While rarely the focus, Rock and Roll is controlled by the drummer, so having a good drummer is key. So we developed Rythmic, the perfect AI drummer companion, to always output the best beats and keep the flow going.
What it does
Rythmic simply takes a MIDI stream as input and outputs a drum beat. In the background, Rythmic is trained on multitudes of different drumming patterns as well as with corresponding melodies. This gives Rythmic the best chance to analyze your melody and output a beautiful drum pattern that blends flawlessly with your melody.
How we built it
Rythmiq is built on top of a fine-tuned Magenta MusicVAE model to help create the drum lines. We leveraged the fact that all of our group members owned GPUs. We combined our processing power to fine-tune the Magenta LSTM model with over 60,000 song MIDI files from the Lakh and Million Song Database. We ran trained magenta with a distributed system through wsl2, conda, cuda, cudnn. We also leveraged HuggingFace to make a small-scale transformer model.
Challenges we ran into
We encountered many fallbacks on our journey. We initially planned to make our entire project using transformers and HuggingFace. However, we realized that we did not have the required VRam to run the training algorithm even with our combined GPUs. Our biggest hiccup during this project was figuring out how to set up the environment properly for distributed GPU accelerated training. The Magenta AI uses very dated libraries and has a vague documentation making it very tough to find the right libraries required to run the appropriate scripts.
Accomplishments that we're proud of
We originally believed this would be a very ambitious project, especially as none of us have prior experience creating or training complex AI models. Even though we believe a lot of our time was spent on a product we couldn't deliver, we still think that our end-result is better than anything we could have imagined.
What we learned
Our main takeaway from this hackathon is the experience we got developing various AI models. We learned to become proficient at using technologies such as CUDA, NumPy, Pretty_MIDI, and more. This project opened up our eyes to pursuing other similar projects that we would've been too intimidated to do before.
What's next for Rythmiq
Our immediate next plan for Rythmiq is to add functionality to generate a background bass track to accompany the drums. In the long term, we'd love to keep building your virtual band -- all in one webapp!
Built With
- auth0
- css
- javascript
- magenta
- next.js
- typescript
Log in or sign up for Devpost to join the conversation.