Inspiration

According to the National Institutes of Health, about 35% of people over 70 and nearly all people over 85 experience mobility limitations that impact their ability to complete daily tasks with confidence and ease [1]. This loss of mobility significantly reduces elders’ quality of life and puts them at an increased risk for injury. The NIH suggests tackling this loss of mobility by partaking in low-impact, steady-state physical activity. What better way to do this than dance!

Our dance game generates a choreography representation from a source video. Then, the user’s movement and form are compared to the original video. Not only can this be used for dance videos, but also weight-lifting and other fitness videos where form and physical movement are key.

Dancing games break expectations for user interfaces. There are no checkboxes or dropdowns; you Just Moove! We designed this platform to be as accessible as possible, ensuring that there is a high-contrast UI with clutter-free frames so players can utilize the platform to its fullest potential. The scoring component also provides an addictive element to the game that ensures continued engagement and fun!

[1] Freiberger, Ellen et al. “Mobility in Older Community-Dwelling Persons: A Narrative Review.” Frontiers in physiology vol. 11 881. 15 Sep. 2020, doi:10.3389/fphys.2020.00881

What it does

Just Moove! first asks the player to pick a video from a library. While playing, the user will see themselves alongside the video, both with an overlay of a “skeleton” used to track and compare their movements. The game then generates a score by tracking their overall accuracy and form.

How we built it

Just Moove! is powered by an AI algorithm trained to identify and scale human features, which is what results in the “skeletons.” On top of this algorithm, we designed a scoring system rooted in min-max normalization and cosine similarity. To make our game extendable to other videos on the internet, we also developed a system for scraping, trimming, uploading, and tracing videos with the AI algorithm.

Challenges we ran into

We wanted to accelerate our game with a GPU, but learned that our underlying ML library, mediapipe was not compiled with GPU support. We changed gears by optimizing the pose-fitting inference for CPUs. We minimized the number of “landmark” points identified by the model and saved work by processing videos before any real time movement begins.

Accomplishments that we're proud of

We achieved real time motion tracking, even on older laptops! We have an excellent user interface. We devised a scoring system that visibly incentivizes movement that matches the source video. The end product largely provides a self-contained, complete experience. We think that there is real room for social benefit.

What we learned

Figma, OpenCV, pygame, user interaction, troubleshooting GPU hardware acceleration, motion capture paradigms

What's next for Just Moove!

We can allow users to upload any video they want. We got really close to achieving this. Speed control would also be a useful feature. The skeleton rendering is somewhat basic and could be made to look nicer.

Built With

Share this project:

Updates