Inspiration:
After experiencing the pain of not being able to communicate my range-of-motion progression after shoulder surgery, I knew rehab patients would greatly benefit from simple tracking systems that quantify the dynamics of joints. As graduate biomedical engineering candidates, my fellow teammate and I hacked a cheap and effective method of tracking the joints that doesn't require expensive and bulky sensors.
What it does:
Leveraging advancements in computer vision technology and the introduction of depth cameras into new smartphones (i.e. iPhoneX, etc.), we track your joints movements and provide real-time actionable insights to guide patients during at-home physical therapy exercises. In parallel, the data is analyzed and streamlined to their therapists, so they can monitor the progression of patients throughout their rehabilitation journey. This can empower therapists with the analytical tools required to prioritize their time during therapy sessions.
Unlike commercial body tracking solutions available on the market today, our approach is software-only solution that require no extra sensors or hardware and thus significantly decrease the barriers to patients adoption.
How we built it:
We looked for simple techniches that can detect/track body pose or markers, and is easily accessible at home, such as a mobile phone or a laptop. After hours of search, tracking.js, a JavaScript library that implements openCV, stands out due to its cross-platform compatibility (web-based) and ability to track color (more accurate, frame by frame detection of color).
We used different color stickers as markers, put them at keypoints (joints), track them with web camera. Neighboring keypoints were connected to compute the joint angle using kinematics equations. We calculated the real time joint angle and angular velocity as reference for the user. The two variables are plotted in the dynamic graph enabled by the canvasjs library.
Challenges we ran into:
The initial idea was to use ARKit platform and iPhoneX true depth front-camera to track the joints in 3D space. However, it turned out that that Apple only provides face-tracking API and developing and training our models would require long timeframe.
We quickly pivoted into a web-based computer vision platform, and utilized geometry techniques to quantify the joint angles and angular velocity in real-time. The markers were simple RGB colors that the camera can track.
Adjusting the digital color meter and calibrating the camera to filter background noise that can affect our markers' tracking was also challenging and required multiple iterations.
Accomplishments that we're proud of:
It feels great to showcase a proof-of-concept MVP that offers patients a practical data-driven therapy guide. Now, patients can have higher confidence in the efficacy of their exercises at home, and communicate their progression to their therapists in a more objective way.
What we learned:
We built the whole code using Javascript. As none of us had prior experience using javascript, we enjoyed quickly learning the functions that were necessary to implement our idea.
What's next for EzTherapy:
- Porting into other platforms (i.e. ARKit to interface with iPhoneX's depth sensing camera)
- Leveraging deep learning techniques to help classify certain postures (can use TensorFlow, Caffe, etc.)
- Testing and validating the methodology through partnering with clinicians
Built With
- canvas.js
- javascript
- opencv
- tracking.js

Log in or sign up for Devpost to join the conversation.