Inspiration

We built this app because people always know exactly where they feel pain—but explaining it can be hard. By letting patients point out discomfort in augmented reality before they even see a doctor, we can reduce wait times, make pain descriptions more accurate, and give healthcare workers a head start, all without invasive tests or guesswork.

What it does

PainAR allows users to show exactly where their discomfort is on their body, with the ability to circle the area of pain and show how it has spread. Users can then interact with our chatbot to describe the intensity and depth of pain and more. With a stored log record, users can easily refer back and update their symptoms for a more detailed report.

How we built it

We had a frontend and a backend, emulated through Android Studio

Challenges we ran into

The primary challenge that we ran into was getting React Native to be able to handle MediaPipe in order to track body parts of the user in real time. We also struggled with getting set up with React Native, given that we are all new to the technology.

Accomplishments that we're proud of

We are proud of being able to link a React Native frontend with a backend given that we were all inexperienced with React Native.

What we learned

We learned that getting Expo to handle workloads from more complex libraries such as Vision Camera and MediaPipe causes a lot of headache inducing problems.

What's next for PainAR

The next step for PainAR would be to create a summarized report with AR visualization for doctors to review before, during, and after appointments. This could be a powerful tool to create an ease-of-business between doctors and patients which would cut down on hospital wait times and provide more accurate descriptions of pain for better relief.

Built With

Share this project:

Updates