Inspiration
Skin cancers are one of the most common types of cancer, yet can easily go unnoticed as risk of harm increases. Sometimes it may be difficult to diagnose whether or not someone may have skin cancer and what kind. This is where our app SkinScope comes into play.
What it does
SkinScope is a skin cancer detector that can analyze and categorize what type of cancer it may be as well as immediately provide info about it. If someone may believe that an area of them has skin cancer, access to analysis is quick and easy compared to ignoring it, having to wait for a doctor, or not understanding the complicated internet. SkinScope also has a built in ai chat tool that can assist with providing even further details on skin cancer for further understanding of your personal situation and provide with more assistance.
How we built it
We built SkinScope using cutting-edge AI technologies to ensure accurate and efficient functionality. For image analysis, we utilized YOLO's framework, a powerful object detection algorithm, to capture and identify skin diseases and cancers in real time. To enhance the diagnostic process, we trained two AI models using TensorFlow: a detection AI to locate areas of concern on the skin and a classification AI to analyze and classify these regions as benign or malignant. Using a comprehensive dataset of labeled skin images, we used data augmentation techniques to improve model accuracy and adaptability. The seamless integration of YOLO for precise detection and TensorFlow for robust classification enabled us to create an efficient and reliable diagnostic pipeline. Finally, we combined these models with an intuitive user interface, enriching the experience with detailed cancer information, treatment options, and inspiring survivor stories, making SkinScope both a powerful tool and a source of hope.
Challenges we ran into
While building SkinScope, we came with several challenges. One major problem was the long training times for our AI models, as training deep learning algorithms on large datasets is computationally intensive. Given the time constraints of the hackathon, this posed a significant challenge, forcing us to optimize our workflows and prioritize key functionalities. Despite our efforts, the models' accuracy was not as high as we had hoped, primarily due to the limited time for fine-tuning and testing. Balancing the need for precision with the fast-paced demands of the event required innovative problem-solving and adaptability. These challenges pushed us to focus on building a scalable foundation that could be improved further post-hackathon.
Accomplishments that we're proud of
We are incredibly proud of the innovative solution we built in such a short time. One of our key accomplishments was successfully integrating YOLO for real-time skin lesion detection, allowing us to accurately pinpoint areas of concern. Coupled with TensorFlow, we developed both a detection AI and a classification AI, providing a comprehensive solution for diagnosing potential skin cancer. Despite the challenges, we managed to create a functional prototype that can detect lesions, classify them, and provide detailed cancer information and treatment options. Additionally, we included inspiring survivor stories, offering both hope and education to users. The seamless user experience, which combines technology, education, and empathy, is something we’re truly proud of. Most importantly, we created a foundation that has the potential to make a real impact in early cancer detection and patient support.
What we learned
Throughout the development of SkinScope, we learned lots of important lessons. First, we gained a deeper understanding of the complexities involved in training AI models for medical applications. The challenge of handling large datasets and fine-tuning models for accurate predictions highlighted the importance of data quality and the need for extensive testing. We also learned the importance of collaboration under tight time constraints, as our team had to balance development, optimization, and presentation, while staying focused on delivering a functional prototype. Additionally, integrating YOLO and TensorFlow taught us the intricacies of combining detection and classification AI in a single workflow, and the need to continuously improve models for higher accuracy. Finally, we realized how essential it is to blend technology with empathy—creating a product that not only provides diagnostic insights but also offers educational value and emotional support to users facing a cancer diagnosis. These lessons have not only helped us build SkinScope but will also shape our future endeavors in AI and healthcare innovation.
What's next for SkinScope
Looking ahead, we aim to expand SkinScope by incorporating features that further support users in their cancer journey. One of the key additions will be the integration of treatment location data—providing users with a map of nearby hospitals, clinics, and treatment centers. We also plan to include doctor locations and reviews, so users can find trusted healthcare professionals with ease. By offering personalized, location-based resources alongside our detection and educational features, we hope to make the entire process more accessible and empowering. These future updates will help make SkinScope not only a diagnostic tool but also a comprehensive support system for users navigating the complexities of cancer treatment.
Built With
- css
- gsap
- html
- javascript
- next
- python
- react
- react-native
- spline
- tensorflow
- yolo


Log in or sign up for Devpost to join the conversation.