Inspiration

Melanoma is the most dangerous form of skin cancer, with 250,000 diagnosed each year and over 30,000 annual deaths. One of the key steps in current melanoma therapy is identifying and treating the problem early; the rate of patient survival drastically increases if surface abnormalities are identified earlier, with a 98% five-year survival rate for Stage I detection.

What it does

To use Delve, the user takes pictures of any concerning skin abnormalities. The images are processed and combined with the user's personal information to return a health report. The report includes the likelihood that the given skin lesion is malignant, notes about any growth from previous scans, and if the patient should seek further guidance.

How we built it

We built the backend algorithm for Delve using a customized TensorFlow image classification network. We utilized a custom-built nine-layer convolutional neural network trained the network on the Google Cloud Platform in order to speed up computation. The mobile front-end was written in Android Studio.

Challenges we ran into

One of the main challenges we ran into was git, our version control system. An incredible tool, git is also capable of throwing unfamiliar problems when working with it for the first time. We found out the hard way, as we couldn't find our main app files about halfway through the hackathon and had to rewrite everything. Additionally, we struggled with the intricacies mobile app development at first because of our limited background in app design.

Accomplishments that we're proud of

Finishing this project has been a fantastic accomplishment for our team. We are particularly proud of our work on the CNN due to the difficulties associated with limited available data and limited compute power.

What we learned

As we developed Delve, we learned about practical implementations and best practices for designing our neural network. Initially, we found it difficult to find a compromise between deeper, larger, and more powerful networks and the computational constraints of our local devices; Google's extensive Cloud platform proved to aid in that regard.

What's next for Delve

One improvement we would like to implement in a future version of Delve is to use previous scans in predicting a current risk of melanoma. Currently, our app simply notes when the lesion has grown, but does not correlate this growth with a risk factor for the patient. With this addition to our model, we would be able to assess the patient's risk more accurately.

Built With

Share this project:

Updates