Inspiration
Seeing the registry of deaf clubs and collaborative at UF really inspired our group to pursue solutions which adequately fit the needs of the blind.
What it does
The app consists of 2 parts. A calendar chat bot which is intractable completely through sound and touch. The chat bot takes in information about your schedule, events, and tasks for the week and reminds you throughout the week when you have upcoming events. The second part of the app consists of enabling blind students independent mobility across campus. Using YOLO computer vision we are able to accuraetly identify a variety of obstacles in a users path such as cars, chairs and other people. The app is also able to notify a user when a person is in their radius. It is also capable of seeing when a user can safely cross a street by keeping track of a cars size (in terms of frame of reference).
How we built it
We built the app using React and Flask for an interactive UI and a ML capable backend. In order to use NLP for the scheduling software we made calls to the OpenAI API to fill in calendar information.
Challenges we ran into
We originally wanted to build a web app but had a variety of difficulties with React Native. This led to us pursuing a solution which was web based.
Accomplishments that we're proud of
We are proud of the fact our ML model was able to detect the safest times to cross the street. We are also proud of our scheduling software which can request the user for more info and act as a personal assistant.
What we learned
We learned to build interactively rather than independently.
What's next for Canvas Blind
In the future we hope Canvas blind can be used for blind students across campuses all across the world.
Log in or sign up for Devpost to join the conversation.