Inspiration
Tasks as simple as navigating are not only difficult and dangerous for those who are visually impaired and blind. These tasks should not be challenging, as everyone deserves the right to get to places they need to get to.
What it does
Echo gives users feedback based on their surroundings; Their iPhone vibrates through haptic feedback as they approach an object/obstacle, and the strength intensifies as they get closer. The objects are identified using the LiDAR sensors on the iPhone which projects lasers, similar to echolocation, to determine the distance between the user and the obstacle.
How we built it
This iOS application was build fully using Swift, through the xCode IDE.
Challenges we ran into
The biggest challenge we ran into was identifying the depth of items from the user. This was difficult due to the long process that Apple requires to access depth data through their APIs
Accomplishments that we're proud of
We are proud of learning a new programming language as none of the group had used Swift before. Additionally, the group was able to get haptic feedback working on the phones, which took a long time to procure
What we learned
We learned that project need to be planned out beforehand. This includes thorough research on the feasibility of the project. However, it is also important to keep in mind, that the team needs to be ready for changes and be able to adapt quickly.
Log in or sign up for Devpost to join the conversation.