Inspiration

In 2018, more than 5,500 children aged six and below were diagnosed with developmental problems, up from an average of 4,362 new cases yearly from 2015 to 2017. It was estimated that one in 150 children in Singapore was on the autism spectrum, a higher rate than the global figure of one in 160. There has been a growing gap between supply and demand for services for the post-18 group, with the number of Sped school graduates increasing at an average of 5 percent a year in the last three years.

President of the Autism Resource Centre (Singapore) and chairman of the Autism Association (Singapore) Denise Phua said: “The cliff effect after formal schooling is well known, but there continues to be a dearth of services for adults across the autism spectrum.

While efforts to understand and empathize with people on the Alexithymia spectrum are very much appreciated in the status quo, we believe that there can be more effort to enhance the ability of cognitive empathy of those on the Alexithymia spectrum. Hence, we decided to come up with EmoSense, a training tool to allow volunteers to practice with emotionally-impaired people on how to recognize the emotions of those they are speaking to.

What it does

EmoSense is a breakthrough application crafted to make a meaningful impact on the lives of emotionally handicapped individuals. By leveraging the power of Machine Learning and Generative Artificial Intelligence, EmoSense focuses on enhancing emotional understanding, a skill that can significantly improve social interactions and overall quality of life.

This innovative tool utilizes NLP to convert spoken words to text. With the help of OpenAI, it deciphers the emotions conveyed within the words, while taking into account the speed of speech and volume of conversation. During the setup, caretakers can also determine the level of assistance required, emphasizing specific emotions that the beneficiary requires additional help in.

The app offers personalized guidance to help individuals with emotional disabilities better comprehend the emotions of others, allowing them to better relate to those around them through clear explanations tailored to their unique needs.

How we built it

We used the Python libraries such as SpeechRecognition, tkinter and numpy. Additionally, we used OpenAI for comprehension.

Challenges we ran into

It is inconvenient for the user to manually start and stop recordings. Hence, we utilized loudness thresholds to start and stop the program.

Our team was made up of a very diverse group of people from different stages of life. With different levels of technical abilities and ways of thinking, we needed to work together and capitalize on each other's skills to create our product in a day. With different schedules and commitments, we needed good task delegation to ensure progress was made.

Utilizing loudness thresholds does not work well in noisy environments. We added a manual stop button for times where the program was unable to determine when the user has stopped talking.

Accomplishments that we're proud of

EmoSense's intuitive interface and non-intrusive design create an environment where learning about emotions feels natural and comfortable. This newfound emotional intelligence can lead to more successful social interactions, helping emotionally handicapped individuals better understand the feelings of others and express their own emotions effectively.

We created a powerful tool that opens doors to improved emotional comprehension, providing emotionally handicapped individuals the chance to learn more about emotions in an environment that is natural and comfortable

What we learned

We learned to integrate ChatGPT, a nascent generative AI tool, into our programme to make this new technology accessible to the emotionally handicapped.

What's next for EmoSense

Integration of audio-recording hardware for easy accessibility and portability by the emotionally handicapped community. Cater to a wider spectrum of emotionally impaired people Launch app, website and social media

Built With

Share this project:

Updates