Inspiration

Wanting to experiment with both hardware and software this hackathon, we decided to go with the fun idea of a self feeding robot. Pertaining to people with mobility issues, or simply just those who are lazy, we caught inspiration of building a robotic arm which operates completely hands free!

What it does

This 3D printed robotic arm tracks your face and moves towards you with a spoon in "hand", depositing food straight to your mouth. It was designed with the functionality to coop up the sustenance from a bowl, then move up and towards you while your mouth is open. Once your mouth is closed and the food is consumed, the robot arm will move back out and repeat the process based on the face tracking.

How we built it

This robot arm was built with modded 3D printed parts, and wired with an Arduino Uno, 6 SG90s servos, and 2 power supply modules. The software for the face tracking used *MediaPipe FaceMesh and OpenCV, which captured video from a USB external webcam, drawing the landmarks onto the video and receiving facial landmark positions for each frame. This data from the face tracking was utilized the Jacobian pseudo-inverse method working in NumPy in order to calculate the reverse kinematics. After calculating the change in angles within the arm, the data was then sent to the Arduino board which was loaded with the Firmata protocol, allowing us to use PyFirmata to interface with our servos in real time.

Challenges we ran into

We initially planned to use JavaScript, with React and TensorFlow for the face tracking and Johnny Five for the Arduino and JavaScript communication. After many initial hours of struggle with conflicting dependencies and outdated documentation, we made breakthroughs on both sides by alternatively using Python. The next major challenge was wiring the servos, since we needed 6 servos and only had an Arduino, which didn't have enough power with the default 9 volt to 5 volt power supply that accompanied the kit. We were also constantly inputting every face landmark data captured into the servos, which made them very unstable and twitchy. We were especially astonished when our servos started continuously spinning, when they should only ever turn 180 degrees. Finally we faced issues with the 3D printing, as we had to modify the model to fit the servos which may not have enough power, and not being able to get materials to properly construct it.

Accomplishments that we're proud of

We're proud of the adversity we overcame while constructing this project. We all learned a lot from facing areas of problem for our individual sections, and the small achievements while learning. The face detection works amazingly, being able to map the markings on your face in real time, and being able to detect whether your mouth is open or not. Learning to connecting hardware and software together in a way that was both smooth and effective was a valuable learning experience.

What we learned

We learned about compatibility and dependency issues when working with different libraries and versions. We also had many lessons with hardware, such as the power of servos and turning Arduinos into micro controllers using Firmata. Much of our teams hardware experience came from online simulators such as TinkerCad, so experiencing difficulties of overheating motors, power supply issues, and fun electric shocks was a great way to showcase the difference between theory and hands-on experience.

What's next for the Automatic Human Feeder?

We plan to clean up the math for controlling the arm's servos, as the rotational constraints can cause the program to occasionally crash. We also want to add automatic scooping to allow for the picking up of food. We plan to rework some dimensions of the arm where that had to be on-the-spot modified due to equipment constraints. Finally, creating an online simulator that mimics the arm movement will allow for optimal motion tests without risks to the robot's singular body/limb.

Built With

Share this project:

Updates