Inspiration

We like Kermit!

What it does

Kerminator uses a camera built into his eye to track the user, and follows them by turning his head. He can do some arm motions, and uses an AI generated voice to answer questions.

How we built it

Kerminator follows the user using an arducam 219x connected to a raspberry pi. YOLOface was used to track users, and Kerminator is controlled with 4 servos and 1 vex motor, which are controlled using I2C from the rasberry pi.

Microphone input => STT => NLP response => TTS => Raspberry Pi PyAudio => Whisper => Llama 2 7B => Tortoise TTS => SCP

Trained model to mimic Kermit’s voice.

Challenges we ran into

Vex motors are meant to be run on raspberry pi's. As such, getting the motors to run on raspberry pi's i2c's was a hassle and required us to directly manipulate register values. Furthermore, it required a different variety of i2c, causing issues since there is only one i2c bus.

Accomplishments that we're proud of

Being able to combine a mechanical aspect in with software to create a unified final product.

What we learned

How I2C works.

What's next for Kerminator

Fine tuning, better manufacturing, and a less unsettling place to put the camera.

Built With

Share this project:

Updates