Inspiration

We all had some exposure to robotics, but didn't

What it does

Jacques attempts to draw a simple drawing that a user has traced, without having the image as a physical reference.

How we built it

Jacques uses a Raspberry Pi to control its movement, while another Raspberry Pi with a camera that sits atop an impromptu bridge made out of Soylent boxes directs the robot, communicating over sockets.

Challenges we ran into

Where to begin...? At the beginning of the hackathon, we had little idea of how to construct an environment for the robot to draw in. We thought outside the box, and borrowed a bunch of Soylent boxes to make a bridge for a Raspberry Pi with a camera to rest upon.

The most challenging part of the project was communicating accurately between the camera-Pi and Jacques. Lag proved to be a difficult variable to control, along with noise such as glare, which we solved with a cleverly selected non-reflective marker made out of Amazon packaging material. The conversion rate between the initial picture and Jacque's position, in addition to the algorithm for directing Jacques were also incredibly challenging to implement, and we partially succeeding in doing so.

Accomplishments that we're proud of

We're super proud of getting Jacques to trace users' drawings, even though he's not a master artiste just yet. Everything from creating Jacque's environment, to manipulating image/video data with OpenCV, to implementing the motion algorithms was really rewarding.

What we learned

What's next for Jacques

Better tracing accuracy, more complicated input, and not falling off tables in the middle of his painting sessions.

Built With

Share this project:

Updates