Inspiration
Blackjack is one of the few casino games that is mathematically "beatable," yet most players lose money because they play with their gut rather than data. Inspired by the concept of card counting from the movie 21, we wanted to build an AI algorithm that could automatically tell us the next best move through the Ray-Ban Meta glasses.
What it does
Our algorithm tracks every card dealt in real-time via the Ray-Ban Meta glasses to calculate the true count (density of high cards remaining in the deck). It uses the Kelly Criterion to calculate the exact dollar amount to bet based on the player's current advantage, maximizing growth while preventing bankruptcy. After calculating the optimal move (Hit, Stand, Double, Surrender), it then uses computer vision automation to take control of the user's cursor and type the best move directly into the chat of an Instagram Livestream (through the Meta glasses).
How we built it
Using Python, we implemented a lookup table for Basic Strategy and a dynamic formula for the Kelly Criterion to handle bet sizing. For the Instagram integration, we used PyAutoGUI. This library allows our script to interact with the OS layer, controlling the mouse and keyboard to locate the Instagram comment box coordinates and inject the advice as if a human were typing it. To verify our math, we ran a 50,000-hand simulation using matplotlib to visualize the bankroll growth over time, proving that the strategy actually overcomes the house edge most of the time.
Challenges we ran into
A big challenge we ran into was differentiating the dealer and player's hands as differentiating the cards using computer vision (Gemini API) was quite difficult. Due to time constraints, we were not able to integrate a signal that tells our algorithm to reshuffle the deck. In addition, it takes a long time to analyze the image and give the next best move, which could delay the response.
What we learned
Connecting our Python code to the Ray-Ban Meta glasses was far more complex than anticipated as we had to hack together a communication bridge without official API support. We spent a long time debugging connection drops and data latency that could have been avoided if we had mapped out the data flow earlier. This experience taught us that outlining a robust technical pipeline before writing a single line of code is essential.
What's next for 21 Eyes
In the future, syncing our algorithm with a mobile app would allow for easier implementation. As the Meta Wearables Device Access Toolkit is still very new, we hope the addition of more features and access will help improve our computer vision set-up.
Log in or sign up for Devpost to join the conversation.