Inspiration

Dylan and Casey always do web development, but they've been meaning to try out game development. While both have used Unity to make exactly one game each a few years ago, they have been meaning to learn how to use Unreal Engine given its famous reputation in high-profile gaming companies. The decision to choose Unreal is inspired from the desire to learn the key tool that enables many people to shift their hobbies into careers, even if they don't intend to enter the gaming industry.

Likewise, Bryant and Thao have learned all about AI in their classes and the surrounding hype around it, but never really had a chance to actually implement it in a project before. Their goal: train, refine, and implement an AI model themselves rather than relying on something entirely prebuilt.

This project is the combination of two passion ideas into one high-tech game demo.

What it does

Trigger Finger Tango is an FPS.

A Finger Pointer Shooter!

Trigger Finger Tango unites traditional web and traditional game development, crossing boundaries that typically are rarely crossed in either field. This project combines two distinct fields into one cohesive game, leveraging the advantages and technology in web-based machine learning (MediaPipe library and gesture recognition) to smoothly power and control gaming industry's most cutting edge game engine to date.

How we built it

lots of tutorials

This project is a three-part, non-traditional application:

  1. A React.js frontend capturing gesture input
  2. A Node.js TypeScript WebSocket server backend acting as the bridge between web and Unreal
  3. An Unreal Engine 5 shooting game remotely controlled via the React.js frontend.

Here is the flow of data:

  1. Webcam input is captured on the React.js frontend.
  2. The webcam input is put into our custom trained MediaPipe gesture model, trained to recognize the two positions of the game: "resting" and "shooting"
  3. Information about the hand, such as the landmark positions and shooting status are transmitted via WebSockets to the Node.js WebSocket server backend.
  4. The magic lives here: The WebSocket server unites JavaScript and Unreal, transmitting the gesture information to Unreal Engine.
  5. Unreal Engine has a WebSocket client implementation. So, it listens for incoming gesture data and acts accordingly, shooting when needed
  6. The assets and game physics were all programmed in Unreal, such as the player movement and bottle shattering mechanics.

Challenges we ran into

making sure we didnt eat too much food

Unreal's a monster to install and use. Our laptops were whirling and overheating throughout the entire hackathon and we spent several hours fixing installation issues. Also, Unreal is the industry standard for game development, carrying with it a relatively high ceiling and learning curve. It was a huge challenge for both Casey and Dylan to transfer their knowledge of Unity (But Unity is almost completely backwards compared to Unreal) into something usable in 36 hours. Unreal kept on getting build errors and caused them to constantly lose progress also.

Edit: 5:13am 4 hours before the Hackathon is over, merging Unreal Engine blueprint and C++ is the biggest monster we've faced. Somehow everything worked perfectly once, but not after that. It's hard to debug when some things are abstracted within Unreal Engine's blueprint alongside when the engine itself just decides to crash randomly once in a while.

Also, Bryant and Thao had a lot of trouble figuring out how to custom train the gesture identification model. It was their first time developing a project with AI and training a model themselves. A lot of demos and tutorials were watched to understand the idea behind what it really meant to train a gesture recognition model themselves.

Accomplishments that we're proud of

running the demo mediapipe and installing unreal

The entire team learned a LOT! No one really used anything they were entirely comfortable with coming in. And we managed to make a cool concept work!

What we learned

unity > unreal

Everything!

Unity is 100x easier than Unreal Engine Q.Q, but Unreal feels more powerful and it has a more capable built-in physics system.

What's next for Finger Guns

Originally, this project was meant to be multiplayer shooting dueling game, but already it was hard enough picking up Unreal Engine 5 in 36 hours. We want to add a multiplayer dueling aspect.

We also want to train the gesture recognition to recognize "dual wielding" guns to make the game feel more immersive.

Also, we want to put our game into source code.

Lastly, we want to free the game from the keyboard, relying on gestures to traverse the game and aim.

Built With

Share this project:

Updates