Inspiration
I began looking over the hardware MLH was bringing, originally looking at the Oculus Rift, but finally focusing on the Leap Motion after seeing its unusual input method and potential for quick implementation in the Unity3D engine, allowing me time to feasibly create a fresh experience solo under the time constraints.
What it does
Uses the Leap Motion as a controller to control 3D representations of 2 players' hands. These virtual hands are armed with thumb-mounted swords and the players essentially fence, trying to strike the other player's palms.
How I built it
Hand2Hand is build in the Unity3D engine using the Leap Motion as its primary form of input. I built off of the official Leap Motion Unity implementation for basic hand controls, writing all of my game mechanic code in C# with MonoDevelop. Version control and backup was handled with both GitHub and Google Drive. All models were done with standard primitives and I created all audio in Linux MultiMedia Studio. Texturing and all other image work was done in GIMP 2. The zip file containing the playable executable is hosted on GoogleDrive, while the source remains on GitHub.
Challenges I ran into
- The Leap Motion is dependent on an IR camera. The bright lighting in the fieldhouse interferes heavily with its already imperfect hand-tracking accuracy.
- The players' swords are, unlike their hands, not physical objects in the real world. This means that when their blades impact something in the game world, nothing causes the players to respond physically, removing the vital recoil element of a kinetically satisfying sword-fight.
- Working solo at a largely team-focused event put me at a relative disadvantage as no progress could be made while I slept and all art/design/graphical work had to be handled by me in addition to the programming workload.
Accomplishments that I'm proud of
- As a substitute for kinetic feedback actually stopping the players' real hands upon impacting their targets or parrying an attack, the swords "break" on impact, requiring the player to remove their hands from sensor-range briefly to get a new sword. This causes players to react in approximation to recoiling from a blow without any actual kinetic feedback.
- Handling this project solo and reached a passable end product without nearly the sleep loss I expected starting out.
- Several players that tested my game seemed to rather enjoy playing it, and player enjoyment is what I consider the ultimate indicator of success for a game of this nature.
What I learned
The value of mechanical and auditory elements in simulation that encourage a user to provide their own kinetic feedback where a hardware solution would be significantly more expensive, complicated, time-consuming, or otherwise impractical.
What's next for Hand2Hand
I intend to purchase a Leap Motion to continue development and experiment with more than 2 players, multiple Leap Motion controllers, Leap Motion paired with Oculus Rift, and the possibility of a physical shield or a layer of image-processing that can be paired with the Leap Motion to block out bright light sources and minimize interference for maximum tracking accuracy.
Playing Hand2Hand
A playable version of the game for Windows PC is located in the .zip file on Google Drive linked on this page. A standard keyboard and a properly set-up Leap Motion controller with all in standard face-up orientation is required to play this this game. Leap Motion hand tracking works better in areas with lower light. More info on Leap Motion setup can be found at https://www.leapmotion.com/setup. A README.txt is provided in the .zip with detailed control description if the in-game tutorial text is insufficient.
Built With
- audacity
- c#
- leap-motion
- linux-multimedia-studio
- unity
- windows-10

Log in or sign up for Devpost to join the conversation.