Several months ago, I had a chance to play with the Leap Motion Controller and implement several hand-gesture-recognition applications using the Leap Motion Python SDK. Here is a brief of my personal journey.
Pygame is a Python package designed for writing games. The wiki page of Pygame is here and the official website of Pygame is here. I use Pygame to display a crazyface icon on screen which can be controlled by hand gestures. For example, showing the left hand over the Leap Motion Controller moves the crazyface icon to the left, while showing the right hand over the Leap Motion Controller moves the crazyface icon to the right. Furthermore, showing both hands over the controller moves the crazyface icon down and playing some special gestures moves the crazyface icon up or changes the crazyface icon to something else.
VPython is a Python package which allows developers to create and animate 3D objects such as boxes and spheres on screen. Some teachers have used VPython to teach students to conduct physics simulations. The wiki page of VPython is here and the official website of VPython is here. I use VPython to display a ball bouncing over a platform automatically if no hand is over the Leap Motion Controller. When a hand is over the controller, the hand can grab the ball and move it in 3D space. Note that the Pygame program just uses the basic features provided by Leap Motion Python SDK (such as recognizing the left hand, the right hand, some simple gestures, etc,) while the VPython program needs to calculate the mapping between the physical space in the real world and the virtual space on the screen so that the distances of hand movement and ball movement can be in sync.
The Pygame program and the VPython program above are software on laptop. However, combing the Leap Motion Controller together with some hardware systems such as Arduino and uArm explores anther dimension on gesture recognition and applications.
Arduino is an open source hardware/software platform and is very popular in the community of makers. The wiki page of Arduino is here and the official website of Arduino is here. The idea of testing the integration of a Leap Motion Controller and an Arduino board is simple: 1. connecting the Arduino board with 2 LED lights and marking them as LED1 and LED2, 2. writing a Python program to detect the appearance of the left hand or the right hand over the Leap Motion Controller, 3. if the left hand is present, then the Python program sends a command to the Arduino board to light up LED1; and if the right hand is present, then the Python program sends another command to light up LED2. The idea is simple, but the program is somewhat tricky because somehow it has to maintain a simple state machine to keep track of the status of two LED lights. What happens next after this simple test? Well, controlling a drone or a vehicle (here) or conducing a virtual reality simulation (here) could be some cool ideas.
uArm is an open source robot arm which is equipped with Arduino SDK and Python SDK. Its Python SDK allows developers to move the robot arm and control its gripper. With some tricks on recognizing special gestures such as an open palm and a closed fist, one can implement some interesting action on uArm, although it is not good enough to play a piano yet. :-P
So far, we have covered several interesting topics about the Leap Motion Controller, some software packages (e.g., Pygame/VPython,) and some hardware systems (e.g., Arduino/uArm.) I think that the above topics are good enough as a semester course for college students or as projects for makers. However, the challenge down the road is if we could recognize advanced static/dynamic hand gestures and realize it in a bigger context. Making a simple rock-paper-scissors game or combining some of its gestures into Pygame/VPython/Arduino/uArm applications is certainly is a good exercise. However, recognizing sign languages is definitely not an easy task. I haven’t uploaded my programs onto GitHub yet. To be continued …