munchsistersAI and Robotics

Oct 17, 2013 (4 years and 8 months ago)


Master Project

Motion Gesture Recognition for Human

mita Karikatti

1. Committee Members and Signatures:

Approved by Date

__________________________________ _____________

Advisor: Dr. Edward Chow

__________________________________ _____________

Committee member:
Dr. Tim Chamillard

_________________ _____________

Committee member:
Dr. Chuan Yue

2. Introduction

In order to meet human requirements better, motion estimation at a distance has recently
gained more and more interests from computer vision res
earchers. It has a particularly
attractive modality from a surveillance perspective, such as human computer interaction
(HCI) or more generally human machine interaction (HMI) etc.

Hand gesture recognition, as one of gesture recognition problem, is so impo
rtant that the
motion of human hands can provide abundant information of human intention and implicit
meaning to the machines in real world. Many reports on intelligent human machine
interaction using hand gesture recognition have already been presented [1
], which can be
mainly divided into “Data Glove
based” and “Vision
based approaches”.

The “Data Glove

based” methods use a special input device named “hand data sensor
glove: for digitizing hand and finger motions into multi
parametric data. It is possib
le to
analyse 3D space hand motion with the sensing data. However, the device is too expensive
and the users might feel uncomfortable when they communicate with a machine.

Without specialized tracking devices, one of the greatest challenges of the system

is to
reliably detect and track the position of the hands using computer vision techniques. The
based” methods use only the vision sensor; camera [2]. In general, the entire system
of the vision
based hand gesture recognition must be simpler than
the Data Glove
approach, and it makes human
friendly interaction with no extra device. The vision
hand gesture recognition is a challenging problem in the field of computer vision and pattern
analysis, since it has some difficulties of algorith
mic problems such as camera calibration,
image segmentation, feature extraction, and so on.



The main objective of the project work is to develop HCI (Human Computer Interface).
Following Fig 1.depicts the block diagram of proposed algorithm

Fig 1. block diagram.

The project starts with hand detection in live video stream. For hand segmentation we use
either skin colour detection or motion detection method.

The second and most important step of this project is to

detect the important point i.e. marker
on hand. The project is aiming to develop motion gesture recognition system where user has
to draw the pattern in air. Hence the efficiency of the project lies in accurate marker
detection. In this project two marker
s are required one marker is used to guide a pattern
whereas other marker will be used to start and stop the motion gesture. There could be two
solutions for maker detection.


Finger tip detection and use it as marker along with thumb.


Wearing two different

colour caps in two fingers.

Both methods will be tested and either of the method will be selected based on the accuracy.

The coordinates of the marker in consecutive frames will be recorded and will be used to
create a binary pattern in an image. After binary pattern creation any noise will be filtered by
using morphological operations.

Another most important step of the p
roject is to extract meaningful features from the pattern.
More relevant features results in more accuracy. Different feature extraction methods will be
tested and best method will be chosen.

The extracted features will be used to train the classifier at
the initial stage. Once the
classifier is trained with sufficient data the system will be ready to interact with user. Support
Vector Machine (SVM) will be used as a classifier.

3.1 Tasks:

3.1.1 Already Complete

Study of different feature extraction metho
ds such as characteristic loci, PCA etc.

Study of various classifiers such as ANN, SVM etc.

Acquiring frames from the webcam and process it.

MATLAB programming language

3.1.2 In

Colour detection and segmentation algorithm development for gesture

3.1.3 Future

(Listed from highest to lowest priority)

Must be done

Development of motion gesture algorithm on live streaming.

Perform various operations in PC based on gesture.


a user manual for use of gesture recognition system.

the code with different feature extraction method.

Test the code on different methods to detect the pointer in motion gesture.

Compare the results of SVM and ANN for pattern recognition.

Write thesis

3.2 Deliverables:

Motion gesture recognition system deve
loped in MATLAB.

A thesis report documenting the design and implementation of motion gesture
ecognition system and c
omparative study of results using different methods.

4.0 References

1. Ying Wu and Thomas S Huang, "Vision
based gesture recognition: A
review," LNCS:

Based Communication in Human
Computer Interaction: International Gesture
Workshop, vol. 1739, pp. 103, 2004.

2. F. Quek, "Toward a vision
based hand gesture interface," in Proceedings of Virtual
Reality Software and Technology, Sin
gapore, 1994, pp. 17

3 Takahiro Watanabe and Masahiko Yachida, "Real
time gesture recognition using
eigenspace from multi
input image sequences," IEEE Computer and System in Japan, vol.
30, no. 13, pp. 810
821, 1999.

4. R. Bloem, H. N. Gabow, and F.
Somenzi, "An algorithm for strongly connected
component analysis in n log n symbolic steps," pp. 37
54, Nov 2000, LNCS 1954.


6. James, D. and S. Mubarak, "Recognizing Hand Gestures.", European Confere
nce on
Computer Vision, pp. 331
340, Stockholm, Sweden, May 1994.

7. Just, A. and S. Marcel, "A comparative study of two state
art sequence processing
techniques for hand gesture recognition.", Computer Vision and Image Understanding, vol.

113, no.

4, pp. 532
543, Apr 2009

8. Yanghee, N. and W. KwangYun, "Recognition of Space
Time Hand
Gestures using
Hidden Markov Model.", ACM symposium on Virtual