ISSN ONLINE(2278-8875) PRINT (2320-3765)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Survey Paper on Hand Gesture Recognition

Manjunatha M B1 ,Pradeepkumar B.P.2 ,Santhosh.S.Y3
  1. Professor, Dept of ECE, AIT, Tumkur, Karnataka, India
  2. Asst. Professor, Dept of ECE, AIT, Tumkur, Karnataka, India
  3. PG scholar, Dept of ECE, AIT, Tumkur, Karnataka, India
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering

Abstract

This survey paper proposes a real time implementation and novel methods for a hand-pose estimation that can be used for vision-based human interfaces. In this paper different methodologies, segmentation, feature extraction, classifiers are discussed and compared.The experimental results indicate the feasibility of different methodologies for vision-based interfaces in different environmental conditions although the methodologies discussed for various database and specified classifiers were used for faster implementation for real-time processing

Keywords

segmentation, feature extraction, classifiers.

INTRODUCTION

Human hand movements have controlling functions (e.g., object grasping, object designing) and logically explainable functions (e.g., sign language, pointing). Most of the “human skills” belong to controlling function. Since it is difficult to describe manipulative hand movements, teachinghuman skills to a system is a hard task.
Gesture recognition is important for developing an attractive alternative to prevalent human–computer interaction modalities. In many of the paper they have focused on the problem of recognition of dynamic hand gestures. They have concentrated on single palm gestures which are sequences of distinct hand shapes and postures. A given hand shape can undergo motion and discrete changes. However, continuous deformations are not permitted in older novel approaches. These gestures are distinguished on the basis of hand shapes involved and the nature of motion. They have developed a real time recognition system which can easily recognize these gestures in spite of individual variations. The novel approach system also has the ability to detect start and end of gesture sequences in an automated fashion.
Gesture generation may range from simple actions of using our hand to point at and move objects around to the more complex ones that express our feelings and allow us to communicate with others in effective manner. To exploit the use of gestures in HCI it is necessary to provide the means by which they can be interpreted by computers. The HCI interpretation of gestures requires that dynamic and/or static configurations of the human hand, arm, and even other parts of the human body, be measurable by the machine.

II. RELATED RESEARCH

In this paper, they proposed a novel hand-pose estimation method, which can be used for vision-based interfaces. In the method, A voxel model is constructed from silhouette images of the hand obtained from a multi viewpoint camera system. The voxel model method can be applied not only to skill teaching systems but also to interfaces for virtual reality and 3-D design systems [1]. In this work they have developed a HMM based gesture recognition system which uses both the temporal and shape characteristics of the gesture for recognition. They have considered single handed dynamic gestures. A gesture is composed of a sequence of epochs. Each epoch is characterized by the motion of distinct hand shapes [2].The proposed method is obtaining the image through subtract one image from another sequential image, measuring the entropy, separating hand region from images, tracking the hand region and recognizing hand gestures. Through entropy measurement, they have got color information that have near distribution in complexion for region that have big value and extracted hand region from input images [3].This paper proposed a PCA-ICA based representation of hand articulation for tracking hand-finger gestures in an image sequence. The dimensionality of hand motion space is reduced by PCA and then ICA is applied to extract the local feature vectors [4]. The most fundamental result is that using EMG forearm signals to build a high accuracy classifier to predict hand gestures are possible. Top end accuracies above 90%, indicate that classification has promise as a technique to control a prosthetic hand. Comparisons of ANN and RF showed them to be equivalent, indicating either are good candidates for future EMG classification studies [5]. A Computer vision color tracking algorithm is developed and applied towards tracking human hand. This system is based on three main stages: skin color detection, hand tracking and hand gesture recognition. Thus a system id developed 26 hand gestures which represents the alphabet from A to Z in ASL (American Sign Language) [6]. This paper presents a real-time method for hands detection and gesture classification. Since hands are non-rigid body components, it is difficult to segment and track the regions precisely. By using both the depth and color property, they are able to reconstruct the gesture trajectory in real-time [7].In this paper, an overview of algorithms for hand gesture segmentation has been presented. A universal algorithm for segmenting hand gesture images certainly does not exist and, on the contrary, most techniques are tailored on particular application and may work only under certain hypotheses and assumptions [8]. Many of input providing technologies require physical touch and also there are other variations that provide input to the application without using physical touch as they may be based on other human characteristics like speech, hand gesture etc. The advantage of the usage of hand gesture based input mode is that this method provides the user ability of interacting with the application from a distance without using the traditional input devices like keyboard or mouse [9]. They have developed a gesture recognition system that is shown to be robust for JSL gestures. The system is fully automatic and it works in real-time. It is fairly robust to background cluster. The advantage of the system lies in the ease of its use. Experiments on a single hand database have been carried out and recognition accuracy of up to 99% has been achieved [10]. They suggested a simple and effective algorithm for hand detection. When input image of the hand is detected,the algorithm articulate and sectored the hand portions andthen performed the operations on the fingers of hand detected. The proposed model works well in simple conditions with an accuracy of 95% but due to weaker localizing portion its efficiency decreases with complex backgrounds [11]. The current approach taken provides a simple and fast localization of hand gesture. It takes the advantages directly from the image content through a combination of motion and color cues. The color map is statistically generated from a generic skin color model and motion probabilities are obtained from simple differencing [12].

III. METHODOLOGY

A. Algorithm-1

The proposed method is obtaining the image through subtract one image from another sequential image, measuring the entropy, separating hand region from images, tracking the hand region and recognizing hand gestures. Through entropy measurement, they have got color information that have near distribution in complexion for region that have big value and extracted hand region from input images [3].
Step-1:Take the image sequence
Step-2:Diference image of the neighbourhood frames
Step-3:For each sub-block, evaluate PIM
Step-4:Get mean and variance for PIM values
Step-5:Extractio of the hand region
Step-6: Contour extraction using centroid and Chain code
Step-7: Get the centroidal profile
If no go to step-2

B. Algorithm-2

The experiment described in this paper collects and classifies sEMG signals associated with a set of hand gestures. The goals are (1) to show the feasibility of using classification of surface electromyogam (sEMG) signals collected from a human’s forearm muscles to predict the specific hand gesture associated with a set of signals, (2) improve the classification accuracy, and (3) simplify the classification model by reducing the feature set[5].
image

C. Algorithm-3

The major task of BGS is to build an explicit model of the background Segmentation is then performed to extract foreground objects by calculating the difference between the current frame and the background model. A good BGS algorithm should be robust to changing illumination conditions, able to ignore the movement of small background elements, and capable of incorporating new objects into the background model [8.]
Step-1: input video stream
Step-2: preprocessing the stream for background modeling
Step-3: foreground detection
Step-4: postpprocessing the stream
Step-5: obtain the foreground mask
Step-6: background modeling
Step-7: obtain the background model from foreground detection

D. Algorithm-4

The procedure starts by acquisition phase. As the standard input devices like keyboard, position & pointing devices are has been declared as non grata for this domain of applications. Our endeavor was oriented towards possible alternatives for user friendly and smart interfaces inspired by natural behavior of the users in real-world scenario [9]. While making the choice of image capturing devices they have to look the point of installation as well.
Step-1:Take the Captured image as image sequence
Step-2:Background subtraction, Hand Detection.Locating hand position using Haar Cascade.
Step-3:Camshift, Lucas kanade optical flow.Perform Hand Tracking.
Step-4:Find and exyract biggest contour (Area). Finding convex hull of the contour.
Step-5:Count number of defects. Find orientatio about bounded rectangle.
Step-6:Modeling of Gesture. Interpret gesture to meaningful command.
Step-7: Perform appropriate action.
In the first algorithmthey performed the experiment for 6 kinds of hand posture.
image
In the experimental results for 6 kinds of hand gesture, it shows the recognition rate with more than 95% for person and 90~100% for each gesture at 5 frames/sec by entropy analysis method.[3]
the second algorithm they are using different Classifiers: Artificial Neural Network (ANN), Random Forest (RF), One Nearest-Neighbor (1NN), Decision Tree with Boosting (DT/B), Support Vector Machine (SVM), and Decision Tree (DT).
image
Finally, for detecting the hand gestures using EMG by different classifiers i.e; ANN-6 and RF-30 are the superior classifiers when using five channel features. ANN-6 proved to be effective even when using as few as two features. ANOVA comparisons of ANN and RF showed them to be equivalent, indicating either are good candidates for future EMG classification studies.[5]
In the third algorithm different background subtraction analysis has been compares their advantages and disadvantages has been discussed and also color based segmentation analysis has been discussed.Different background subtraction techniques is discussed in this algorithm like Gaussian modeling used for robust classification of objects and it will gives good results compared to other two types, histogram of oriented gradients used for slight changes in shapes, and force field for occlusions[8].
Different types of color based segmentation analysis in clustering technique will gives non supervised classification. In adaptive clustering spatial constraints are imposed. In histogram threshold technique prior information of an image is not needed and it will gives fast approach.
In the gaming application fourth algorithm is best suited in terms of time responseOnce the hand has been detected the application further tracks different gestures of the user performed by his hand and generates contour around it. The application uses four hands gestures defined within the application for interaction with virtual game. Different gestures along with their assigned commands (functions) to control the application that is further used for the virtual game control.[9]

V. CONCLUSION

In this paper, the concept of different methodologies, segmentation, feature extraction, classifiers are discussed and compared on recognition and implementation of a hand posture and gesture modeling. As per our Survey different methodologies, segmentation techniques, and feature extraction techniques has been compared. Among the different classifier ANN classifier has the best recognition rate and accuracy than that of other classifiers compared in this paper, but in this paper he has not discussed the time response of the system. In order to obtain the good time response multi class SVM classifier can be used. This interface makes human users to be able to control smart environments by hand gestures. The presented research has classified different simple hand postures and any hand gestures that consist of any combination of the predefined hand postures.

References

  1. Etsuko Ueda, Yoshio Matsumoto, Masakazu Imai, and Tsukasa Ogasawara, “A Hand-Pose Estimation for Vision-Based Human Interfaces” IEEE Transactions on industrial electronics, VOL. 50, NO. 4, AUGUST 2003
  2. AdityaRamamoorthy, NamrataVaswani, SantanuChaudhury, Subhashis Banerjee. “Recognition of dynamic hand gestures” Pattern Recognition 36 (2003) 2069 – 2081, Received 6 December 2000; accepted 7 October 2002, www.elsevier.com/locate/patcog
  3. JongShill Lee, YoungJoo Lee, EungHyuk Lee, SeungHong Hong “Hand region extraction and Gesture recognition from video stream with complex background through entropy analysis” Proceedings of the 26th annual International Conference of the IEEE EMBS San francisco, CA, USA, September 1- 5,2004
  4. Makoto Kato, Yen-Wei Chen and Gang Xu “Articulated Hand Tracking by PCA-ICA Approach” Proceedings of the 7th International Conference on Automatic Face and Gesture Recognition (FGR’06) 0-7695-2503-2/06 $20.00 © 2006 IEEE
  5. Gene Shuman, “Using Forearm Electromyograms to Classify Hand Gestures” 2009 IEEE International Conference on Bioinformatics and Biomedicine
  6. Miss SulochanaM.Nadageri, Dr.S.D.Swarkar, Mr.A.D.Gawande, ”Hand Gesture Recognition Using CAMSHIFT Algorithm” Third international Conference on Emerging Trends in Engineering and Technology
  7. Sung-il Kang, AnnahRoh, Hyunki Hong “Using Depth and Skin Color for Hand Gesture Classification” 2011 IEEE International Conference on Consumer Electronics (ICCE)
  8. Rohit Kumar Gupta “A Comparative Analysis of Segmentation Algorithms for Hand Gesture Recognition” 2011 Third International Conference on Computational Intelligence, Communication Systems and Networks
  9. Siddharth S. Rautaray, AnupamAgrawal " Interaction with Virtual Game through Hand Gesture Recognition” 2011 International Conference on Multimedia, Signal Processing and Communication Technologies
  10. Nguyen Dang Binh, Toshiaki Ejima “REAL-TIME HAND GESTURE RECOGNITION USING PSEUDO 3-D HIDDEN MARKOV MODEL” Proc. 5th IEEE Int. Conf. on Cognitive Informatics (ICCI'06) Y.Y. Yao, Z.Z. Shi, Y. Wang, and W. Kinsner (Eds.)821-4244-0475-4/06/$20.OO @2006 IEEE
  11. M.Ali.Qureshi, Abdul Aziz, Muhammad AmmarSaeed, Muhammad Hayat “Implementation of an Efficient Algorithm for Human Hand Gesture Identification” 978-1-4577-0069-9/11/$26.00©2011 IEEE
  12. Annamária R. Várkonyi-Kóczy, BalázsTusor “Human–Computer Interaction for Smart Environment Applications Using Fuzzy Hand Posture and Gesture Models” IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, VOL. 60, NO. 5, MAY 2011
  13. Lim Wei Howe, Farrah Wong, Ali Chekima “Comparison of Hand Segmentation Methodologies for Hand Gesture Recognition” 978-1-4244-2328- 6/08/$25.00 © 2008 IEEE
  14. Predeep Kumar B P “dynamic hand gesture recognition” has Published by “IFRSA International Journal of graphics and image processing (IJGIP)”-2012 at International Forum Of Researchers Students And Academician(IFRSA) ”, ISSN (2249-5452)volume2,issue1,april2012
  15. Predeep Kumar B P “design and development of HCI system based on gesture recognition using SVM” has Published by “IFRSA International Journal of graphics and image processing (IJGIP)”-2012 at International Forum Of Researchers Students And Academician(IFRSA) ”, ISSN (2249- 5452)volume2,issue2,july2012
  16. Predeep Kumar B P “advanced video compression using H.264” has Published by “International Journal of Emerging Technology and Advanced Engineering (IJETAE), ISSN (2250-2459)volume3,issue1,jan-2013
  17. Predeep Kumar B P “dynamic hand gesture using CBIR” has Published by “IAEME International Journal of computer engineering and technology(IJCET)”-2013 ,volume4,issue3,PP-340-342,may- june 2013
  18. Predeep Kumar B P “design and development of HCI using gesture recognition” has Presented in international conference on emerging innovative technology for a sustainable world (ICEITSW-2013) conducted by Oklahoma state university
  19. Predeep Kumar B P” design and development of human computer interface using svm” presented in International conference on Emerging Trends in Engineering (ICETE-12) 15th and 16th May 2012