ISSN ONLINE(2320-9801) PRINT (2320-9798)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Implementation of Real Time Hand Gesture Recognition

Manasa Srinivasa H S, Suresha H S
M.Tech Student, Department of ECE, Don Bosco Institute of Technology, Bangalore, Karnataka, India
Associate Professor, Department of ECE, Don Bosco Institute of Technology, Bangalore, Karnataka, India
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Innovative Research in Computer and Communication Engineering

Abstract

Most common form of communication is by means of using gestures .Gestures plays an important role in interaction that may be with human or non human things. In this paper main objective is to count the number of fingers which are unfolded in real time. The real time recognition process uses three main steps they are background subtraction by using codebook algorithm, contour, convex hull and convexity defect calculation and final convexity defect calculation. Based upon the calculation of defect points number of fingers are counted. Back ground subtracted step results in back ground subtracted image. Back ground subtracted image results hand as white color and background as black color.This paper implements recognition of hand gesture in real time and it is implemented using Intel Atom Processor using OpenCV.

Keywords

Code book algorithm; back ground subtraction; fore ground image; contour; convexity defects; convex hull

INTRODUCTION

Recognizing the gestures in real time is the forms the main objective of this paper. Different types of gestures are online gesture and offline gestures. Online gestures mean that a real time gesture that is direct manipulation gestures and offline gestures means processing the gestures before user interaction. Different kinds of gesture recognition are face gesture recognition, sign language recognition and hand gesture recognition.
Many challenges like latency that is recognizing the different gestures in image processing applications is slow, Lack of gesture language because different people have different gestures which is lack in recognition of different gesture language. Another important challenge is robustness means because of lightning conditions and background noise its difficult recognize the different postures of gestures.

RELATED WORK

In Paper [4], for unobstructed surroundings real time hand gesture appreciation is introduced in command to identify the hand gestures. In order to make out the hand gestures in genuine time three phases are introduced they are real time hand tracking, learning of gesture, via applying Hidden Markov models identify the gesture. In charge to follow the hand in real time Kalman filter and furnish blobs analysis is used. Second phase is to educate the gestures and preserve the record. Third phase is to Recognise the gesture shape based on the American Sign Language
In paper [5], earlier to silent backgrounds, endless gestures are acknowledged. In this paper four phases are introduced to make out the endless gestures they are Tracking real time in hand and extracting to locate the unremitting hand and region of hand extraction. Second phase is taking out of feature by means of Fourier Descriptor. Third phase teaching the hand gestures with hidden Markov model. And the last phase is to know the gesture.
In paper [3], vision based hand gesture relations is discussed. To trigger hand detection a precise sign is required by means of tracking, afterwards based on the movement and colour cues hand is segmented.
In paper [7], for lively environments and real time hand gratitude algorithm is discussed to cooperate with robots. by means of cascade of boosted classifiers the discussed algorithm detects hand based going on hand positions ,velocities and static gestures.

PROPOSED WORK

A. Bock Diagram
In this Paper the proposed block diagram consists of three stages which are shown in Figure 1.
image
From the above block diagram the input is taken from the web camera in real time. The real time video is converted into fixed number of number of frames. The frames are the input to the code book algorithm. The code book algorithm converts the color image which is three channel images to binary image which is single channel image. Background subtracted image is single channel image where hand image is white in color and background is black in color. The binary image is used for the calculation of contour, convex hull and convexity defects and finally depending upon the calculation of defect points the fingers which are unfolded are counted.
B. Frames
As shown in figure 1. The input from web camera is real time video. The video is divided into number of frames. In this proposed paper the number of frames is in range from 30 to 300.The web camera used in this proposed paper is Tag web camera with 16 mega pixel camera.
C. Code book algorithm
Consider M be a sequence of trained N RGB vectors for single pixel.
M = {m1,m2, m3 … . mn}
For X code words codebook B is given as
B = {b1,b2, b3 … . bx}
Each code word bi , i = 1, 2, …. X with RGB vectors Ui = R i , G i , B i and having six key values like
auxn = I , I , fi , λi ,mi , ni
Where I , I represents brightness factors like minimum and maximum.
fi Represents Frequency of codeword occurrence.
λi Codeword not occurred during the training period of longest interval that is Maximum Negative Run length (MNRL).
mi, ni Access time of 1st n last codeword.
The algorithm Construction is shown below.
image
image
D. Contour, Convex hull and Convexity defect calculation
Contour means the exact boundary of an hand image which mean the boundary pixel an hand image. Contour forms the recognition of hand image. Convex hull fits the area within the contour of a hand image. Convex defect points means between the corresponding fingers the defect points will be calculated.

EXPERIMENTAL RESULTS

In real time by using web camera the input video is taken and converted into frames than some of the steps are carried out as shown in the figure 1 to count the number of fingers. The experimental results are shown below.
image
Figure 2 shows the count zero of raw hand image with the back ground subtracted image. Raw image from the real time web cam with the count zero. In the raw image the back ground with some extra objected is subtracted and results in the back ground subtracted image shown.
image
Figure 3 shows the count one of raw hand image with the back ground subtracted image. Raw image from the real time web cam with the count one.

image

 

Figure 4 shows the count two of raw hand image with the back ground subtracted image. Raw image from the real time web cam with the count two. Figure 5 shows the count three of raw hand image with the back ground subtracted image. Raw image from the real time web cam with the count three.
image
Blue color in the raw hand image shows the convex defect points. Count three of raw image and back ground subtracted image is shown in figure 5.
image
Yellow color in the all figures shows the convex hull of hand image. Count 4 of an hand raw image is shown in the figure 6 and also back ground subtracted image. Figure 7 shows the count five of raw hand image with the back ground subtracted image. Red color in the below image indicates the count of number of fingers unfolded.
image

CONCLUSION

The core of the proposed project is to find the number of fingers unfolded by using Intel atom processor and OpenCV. Depending upon the requirement of applications the proposed paper results are used like for examples physically challenged persons the different gestures makes the movement of wheel chair and for operating robots and many more. Future scope is to recognise both the hand gestures and count the unfolded fingers.

References

  1. Massimo Piccardi “Background subtraction techniques: a review” IEEE International Conference on Systems, 2004. Monuri Hemantha, M.V.Srikanth,”Simulation of real time hand gesture recognistion for physically impaired”,International journal of advanced research in computer and communication engineering vol.2,Issue 11,November 2013.
  2. Siddharth S. Rautaray, Anupam Agrawal,”Real time hand gesture recognistion syatem for dynamic applications”,International Journal of UbiComp (IJU), Vol.3, No.1, January 2012.
  3. Yikai Fang, Kongqiao Wang, Jian Chengand Hanqing Lu,”A real time hand gesture recognition method”1-4244-1017-7/07 ©2007 IEEE.
  4. Nguyen Dang Binh, Enokida Shuichi, Toshiaki Ejima,” Real time hand tracking and recognistion system”,GVIP 05 Conference, 19-21 December 2005, CICC, Cairo, Egypt.
  5. Feng-Sheng , Chih-Ming Fu, Chung-Lin Huang ”Hand gesture recognition using a real time tracking method and hidden morkev models”Received 15 January 2001; received in revised form 2 January 2003; accepted 20 March 2003.
  6. Cristina Manresa, Javier Varona, Ramon Mas and Francisco J. Perales,”Real time hand tracking and gesture recognistion for human computer interaction”,Received 1 January 2000; revised 1 January 2000; accepted 1 January 2000.
  7. M. Correa, J. Ruiz-del-Solar1, R. Verschae1, J. Lee-Ferng1, N. Castillo”Real time hand gesture recognistion for human robot interaction”.