Keywords
|
Human Computer interaction, Microcontroller, Webcam, MATLAB tool, RF module. |
INTRODUCTION
|
Nowadays, the most of the human-computer interaction (HCI) is based on mechanical devices such as keyboards, mouse, joysticks or gamepads. In recent years there has been a growing interest in a class of methods based on computational vision due to its ability to recognize the human gestures in a natural way. Such methods use as input the images acquired from a camera or from a stereo pair of cameras. The main goal of such algorithms is to measure the hand configuration in each time instant. To facilitate this process many gesture recognition applications resort to the use of uniquely coloured gloves or markers on hands or on the fingers. In addition, using a controlled background makes it possible to localize the different hand efficiently and even in real-time. These two conditions impose restrictions on the user and on the interface setup. We have specifically avoided solutions which require coloured gloves or markers and a controlled background because of the initial requirements of our application. It must work for different people, without any complement on them and for unpredictable backgrounds |
Our application uses images from a low-cost web camera placed in front of the work area, where the recognized gestures act as the input for particular robotic arm motion. Here, webcam is connected with computer or laptop for human machine interface. Computer is already loaded with MATLAB 7 tool having Windows XP installed. Webcam precedes several of recognizing values to the computer. MATLAB tool recognizing the preferred gestures by comparing stored gestures values & gives respective outputs. The output which was generated by comparison has been transmitted wirelessly through RF module. Receiver section accepts the transmitting signals and given to AVR microcontroller which check the several values. The output of microcontroller is given to the motor which has been mounted in robotic arm and we will get a respective motion of robotic arm. |
In this paper we propose a real-time non-invasive hand tracking and gesture recognition system and divide our method in three steps. First step is hand segmentation where the image region that contains the hand has to be located. In order to make this process it is possible to use shapes, but they vary greatly during the natural motion of human hand. For such method, we choose skin-colour as the hand feature. The skin-colour is a distinctive cue of hands and it is invariant to scale and rotation. The next step is to track the position and orientation of the hand to prevent errors in the segmentation phase. We use a pixel-based tracking for the temporal update of the different hand state. In the last step we use the estimated hand state to extract several hand features to define a deterministic process of gesture recognition. Finally, we present the system’s performance evaluation results that prove that our method works well in unconstrained environments such as in industries and for several users. |
II. RELATED WORK
|
Today, a number of robotic arms used in robotics research, many with unique features and design criteria. In this section, a brief of some recent widely-used and/or Influential robotic arms is given. In the robotics field, several research efforts have been directed towards recognizing human hand gestures. |
Following are the few popular systems: |
A. Vision-based Gesture Recognition [3] -This Recognition system basically worked in the field of Service Robotics and the researchers are finally designed a Robot performing the cleaning task. They designed a gesture-based interface to control a mobile robot equipped with the manipulator. The interface uses a camera to track a person and recognize the different gestures involving arm motion. A fast, adaptive tracking algorithm enables the robot to track and follow a person reliably through an office environment with changing lighting conditions. Two gesture recognition methods i.e. a template based approach and a neural based approach were compared and combined with the Viterbi algorithm for the recognition of gestures defined through the arm motion. It results in an interactive clean-up task, where the user guides the robot to go to the specific locations that need to be cleaned and also instructs the robot to pick up available trash. |
B. Motion Capture Sensor Recognition [4] -Such recognition technique made it possible to implement an accelerometer based system to communicate with an industrial robotic arm wirelessly. In this particular project the robotic arm is powered with an ARM7 based LPC1768 core. Actually, MEMS is a three dimensional accelerometer sensor which captures gestures of human-arm and produces three different analog output voltages in three dimensional axes. And two flex sensors are used to control the gripper movement. |
C. Accelerometer-based Gesture Recognition –This Gesture Recognition methodology has become increasingly popular in a very short span of time. The low-moderate cost and relative small size of the accelerometers are the two factors that make it an effective tool to detect and recognize different human body gestures. Several studies have been conducted on the recognition of gestures from the acceleration data using Artificial Neural Networks (ANNs). |
Figure shows the proposed system framework in which webcam connected with the laptop computer. The gesture recognition system running on a laptop/computer. A pair of wireless communication modules connected with the gesture recognition system and the robot controller respectively. The webcam is used to obtain the image data of the various hand movements. The image or video acquired as input may be noisy or may reduce the performance by recognizing surrounding as hand region. The acquired data is subjected to enhancement and processed further to make it fit for approximation with the gestures (data) stored in the database. Then the data are processed to recognize the gesture. Each gesture is corresponding to a different robot control command. Then wireless module is used to send these different robot control commands to the robot controller. Accordingly, the robotic arm will do actions according to different human hand gestures, thus human-robot interaction can be achieved. The gesture recognition system is developed with MATLAB tool. |
The Robotic Arm Unit comprises of a microcontroller (PIC16F877A) to take decisions depending on the received code. The different microcontroller interfaces implemented in the Robotic Arm Unit are shown in figure B. PIC16F877A works on 5V, while the RF module works on 3.3V, which having 2.4 GHz frequency range. DC motors are used to physically drive the application as per the received code. The dc motor works on 12 V. To drive a dc motor, we must need a dc motor driver called L293D. This dc motor driver is capable of driving 2 dc motors at a time. In order to protect the dc motor from a back EMF generated by the dc motor while changing in the direction of rotation, the dc motor driver have an internal protection suit. We have also provided the back EMF protection suit by connecting 4 diode configurations across the each dc motor. LCD is used in a project to visualize the output of the application. We have used 16x2 LCD which indicates 16 columns and 2 rows. LCD can also used in a project to check the output of different modules that interfaced with the microcontroller. Thus LCD plays a vital role in a project to see the output and to debug the system module wise in case of system failure in order to rectify the problem. |
IV. ALGORITHM
|
Step 1: Webcam capturing the hand motion performed by the user. |
Step 2: PC/Laptop compares the received gesture with the stored database through MATLAB. |
Step 3: Matched? |
Yes |
Step 4 |
No |
Step 2 |
Step 4: Transmits the generated output through wireless RF module. |
Step 5: Accepting the transmitting signal and proceeds towards microcontroller. |
Step 6: Microcontroller matched the receiving values with stored data. |
Step 7: Matched? |
Yes |
Step 8 |
No |
Step 8: Motors gives respective motion to the robotic arm. |
V. OBJECTIVE
|
The main objective of this project is to investigate the characteristic and performance of the development of robotic arm to mimic the human hand on manipulating the objects by introducing the PIC based wireless system. |
Followings are the additional objective proposed system: |
A. To fabricate robot hands, which is capable of applying independent forces to a grasped object. |
B. To produce a wireless artificial robotic hand which mimic the human hand on manipulating the objects as well as contribute to the solution of robot end effectors grasping problem and robot reprogramming difficulty. |
C. To control the movement by using glove to integrate with hand and teleoperate by RF wireless module. |
D. To design control parts of the robot hand by PIC/AVR/ARM family midrange microcontroller as controller. |
VI. EXPERIMENTAL RESULT
|
In order to get a reliable recognition, it is quite important that the features extracted from the training image are detectable even under changes in image scale, noise and illumination. Such points generally lie on high-contrast regions of the image, for example object edges. Gesture recognition is initially performed by matching each key point independently to the database of key points extracted from training images. Many of these initial matches will be incorrect due to ambiguous features or features that arise from background clutter. Therefore, clusters of some features are first identified that agree on an object and its pose, as these clusters have a much higher probability of being correct than individual feature matches. Then, each cluster is checked by performing a detailed geometric fit to the model, and the result is used to accept or reject the interpretation. |
Results from our implementation are shown in figures C & D. Noise adjustment is a very essential part for our approach which could result in inefficient or false matching. However, we have used parameters which should help the keep the feature matching robust to noise in this implementation |
This input image representing character O is a colour image. When this is applied as an input query image, it matches with the scaled image of character O present in the database. Matlab tool compares the input/ accepted images with database image and give the appropriated output to the processing block. |
Some of the transformations that we tested in our implementation were: |
Saturation- Even though colour removal was done from the input query image, the database image, and both, the results were correct. |
Scale- We scaled down the source image as well as the destination image and still the recognition was found to be correct |
Rotation- We took pictures at a skewed angle and then did the matching and results came out to be correct. |
VII. APPLICATION AND FUTURE WORK
|
Controlling a robot, in real time, through the hand gestures is a novel approach and whose applications are myriad. An inflammation of service robot to domestic users and industries in the upcoming years would need such methods extensively. The approach has huge potential once it gets further optimized, as its time complexity is higher, with the help of hardware having better specifications. Use of more efficient wireless communication technique and a camera on the robot unit would improve the performance of system to a great extend and can be incorporated in the future work. |
VIII. CONCLUSION
|
A low cost computer vision system that can be executed in a common PC equipped with low power USB web cam was one of the main objectives of our work, which has been implemented successfully. We have experimented with around 30 hand gesture images and achieved higher average precision. The best classification rate of 97% was obtained under different light conditions. But the drawback in this method is that the hand should be properly placed with respect to the webcam so that the entire hand region is captured. If the hand is not placed properly the gesture is not recognized appropriately. Gesture made in this method involves only one hand and this reduces the number of gestures that can be made using both hands. |
Figures at a glance
|
|
|
|
|
Figure 1 |
Figure 2 |
Figure 3 |
Figure 4 |
|
|
References
|
- David G. Lowe, “Distinctive image features from scale-invariant key-points”, International Journal of Computer Vision, Vol.60, Issue No. 2,pp.91-110, 2004.
- Mokhtar M. Hasan, and Pramod K. Mishra: “Hand gesture Modeling and recognition using Geometric Features: A Review”: Canadian journalon image processing and computer vision vol.3, Issue No.1, March 2012.
- S. Waldherr, R. Romero and S. Thrun, “A gesture based interface for human-robot interaction”, In Autonomous Robots in Springer, vol. 9,Issue 2, pp. 151-173, 2000.
- K. Brahmani, K. S. Roy, Mahaboob Ali, “Arm 7 Based Robotic Arm Control by Electronic Gesture Recognition Unit Using Mems”,International Journal of Engineering Trends and Technology, Vol. 4, Issue 4, pp. 50-63, April 2013.
- J. Yang, W. Bang, E. Choi, S. Cho, J. Oh, J. Cho, S. Kim, E. Ki and D. Kim, “A 3D Hand drawn Gesture Input Device using Fuzzy ARTMAPbasedRecognizer”, In Journal of Systemic, Cybernetics and Informatics, Vol. 4 Issue 3, pp. 1-7, 2006.
- Prof. R. W. Jasutkar, Ms. Shubhangi J. Moon, “A Real Time Hand Gesture Recognition Technique by using embedded device”. InternationalJournal of Advanced Engineering Sciences and Technologies, vol. 2, issue no.1, pp. 043–046 may 2005.
- RafiqulZaman Khan1 & Noor Adnan Ibraheem, “Survey on Gesture Recognition for Hand Image Postures”, International Journal of Computerand Information Science, Vol. 5, Issue No. 3, pp 110-121, 2012.
- G. R. S. Murthy & R. S. Jadon. “A Review of Vision Based Hand Gestures Recognition, “International Journal of Information Technology andKnowledge Management, Vol. 2, No. 2, pp 405-410, 2009
|