ISSN ONLINE(2320-9801) PRINT (2320-9798)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

VISUAL FINGER INPUT SENSING ROBOT MOTION

Mr. Vaibhav Shersande1, Ms. Samrin Shaikh2, Mr.Mohsin Kabli3, Mr.Swapnil Kale4, Mrs.Ranjana Kedar5
  1. Student, Dept. of Computer Engineering, KJ College of Engineering & Management Research, Pune, MH, India1,2,3,4
  2. Professor, Dept. of Computer Engineering,KJ College of Engineering & Management Research, Pune,MH, India5
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Innovative Research in Computer and Communication Engineering

Abstract

The purpose of this paper is to control the motion of any robot by using just fingers of the hand. The robot can be made to perform hard-core tasks not possible or to difficult for human beings to reach. There are a lot of examples among which the best could be handling boilers in mechanical Industries and performing operations in outer space just by recognising the fingers i.e. visual interaction in space. Also, the robot can be made to act as a servant providing comfort to the operator. The large scale implementation of this paper could also lead to the development of an independent humanoid capable of performing operations which would be interactive with respect to the surrounding vision. The proposed system is a completely wireless system with user authentication.

Keywords

finger interaction, mechanical robot, space robot, motion control.

INTRODUCTION

In the modern world of today robots have become an indispensible part .Robots have a wide range of applications in numerous fields. The most important aspect of a robot is that it should be user friendly. It must be able to perform certain tasks by visual inputs which can be given by the simple movements of fingers. Robots must be easier to operate especially for the general users without the back-end knowledge in the day-to-day life [1-5]. To increase the scope of intelligent robotic systems large scale research is being conducted worldwide for providing user friendly robotic motion control. The world is progressing at a faster pace and the standard of living has been raised with the developing technology. The developing technology brings with itself automation and wireless control of objects. Progress in the vision technology has made it easier to provide visual natural inputs which the robot accepts and interacts accordingly [6]. Various algorithms have been proposed to improve the hand gesture recognition[7]. The idea here is having a robot motion by finger gestures. This paper proposes to control the motion of robots by interaction with the fingers.

EXISTING SYSTEM

The existing robotic systems present are operated manually. Most of the systems are wired systems which have a number of drawbacks such as limited operational area, maintenance and also increase the complexity of the systems. Also the systems are sensor-based which increase the cost of the system and also needs maintenance.

PROPOSED SYSTEM

Besides verbal communication, motions of the various parts of the body are widely used for human communication among which hand gestures are the most widely used .The working of the robot is explained in the following chapter.
The proposed method consists of two sections:
A. Operator Section:
In this section the main focus is on the recognition of the finger. Input from the user is received through the web-camera connected on the operator’s personal computer. The captured image is then compared with the systems database and processed using image processing techniques [8-9]. A micro-controller is connected to operator’s personal computer and the RF receiver. With the help of serial communication between the operator’s personal computer and the controller, keys are transmitted to the controller. With the help of code dumped in the micro-controller written in Embedded C appropriate key is sent to the RF transmitter. Use of RF module facilitates wireless transmission and reception of data in both the sections.
There are a number of techniques available for the recognition of fingers .The following image processing algorithm is used in this section:
1. Capture the background image.
2. Capture the Image with object (finger).
3. Convert both the images into grayscale.
4. Subtract both the images.
5. Convert the image obtained into black and white image.
6. Initialize threshold=25
7. If pixel>25
Convert pixel to white i.e. 0
Else
Convert pixel to black i.e. 1
8. Open the database for images.
9. Repeat for number of images:
Find correlation factor of final image with all the images.
10. Find the maximum value from the set of correlation factor and the position of that image.
11. Based on the position value and maximum value send char to the controller.

B. Robot section:

The operator section mainly focuses on the control of the robotic motion. The transmitted image is received by the RF receiver. The robotic section consists of a micro-controller which is connected to the receiver and the robot. According to the code that is dumped in the micro-controller written in Embedded C, the controller processes the received keys and carries out the appropriate operation.

SYSTEM ARCHITECTURE

The overall system architecture is divided in two sections: the user section and the robot section. Robot is going to be a moving object which connected to the control section via RF communication. Direction control of the particular robot is operated by the movement of the finger that is shown in front of the camera. The GUI can be created using any of the available languages like MATLAB or Java. For each and every movement of finger in front of camera a unique coded signal is sent through RF transmitter. The signal is then processed and the robot motion occurs.
image
image
image

MATHEMATICAL MODEL

1. Subtraction of Background Image and Image with Object:
X- > Array of pixels of background image:
Y -> Array of pixels of image with object:
OPERATOR: -
EXPRESSION:
Z = IMSUBTRACT(X; Y )
DESCRIPTION:-
Subtract each element of array Y from array X and return the difference in output array Z.
Equivalent:-
2. For Converting Image Into Black and White:
Initialize value of threshold=25(approximately).
Out-> matrixofpixelsinitializedto0:
OPERATOR:
Comparison (< and >)
EXPRESSION:
Ifmatrix (i; j) > 25
Out (i; j) = 255
Ifmatrix(i; j) < 25
Out (i; j) = 0
DESCRIPTION:
Matrix is the image obtained after subtraction. Comparing each pixel value of matrix and then storing it in Out.
3. For Comparing Image With Each Image In Database:
A and B are matrices or vectors of the same size.
R-> contains the correlation output:
EXPRESSION:
R = CORR2 (A;B)
DESCRIPTION:
Analyze A (i,j) and B(i,j) for each I and j in the matrix and output the correlation coefficient.
4. For Finding the Maximum Correlation Factor:
A- > Matrix of pixels.
m- > Matrix with the maximum correlation factor.
Pos- > Position of matrix in the database:
EXPRESSION:
[mPos] = Max (A)
DESCRIPTION:
Each Image is compared with the correlation factor of the required image. The maximum of all the factors is obtained with its corresponding position Database.
5. Sending Characters To The Controller:
OPERATOR:
Comparison (> ;<=)`
Logical (&&)
EXPRESSION:
Ifpos > 0&&pos <= 10
Transmit1
Ifpos > 10&&pos <= 20
Transmit2
Ifpos > 20&&pos <= 30
Transmit3
Ifpos > 0&&pos <= 10
Transmit4
DESCRIPTION:
Comparing the position obtained and transmitting the appropriate character.

RESULT

At the operators side the finger will be shown in front of the web camera at the user’s personal computer. Appropriate processing of the finger would be done. After the processing of the finger the robot will move to a specified direction or perform the appropriate operation.

CONCLUSION

This paper proposed is capable of handling visual finger input sensing robot motion. Based on finger-robot interaction the proposed method is capable of creating and controlling motions of robots for performing a wide variety of applications. The proposed method can lead to the creation of a user friendly robot capable of operating in all fields ranging from home appliances to industrial. The proposed method supports a wide range of applications which include industry automation, home automation and is also suitable for deaf and dumb people thus removing the language barrier.

References

  1. http://www.cyberbotics.com/ (CyberRobotics WebotsTM)
  2. http://www.microsoft.com/korea/robotics/studio.mspx
  3. S. Y. Nam et al., “A study on design and implementation of the intelligent robot simulator which is connected to an URC system,” vol. 44. no. 4, IE, pp. 157-164, 2007.
  4. http://www.microsoft.com/downloads/ (Micorsoft Speech SDK 5.1)
  5. K. H. Seok and Y. S. Kim, “A new robot motion authoring method using HTM,” International Conference on Control, Automation and Systems, pp. 2058-2061, Oct. 2008.
  6. H. Kim, M. Y. Ock, T. W. Kang, and S. W. Kim, “Development of Finger Pad by Webcam,” The Korea Society of Marine Engineering, pp. 397-398, June 2008.
  7. K. H. Lee, J. H. Choi, “Hand Gesture Sequence Recognition using Morphological Chain Code Edge Vector,” Korea Sciety of Computer Information, vol. 9, pp. 85-91.
  8. A. Licsar, T. Sziranyi, “User-adaptive hand gesture recognition system with interactive training”, Image and Vision Computing 23, pp. 1102-1114, 2005.
  9. K. W. Kim, W. J. Lee, C. H. Jeon, “A Hand Gesture Recognition Scheme using WebCAM,” The Institute of Electronics Engineers of Korea, pp. 619- 620, June. 2008.
  10. M. W. Spong, S. Hutchinson, M. Vidyasagar, “ROBOT MODELING AND CONTROL”, WILEY, 2006.