ISSN ONLINE(2320-9801) PRINT (2320-9798)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

MEMS Accelerometer based Digital Pen Recognition using Neural Networks

A. Geetha Vinothini1, V. Vishnu Prasath2
  1. PG Student, Department of Applied Electronics, Sri Subramanya College of Engineering and Technology, Palani, Tamilnadu, India
  2. Assistant Professor, Department of ECE, Sri Subramanian College of Engineering and Technology, Palani, Tamilnadu, India
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Innovative Research in Computer and Communication Engineering

Abstract

Now days, the growth of miniaturization technologies in electronic circuits and components has greatly decreased the dimension and weight of consumer electronic products, such as smart phones and handheld computers, and thus made them more handy and convenient. Due to the rapid development of computer technology, human– computer interaction (HCI) techniques have become an indispensable component in our daily life. Recently, an attractive alternative, a portable device embedded with inertial sensors, has been proposed to sense the activities of human and to capture his/her motion trajectory information from accelerations for recognizing gestures or handwriting. This paper presents an accelerometer-based digital pen for handwritten digit and gesture trajectory recognition applications. The digital pen consists of a tri-axial accelerometer, a microcontroller, and a Zigbee wireless transmission module for sensing and collecting accelerations of handwriting and gesture trajectories. Using this project we can do human computer interaction. Users can use the pen to write digits or make hand gestures, and the accelerations of hand motions measured by the accelerometer are wirelessly transmitted to a computer for online trajectory recognition. So, by changing the position of MEMS (micro electro mechanical systems) we can able to show the alphabetical characters in the PC. The acceleration signals measured from the tri-axial accelerometer are transmitted to a computer via the wireless module.

Keywords

Accelerometer, MEMS, gesture, Human Computer Interaction.

INTRODUCTION

A.NEURAL NETWORKS

An Artificial Neural Network (ANN) is an information processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. The key element of this paradigm is the novel structure of the information processing system. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. ANNs, like people, learn by example. An ANN is configured for a specific application, such as pattern recognition or data classification, through a learning process. Learning in biological systems involves adjustments to the synaptic connections that exist between the neurons. This is true of ANNs as well. Neural network simulations appear to be a recent development. However, this field was established before the advent of computers, and has survived at least one major setback and several eras.
Many important advances have been boosted by the use of inexpensive computer emulations. Following an initial period of enthusiasm, the field survived a period of frustration and disrepute. During this period when funding and professional support was minimal, important advances were made by relatively few researchers.

B.GESTURE RECOGNITION

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from the face and hand gesture recognition. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviours is also the subject of gesture recognition techniques. Gesture recognition can be seen as a way for computers to begin to understand human body language, thus building a richer bridge between machines and humans than primitive text user interfaces or even GUIs (graphical user interfaces), which still limit the majority of input to keyboard and mouse. Gesture recognition enables humans to communicate with the machine (HMI) and interact naturally without any mechanical devices. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly. This could potentially make conventional input devices such as mouse, keyboards and even touch-screens redundant. Gesture recognition can be conducted with techniques from computer vision and image processing. The literature includes ongoing work in the computer vision field on capturing gestures or more general human pose and movements by cameras connected to a computer. Gesture recognition and pen computing: This computing not only going to reduce the hardware impact of the system but also it increases the range of usage of physical world object instead of digital object like keyboards, mouse. Using this we can implement and can create a new thesis of creating of new hardware no requirement of monitors too. This idea may lead us to the creation of holographic display. The term gesture recognition has been used to refer more narrowly to non-text-input handwriting symbols, such as inking on a graphics tablet, multi-touch gestures, and mouse gesture recognition.

C.MEMS ACCELEROMETER

Micro-Electro-Mechanical Systems (MEMS) is the integration of mechanical elements, sensors, actuators, and electronics on a common silicon substrate through micro-fabrication technology. While the electronics are fabricated using integrated circuit (IC) process sequences (e.g., CMOS, Bipolar, or BICMOS processes), the micromechanical components are fabricated using compatible "micromachining" processes that selectively etches away parts of the silicon wafer or adds new structural layers to form the mechanical and electromechanical devices.
An example of a commonly used MEMS sensor is an accelerometer that might be used in consumer electronic devices such as game controllers (Nintendo Wii), personal media players/cell phones (Apple iPhone, Nokia mobile phone models, HTC PDA models), as well as a number of digital cameras and other “smart” devices. MEMS accelerometer sensors are mainly designed on the principle of capacitance differentiation

II. RELATED WORKS

Due to the tremendous progress in pattern recognition technology, handwriting-based human-computerinteraction (HCI) has become an indispensable component in our daily life. According to input signals, handwritten character recognition can be divided into online and offline recognition. Online recognition recognizes the stroke trajectories of handwritten characters, while offline recognition identifies the images of handwritten characters. Therefore, the input signals for online and offline recognition are the coordinate information of the pen tip as functions of time and the scanned image of the handwritten character, respectively. Since the online recognition can translate human’s intentions to a computer more intuitively and effectively than the offline method, the input devices for the online handwriting recognition are widely developed in the last decade such as ultrasonic digital pens, infrared digital pens, and touch pads. However, the drawback of the above mentioned input devices is that they should be operated with ambit restrictions. Recently, pen-based input devices embedded with inertial sensors have been proposed to sense or capture acceleration signals generated by writing trajectories.
The inertial-sensor-based input devices can provide the coordinate information of the pen tip as functions of time for online handwriting recognition [7], [8]. A significant advantage of inertial-sensor-based input devices for handwriting recognition is that they can be operated without any external reference or ambit restrictions [3], [4], [5], [6]. Recently, many researchers have focused on the development of inertial-sensor-based input device. To name a few, Wang and Chuang [7] presented an accelerometer based digital pen with a trajectory recognition algorithm for 2D handwritten digit recognition. The proposed algorithm extracted the time- and frequency-domain features from the accelerations, and then selected the most important features by the kernel-based class separability (KBCS) and linear discriminant analysis (LDA). Finally, a probabilistic neural network (PNN) recognized the handwritten digits based on the reduced features and the recognition rate achieved at 98%. Choi et al. [8] proposed a digital pen with a triaxial accelerometer to recognize 3D handwritten digits. They used the under-sampling and principle components analysis (PCA) methods to reduce the data size and the computation time. Subsequently, the hidden Markov model (HMM) was utilized to classify the movement signals. The user dependent and user-independent recognition rates using the HMM classifier was 100% and 68.6%, respectively. Zhou et al. [12] developed a micro inertial measurement unit (μIMU) composed of an accelerometer and a gyroscope to recognize 2D handwritten digits and 2D handwritten English letters. The recognition rates of the handwritten digits and English letters were about 85% and 64% by using the selforganizing map (SOM) classifier. The abovementioned results show that HMMs and NNs are Effective in dealing with handwriting recognition problems. However, HMMs and NNs both consume more computational time to obtain acceptable recognition rates in the training stage.

III. PROPOSED WORK

The proposed system consist of Pen section and PC section where the pen-type portable device consists of a tri axial accelerometer, a microcontroller, and an RF wireless transmission module. The acceleration signals measured from the tri axial accelerometer are transmitted to a computer via the wireless module. Users can utilize this digital pen to write digits and make hand gestures at normal speed. The measured acceleration signals of these motions can be recognized by the trajectory recognition algorithm.
The tri-axial accelerometer measures the acceleration signals generated by a user’s hand motions .By tilting the position of the sensor the acceleration signals are generated. The acceleration signal is in the form of analog signal which is converted into digital signal by the PIC microcontroller. The wireless transceiver transmits that signal wirelessly to the PC Recognized alphabetical characters are displayed in PC.

IV. SIMULATION RESULTS

The controller programming was implemented using Embedded C. The software used in this project for simulation is Proteus-Lab center Electronics. The simulated result for this work and prototype model for the proposed system is shown below (Fig.3-6). Advantage of this approach is the potential of mobility. The accelerometer can be used independently with an embedded processor or by connecting wireless module. For simulated model, input device is the potential divider (crimp port) instead of MEMS accelerometer, as the accelerometer is not available in this software library. Using this crimp port we can change the acceleration value.
In Proteus simulation the X-axis and Y-axis values are generated in the form of analog signal which is given to the PIC 16F877A to generate the digital signal. The generated signal consist of set of datapoint values. These datapoint values are plotted to obtain the recognised structure. In MATLAB the X and Y co-ordinates values are plotted and then the structure is recognised. The structure recognised using the MATLAB is shown in the figure 6.The recognised alphabetical characters are shown in PC.

V. CONCLUSION

There are several methods available for the gesture recognition. This paper describes a nonspecific person gesture recognition system by using MEMS accelerometers. The recognition system consists of sensor data collection, segmentation and recognition. After receiving acceleration data from the sensing device, a segmentation algorithm is applied to determine the starting and end points of every input gesture automatically. The sign sequence of a gesture is extracted as the classifying feature, i.e., a gesture code. Finally, the gesture code is compared with the stored standard patterns to determine the most likely gesture. Since the standard gesture patterns are generated by motion analysis and are simple features represented by 8 numbers for each gesture, the recognition system does not require a big data base and needs not to collect as many gestures made by different people as possible to improve the recognition accuracy. This result encourages us to further investigate the possibility of using our digital pen as an effective tool for Human Computer Interaction applications. In future an interface system can be constructed which can allows an advanced user to interact with the computer.

Figures at a glance

Figure 1 Figure 2 Figure 3
Figure 1 Figure 2 Figure 3
Figure 1 Figure 2 Figure 3
Figure 4 Figure 5 Figure 6
 

References