ISSN ONLINE(2278-8875) PRINT (2320-3765)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

A Portable Wireless Head Movement Controlled Human-Computer Interface for People with Disabilities

 

M.Venkata suresh kumar, B.Neelima
Assistant Professor, Department of ECE, A.I.T.S, Tirupati, A.P, India
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Advanced Research in Electrical, Electronics and Instrumentation Engineering

Abstract

This paper describes about an economical head operated computer mouse for people with disabilities. It focuses on the invention of a head operated computer mouse that employs one tilt sensor placed in the headset to determine head position and to function as simple head-operated computer mouse. The system uses accelerometer based tilt sensor to detect the user's head tilt in order to direct the mouse movement on the computer screen. Clicking of mouse is activated by the user's eye brow movement through a sensor. The keyboard function is designed to allow the user to scroll letters with head tilt and with eye brow movement as the selection mechanism. Voice recognition section is also present in the head section to identify the small letters which are pronounced by the paralyzed user. This system was invented to assist people with disabilities to live an independent professional life.

Keywords

Computer mouse, head operated, people with disabilities, tilt sensor.

INTRODUCTION

OWING to the lack of appropriate input devices, people with disabilities often encounter several obstacles when using computers. Currently, keyboard and mouse are the most common input devices. Due to the increasing popularity of the Microsoft Windows interface, i.e., Windows 98 and NT, computer mouse has become even added important. Therefore, it is necessary to invent a simple mouse system for people with disabilities to operate their computers.
People with spinal cord injuries (SCIs) and who are paralyzed have increasingly applied electronic assistive devices to improve their ability to perform certain essential functions. Electronic equipment, which has been modified to benefit people with disabilities include communication and daily activity devices, and powered wheelchairs. From our literature analysis there are many computer input devices are available. Finger mounted device using pressure sensors, but no hardware has been realized so far and it needs physical kind of interaction with computer system. A wide range of interfaces are available between the user and device and the interfaces can be enlarged keyboards or a complex system that allows the user to operate or control a movement with the aid of a mouth stick, However, for many people the mouth stick method is not accurate and comfortable to use. An eye imaged input system, electrooculograpy (EOG) signals [5], electromyogram (EMG) signals [5], Electroencephalogram (EEG) signals [1], [2], [4], [5], [7]–[19] are capable of providing only a few controlled movements have slow response time for signal processing and require substantial motor coordination. In infrared or ultrasound-controlled mouse system (origin instruments’ head mouse and prentke romish’s head master) [3]–[6], etc. There are two primary determinants that are of concern to the user. The first one being whether the transmitter is designed to aim at an effective range or not with respect to receiver, the other one being whether the cursor of computer mouse can move with his head or not. These considerations increase the load for people with disabilities. Thus, alternative systems that utilize commercially available electronics to perform tasks with easy operation and easy interface control are sorely required.
The ability to operate a computer mouse has become increasingly important to people with disabilities especially as the advancement of technology allows more and more functions to be controlled by computer. There are many reasons for people with disabilities to operate a computer. For instance, they need to acquire new knowledge and communicate with the outside world through the Internet. In addition, they need to work at home, enjoy leisure activities, and manage many other things, such as home shopping and internet banking. This research focuses on a tilt sensor controlled computer mouse. The tilt sensors or inclinometers detect the angle between a sensing axis and a reference vector such as gravity or the earth’s magnetic field. In the area of medicine science, tilt sensors have been used mainly in occupational medicine research. For example, application of tilt sensors in gait analysis is currently being investigated. Andrews et al. [20] used tilt sensors attached to a floor reaction type ankle foot orthosis as a biofeedback source via an electrocutaneous display to improve postural control during functional electrical stimulation (FES) standing. Bowker and Heath [21] recommended using a tilt sensor to synchronize peroneal nerve stimulation to the gait cycle of hemiplegics by monitoring angular velocity. Basically, tilt sensors have potential applications of improving the abilities for persons with other disabilities [18]. The system uses MEMS accelerometers to detect the user’s head tilt in order to direct mouse movement on the computer screen. Clicking of the mouse is activated by the user’s eye-brow movement through a sensor. The keyboard function is designed to allow the user to scroll letters with head tilt and with eye brow movement as the selection mechanism. Voice recognition section is also present in the head section to identify the small letters which are pronounced by the paralyzed user. The tilt sensors can sense the operator’s head motion up, down, left, and right, etc. Accordingly, the cursor direction can be determined.

METHODS

The system replaces the original computer mouse with tilt sensors which are mounted onto a headset worn by people with disabilities.
The user performs to control computer mouse in order to move the cursor and perform all necessary functions in Windows 98. This mouse controlled functions include: up, down, left, right, upper-left, upper-right, lower-left, lower-right. The block diagram representation of the tilt sensor controlled computer mouse is shown in Fig. 1. The circuit of computer mouse interface controlled by tilt sensor is composed of six major elements: 1) the tilt sensor module; 2)the voice recognition module; 3) Eye brow sensor module; 4) the signal processing module; 5) Microcontroller module; 6) Wireless communication module.

A. The Tilt Sensor Module

The tilt sensor module, as shown in Fig. 1(A), links the computer mouse interface to people with disabilities. The tilt sensor module weighing roughly 8 grams. This is a 3 Axis Low-g Micro machined Accelerometer module with sensitivity selection using accelerometer Sensor (MMA7260Q) from free scale.
The device consists of two surface micro-machined capacitive sensing cells (g-cell) and a signal conditioning ASIC contained in a single IC. The sensing elements are sealed hermetically at the wafer level using a bulk micro machined cap wafer.
The g-cell is a mechanical structure formed from semiconductor materials (polysilicon) using semiconductor processes. It can be designed as a set of beams attached to a movable central mass that move between fixed beams and the movable beams can be deflected from their rest position by subjecting the system to acceleration.
The central mass move contains beams attached to it; hence the distance from them to the fixed beams on one side will increase by the same amount that the distance to the fixed beams on the other side wall decreases. Acceleration is the measure for change in distance.
The g-cell beams form two back-to-back capacitors. The center beam moves with acceleration, hence the distance between the beams changes and each capacitor's value will change, (C= Aε/D). Here A is the beam area, ε is the dielectric constant, and D is the distance among the beams. Switched capacitor techniques are used by the ASIC to measure the g-cell capacitors and extract the acceleration data from the difference between the two capacitors. The ASIC can also conditions and filters (switched capacitor) the signal, provided that a high level output voltage that is ratio-metric and proportional to acceleration.
The g-Select feature which is present in the device allows the selection among 4 sensitivities. Depending upon the logic input given on pins 1 and 2, the internal gain of the device changes allowing it to function with a 1.5g, 2g, 4g, or 6g sensitivity. With the supply range of 2.2 and 3.6 V, the device works as a fully calibrated linear accelerometer. Apart from these supply limits the device may operate as a linear device but is not guaranteed to be in calibration. This value is measured using g-Select in the mode of 1.5g.
The output will be increased above VDD/2 for positive acceleration, whereas the output will decrease below VDD/2 for negative acceleration. The detectable range of the tilt sensors in this study is any angle within ±45°.
The cursor controlled by the tilt sensor cannot only move in vertical or horizontal direction, but it can also move in a diagonal. One 9-V battery supplied the power required for all devices contained within the control box, including the following: tilt sensor, voice recognition circuitry, signal-processing circuitry, and microprocessor circuitry of the circuitry control box.

B .Voice Recognition Module

Voice recognition kit processes in analyzing voice, recognition of process and controlling system functions. Voice recognition system can be composed of external micro-phone, Keyboard, 64K SRAM and some additional components, an intelligent recognition system can be built my combining the microprocessor. It can recognize maximum of 1.92 sec of word and its response time is less than 300ms. HM 2007 IC is used for voice recognition. Maximum 40 words can be recognized by single chip. when the user enters a voice input though micro phone then that voice is send for recognition process their it compares with the stored voice pattern and the resultant signal is send to micro controller for further processing.

C. Eye Brow Sensor Module

The eye brow sensor contains an IR LED at 900 nm. It Shines invisible IR light on the user’s eye and this light does not cause any harm to the user’s eye. An IR 900 nm sensor is use to detect the reflected IR light when the user blinks his eye. This signal is given to the signal conditioning section then to the microcontroller for further processing.

D. The Signal-Processing Module

The signal-processing module, as shown in Fig. 1(B), consists primarily of three components: an amplifier, a low-pass filter, and analog to digital (A/D) converters. In order to receive a small signal from tilt sensors, a high performance amplifier was employed in this system. A second order low-pass radiatively coupled filter of 2 Hz is designed for the system, which can reduce the acceleration effects and remove the noise frequency. The 10-bit A/D converters are used to digitize the signals of the tilt sensor and voice recognition circuitry.

E. The Main Microcontroller

The ARM microcontroller is the main controller of the system, as shown in Fig. 1(C). Port0 and Port1 of the microprocessor can receive the digitized signals from the tilt sensor via the signal-processing module. At the same time, Port1 receives the trigger signal from the eye brow sensor to perform the click motions. A parallel-to-serial method is deployed via Port1 to dispatch signals capable of controlling input motion of the computer mouse (COM1). Port1 dispatches all control signals to the operator to confirm that his input motion has been completed.
Lateral and up-and-down motions from user’s head can be detected by the tilt sensors and are fed into the microprocessor for analysis and processing. The microprocessor maps the fed-in signal immediately to its command code as Port 1 receives signal from one AD converter only. It commands the mouse to have the cursor move in vertical or horizontal direction, i.e., up, down, left, right, upper or lower left, upper or lower right. The Port 0 of the microprocessor converts the parallel data into serial data and transmits these data to the computer through a radio-frequency (RF) method. The serial port (COM1) of the computer forwards both the command codes and digitized trigger signals to the computer. An application program written by visual basic (VB) language reads the command codes sent by serial port (COM 1) regarding the mouse activities from the microprocessor via API. These codes are converted in order to carry out the motions of up, down, left, right, upper or lower left, upper or lower right. Also, a speed control function for cursor/click is built in the application program. A set of desirable controlling parameters may be preset to satisfy the user depending on how familiar the operator is with the system. The application program is positioned in the top level of the Windows operation system such that the head-driven mouse may work with the rest of Windows based applications.
For system evaluation, 12 people (all men, 23–33 years old, six are nondisabled and six are individuals with quadriplegia) who had experience in operating computer were selected for this study. The six nondisabled individuals had their whole bodies were constrained in a fixed and stationary position, except neck and head movements are free as that in the spinal cord injured group, and were assigned as the control group. The rest of six SCI people with quadriplegia were assigned as the experimental group. All of them were given 30 min training prior to using this newly developed computer mouse. In addition, they all received instructions using 30 commands in controlling the computer mouse [up, down, left, right, upperleft, upper-right, lower-left, lower-right]. Then, they were asked to input as accurate as possible and not to correct their errors. During the test, the clinician read each command to prompt the users’ input motions. To begin with, the clinician read the first command to prompt the users’ first input motion. Then, the clinician read each of the following commands to prompt the user to input them as soon as the user had completed the input from the previous command. Therefore, the speed of using this interface is up to the user himself. The clinician recorded the number of correct input motions and the time needed to finish 30 input motions. Then, both the percentage of accuracy (number of correct input motions divided by 30 and multiplied by 100%) and the time needed for every user were calculated for the control group and the experimental group, correspondingly.

F. Wireless Communication Module

The XBee and XBee-PRO OEM RF Modules were engineered to meet the standards of IEEE 802.15.4 and maintain the unique needs of wireless sensor networks to have low cost and low power. The modules need minimal power to provide reliable data delivery between devices. The modules are operated in the ISM 2.4 GHz frequency band and are pin-for-pin compatible. The Modules interface to a host device through a logic-level serial port which is asynchronous. The module can communicate with any logic and voltage compatible UART through its serial port; or a level translator to any serial device. The XBee®/XBee-PRO RF Module were designed to mount into a receptacle (socket) and therefore do not require any soldering when mounting on to a board. The XBee Development Kits enclose RS-232 and USB interface boards which use two 20-pin receptacles to receive modules.

RESULTS

In order to accomplish the objective of operating the computer and communicating a message through the World Wide Web (WWW), all the individual needs to do is put on our newly developed headset, as shown in Figs.2. The test results were listed for users in both groups of control and experiment, as shown in Table I.
The average accuracy of this experiment for both the control group and the experimental group are 97.8 2.6% and 95.1 4.9%, respectively. The average time needed for the control group and the experimental group are 3.5 1.1 min and 4.9 2.0 min, respectively. An independent test revealed that the differences in the average accuracy and the average time of the control group and the experimental group are not significant. This means that the newly designed computer mouse interface is user friendly with respect to nondisabled people or the people with disabilities (SCI with quadriplegia).

SUMMARY

The increasing number of various accidental injuries over the years has resulted in a dramatic increase in the population of individuals with disabilities. Although there are numerous devices that can supplement the loss of function for people with spinal cord injuries, there is still a substantial difference in terms of their convenience and accuracy. Most of the devices are designed to serve as a computer mouse supplement for the individuals with disabilities by utilizing methods of mouth stick, eyeball movements, or eye-ball-imaging to complete the input motion [7], [12], [14], [15], [16], [17].Although mouth stick provides reasonable function and allow successful input through the computer mouse, it frequently lack good sanitation or convenience because it is orally activated. Similarly, eyeball movement and eyeball-imaging based systems rely on high level imaging analysis (with questionable accuracy), and they require a much longer operating time to input a number or a letter. As the head-controlled mouse relies on infrared and ultrasonic signals, the transmitter placed on the head sends signals to the remote receiver after a motion is detected. However, the user must focus on the cursor’s movement on the computer screen and assure that the transmitted signals are within the reception range of the receiver. As a result, these devices certainly cause troubles for people with disabilities. In addition, these systems have disadvantages such as expensive instrument costs and the requirements of extended operational training.
In the era of new millennium, it is our concern that individuals with disabilities do not become technological orphans in the areas of electronics and computers. Specifically, for people with disabilities to overcome inconveniences in their daily lives, we have utilized the least amount of circuitry as well as highly accurate control system to generate devices. The system presented in this paper allows people with disabilities to avoid the need to use uncomfortable input methods such as clutching a mouth stick. Rather, this system employs a tilt sensor module to control the computer mouse in response to the movements of neck’s rotation. There are several user-friendly features also included in this system. As a result, this system outperforms the mouth stick-based system in terms of providing users the advantages such as convenience, accuracy, and sanitation. In addition, a headset-type control method is especially helpful for those who are quadriplegic due to spinal cord injuries. The quantitative data also revealed that following limited training, nondisabled and people with disabilities can operate the system with an accuracy that exceeds 95%. On average, it took 7–9 s to complete a single mouse motion. The result shows people with disabilities can operate the system as good as nondisabled. Furthermore, when compared to the previously developed infrared controlled human–computer mouse interface [3], the newly developed system can complete a single mouse command 3–4 s faster, which proves the practical value of the system. People with disabilities can also mount the tilt sensor module on prosthesis, a protective gear, or on a powered wheelchair to achieve the objective of using the computer mouse easily and sanitarily.
This computer mouse interface, which is controlled by tilt sensor, utilizes current circuit technology to accomplish the control of a computer mouse system effectively. In the future, this interface can be introduced into many control systems at home such as powered wheelchairs, telephones, and appliances with great potential demanded by the market.

Tables at a glance

Table icon
Table 1
 

Figures at a glance

Figure Figure Figure
Figure 1 Figure 2 Figure 3
 

References