ISSN ONLINE(2319-8753)PRINT(2347-6710)

All submissions of the EM system will be redirected to Online Manuscript Submission System. Authors are requested to submit articles directly to Online Manuscript Submission System of respective journal.

Design of Human Assist System for Communication

V.G.Vijaya, V.Prabhakaran
Department of Mechatronics Engineering, Center for Nano Technology, Bharath Institute of Science & Technology Bharath University, Chennai-, India
Related article at Pubmed, Scholar Google

Visit for more related articles at International Journal of Innovative Research in Science, Engineering and Technology

Abstract

With the development of Human-Computer Interface (HCI), methods have been developed to help these people for communication. Unlike traditional HCIs (a keyboard, or a mouse, etc.), modern HCIs have played an important role in the area of rehabilitation. However, the disabled with severe paralysis have only few ways to control and work with the applications. For these people, methods based on eye movement or blinking and voice can be selected. In this project, we focus on implementing an IR Sensors for EOG based HCI and voice to text processor which is cheap, portable and noninvasive. In the Eye Ball Control section IR sensors will be placed closer to the eye and when it sense the Eye ball movement then it will automatically transmit the value to the Comparator to compare the voltage received from the Sensors. If the voltage is about 0.5 volt then automatically signal will be passed to the Remote section and control few applications in PC connected to it. This voltage will be gained only if the eye ball movement is sensed. In the Receiver section we have microcontroller section interfaced to the zigbee module and PC. Zigbee receiver will receive the value and controller will calculate the sensed movement and control the application in PC.In the tooth click application sensor is placed in tooth and while the user clicks the teeth, Micro switch sensor will sense the data and mouse click operation is performed for particular application. In the last phase of the project MEMS sensor is used for head nodding application. MEMS can be tilted in three positions like our head, so far each head position a particular application can be controlled.

INTRODUCTION

1.1Electronic Eye Gesture

Gesture recognition is a topic in computer science and language technology with the goal of interpreting human gestures via mathematical algorithms. Gestures can originate from any bodily motion or state but commonly originate from the face or hand. Current focuses in the field include emotion recognition from the face and hand gesture recognition. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. However, the identification and recognition of posture, gait, proxemics, and human behaviors is also the subject of gesture recognition techniques. Based on the fact that there are many challenging cases of infirm persons, who are able to control only their eye muscles, a low-cost mobile device for electronic eye gesture recognition has been designed as a human-machine interface, which enables the control of different applications and home appliances by user's eye gestures by the IR and Bluetooth wireless technology. In this paper the embedded system design of the device is presented in detail, including hardware and software design, modes of operation, and used methods for the eye gesture recognition. Beside these, measurements of the used differential amplifier and the achieved eye gesture recognition efficiency are presented within the test results.Furthermore, a newly designed adjustable head mounted EOG acquisition device with the permanent surface electrodes is proposed.
1.2Pattern Recognition
Pattern recognition is the scientific discipline whose goal is the classification of objects into a number of categories or classes. In machine learning, pattern recognition is the assignment of some sort of output value (or label) to a given input value (or instance), according to some specific algorithm. An example of pattern recognition is classification, which attempts to assign each input value to one of a given set of classes (for example, determine whether a given email is "spam" or "non-spam"). However, pattern recognition is a more general problem that encompasses other types of output as well. Other examples are regression, which assigns a real-valued output to each input; sequence labeling, which assigns a class to each member of a sequence of values (for example, part of speech tagging, which assigns a part of speech to each word in an input sentence); and parsing, which assigns a parse tree to an input sentence, describing the syntactic structure of the sentence.
1.3 EOG
By Brandon Peters, M.D., A measurement of the electrical activity associated with eye movements as recorded with the placement of small metal discs called electrodes applied to the skin near the eyes. It is useful for monitoring eyeball movement in REM and non-REM sleep. Electrooculography (EOG/E.O.G.) is a technique for measuring the resting potential of the retina. The resulting signal is called the electrooculogram. The main applications are in ophthalmological diagnosis and in recording eye movements. Unlike the electroretinogram, the EOG does not represent the response to individual visual stimuli. Principle of electrooculography. The eye acts as a dipole in which the anterior pole is positive and the posterior pole is negative.1.Left gaze; the cornea approaches the electrode near the outer canthus resulting in a positivegoing change in the potential difference recorded from it. 2.Right gaze; the cornea approaches the electrode near the inner canthus resulting in a positive-going change in the potential difference recorded from it (A, an AC/DC amplifier). Below each diagram is a typical tracing displayed by a pen recorder. Electrooculography was used by Robert Zemeckis and Jerome Chen, the visual effects supervisor in the movie Beowulf during the enhanced performance capture to correctly capture and animate the eye movements of the actors. It was an improvement from The Polar Express.

LITERATURE REVIEW

A. Inference From Literature Review
A newly designed adjustable head mounted EOG acquisition device with the permanent surface electrodes is proposed. Pattern recognition is the scientific discipline whose goal is the classification of objects into a number of categories or classes. Electrooculography (EOG/E.O.G.) is a technique for measuring the resting potential of the retina. Many approaches have been made using cameras and computer vision algorithms to interpret sign language. including analogue connections, codec’s, packet loss and variable delay.

METHOD OF IMPLEMENTATION

3.1Embedded systems
An embedded system is a computer system designed to perform one or a few dedicated functions often with realtime computing constraints. It is embedded as part of a complete device often including hardware and mechanical parts. By contrast, a general-purpose computer, such as a personal computer, is designed to be flexible and to meet a wide range of end-user needs. Embedded systems control many devices in common use today. Embedded systems are controlled by one or more main processing cores that is typically either a microcontroller or a digital signal processor (DSP). The key characteristic is however being dedicated to handle a particular task, which may require very powerful processors. For example, air traffic control systems may usefully be viewed as embedded, even though they involve mainframe computers and dedicated regional and national networks between airports and radar sites. (Each radar probably includes one or more embedded systems of its own.)Since the embedded system is dedicated to specific tasks, design engineers can optimize it reducing the size and cost of the product and increasing the reliability and performance. Some embedded systems are mass-produced, benefiting from economies of scale. Physically, embedded systems range from portable devices such as digital watches and MP3 players, to large stationary installations like traffic lights, factory controllers, or the systems controlling nuclear power plants. Complexity varies from low, with a single microcontroller chip, to very high with multiple units, peripherals and networks mounted inside a large chassis or enclosure.In general, "embedded system" is not a strictly definable term, as most systems have some element of extensibility or programmability. For example, handheld computers share some elements with embedded systems such as the operating systems and microprocessors which power them, but they allow different applications to be loaded and peripherals to be connected. Moreover, even systems which don't expose programmability as a primary feature generally need to support software updates. On a continuum from "general purpose" to "embedded", large application systems will have subcomponents at most points even if the system as a whole is "designed to perform one or a few dedicated functions", and is thus appropriate to call "embedded".
3.2Characteristics of Embedded Systems
1. Embedded systems are designed to do some specific task, rather than be a general-purpose computer for multiple tasks. Some also have real-time performance constraints that must be met, for reasons such as safety and usability; others may have low or no performance requirements, allowing the system hardware to be simplified to reduce costs.
2. Embedded systems are not always standalone devices. Many embedded systems consist of small, computerized parts within a larger device that serves a more general purpose. For example, the Gibson Robot Guitar features an embedded system for tuning the strings, but the overall purpose of the Robot Guitar is, of course, to play music.[5] Similarly, an embedded system in an automobile provides a specific function as a subsystem of the car itself.
3. The program instructions written for embedded systems are referred to as firmware, and are stored in read-only memory or Flash memory chips. They run with limited computer hardware resources: little memory, small or nonexistent keyboard and/or screen.
3.3Processors in Embedded Systems
Embedded processors can be broken into two broad categories: ordinary microprocessors (μP) and microcontrollers (μC), which have many more peripherals on chip, reducing cost and size. Contrasting to the personal computer and server markets, a fairly large number of basic CPU architectures are used; there are Von Neumann as well as various degrees of Harvard architectures, RISC as well as non-RISC and VLIW; word lengths vary from 4-bit to 64-bits and beyond (mainly in DSP processors) although the most typical remain 8/16-bit. Most architectures come in a large number of different variants and shapes, many of which are also manufactured by several different companies.

MODULE EXPLANATION

4.1Transmitter Section:
image
Transmitter part consists of Sensing unit, Controller section, Comparator circuit, Zigbee Transceiver module and charging unit. Battery source is given as power supply as the nodes will be in remote place. IR Sensor with Transmitter and Receiver are placed closer to the eye ball and signal received from the sensor are compared in the comparator circuit and transmitted to the remote location for mobile device controlling through Zigbee Transceiver module. Next values of Microswitch sensor is transmitted to the Receiver via same medium for mouse controlling. At last MEMS values(X,Y,Z) are acquired for head motion control
4.2Receiver Section
Receiver part will consist of Zigbee Transceiver module, Controller section, Power supply, and Switching Circuit and PC. Zigbee module will receive the Sensor values. Applications in the PC will be controlled by the signals received from Eye ball section, Tooth click section and MEMS based head movement.
image
4.3Peripheral Devices
4.3.1Zigbee Module – Zigbee Physical Layer
ZigBee is a wireless technology developed as an open global standard to address the unique needs of low cost, low-power wireless M2M networks. The ZigBee standard operates on the IEEE 802.15.4 physical radio specification and operates in unlicensed bands including 2.4 GHz, 900 MHz and 868 MHz. The 802.15.4 specification upon which the ZigBee stack operates gained ratification by the Institute of Electrical and Electronics Engineers (IEEE) in 2003. The specification is a packet-based radio protocol intended for low-cost, batteryoperated devices. The protocol allows devices to communicate in a variety of network topologies and can have battery life lasting several years.
4.3.2Zigbee Protocol
The ZigBee protocol has been created and ratified by member companies of the ZigBee Alliance. Over 300 leading semiconductor manufacturers, technology firms, OEMs and service companies comprise the ZigBee Alliance membership. The ZigBee protocol was designed to provide an easy-to-use wireless data solution characterized by secure, reliable wireless network architectures.
4.3.3Zigbee Advantage
ZigBee protocol features include Support for multiple network topologies such as point-to-point, point-to-multipoint and mesh networks
Low duty cycle – provides long battery life
Low latency
Direct Sequence Spread Spectrum (DSSS)
Up to 65,000 nodes per network
128-bit AES encryption for secure data connections
4.3.4PIC Microcontroller
image
4.3.5Mesh Networks
A key component of the ZigBee protocol is the ability to support mesh networking. In a mesh network, nodes are interconnected with other nodes so that multiple pathways connect each node. Connections between nodes are dynamically updated and optimized through sophisticated, built-in mesh routing table.Mesh networks are decentralized in nature; each node is capable of self-discovery on the network. Also, as nodes leave the network, the mesh topology allows the nodes to reconfigure routing paths based on the new network structure. The characteristics of mesh topology and ad-hoc routing provide greater stability in changing conditions or failure at single nodes.
4.3.6Zigbee applications
ZigBee enables broad-based deployment of wireless networks with low-cost, low-power solutions. It provides the ability to run for years on inexpensive batteries for a host of monitoring and control applications. Smart energy/smart grid, AMR (Automatic Meter Reading), lighting controls, building automation systems, tank monitoring, HVAC control, medical devices and fleet applications are just some of the many spaces where ZigBee technology is making significant advancements.

CONCLUSION

The main advantage of this paper is to eliminate the disability for the handicapped people so that they can enjoy this world as a normal human being are enjoying. Those people can control or operate all the computer application by the gesture of their eye movement and the interactive application are done by their tooth click and also gaming, swapping, page scrolling, etc. are also done using their head movement by placing a MEMS
The complete replacement of wired communication Using PIR sensors, this sensor can be placed on the eye for a long time since it won’t produce the IR Radiation It provides a better human computer interface with its higher operating microcontroller It finds the solution to the disabled person to operate the computer fully with the enabled mode of typing also possible.

References

  1. B. Bridgeman,“Conscious vs. unconscious processes: The case of vision”, Theory and Psychology, vol. 2, No. 1, pp.73-88, 1992.
  2. Q. Ji, H. Wechsler, A. Duchowski, M. Flickner,“Special issue: eye detection and tracking”,Computer Vision and Image Understanding, Vol. 98, No. 1, pp. 1-3, 2005.
  3. S. Kawato, N. Tetsutani, “Detection and tracking of eyes for gazecamera control”,The 15th International Conference on Vision Interface, Calgary, May 27-29, 2002..
  4. J. Kim, “A simple pupil-independent method for recording eye movements in rodents using video”,Journal of Neuroscience Methods, Vol. 138, No. 1-2, pp. 165-171, 2004
  5. C. Morimoto, M. Mimica, “Eye gaze tracking techniques for interactive applications”, Computer Vision and Image Understanding, Volume 98, No. 1, pp 4-24, 2005.
  6. Q. Ding, K. Tong, G. Li, “Development of an EOG (Electro- Oculography) Based Human-Computer Interface”,Engineering in Medicine and Biology 27th Annual Conference Shanghai, China, September 1-4, 2005.
  7. N. Phuong, “Digital Correlation”,Vietnam OpenCourseWare module, Jul 2008.