Robots are employed in variety of applications and are available in a wide range of configurations. The need to respond to the environment without using the nervous system’s efferent pathways has initiated a new interaction system that can boost and speed up the human sensor-effector system. To maximize human and machine interaction, Human Threading TM technique has been developed to merge the observations made in human cognitive system, neuro-anatomical structures, finite state machines and their associated relationships. The Brain-Computer Interface (BCI) is used to create a robust communication system that can interpret human intentions and cognitive emotions reflected by appropriate brain signals into control signals for robotic manipulations. Efficient brain- computer interfaces use efficient neural signal recording devices that are able to record neural signals continuous over long periods of time through Positron Emission Tomography (PET), functional Magnetic Resonance Imaging (fMRI), functional Near-Infrared Imaging (fNIR), Electroencephalography (EEG) and Electrocorticographic (ECoG) methods. The paper presents critical review of the brain-computer interface system and robotics for manufacturing applications.
Keywords |
BCI, EEG, ECoG, Robotic Control, Human Threading |
INTRODUCTION |
The need to respond to our environment without using our nervous system’s efferent pathways have initiated new
interaction systems that can boost and speed up the human sensor-effector systems. The recent trend in the study of
neuroscience has created avenues of improving the brain- computer interface (BCI) and research has started
exploring the vast applications in different fields that can benefit from such improvements on the BCI system. The
many applications include mechatronic systems control and robotics, communication, neuroprosthetics, environmental
control and electronic device coordination and control [10]. |
Robots are employed in variety of applications and are available in a wide range of configurations. Recent academic
researches have been aimed at improving the usage of robots using advanced control methods. These advanced control
methods include model-based techniques for adaptive control [32, 33], intelligent control methods using computational
and neurodynamic techniques [34, 35] modelled to predict human cognitive states. There have been varying degrees
of success demonstrated in the use of neuroscience in robotics and the applications of the several advance
methods are often restricted to development of commercial systems [25]. |
There are millions in the world who are suffering from severe motor dysfunctions with or without lower and upper
extremity impairments. For a person with such motor dysfunction, it is almost impossible to interact with their
environment. Efficient robotic systems that can integrate a sensory subsystem, brain-machine interface and provide
autonomous or semi-autonomous movements are systems that are desired by such individuals as their scalp electric
potentials can be exploited to their advantage [9]. Parikh et al [18] provided an integrated solution for motion
planning and control with human inputs that includes interactions from the user’s brain with the controller in generating commands for controlling a wheelchair [18]. Mazo [16] in his work demonstrated the possibility of
controlling a wheelchair using head movements, signals from electro-oculography and other sensors [16]. |
The Non-invasive EEG-based brain-computer interface (BCI) provides an integrated communication channel for
individuals who do not necessarily require their motor function capabilities to interact with the environment
around them. They can interact with external world by controlling devices such as a wheelchair, robotic arm and
computer. The brain-computer interface is also useful to able-bodied human beings for interaction with media
applications, virtual environment and games. Most of the brain-computer interface research has been carried out on
trial-based continuous control systems. The trails require that the participants maintain a sustained attention and
regulate their brain activities in order to obtain the desired results. The trail-based system has prompted the
development of self-paced or asynchronous system for continuous BCI evaluation. The self-paced system differentiates
between “Intentional Control” state and “No Control” state of the human mind [20]. To maximize human-machine
interaction, Human Threading TM technique has been developed to merge the observations made in human
cognitive system, neuro- anatomical structures, finite-state machines and their associated relationships [15]. This
technique is used to clear the uncertainties in the physiological inefficiencies that exist between human beings and
machines. The concept of using interwoven technological designs in current researches involving cognitive
neuroscience, electrical engineering, computer science, psychology, mechatronic systems and robotics may provide an
unlimited array of artefact creation if they follow particular guiding principles in contrast to design science. |
The Human Threading TM system consists of recursive linear systems. These include the observation of human
interaction with a machine, the design of an efficient system of interaction between human being and machine and the
output system for the new relationship formed between human beings and machines at the least cost and high
operational efficiency. The efficiency of the Human Threading TM methodology relies on its ability to combine
measurements from functional Magnetic Resonance Imaging (fMRI), neural firings, Electroencephalography (EEG),
infrared spectral analysis, Transcranial Doppler Sonography (TDS), interaction-based time complexities and
galvanic skin response to refine human physiological dynamics and determine the efficient usage of the brain
resources [15]. |
BRAIN-COMPUTER INTERFACE (BCI) |
The prime purpose of a brain-computer interface is to create a robust communication system that can interpret human
intentions and cognitive emotions reflected by appropriate brain signals into control signals for robotic
manipulations. In addition, the BCI system is designed to increase the autonomy of individuals with severe motor
disabilities by providing new communication pathways and control options [3]. The type of data handled by a BCI
system according to the definition that was put together at the first international meeting on BCI systems “must not
depend totally on the brain’s normal output pathways of peripheral nerves and muscles” [30]. The definition created
reasonable bounds for harnessing signals with useful information regardless of their origin on the human body.
The different methods used in tapping EEG signals rely on non-invasive EEG system and invasive EEG systems. The
non-invasive EEG system uses a BCI system that analyses signals arising from non-evoked potentials. |
In contrast, BCI systems using evoked potentials achieve higher data transfer rates than the BCI system that works with
un-stimulated brain signals. The inefficiency of evoked systems lies in the user being exhausted after long usage
of the system as user is constantly faced with stimuli [5]. An invasive BCI system makes use of single-neuron
activity and outputs signals with higher spatial resolution. The signals from the invasive system depend on the
electrodes placed on the cortex and provide control signals that have many degrees of freedom. The limitation faced
in the usage of EEG signals for communication and control lie in the fact that EEG-based BCI system has limited
resolution and requires extensive training. The single neuron system also has significant clinical risks and limited
stability. These limitations are overcome through the use of Electrocorticographic (ECoG) activity recorded from
the surface of the brain. ECoG activity allows users to control one-dimensional robotic signals rapidly and accurately.
The identification and training in the usage of ECoG signals provides the platform for closed loop control system for
one dimensional binary activity. It is also useful and stable for applications requiring open loop control such as
two dimensional joystick movements [14]. |
The transformation of brain activity into the direct control of computer components and mechanical hardware without
the use of the peripheral nervous system is a system that is gaining attention to provide control options for paraplegic
patients and robotics in general. The need for brain activity transformation has led to the development of methods that
can acquire EEG signals and analyse them on temporal or frequency domain boundaries and translate them into
appropriate control commands for hardware manipulation. The Brain-Computer Interface (BCI) also known as the
Brain-Machine Interface (BMI) is a system that translates neural activity of the human brain into signals and
commands that can be used in controlling machines and robots. The three main sub-systems of the BCI are: |
• The Electrodes: These are the devices that are used for recording of neural activity from the brain. The
recordings can be invasive or non-invasive, analog neural population signals in the form of scalp field potentials
measured from the scalp. The readings and measurements of the field potentials can be restrictive in the manner of
implementation of the potential functionality of a BCI. |
• End-effector: The end effector controlled by the neural signals measured from the scalp. The endeffector
can be anything from a robotic arm, visual signal, computer game, to a complicated prosthetic system. |
• The Algorithm: The algorithm analyses and interprets the measured neural signals into command signals. The
algorithm forms the link between the measuring device and the end-effector. It determines which sections of the
recorded neural activity that can be used for robotic movements and control and which commands that can be
generated from the recorded activity. |
There have been substantial applications of BCI in the rehabilitation, treatment and care of disabled and paralysed
patients with the intent of developing an efficient communication channel for paralysed patients so as to
restore and improve their social interaction with the outside world. The application is also extended to the
restoration of movement capabilities of patients by using signals from neural activity to drive prosthetic devices [28]. |
The EEG Electrode |
The motor pathways in the human body which the brain uses for communication and control of emotions and
motions can be disrupted by many disorders such as brain-stem stroke and amyotrophic lateral sclerosis. Individuals
with communication difficulty as a result of having no means of repairing damaged nervous systems can restore their
communication capabilities through functional augmentation of the remaining pathways, data diversion around points
of damage and providing the brains with a whole new set of communication channels for communication and
control. EEG activity can provide the platform for creating such communication channels and studies have shown that
humans have the ability to control EEG phenomena. Single channel EEG-based BCI systems have a low data transfer
rate that can be useful for individuals with severe motor disabilities. The development of multi-channel BCI systems
increases the capacity of the EEG based communication systems thereby increasing the possibilities and applications in
communication and control of robots [31]. |
Brain signals are detected and measured using various techniques. The techniques include the recording of electric or
magnetic fields, Positron Emission Tomography (PET), functional Magnetic Resonance Imaging (fMRI) and
functional Near-Infrared Imaging (fNIR). Brain activity can be recorded at the scalp using EEG methods, at the
cortical surface using electrocorticographic methods (ECoG) or within the brain through local field potentials, neuronal
action spikes or neuronal potentials [7]. Efficient brain-computer interfaces use efficient neural signal recording devices
that are able to record neural signals continuously over long periods of time. EEG recordings are made from electrodes
placed on the scalp and the average electrical activity directly below the electrode is captured. The recordings reflect
the electrical activity of synchronous firing of pyramidal cells. EEG signals are obtained through non-invasive
techniques by placing Ag/AgCl electrodes on the scalp and contain data in a relatively narrow frequency band. Recent
BCI research has introduced the use of intra-cortical extracellular microelectrodes which are inserted into the
cerebral cortex [4]. |
SPREEDSHEET AND EEG ANALYSIS |
Numerical data are often used in analysis of robotic signals and commands. Spreadsheet becomes handy as its
utilization cuts across various disciplines. The popularity of spreadsheet neuroscience have demonstrated that EEG
data sets can be used to classify electrodes. EEG data sets are so huge that it became necessary to develop and use tools such as TableLab to manage EEG data sets. TableLab expanded the common functionality of spreadsheets by
having huge text file partitioning, long table visualization and processing, random number generation, signal analysis
and generation and EEG cluster analysis [1]. |
THE NEURAL CODE |
The growing interest in neuroscience has been how to make sense out of the signals that are measured from the
human brain in expanding the field of robotics. The “rate code” which encompasses neural signal and uncorrelated
noise model has been of the view that EEG data with the temporal structure of neural spike train are uncorrelated noise
which is not suitable for brain data processing. Event-related potentials (ERPs) are recovered from averaging the
noise signals experimentally over repeated trials [22]. The method assumes that variability reflects noise which if
uncorrected with the right signal could be overcome by the brain through relevant averaging of the neural signals. The
temporal code suggests that precise neural spike timing represents time-varying cognitive, sensory or motor signals.
The temporal code has represented high frequency EEG components as signals instead of noise even during
spontaneous activity [24]. |
The output of the neural spike train derived from the integrate-and-fire neuron model is usually regular. The
transformation of the current signal into frequency modulated neural spike train that is based on the regular
output is achieved using the integrate-and-fire neuron. As a result of the regular output from the integrate-and-fire
neuron, the efficiency of the model may be limited due to the presence of discrete spectral components at the output
frequency and its multiples [2]. The limitations associated with a regular output from the integrate-and-fire neuron
model are eliminated in the Poisson neuron model having a random output. The randomness of the output
improves the efficiency of the process that transforms the continuous somatic signal into a neural spike train. The
Poisson output has a white noise component resulting from the randomness of the output. This is because it has no
spectral noise components as opposed to the regular output model from the integrate-and-fire neuron [11]. |
Neural coding shaped through the understanding of noise in EEG data sets presents better precision through adaptive
modelling of the white noise generated from the random output. Neural spike encoding and signal
reconstruction process that is based on noise-shaping neural coding takes the somatic current signal i(t ) having
passed through dendritic low-pass filter for band limiting is encoded into a neural is as a result of its
simplicity, short learning curve, functional power, attractiveness and high productivity in its usage. EEG impulse
train y(t ) = Σδ (t -ti ) . A change in the input data sets can be made of millions of rows and several columns
corresponding to electrode recordings. Recent researches in frequency at the electrodes leads to a linearly
proportional change in the output frequency and the change transforms the underlying somatic membrane potential
v(t) into a Poisson- and tasks, decision making and social interaction of robots are
like random neural spike train with additive white noise E expressed in (1). highly dependent on the ability of
human beings to embed human moods and emotional states in robots. It is critical in social interaction that
emotional intelligence plays an |
|
important role in EEG signal adaptation and learning process. Researches in cognitive intelligence and neuroscience
have where fm is the mean firing frequency. The system operates demonstrated that emotions are major
components of as a closed negative feedback loop in order to minimize the underlying somatic membrane potential
[22] and represented in (2). intelligent thinking and intelligent behaviour [19]. Renowned techniques use statisticalbased
[27] and wavelet-based [17] analysis of EEG signals for feature extraction; coupled with support vector machine
(SVM) [6], fuzzy k-means [8] and fuzzy c-means [26]. The recognition of emotions using EEG based recognition system through artificial stimulation of emotional states removes the disadvantages
introduced by Where represents the convolution integral, i(t ) represents other emotion recognition techniques as
the technique has the input current, if (t ) represents total negative feedback minimal influence on the central
nervous system signals [19]. current after a spike. The noise spectrum of the neural signal N ( f ) expressed in
(3) as part of the output neural spike The implementation of the technique encompasses an EEG- based userindependent
emotion recognition system using features derived from higher-order-crossing analysis [12, 19].
Train y(t )of the noise shaping neuron is illustrated by the
simplest noise shaping filter G( f ) [23],N Where N denotes the order of noise shaping that is associated to the negative feedback function hf (t ) [22]. |
|
|
Neural code classification |
Human beings exchange information and communicate with each other through verbal and non-verbal means.
The extension of non-verbal mode of communication to computer agents, robots and machines is becoming more and
more interesting as its applications to our world is widening on daily basis. Research into emotion recognition, a nonverbal
mode of communication has been investigated using speech, image, gesture, facial emotion and physiological
signals. The implementation of emotion recognition technology using EEG signals and gestures has proven to add
more weight in the advancements of control and coordination of robots in recent years. The EEG signals are generated
through bio-potential signals on the scalp and gestures are generated by moving the wrist and the hand. The study of
gestures recognition is crucial in understanding and recognising human emotion. Important to the study is the use of
adequate EEG and action-recognition equipment to capture the bio-potential signals as emotions are caused or induced
through the stimulus of objects in the environment [13]. |
The different states of emotion induced through the sensitive stimulus system implemented through pictures are
standardised by the International Affective Picture System (IAPS). The system allows for classification of human
mental condition. These conditions are tension/relaxation, pleasant/unpleasant and excitement/calmness [21]. The
emotions can be used to generate EEG signals that are useful for robotic communication and control. Adaptations to
events and tasks, decision making and social interaction of robots are highly dependent on the ability of human beings
to embed human moods and emotional states in robots. It is critical in social interaction that emotional intelligence
plays an important role in EEG signal adaptation and learning process. Researches in cognitive intelligence and
neuroscience have demonstrated that emotions are major components of intelligent thinking and intelligent
behaviour [19]. Renowned techniques use statistical-based [27] and wavelet-based [17] analysis of EEG signals for
feature extraction; coupled with support vector machine (SVM) [6], fuzzy k-means [8] and fuzzy c-means [26]. The
recognition of emotions using EEG- based recognition system through artificial stimulation of emotional states
removes the disadvantages introduced by other emotion recognition techniques as the technique has minimal
influence on the central nervous system signals [19]. The implementation of the technique encompasses an EEG- based
user-independent emotion recognition system using features derived from higher-order-crossing analysis [12, 19]. |
EEG SIGNAL PERFORMANCE MEASURE |
Bit rate is the standard yard stick for measuring the data show a direct relationship to how efficient and responsive that
system would be. The Bit rate expressed in (4) depends on both speed and accuracy and as such EEG trainings having
N possible trainings and each of the trainings has the same probability of being the one that the user desires. If the
probability P that the desired training will actually be selected is always true and if each of the undesired trainings has
the same probability of being chosen then the bit rate B for transferring such data is expressed as [5]: |
|
THE PERFORMANCE OF HUMANS AND THE MANUFACTURING ENVIRONMENT |
Recent research has shown that there is complete difference in the performance of human beings and as such
there are slight variations in the EEG signals. This phenomenon makes the training algorithm for emotion
recognition and physiological changes in the brain difficult and at the same time has prompted the development
of learning and adaptation algorithms for EEG pattern recognition. The understanding of the sequence of changes is
analogous with the understanding of techniques which includes training and pharmacological interventions of
how these changes can be controlled. Recently, there is clear substantiation that interventions based on brain plasticity
can fix deficits arising from degeneration, environmental stress, disease, psychiatric problems and trauma. The
neurological basis for brain plasticity is the biochemical processes that are concerned with transmitting signals between
neurons thereby generating EEG signals. Brain plasticity is the process of change in synapses and rewiring or refining
brain function can occur during the process [29]. The plasticity of the brain allows for neuroplasticity-based
techniques which are useful in enhancing the effectiveness of cognitive recognition. |
The control of prosthetic devices and robotic arms using EEG signals has created a new level of communication
between humans and machines that can be extended to the manufacturing environment. The search for better ways
of coordination and control within and among robots has prompted the integration of brain waves into the
communication system of robots. |
CONCLUSION |
The response of the human brain to events in the environment has proven to be source of EEG signal generation for
coordination and control of robots. The development of the communication interface between the human mind and
machines has increased the chances of integrating back valuable human capital into the manufacturing
environment and also to interact effectively with the environment. Adaptation and decision making in robots are
improved through the social interaction that can be coordinated using the brain-computer interface. The brain-computer
interface has made it possible for humans to communicate with machines using human thoughts, intentions,
cognitive and affective states of the mind. The integration of human threading into machines and robot
communication systems will improve the efficiency and performance of machines and robots in a humancoordinated
environment. The brain-computer interface has provided the next level of communication between
the human mind, robots and machines. |
|
References |
- M. Ayala, M. Cabrerizo, M. Tito, A. Barreto, & M. Adjouadi. âÃâ¬ÃÅA Spreadsheet Application for Processing Long-term EEG RecordingdâÃâ¬ÃÂ. Computers in Biology and Medicine, vol.39 , pp 844-851, 2009.
- E. J. Bayly. âÃâ¬ÃÅSpectral Analysis of Pulse Frequency Modulation in the Nervous SystemâÃâ¬ÃÂ. IEEE Transaction on Biomedical Engineering BME-15, pp 257-265, 1968.
- B. Blankertz, G. Dornhege, M. Krauledat, K.-R. Muller, V. Kunzmann, F. Losch. âÃâ¬ÃÅThe Berlin Brain-Computer Interface: EEG-based Communication Without Subject TrainingâÃâ¬ÃÂ. IEEE transaction on Neural Systems and Rehabilitation Engineering, vol 20, pp 1-6, 2006.
- J. S. Brumberg, A. Nieto-Castanon, P. R. Kennedy, & F. H. Guenther. âÃâ¬ÃÅBrain-Computer Interfaces for Speech CommunicationâÃâ¬ÃÂ. Speech Communication, vol 52, pp 367-379, 2010.
- M. Cheng, X. Gao, S. Gao, & D. Xu. âÃâ¬ÃÅDesign and Implementationon of a Brain-Computer Interface with Higher Transfer RatesâÃâ¬ÃÂ. IEEE Transactions on BioMedical Engineering vol 49, no.10, pp 1181-1186, 2002.
- N. Cristianini, & J. Shawe-Taylor. âÃâ¬ÃÅAn Introduction to Support Vector Machines and Other Kernel-Based Learning MethodsâÃâ¬ÃÂ. Cambridge, UK: Cambridge University Press, 2000.
- J. J. Daly, & J. R. Wolpaw. âÃâ¬ÃÅBrain-Computer Interfaces in Neurological RehabilitationâÃâ¬ÃÂ. LancertNeural, vol 7 , pp 1032-1043, 2008.
- J. J. De Gruijter, & A. McBratney. âÃâ¬ÃÅA Modified Fuzzy k- means for Predictive ClassificationâÃâ¬ÃÂ. In H. H. Bock, Classification and Related Methods of Data Analysis, pp. 97-104. Amsterdam, the Netherlands: Elsevier, 1988.
- C. De La Cruz, W. C. Celeste, & T. F. Bastos. âÃâ¬ÃÅA Robust Navigation System for Robotics WheelChairsâÃâ¬ÃÂ. Control Engineering Practice, vol 19 , pp 575-590, 2011.
- F. Galan, M. Nuttin, E. Lew, P. W. Ferrez, G. Vanacker, J. Philips. âÃâ¬ÃÅA Brain-Actuated Wheelchair: Asynchronous and Non-Invasive Brain- Computer Interfaces for Continuous Control of RobotsâÃâ¬ÃÂ. Clinical Neurophsiology vol 119 , pp 2159-2169, 2008.
- G. Gestri. âÃâ¬ÃÅPulse Frequency Modulation in Neural SystemsâÃâ¬ÃÂ. Biophysiology, vol 11 , pp 98-109, 1971.
- B. Kedem. âÃâ¬ÃÅTime Series Analysis by Higher Order CrossingsâÃâ¬ÃÂ. New Jersey: IEEE Press, 1994.
- H.-D. Kim, H.-C. Yang, & K.-B. Sim. âÃâ¬ÃÅEmotion Recognition Method for Driver ServicesâÃâ¬ÃÂ. International Journal of Fuzzy Logic and Intelligent Systems vol 7, no. 4, pp221-308, 2007.
- E. C. Leuthardt, G. Schalk, J. R. Wolpaw, J. G.Ojemann, & D. W. Moran. âÃâ¬ÃÅA Brain-Computer Interface using Electrocorticographic Signals in HumansâÃâ¬ÃÂ. Journal of Neural Engineering vol 1 , pp 63-71, 2004.
- C. Liapis, âÃâ¬ÃÅA Primer to Human ThreadingâÃâ¬ÃÂ. Computers in Human Behaviour, vol 27 , pp 138-143, 2011.
- M. Mazo. âÃâ¬ÃÅAn Integrated System for Assisted MobilityâÃâ¬ÃÂ. IEEE Robotics and Automoation Magazine, vol 8, no. 1 , pp 46-56, 2001.
- M. Murugappan, M. Rizon, R. Nagarajan, S. Yaacon, I. Zunaidi, & D. Hazry. âÃâ¬ÃÅEEG Feature Extraction for Classifying Emotions using FCM and FKMâÃâ¬ÃÂ. Computers and Communication, vol1 , pp 21-25, 2007.
- S. P. Parikh, V. Grassi, V. Kumar, & J. Okamito. âÃâ¬ÃÅIntegrating Human Inputs With Autonomous Behaviours on an Intelligent Wheelchair PlatformâÃâ¬ÃÂ. IEEE Intelligent Systems, vol 22, no.2 , pp 18-24, 2007.
- P. C. Petrantonakis, & L. J. Hadjileontiadis. âÃâ¬ÃÅEmotion Recognition from EEG Using Higher Order CrossingâÃâ¬ÃÂ. IEEE Transactions on Information Technology in Biomedicine vol 14, no. 2 , pp 186-197, 2010.
- K. Qian, P. Nikolov, D. Huang, D.-Y. Fei, X. Chen, & O. Bai. âÃâ¬ÃÅA Motor Imagery-Based online interactive Brain- Controlled switch: Paradigm Development and Preliminary TestâÃâ¬ÃÂ. Clinical Neurophysiology, vol 121 , pp 1304-1313, 2010.
- J. A. Russell. âÃâ¬ÃÅEvidence of Convergent Validity on the Dimensions of AffectâÃâ¬ÃÂ. Journal of Personality and Social Psychology , vol 36 , no. 10 ,pp 1152-1168, 1978.
- J. Shin. âÃâ¬ÃÅA Unifying Theory on the Relationship Between Spike Trains, EEG, and ERP Based on the Noise Shaping/Predictive Neural Coding HypothesisâÃâ¬ÃÂ. BioSystems vol 67 , pp 245-257, 2002.
- J. Shin. âÃâ¬ÃÅAdaptive Noise Shaping Neural Spike Encoding and DecodingâÃâ¬ÃÂ. Neurocomputing vol 38 , pp 369-381, 2001.
- J. Shin, & A. Talnov. âÃâ¬ÃÅA Single Trial Analysis of Hippocampal Theta Frequency during Nonsteady Wheel Running in RatsâÃâ¬ÃÂ. Brain Research, vol 897 , pp 217-221, 2001.
- M. Short, & k. Burn. âÃâ¬ÃÅA Generic Controller Architecture for Intelligent Robotic SystemsâÃâ¬ÃÂ. Robotics and Computer-Integrated Manufacturing vol 27 , 292-305, 2011.
- K. G. Srinivasa, K. R. Venugopal, & L. M. Patnaik. âÃâ¬ÃÅFeature Extraction using Fuzzy C-Means clustering for Data Mining SystemsâÃâ¬ÃÂ. International Journal of Computer Science and Network Security vol 6, no.3, 230-236, 2006.
- K. Takahashi. âÃâ¬ÃÅRemarks on Emotion Recognition from Bio-Potential SignalsâÃâ¬ÃÂ. 2nd International Conference on Autonomous Robots Agents, pp. 186-191, Palmerston North, New Zealand, 2004
- S. Waldert, T. Pistohl, C. Braun, T. Ball, A. Aertsen, & C. Mehring. (2009). âÃâ¬ÃÅA Review on directional Information in Neural Signals for Brain-Machine InterfacesâÃâ¬ÃÂ. Physiology- Paris ,vol 103 , pp 244-254, 2009.
- E. Williams. âÃâ¬ÃÅHuman PerformanceâÃâ¬ÃÂ. McLean, Virginia: JASON, The MITRE Corporation, 2008.
- J. R. Wolpaw, N. Birbaumer, W. J. Heetderks, D. J. McFarland, P. H. Peckham, G. Schalk. âÃâ¬ÃÅBrain-Computer Interface Technology: A Review of the First International MeetingâÃâ¬ÃÂ. IEEE Transaction on Rehabilitation Engineering vol 8 no.2, pp 164-173, 2000.
- J. R. Wolpaw & D. J. Mcfarland. âÃâ¬ÃÅMultichannel EEG- Based Brain-Computer CommunicationâÃâ¬ÃÂ. Electroencephalography and Clinical Neurophysiology vol 90 , pp 444-449, 1994.
- H. Seraji. âÃâ¬ÃÅNonlinear and Adaptive Control of Force Compliance in Manipulators.âÃâ¬Ã International Journal of Robotics Research; vol 17, no. 5, pp 467-484, 1998.
- M. Tarokh. âÃâ¬ÃÅA Discrete-Time Adaptive Control Scheme for Robot ManipulatorsâÃâ¬ÃÂ. Journal of Robotic Systems; vol 7, no. 2, pp 145-166, 1990.
- R. Kozma. âÃâ¬ÃÅIntentional Systems: Review of Neurodynamics, Modelling and Robotics ImplementationâÃâ¬ÃÂ. Physics of Life Reviews; vol 5, pp 1- 21, 2008.
- O. Bai, V. Rathi, P. Lin, D. Huang, H. Battapady, D-Y. Fei, L. Schneider, E. Houdayer, X. Chen, M Hallett. âÃâ¬ÃÅPrediction of Human Voluntary Movement Before it OccursâÃâ¬ÃÂ. Clinical Neurophysiology; vol 122, pp 364-372, 2011.
|