Adaptive Brain Interfaces (ABI) is a part of European Union Information Technology’s ESPRIT programme, with the central aim of extending the capabilities of physically-impaired people to access new services and opportunities. The ABI is a portable brain-computer interface based on the analysis of electroencephalogram (EEG) signals and interface of P300 based speller.
A cap with a few integrated electrodes acquires brain signals that are pre-processed and sent to a computer for further analysis. The portable brain interface has an embedded neural network classifier that recognizes what mental task the wearer is concentrating on. It does so by analyzing continuous variations of EEG signals over several cortical areas of the brain. Each mental task is associated to a simple command. This enables people to communicate using their brain activity, as the interface only requires users to be conscious of their thoughts and to concentrate sufficiently on the mental expression of the commands required to carry out the desired task. So, by composing command sequences (thoughts), the user can read a web page, interact with games, turn on appliances, or even guide a wheelchair.
Brain interface will be most successful when it is adapted to its owner. The approach is based on a mutual learning process where the user and the ABI interface are coupled together and adapt to each other. The neural network has been specifically designed to cope with the challenging problem of recognizing mental tasks from spontaneous on-line EEG signals. Although the immediate application of ABI is to help physically disabled or impaired people by increasing their independence and facilitating access to the Information Society, the benefits of such a system are extensive. Anyone can use it for other purposes, e.g. health and safety concerns (e.g. monitoring a person's level of alertness). ABI could also contribute to the medical diagnosis of brain disorders.
This report presents a subject-independent EEG(Electroencephalogram) classification technique and its application to a P300 based word speller. It also presents use of signals recorded from the brain to operate robotic or prosthetic devices. Both invasive and noninvasive approaches have proven effective. Achieving the speed, accuracy, and reliability necessary for real-world applications.INTRODUCTION
Adaptive Brain Interface (ABI) is a human computer interface system that accepts voluntary commands directly from the brain to interact with the surrounding environment or to do a particular task. Sometimes it is called a direct neural interface, Brain computer interface or a brain-machine interface. It is a direct communication pathway between a brain and an external device. BCIs were aimed at assisting, augmenting or repairing human cognitive or sensory-motor functions. The approach, on which the ABI is based, as the name implies, is the adaptiveness. That means that both the system and the user adapt to each other as explained before. In ABI the adaptive part is the local neural classifier which is responsible for classifying input signal, and the user adapts by training in the chosen mental tasks which he/she finds most comfortable and effective to use. Second important approach is that this system should also work reliably outside laboratory environment, i.e. in normal everyday life. This calls for an easy to use, wearable (small and light) system.When compared with other BCIs, one of the ABI’s areas of good performance is the time required for training. User can acquire good control over the system just in five days.
Each mental task is associated to a simple command such as "select right item". This enables people to communicate using their brain activity, as the interface only requires users to be conscious of their thoughts and to concentrate sufficiently on the mental expression of the commands required to carry out the desired task. So, by composing command sequences (thoughts), the user can read a web page, interact with games, turn on appliances, or even guide a wheelchair. For example, the interface can be used to select letters from a virtual keyboard on a computer screen and write a message. The ABI project seeks to develop individual brain interfaces. The same system is not suitable for everybody, as no two people are identical, either physically or psychologically. This means that an interface will be most successful when it is adapted to its owner. The approach is based on a mutual learning process where the user and the ABI interface are coupled together and adapt to each other.
RESEARCH HISTORY WITH FACTS
1990:
Monkeys in North Carolina have remotely operated a robotic arm 600 miles away in MIT's Touch Lab using their brain signals.The feat is based on a neural-recording system. In that system, tiny electrodes implanted in the animals' brains detected their brain signals as they controlled a robot arm to reach for a piece of food.
According to the scientists from Duke University Medical Center, MIT and the State University of New York (SUNY) Health Science Center, the new system could form the basis for a brain-machine interface that would allow paralyzed patients to control the movement of prosthetic limbs.The Internet experiment "was a historic moment, the start of something totally new," Mandayam Srinivasan, director of MIT's Touch Lab, said in a November 15 story in the Wall Street Journal.The work also supports new thinking about how the brain encodes information, by spreading it across large populations of neurons and by rapidly adapting to new circumstances.
In the Nature paper, the scientists described how they tested their system on two owl monkeys, implanting arrays of as many as 96 electrodes, each less than the diameter of a human hair, into the monkeys' brains. The technique they used allows large numbers of single neurons to be recorded separately, then combines their information using a computer coding algorithm. The scientists implanted the electrodes in multiple regions of the brain's cortex, including the motor cortex from which movement is controlled. They then recorded the output of these electrodes as the animals learned reaching tasks, including reaching for small pieces of food.2000
Two monkeys have been trained to eat morsels of food using a robotic arm controlled by thoughts that are relayed through a set of electrodes connecting the animal's brain to a computer, scientists have announced. The astonishing feat is being seen as a major breakthrough in the development of robotic prosthetic limbs and other automated devices that can be manipulated by paralysed patients using mind control alone.
"The monkey learns by first observing the movement,which activates the brain cells as if he were doing it. It's a lot like sports training, where trainers have athletes first imagine that they are performing the movements they desire," Dr Schwartz said. The robotic arm used in the experiment had five degrees of freedom – three at the shoulder, one at the elbow and one at the hand, which was supposed to emulate the movement of the human arm.
The training of the monkeys took several days using food as rewards. Previous work by the group has concentrated on training monkeys to move cursors of a Computer screen but the latest study using a robotic arm involved a more complicated system of movements, the scientists said.
After the above mentioned discovery P300 samples have been invented. P300 is an endogenous, positive polarity component of the evoke-related-potential (ERP) developed in the brain in response to infrequent/oddball auditory, visual or somato-sensory stimuli in a stream of frequent stimuli. Farwell and Donchin first demonstrated the use of P300 for brain computer interfaces (BCIs) . In the paradigm, the computer displays a matrix of cells representing different letters, and flashes each row and column alternately. A user trying to input a letter needs to pay attention to the letter for a short while. In this process, when the row/column containing the intended letter flashes, a P300 will be elicited in EEG, which can then be detected for word spelling by an appropriate algorithm. It is recognized that large inter-subject variations exist among people. For example, the amplitude and latency exists in both normals as well as clinical populations. And this has been linked to individual differences in cognitive capability. Therefore, from the pattern recognition viewpoint, computational P300 models build for one person would not apply well to another person. To solve this problem existing P300-based BCIs all use a direct method to solve this problem by training subject-specific P300 models. Thus, before a person can operate the BCIs, he/she needs to go through a special training process. In that process, the person usually follows instructions to stare at a particular cell at a given time, while his concurrent EEG is recorded. With the recorded data, a computer algorithm performs signal analysis and learns the subject-specific P300 model. However, this process is normally tedious and complicated.
2004:
WORKING OF ADAPTIVE BRAIN INTERFACE
Electrodes placed on the scalp or within the head acquire signals from the brain, and the BCI system processes them to extract specific signal features that reflect the user’s intent. The BCI translates these features into commands that operate a device—for example, a word-processing program, speech synthesizer, robotic arm, or wheelchair.
It works because of the way our brains function. Our brains are filled with neurons, individual nerve cells connected to one another by dendrites and axons. Every time we think, move, feel or remember something, our neurons are at work. That work is carried out by small electric signals that zip from neuron to neuron as fast as 250 mph [source: Walker]. The signals are generated by differences in electric potential carried by ions on the membrane of each neuron.
Although the paths the signals take are insulated by something called myelin, some of the electric signal escapes. Scientists can detect those signals with the help of tools like electrodes in EEG, interpret what they mean and use them to direct a device of some kind. The electrodes measure minute differences in the voltage between neurons. The signal is then amplified and filtered. In current BCI systems, it is then interpreted by a computer program.
No comments:
Post a Comment
leave your opinion