Brain/machine interface redefines cogito, ergo sum By Peter Clarke and Ian Cameron Electronic Engineering Times - November 13, 2000 - page 4 LONDON - An arm of the Joint Research Center of the European Commission has developed an interface for using thoughts to control a computer. The Institute for Systems, Informatics and Safety (Ispra, Italy) demonstrated its Adaptive Brain Interface (ABI) at the European Information Society Technology exhibition in Nice, France, lastweek. By composing sequences of thoughts, ABI users can read a Web page, interact with games, turn on appliances or even guide a wheelchair. Researchers think the ABI initially will be used to help the disabled improve their access to computers and the Internet. However, they said, the system ultimately could have wide-ranging uses. The ABI could, for example, monitor a person`s level of alertness or provide new forms of education and entertainment. It could also aid in the diagnosis of brain disorders, and be used for teaching and research. However, system speed and accuracy are still issues. Researchers reportedthat a trained ABI correctly recognizes key thoughts in about 70 percent of the attempts. This modestrate is compensated for by the fact that false recognition is below 5 percent. The ABI system makes decisions in about half a second. Some training is also required, since the interface uses a neural network to adapt to individual users. One of the test subjects reportedly achieved excellent control after five hour-long training sessions spread over five days. How it works - The interface is a cap that includes electroencephalograph (EEG) sensors. The EEG signals go into a digital signal processing unit for feature extraction and then on to a neural-network classifier that can be trained to recognize particular EEG patterns associated with key conscious thoughts. This portable system has successfully selected keys from a virtual keyboard displayed on a computer screen just by having the user think previously agreed-upon thoughts, such as right-hand movement, left-hand movement and "relax." As the user concentrates on different mental tasks the keyboard is successively split into smaller parts until a single character is selected. While wearing the cap a person can pick a letter of the al- phabet within about 20 seconds and "write" a sentence in a few minutes. While that may seem intolerably slow, project leaders say their main intent was to investigate ways to help the severely disabled interact with computers and other electronically enabled equipment that could include wheelchairs and doors. The approach is based on a mutual-learning process in which the user and the ABI adapt to each other. The neural network learns user-specific EEG patterns that describe the mental tasks desired, while the user learns to think in a way that enables the personal interface to better understand him or her. In other words, every user chooses his or her most natural mental tasks to concentrate on, and the preferred strategies to undertake those tasks. While setting out to help the disabled, the ABI`s biggest impact maybe on interfaces for electronic tools and toys. Potentially, an ultimate couch potato might not even need to press the remote to change TV channels. Just think. Ian Cameron is a reporter with Electronic Times, sister publication to EE Times in the UK. Photo caption: A cap with EEG sensors transmits brain waves through neural net to control things like computers and games. http://www.icsi.berkeley.edu/talks/Millan.html http://sta.jrc.it/sba/abi.htm