NASA's Hands-Free Approach to Landing Jets In simulations at NASA's Ames Research Center (Moffett Field, CA), researchers recently demonstrated the capability to control a 757 aircraft using only human muscle-nerve signals linked to a computer. Engineers outfitted the pilot with an armband implanted with electrodes. The sensors read muscle-nerve signals as the pilot made the gestures needed to land a computer-generated aircraft at San Francisco International Airport. The pilot also demonstrated the ability to land a damaged aircraft during emergency drills. "This is a fundamentally new way to communicate with machines," said Dr. Charles Jorgensen, head of the neuroengineering lab at NASA Ames. "Human nerve signals can be linked directly to devices without the aid of joysticks or mice, thereby providing rapid, intuitive control." Neuroelectric control uses neural network software that "learns" patterns that slowly change and evolve with time. Nerve signal patterns, which tell muscles to move in a certain way, offer an ideal application for neural net software. A computer can match the patterns - each as unique as a fingerprint - with particular gestures, such as pointing or making a fist. Researchers designed software that can adjust for each pilot's nerve patterns, which can be affected by caffeine use, biorhythms, and performance stress. NASA hopes to apply the technology in space to help astronauts control tools while performing extravehicular activity (EVA) tasks such as construction and spacecraft maintenance. Visit http://www.nasatech.com/cgi-bin/200101.pl?2-6B