A User-Dependent Easily-Adjusted Static Finger Language Recognition System for Handicapped Aphasiacs From: Applied Artificial Intelligence - Vol 23 Issue 10 By: Yu-Fen Fu and Cheng-Seen Ho Unlike sign language, which usually involves large-scale movements to form a gesture, finger language, suitable for handicapped aphasiacs, is represented by relatively small-scale hand gestures accessible by a mere change of the bending manner of a patient's fingers. Therefore, we need a system that can tackle the specificity of each handicapped aphasiac. We propose a system that fulfills this requirement by employing a programmable data glove to capture tiny movement-related finger gestures, an optical signal value-parameterized function to calculate the finger bending degrees, and an automatic regression module to extract most adequate finger features for a specific patient. The selected features are fed into a neural network, which learns to build a finger language recognition model for the specific patient. Then the system can be available for use by the specific user. At the time of this writing, the achieved average success rate was 100% from unbiased field experiments. Full Text PDF http://www.informaworld.com/smpp/373798240-56457533/ftinterface~db=all~content=a916863703~fulltext=713240930 Full Text HTML http://www.informaworld.com/smpp/373798240-56457533/ftinterface~db=all~content=a916863703~fulltext=713240928 Contributed by Rich Simpson