Vision-Based Hand-Gesture Applications From: Communications of the ACM - 02/2011 - page 60 By: Juan Pablo Wachs, Mathias Kölsch, Helman Stern, Yael Edan Assistive Technologies For the impaired, the critical requirements of a hand-gesture interface system are "user adaptability and feedback" and "come as you are." In this context, wheelchairs, as mobility aids, have been enhanced through robotic/ intelligent vehicles able to recognize hand-gesture commands (such as in Kuno et al. [28]). The Gesture Pendant [44] is a wearable gesture-recognition system used to control home devices and provide additional functionality as a medical diagnostic tool. The Staying Alive [3] virtual-reality-imagery-and- relaxation tool satisfies the "user adaptability and feedback" requirement, allowing cancer patients to navigate through a virtual scene using 18 traditional T'ai Chi gestures. In the same vein, a tele-rehabilitation system [18] for kinesthetic therapy—treatment of patients with arm-motion coordination disorders—uses force-feedback of patient gestures. Force-feedback was also used by Patel and Roy [36] to guide an attachable interface for individuals with severely dysarthric speech. Also, a hand-worn haptic glove was used to help rehabilitate post-stroke patients in the chronic phase by Boian et al. [5] These systems illustrate how medical systems and rehabilitative procedures promise to provide a rich environment for the potential exploitation of hand-gesture systems. Still, additional research and evaluation procedures are needed to encourage system adoption. For sign languages (such as American Sign Language), hand-gesture-recognition systems must be able to recognize a large lexicon of single-handed and two-handed gestures. Read the entire article at: http://cacm.acm.org/magazines/2011/2/104397-vision-based-hand-gesture-applications/fulltext References: 28. Kuno, Y., Murashima, T., Shimada, N., and Shirai, Y. Intelligent wheelchair remotely controlled by interactive gestures. In Proceedings of 15th International Conference on Pattern Recognition (Barcelona, Sept. 3-7, 2000), 672-675. 44. Starner, T., Auxier, J., Ashbrook, D., and Gandy, M. The gesture pendant: A self-illuminating, wearable, infrared computer-vision system for home-automation control and medical monitoring. In Proceedings of the Fourth International Symposium on Wearable Computers (Atlanta, Oct. 2000), 87-94. 3. Becker, D.A. and Pentland, T. Staying alive: A virtual reality visualization tool for cancer patients. In Proceedings of the AAAI Workshop on Entertainment and Alife/AI. AAAI Technical Report WS-96-03, 1996. 18. Gutierrez, M., Lemoine, P., Thalmann, D., and Vexo, F. Telerehabilitation: Controlling haptic virtual environments through handheld interfaces. In Proceedings of ACM Symposium on Virtual Reality Software and Technology (Hong Kong, Nov. 10-12), ACM Press, New York, 2004, 195-200. 36. Patel, R. and Roy, D. Teachable interfaces for individuals with dysarthric speech and severe physical disabilities. In Proceedings of the AAAI Workshop on Integrating Artificial Intelligence and Assistive Technology (Madison, WI, July 26-30, 1998), 40-47. 5. Boian, R., Sharma, R., Han, C., Merians, A., Burdea, G., Adamovich, S., Recce, M., Tremaine, M., and Poizner, H. Virtual reality-based post-stroke hand rehabilitation. Studies in Health and Technology Information (2002), 64-70.