Users Control Ear-Mounted Computer with Facial Movements From: Computer - 04/2014 - page 17 Hiroshima City University researchers have developed a prototype tiny computer that a user can wear in one ear and control with eyeblinks, tongue clicks, and other facial movements. This could help people who can't use their hands because they are driving, disabled, or frail. An accelerometer could identify if an older user falls, which would prompt the computer to call either relatives or an ambulance. Links: Hiroshima City University working on ear-mounted computing device http://www.itpro.co.uk/mobile/21739/hiroshima-city-university-working-on-ear-mounted-computing-device Japan researchers testing tiny ear computer https://au.news.yahoo.com/technology/a/21762947/japan-researchers-testing-tiny-ear-computer/ This Wearable Computer is Like Google Glass for Your Ears http://mashable.com/2014/03/01/this-wearable-computer-is-like-google-glass-for-your-ears/ --- A Japanese firm, NS West, has developed a wearable PC for your ear. It looks like a hearing aid, and includes a compass, GPS tracker, gyro sensor, barometer, speaker, and a microphone. It also has infrared sensors that can monitor the movements of your ear as you change facial expressions, allowing hands-free operation. It is battery powered and includes built-in storage, so it could accept software, apps, and files and includes a pulse monitor and thermometer for health monitoring. The initial prototype was built at Hiroshima University. The developers expect to have a finished product available by the end of 2015. A PC For Your Ear (with video 1:30 in Japanese with English captions) http://www.deskeng.com/edge/2014/03/a-pc-for-your-ear/ --- Created by Hiroshima City University staffer Kazuhiro Taniguchi, who calls the prototype Bluetooth ear-worn computer an Ear Switch, the device puts gyro-sensors, compass, GPS functions, barometer, speaker, microphone, and battery in a form-fitting earpiece. The device's interface also uses infrared to detect when the wearer opens and closes his or her mouth; head and mouth movements send corresponding command signals back to the device. Apps could be created that would let the device, say, access weather or traffic information using head/mouth gestures. The next 'smart thing' http://www.electronicproducts.com/Sensors_and_Transducers/Sensors/The_next_39_smart_thing_39.aspx ---