A Computer That Reads Body Language From: ECN Magazine - 07/06/2017 Researchers at Carnegie Mellon University's Robotics Institute have enabled a computer to understand the body poses and movements of multiple people from video in real time - including, for the first time, the pose of each individual's fingers. This new method was developed with the help of the Panoptic Studio, a two-story dome embedded with 500 video cameras. The insights gained from experiments in that facility now make it possible to detect the pose of a group of people using a single camera and a laptop computer. Detecting the nuances of nonverbal communication between individuals will allow robots to serve in social spaces, allowing robots to perceive what people around them are doing, what moods they are in and whether they can be interrupted. A self-driving car could get an early warning that a pedestrian is about to step into the street by monitoring body language. Enabling machines to understand human behavior also could enable new approaches to behavioral diagnosis and rehabilitation for conditions such as autism, dyslexia, and depression. Read the entire article at: https://www.ecnmag.com/news/2017/07/computer-reads-body-language Links: Robots Learn to Speak Body Language http://spectrum.ieee.org/video/robotics/robotics-software/robots-learn-to-speak-body-language Panoptic Studio http://www.cs.cmu.edu/~hanbyulj/panoptic-studio Yaser Sheikh http://www.cs.cmu.edu/~yaser Related: App Couples with Sensors to Assist Blind https://www.mdtmag.com/news/2015/10/app-couples-sensors-assist-blind Mobile Control With Facial Gestures https://www.ecnmag.com/news/2017/07/mobile-control-facial-gestures