Monkeys "Move and Feel" Virtual Objects Using Only Their Brains From: Duke Medicine - 10/05/2011 In a first-ever demonstration of a two-way interaction between a primate brain and a virtual body, two monkeys trained at the Duke University Center for Neuroengineering learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects. "Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton," said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering. Read the entire article and view a video at: http://www.dukehealth.org/health_library/news/monkeys-move-and-feel-virtual-objects-using-only-their-brains Links: Monkey brains 'feel' virtual objects http://www.nature.com/news/2011/111005/full/news.2011.576.html Active tactile exploration using a brain–machine–brain interface http://www.nature.com/nature/journal/vaop/ncurrent/full/nature10489.html Duke University Center for Neuroengineering http://www.duke.edu/~ch/Neuroeng/Neuro.htm Miguel Nicolelis, MD, PhD http://www.neuro.duke.edu/faculty/nicoleli Walk Again Project http://www.walkagainproject.org/ Monkeys Use Brain Activity to Move and Feel Virtual Objects http://www.techbriefs.com/component/content/article/11593 --- Monkeys "Move and Feel" Virtual Objects Using Only Their Brains From: ACM Tech News Duke University researchers recently conducted an experiment involving monkeys using brain-machine-brain interfaces (BMBIs) to control and receive feedback with virtual avatars. "Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton," says Duke professor Miguel Nicolelis. The monkeys were able to use their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and differentiate their textures. The texture of the virtual objects was expressed as a pattern of small electrical signals transmitted to the monkeys' brains. "In this BMBI, the virtual body is controlled directly by the animal's brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal's cortex," according to Nicolelis. He says the experiments provide further evidence that it could be possible to develop a robotic exoskeleton that paralyzed patients could wear to experience the world and move autonomously.