Brain-machine interface lets monkeys control two virtual arms (w/ Video)

Monkeys use minds to move 2 virtual arms
Large-scale brain activity from a rhesus monkey was decoded and used to simultaneously control reaching movements of both arms of a virtual monkey avatar towards spherical objects in virtual reality. Credit: Duke Center for Neuroengineering

In a study led by Duke researchers, monkeys have learned to control the movement of both arms on an avatar using just their brain activity.

The findings, published Nov. 6, 2013, in the journal Science Translational Medicine, advance efforts to develop bilateral movement in brain-controlled prosthetic devices for severely paralyzed patients.

To enable the monkeys to control two virtual arms, researchers recorded nearly 500 neurons from multiple areas in both cerebral hemispheres of the animals' brains, the largest number of neurons recorded and reported to date.

Millions of people worldwide suffer from sensory and motor deficits caused by spinal cord injuries. Researchers are working to develop tools to help restore their mobility and sense of touch by connecting their brains with assistive devices. The brain-machine interface approach, pioneered at the Duke University Center for Neuroengineering in the early 2000s, holds promise for reaching this goal. However, until now brain-machine interfaces could only control a single prosthetic limb.

"Bimanual movements in our daily activities—from typing on a keyboard to opening a can—are critically important," said senior author Miguel Nicolelis, M.D., Ph.D., professor of neurobiology at Duke University School of Medicine. "Future brain-machine interfaces aimed at restoring mobility in humans will have to incorporate multiple limbs to greatly benefit severely paralyzed patients."

Virtual monkey avatar shown from a 3rd person perspective as the movements of the two arms are decoded in real-time from the brain of a rhesus monkey. In the experiment the virtual arms and 3D target objects appear on the screen from a first-person perspective to the monkey, who receives a juice reward for correctly performed trials. Credit: Duke Center for Neuroengineering

Nicolelis and his colleagues studied large-scale cortical recordings to see if they could provide sufficient signals to brain-machine interfaces to accurately control bimanual movements.

Screen as viewed by the monkey during experiments: a first-person perspective of the rhesus monkey avatar limbs. The movements of both virtual arms are decoded in real-time from brain activity while the monkey's own arms were not permitted to freely move. The monkey must move the virtual arms to the circular targets to receive a small juice reward. Credit: Duke Center for Neuroengineering]

The monkeys were trained in a virtual environment within which they viewed realistic avatar arms on a screen and were encouraged to place their virtual hands on specific targets in a bimanual motor task. The monkeys first learned to control the avatar arms using a pair of joysticks, but were able to learn to use just their to move both avatar arms without moving their own arms.

Duke PhD student Peter Ifft observes brain activity from nearly 500 neurons being recorded as the monkey enacted bimanual reaching movements with the arms of the virtual monkey avatar. Credit: Duke Center for Neuroengineering

As the animals' performance in controlling both virtual arms improved over time, the researchers observed widespread plasticity in cortical areas of their brains. These results suggest that the monkeys' brains may incorporate the avatar arms into their internal image of their bodies, a finding recently reported by the same researchers in the journal Proceedings of the National Academy of Sciences.

The researchers also found that cortical regions showed specific patterns of neuronal electrical activity during bimanual movements that differed from the neuronal activity produced for moving each arm separately.

The study suggests that very large neuronal ensembles—not single neurons—define the underlying physiological unit of normal motor functions. Small neuronal samples of the cortex may be insufficient to control complex motor behaviors using a brain-machine interface.

"When we looked at the properties of individual neurons, or of whole populations of cortical cells, we noticed that simply summing up the correlated to movements of the right and left arms did not allow us to predict what the same individual neurons or neuronal populations would do when both arms were engaged together in a bimanual task," Nicolelis said. "This finding points to an emergent brain property—a non-linear summation—for when both hands are engaged at once."

Nicolelis is incorporating the study's findings into the Walk Again Project, an international collaboration working to build a brain-controlled neuroprosthetic device. The Walk Again Project plans to demonstrate its first brain-controlled exoskeleton, which is currently being developed, during the opening ceremony of the 2014 FIFA World Cup.

More information: "A Brain-Machine Interface Enables Bimanual Arm Movements in Monkeys," by P.J. Ifft; S. Shokur et al. Science Translational Medicine, 2013.

Citation: Brain-machine interface lets monkeys control two virtual arms (w/ Video) (2013, November 6) retrieved 19 March 2024 from https://medicalxpress.com/news/2013-11-brain-machine-interface-monkeys-virtual-arms.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Touch and movement neurons shape the brain's internal image of the body

 shares

Feedback to editors