Model describes complete grasping movement planning in the brain

Model describes complete grasping movement planning in the brain
A rhesus macaque (Macaca mulatta) wearing a data glove for detailed hand and arm tracking. Credit: Ricarda Lbik

Neuroscientists at the German Primate Center (DPZ)-Leibniz Institute for Primate Research in Göttingen have developed a model that can seamlessly represent the entire planning of movement from seeing an object to grasping it. Comprehensive neural and motor data from grasping experiments with two rhesus monkeys provided decisive results for the development of the model, an artificial neural network that is able to simulate processes and interactions in the brain after training with images of specific objects. The neuronal data from the artificial network model were able to explain the complex biological data from the animal experiments and thus prove the validity of the functional model. This could be used in the long term for the development of better neuroprostheses, for example, to bridge the damaged nerve connection between brain and extremities in paraplegia and thus restore the transmission of movement commands from the brain to arms and legs.

Rhesus , like humans, have a highly developed nervous and visual system as well as dexterous hand motor control. For this reason, they are particularly well suited for research into grasping movements. From previous studies in it is known that the interaction of three is responsible for grasping a targeted object. Until now, however, there has been no detailed at the neural level to represent the entire process from the processing of visual information to the control of arm and hand muscles for grasping that object.

In order to develop such a model, two male rhesus monkeys were trained to grasp 42 objects of different shapes and sizes presented to them in random order. The monkeys wore data gloves that continuously recorded the movements of arm, hand and fingers. The experiment was performed by first briefly illuminating the object to be grasped while the monkeys looked at a red dot below the respective object and performed the grasping movement with a short delay after a blinking signal. These conditions provide information about the time at which the different areas are active in order to generate the grasping movement and the associated muscle activations based on the visual signals.

Model describes complete grasping movement planning in the brain
Primates are able to perform different grasping movements. The picture shows six different objects that were presented to the monkeys together with the corresponding grip types. Credit: Stefan Schaffelhofer

In the next step, images of the 42 objects, taken from the perspective of the monkeys, were used to train an that mimics the biological processes in the brain. The network model consisted of three interconnected stages corresponding to the three cortical brain areas of the monkeys, and provided meaningful insights into the dynamics of the brain networks. After appropriate training with the behavioral data of the monkeys, the network was able to reflect the grasping movements of the rhesus monkeys. It could process images of recognizable objects and reproduce the muscle dynamics required to grasp the objects accurately.

The results obtained using the artificial network model were then compared with the biological data from the monkey experiment. It turned out that the neural dynamics of the model were highly consistent with the neural dynamics of the cortical brain areas of the monkeys. "This artificial model describes for the first time in a biologically realistic way the neuronal processing from seeing an object for recognition, to action planning and hand muscle control during grasping," says Hansjörg Scherberger, head of the Neurobiology Laboratory at the DPZ, and he adds: "This model contributes to a better understanding of the neuronal processes in the brain and in the long term could be useful for the development of more efficient neuroprostheses."

Explore further

Motor neural population activity patterns are different for reach and grasp behaviors

More information: Jonathan A. Michaels et al. A goal-driven modular neural network predicts parietofrontal neural dynamics during grasping, Proceedings of the National Academy of Sciences (2020). DOI: 10.1073/pnas.2005087117
Citation: Model describes complete grasping movement planning in the brain (2020, December 7) retrieved 3 March 2021 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments