This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:

fact-checked

peer-reviewed publication

trusted source

proofread

Patterns of brain activity accurately predict tongue shape while feeding

Patterns of brain activity accurately predict tongue shape while feeding
Quantifying intraoral tongue deformation and related cortical responses. a A constellation of 7 radio-opaque markers (blue spheres) was implanted into the tongue body to capture whole-tongue kinematics. b Multielectrode arrays were implanted in the orofacial region of the primary motor cortex (M1o; dark blue, Utah array; light blue, floating microelectrode array) and somatosensory cortex (SCo; red, Utah array; orange, floating microelectrode array). c, d While the subjects fed on grapes, biplanar videoradiography (c) recorded the 2D, intraoral motion of the markers, from which 3D trajectories (d) were computed. In d, different colors represent different tongue markers, S-I, superoinferior; A-P, anteroposterior; M-L, mediolateral, all relative to the cranium. e Spike raster of neural data (100 representative neurons from M1o) collected synchronously with the kinematics shown in d. f Top, digital renders of tongue and mandible posture at three timepoints in a chewing cycle and computed kinematic variables. Bottom, a constrained Procrustes superimposition was performed to remove translational, rotational, and scale changes in marker positions, leaving only shape change. g Percent variance explained by first 10 components of a principal component analysis on marker positions (relative to the cranium) across all trials. h Same as g, but computed on marker positions in Procrustes shape space. See “Methods: XROMM data processing” for details on image generation. Credit: Nature Communications (2023). DOI: 10.1038/s41467-023-38586-3

Neuroscientists have learned a great deal about how the brain interprets and controls movements that make up everyday movements like walking, reaching, and grasping objects. But the mechanics of fundamental behaviors like eating, drinking, and communication have been more difficult to measure, largely because a crucial component—the tongue—is mostly hidden from view.

New research from the University of Chicago takes up that challenge by using 3D X-ray videography and machine learning to record intricate movements of the in non-human primates while they are feeding.

When combined with recordings of neural activity taken simultaneously from the sensorimotor cortex of the brain, the study, published in Nature Communications, shows that the 3D shape of the tongue can be accurately decoded from the brain, opening possibilities for brain computer interface-based prosthetics to restore lost functions of feeding and speech.

Infinite degrees of freedom

In addition to being tucked away inside the mouth, the tongue presents another biomechanical challenge. Movements of the arms or legs are constrained by bones and joints of the skeleton, giving them a certain amount of predictability. Since the tongue is made entirely of muscle and other soft tissue, its freedom of movement is almost limitless (except for the unlucky few who can't roll their tongue into a U-shape).

"When we think of the brain controlling muscles, we almost invariably think about it like actuating an arm or a leg, which has rigid bones moving about a joint," said J.D. Laurence-Chasen, Ph.D., the study's lead author and a former postdoctoral scholar at UChicago who now works as a researcher at the National Renewable Energy Laboratory in Golden, Colorado.

"The tongue has a totally different anatomy. There are no rigid internal structures. There's a ton of different muscles with overlapping functions, and so, it has functionally infinite degrees of freedom."

As a postdoc and Ph.D. student, Laurence-Chasen used and machine learning tools to study how the brain controls the dynamic tongue and jaw movements that are crucial for feeding and speech. In the latest study, he worked with Nicho Hatsopoulos, Ph.D., and Callum Ross, Ph.D., both Professors in Organismal Biology and Anatomy, to capture the tongue movements of two male Rhesus macaque monkeys while they were feeding on grapes.

The monkeys each had a set of seven markers attached to their tongues. These markers could be detected by two X-ray video cameras to record movement and shape of the tongue while it was still inside the mouth, much like used for special effects in movies or video games.

The monkeys eat fast, chewing two to three times a second, so the researchers used a novel 3D imaging technology called X-ray Reconstruction of Moving Morphology (XROMM) to capture and process the high-speed data from the tongues' various movements, shape changes, and deformations.

At the same time, microelectrode arrays implanted in the recorded neural activity while the monkeys were feeding. Laurence-Chasen and the team employed , a form of machine learning software, to analyze the brain activity and learn from this information.

When matched with the actual movements recorded by the X-ray cameras, they found that information about the 3D shape and movement of the tongue is present in the motor cortex. They could then use that data to accurately decode and predict the shape of the tongue based on the neuron activity alone.

"We knew from some earlier research that basic movements of the tongue involved the cortex, but we were surprised by the extent and resolution of information about the tongue shape that we could extract so readily," Laurence-Chasen said.

A future for soft prosthetics

Intriguingly, this data is represented the same way that arm movements and 3D positions of the hand are represented in the brain. Hatsopoulos and Sliman Bensmaia, Ph.D., James and Karen Frank Family Professor of Organismal Biology and Anatomy at UChicago, have already used that body of research to translate signals from the into software algorithms that drive the movements of robotic prosthetic limbs that amputees and quadriplegics can move with their minds, and receive natural sensations of touch in return.

While the technology for applications with the tongue aren't nearly as far along, a similar approach could help patients who have lost functions of feeding and speech.

"Dysphagia and difficulty swallowing is a big problem, especially with the elderly," Hatsopoulos said. "If we could use this information about the tongue and its shape to decode when a swallow is about to happen, then you could connect that to a device that could stimulate the right set of muscles to help them swallow."

"What J.D. has been able to do here to decode the shapes of soft tissue, not a skeletal system, is novel," he said. "I think it's super exciting."

More information: Jeffrey D. Laurence-Chasen et al, Robust cortical encoding of 3D tongue shape during feeding in macaques, Nature Communications (2023). DOI: 10.1038/s41467-023-38586-3

Journal information: Nature Communications
Citation: Patterns of brain activity accurately predict tongue shape while feeding (2023, May 24) retrieved 19 April 2024 from https://medicalxpress.com/news/2023-05-patterns-brain-accurately-tongue.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Video: Tongue-tie in babies

93 shares

Feedback to editors