Examining how 'polyglot' neurons encode and decode sensorimotor 'chatter'

neuron
Credit: CC0 Public Domain

During sensorimotor processing in the brain, neurons are constantly bombarded with information from other neurons. When we use our eyes to interact with our environment, thousands of neurons communicate with each other to make sense of all the information coming in and react to it: If someone throws you a ball, your eyes track the ball, and a chain of neuron communication informs your hand where it must go to catch it.

But how these neurons communicate in between seeing and acting is a complex—and important—consideration. New research led by the Cognition and Sensorimotor Integration Lab at the University of Pittsburgh Swanson School of Engineering has uncovered how neurons encode and decode that information and differentiate between motor and sensory signals.

"We wanted to figure out how a decoder knows exactly when to initiate a if it is also getting signals when a movement isn't desired," said Uday K. Jagadisan, lead author and former graduate student in the Cognition and Sensorimotor Integration Lab. "We not only were able to uncover a reliable temporal pattern in the that was tied to movement, but we were also able to replicate it with microstimulation." 

The researchers studied how decoding happens when the signals lead to movement, trying to differentiate it from how information is encoded during visual processing. In other words, if the neurons are receiving both sensory and motor signals, how do they tell them apart? How does the know when to make the body move?

"The same groups of neurons can communicate information about sensations and movement, and the brain knows which signal is which. We found it's as if groups of neurons encode the same information in one 'language' to send messages about sensation and in another 'language' to send information about movement," explained Neeraj Gandhi, professor of bioengineering who leads the Cognition and Sensorimotor Integration Lab at Pitt. "The receiving groups of only act on one of the languages—that's the key." 

The research is the first to both pinpoint the encoding and decoding process and verify the findings using microstimulation. The researchers were able to repeat the pattern of neural activity in non-human primate brains and elicit the intended motor reaction. 

This discovery is vital for applications like and neuroprosthetics. These artificial systems can assist people who have suffered brain injuries or other disorders that affect motor or sensory processes, but in order to work reliably, they need to decode brain activity and understand the intentions behind the patterns of activity.

"For neuroprosthetics, this research could create a way to put the brakes on and inhibit response when you don't need it, and release when actually needed, all based on neuron chatter," said Jagadisan. "Current technology is just delivering a pulse every few milliseconds. If you have the ability to control the time when each pulse is delivered, you can select the patterned microstimulation to achieve the effect that you want." 

The paper was published in the journal Current Biology

More information: Uday K. Jagadisan et al, Population temporal structure supplements the rate code during sensorimotor transformations, Current Biology (2022). DOI: 10.1016/j.cub.2022.01.015

Journal information: Current Biology
Citation: Examining how 'polyglot' neurons encode and decode sensorimotor 'chatter' (2022, February 3) retrieved 25 April 2024 from https://medicalxpress.com/news/2022-02-polyglot-neurons-encode-decode-sensorimotor.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

How the brain makes sense of touch

113 shares

Feedback to editors