Computing hubs in the hippocampus and cortex

Computing hubs in the hippocampus and cortex
Recording paradigm. Schematic representation of the (A) simultaneous mEC/HPC recording setup and (B) simultaneous mPFC/HPC during anesthesia and natural sleep. The Nissl stained sections display the anatomical regions recorded by the different silicon probes used (yellow boxes). Arrows represent the anatomical connectivity (●: source layer, : target layer) between the dorsal hippocampus CA1 region (SOr: stratum oriens; SP: stratum pyramidale; SR: stratum radiatum; SLM: stratum lacunosum moleculare) with the dorso-medial entorhinal cortex (mEC, layers I to VI) and medial prefrontal cortex (mPFC, layers I to VI). (C) Number of recorded single units (color coded on the right scalebar) per anatomical layer (rows), for each of the 30 recordings (columns). (D) Number of recordings (color coded on the right scalebar) simultaneously targeting pairs of two different anatomical layers. Credit: Science Advances, doi: 10.1126/sciadv.aax4843

Neural computation occurs in large neural networks within dynamic brain states, yet it remains poorly understood if the functions are performed by a specific subset of neurons or if they occurred in specific, dynamic regions. In a recent study, Wesley Clawson and co-workers at the Institute of Neuroscience Systems in France, used high density recordings in the hippocampus, medial entorhinal and medial prefrontal cortex of the rat. Using the animal model, they identified computing substates where specific computing hub neurons performed well-defined operations on storage and sharing in a brain state-dependent manner.

The scientists retrieved distinct computing substates in each global state, which included REM (rapid-eye-movement) and NREM (non-rapid-eye-movement) sleep. The results suggested that the functional roles were not hardwired but reassigned at a specified time-scale. Clawson et al. identified the sequence of substates whose temporal organization was dynamic between order and disorder. The results of the study are now published on Science Advances.

Information processing in the brain can be approached on three levels to include (1) biophysical, (2) algorithmic and (3) behavioral components. The algorithmic level remains the least understood, where it describes emergent functional computations that can be decomposed into simpler processing steps with complex architectures. At the lowest level of individual system components such as single , the building blocks of distributed information processing can be modeled as primitive operations of storing, transferring or nonlinearly integrating information streams. During resting state conditions both blood-oxygen level dependent (BOLD) and electroencephalogram (EEG) signals are characterized by discrete periods of functional connectivity or topographical stability known as resting state networks and microstates. Neuroscientists have demonstrated that transition between the large-scale epochs are not periodic or random but occur through a fractal and complex syntax, hitherto not understood.

For instance, does the macroscale organization also occur at the microscale? Is at the level of the microcircuit associated with different styles of information processing? To answer these questions, the first goal of Clawson and co-workers was to determine if information processing at the local neuronal circuit level was structured into discrete sequences of substates to form a hallmark of computation. For this, they focused on the low-level computing operation at the level of the single-neuron such as basic information storage and sharing. They studied two conditions – anesthesia and natural sleep that were characterize by theta (THE)/ slow oscillation (SO) and rapid eye movement (REM)/nonREM sleep, respectively.

Computing hubs in the hippocampus and cortex
Unsupervised extraction of states and hubs. (A) Cartoon representing the approximate recording locations (mEC and CA1; mPFC and CA1) during two experiment types in anesthesia and sleep. (B) Example LFP trace taken from the 32 channels in CA1 (blue) and 32 channels in mEC (orange). Below are examples of isolated unit activity taken from the same recording. For each time window (t), we extract different features represented by the Feature Vector(t), which has a feature value for each channel or single unit recorded. Four features were considered: spectral band averaged powers (from LFP channels), single unit firing rates, information storage, and information sharing. (D) Left: Cartoon representation of Msim. To extract substates and their temporal dynamics, the scientists constructed a feature similarity matrix Msim in which the entry Msim(ta, tb) measured Pearson correlation between the vectors FeatureVector(ta) and FeatureVector(tb). Time flows from the top-left corner horizontally to the top-right corner and vertically to the bottom-left corner. A block (square) along the diagonal in the resulting image identifies a period of feature stability, i.e., a sub-state. A block appearing several times horizontally or vertically indicates that a feature is repeated several times. Middle: Unsupervised clustering identifies the different sub-states (indicated by a number) and their temporal dynamics (the vertical axis corresponds to that of the similarity matrix). Right: they identified computing hub cells, i.e., neurons that display exceptionally high values for a given feature, associated with given sub-states. Note that reoccurring states have the same hub cells (state 3 in this example). The (*) corresponds to the neurons that are behaving in the top 5% of the examined feature. Credit: Science Advances, doi: 10.1126/sciadv.aax4843.

During the work, Clawson considered the CA1 region of the hippocampus (HPC), the medial entorhinal cortex (mEC) and medial prefrontal cortex (mPFC) to investigate the algorithmic properties shared between the regions. Their second study goal, aimed to determine if primitive processing operations were localized, or distributed in the microcircuit as previously proposed. This concept raised two key questions:

(1) are specific operations driven by a few key neurons in a rich-club architecture?

(2) Do neurons have predetermined computing roles as information "sharer," "storer" or rigidly described partners in their functional interactions? More specifically, is information routed through a hardwired neuronal "switchboard" system?

In total, the findings suggest a more distributed and less hierarchical style of information processing in neuronal microcircuits similar to emergent liquid state computation than pre-programmed processing pipelines.

Clawson et al. recorded neurons simultaneously from the CA1 region of the HPC (hippocampus) and mEC (medial entorhinal cortex) under anesthesia and from the mPFC (medial prefrontal cortex) region during natural sleep. They focused on two factors –

(1) How much information could a neuron buffer in time? They measured the parameter as active information storage, and

(2) Information sharing – how much of a neuron's activity information was available to other neurons? This was measured as mutual information.

Computing hubs in the hippocampus and cortex
LEFT: Firing substates. Examples of similarity matrices Msim obtained from Firing(t) at different times in mEC during anesthesia (A) and in mPFC during natural sleep (D), measured in two animals. The bar below Msim indicates the transitions occurring between THE/REM (dark blue) and SO/nonREM (light blue). Although there were only two global brain states, six (A) and five (D) firing substates were identified. (B and E) Examples of the firing density of three neurons (a, b, and c) recorded in mEC and mPFC, respectively, with amplitude normalized for visualization. Neurons tended to fire in specific substates, indicated here with a color code. These examples also illustrate the switching between different firing substates inside a given global oscillatory state and their overlap across different global oscillatory states. The analysis of all recordings revealed that a majority of firing substates tended to occur during a preferred global oscillatory state, as indicated by the bimodal histograms during anesthesia (C) and natural sleep (F), respectively. RIGHT: Information storage substates. Examples of similarity matrices Msim obtained from Storage(t) at different times in mEC during anesthesia (A) and CA1 during natural sleep (D). As for firing substates, we identified more storage substates (six and seven, respectively, in the shown examples) than global oscillatory states. The scientists show in (B) and (E) that the participation of three individual neurons to information storage (indicated in arbitrary units for visualization) was substate dependent. The values reported above the plots correspond to the average firing rate of neuron b (green color) during the corresponding epochs within consistent storage substates. The analysis of all recordings showed that storage substates tended to occur during a preferred global oscillatory substate, as indicated by the bimodal histograms for anesthesia (C) and for natural sleep (F). Credit: Science Advances, doi: 10.1126/sciadv.aax4843.

The neuroscientists identified brain global states by the clustering analysis of field recordings performed in the CA1 region. Using unsupervised clustering they identified two states of anesthesia corresponding to the periods dominated by slow oscillations (SO state) and theta (THE state) oscillations, as well as two states corresponding to REM vs. nonREM episodes.

During proof-of-principle experiments in the animal models, the scientists assessed brain state-dependent firing substates and revealed a total of six firing substates in mEC and five in mPFC during THE oscillations and REM episodes. The scientists showed that the neuronal activity was compartmentalized with discrete switching events from one substate to another. The firing substates were brain state and region specific, without strict entrainment by the global oscillatory state. At any given time, Clawson et al. observed neuronal activity to convey an amount of information, measured by Shannon entropy.

Computing hubs in the hippocampus and cortex
Information sharing substates. The cartoon in (A) shows an example of sharing assembly for a given sharing hub neuron across three nonsequential occurrences of the same substate. The total strength of in- and out-going sharing is equal (large, external arrows) during ta, tb,, and tc while the assembly changes (smaller, internal arrows). The changing size of internal arrows represents the sharing strength of that particular functional connection between the sharing hub and its source and target neurons. (B) Similarity matrices Msim for sharing strengths Sharing_S(t) (top) and sharing assemblies Sharing_A(t) (bottom) in mEC during anesthesia (left) and mPFC during natural sleep (right). The scientists identified a multiplicity of substates within each global oscillatory state as shown by the colored bars below the feature similarity matrices. The similarity matrices for sharing strengths and assemblies have a matching block structure. However, sharing strengths were very stable within a substate (red-hued blocks), while sharing assemblies were highly volatile (light blue–hued blocks). (C) This is quantified for each sharing assembly substate by a liquidity coefficient. For all observed sharing substates across all regions and global oscillatory states in all animals, the liquidity of sharing assemblies was much larger than the one of sharing strengths. (D) Most sharing substates occurred preferentially during a preferred global oscillatory state for both anesthesia and natural sleep combined. Credit: Science Advances, doi: 10.1126/sciadv.aax4843.

Similar to the firing substates, the information storage substates did not show strict alignment between the studied regions. During anesthesia, the scientists observed the absolute storage values to be stronger in mEC than the CA1 regions. However, during natural sleep, the storage values for CA1 were two orders of magnitude larger than during anesthesia. The results showed that information storage was dynamically distributed in discrete substates to be brain state and brain region dependent. As a result, storage capacity of a neuron could vary substantially across time.

Following up from the single-cell level analysis conducted so far, the scientists next determined which neurons sharing cells could exchange information. Although the sum of incoming and outgoing information remained constant in each sharing substate, the information was shared across different cell assemblies from one time period to the next. All three brain regions showed remarkable liquid-like sharing assemblies across brain states with specificity to the brain regions.

Since functional, effective and anatomical hub neurons were previously identified in the brain, Clawson et al. complemented the concept by introducing storing and sharing hubs where neurons displayed elevated storage or sharing values respectively. The scientists observed a general tendency for inhibitory interneurons to serve as computing hubs than for excitory cells. The tendency was stronger for cortical regions during anesthesia and during sleep. In total, the scientists observed only 12 percent of the neurons to function as "multifunction hubs" for both storage and sharing functions.

Computing hubs in the hippocampus and cortex
A democracy of computing hubs. (A) Within every computing substate, some neurons exhibited significantly strong values of information storage or sharing (computing hubs). However, these computing hubs did generally change from one substate to the other, as shown in this example. Different rows correspond to different single units recorded in mEC during anesthesia and different columns correspond to different computing substates (left, storage substates 1 to 6; right, sharing substates 1 to 4). An entry is colored in yellow when the neuron is a computing hub within the corresponding substate. In the example shown, while ~9% of neurons on average were simultaneously acting as computing hub, more than 40% of the recorded units were recruited as hubs for at least one substate when considering all the computing substates together (vertical bar on the right). (B and C) The probability that a neuron acted as hub depended only loosely on its anatomical localization. Panel (B) shows that for all regions and layers, the probability that a neuron acts as computing hub at least once was always larger than 30%. Inhibitory (i) neurons tended to be recruited as hubs more frequently than excitatory (e) neurons. Analogously, panel (C) shows that none of the layers display a specialization in either one of the two processing operations of information storage or sharing. Asterisks denote statistically significant comparisons (lack of overlap between 95% confidence intervals for the probability, reported as vertical ranges on top of the histogram bar). In (C), a yellow horizontal line indicates the fraction of computing hub cells, which also happen to be simultaneously high firing rate cells. Many computing hubs thus have an average or low firing rate. In (B) and (C), in CA1, light blue represents anesthesia and dark blue represents natural sleep. Credit: Science Advances, doi: 10.1126/sciadv.aax4843.

The findings showed the existence of substate sequences in three different brain regions: the HPC, the mEC and the mPFC; during anesthesia and natural sleep. Since the analysis was limited to a few brain states, the scientists assume they may have underestimated the proportion of GABA neurons acting as computational hubs in the study. The observed capacity to generate complex sequences of patterns in the study is a hallmark of self-organizing systems, associated with their emergent potential to perform universal computation. Such dynamics at the "edge of chaos" (transition between order and disorder) allow advantages for information processing. The observed capacity to generate complex sequences of patterns is a hall-mark of self-organizing systems and is associated with their emergent potential to perform universal computations. Understanding such patterns and dynamics within brain states will benefit early interpretation of neurological disorders.

In this way, Wesley Clawson and co-workers revealed a rich algorithmic-level organization of brain computation during natural sleep and anesthesia. The work indicates the existence of a basic architecture for low level computations shared by diverse neuronal circuits. Although the work did not prove functional relevance of substate dynamics, it can serve as a platform for previously undisclosed neural computations. The neuroscientists aim to perform similar analysis during behavioral tasks next with the addition of goal-driven maze navigation.

More information: 1. Computing hubs in the hippocampus and cortex advances.sciencemag.org/content/5/6/eaax4843 Wesley Clawson et al. 26 June 2019, Science Advances.

2. Between order and chaos www.nature.com/articles/nphys2190 James Crutchfield, 22 December 2011, Nature Physics.

3. Reliable Recall of Spontaneous Activity Patterns in Cortical Networks www.jneurosci.org/content/29/46/14596 Olivier Marre, November 2009, Journal of Neuroscience.

4. GABAergic Hub Neurons Orchestrate Synchrony in Developing Hippocampal Networks science.sciencemag.org/content/326/5958/1419 P. Bonifazi et al. December 2009, Science.

© 2019 Science X Network

Citation: Computing hubs in the hippocampus and cortex (2019, July 5) retrieved 19 March 2024 from https://medicalxpress.com/news/2019-07-hubs-hippocampus-cortex.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

The fractal brain, from a single neuron's perspective

4 shares

Feedback to editors