This article has been reviewed according to Science X's editorial process and policies. Editors have highlighted the following attributes while ensuring the content's credibility:


peer-reviewed publication

trusted source


Reading a mouse's mind from its face: New tool decodes neural activity using facial movements

Reading the mouse mind from its face: New tool decodes neural activity using facial movements
Fast and accurate mouse orofacial keypoint tracking. a, A total of 13 distinct keypoints selected for tracking the eye, mouth, whiskers and nose on the mouse face, illustration created with b, Architecture of the Facemap network, a U-Net style convolutional neural network. c, The error percentiles across test frames from a new mouse, where error is defined as the Euclidean distance between the ground-truth label and the prediction. d, Summary of Facemap performance on test data for different subgroups of keypoints. Human error shown for a subset of the test frames labeled in two different sessions by a human annotator. Error bars represent s.e.m., n = 400, 95, 361 and 300 keypoint labels for eye, mouth, nose and whiskers, respectively, across 100 test frames. e, The average error, in pixels, and processing speed, in video frames processed per second, of the Facemap tracker compared with other pose estimation tools. Error bars represent s.e.m., n = 1,156 keypoint labels. f–h, Traces of x and y coordinates of keypoints during different orofacial behaviors. i, Prediction of keypoint traces into the future (test data). j, Variance explained of future prediction at different time lags, summarized for each face region. Error bars represent s.e.m., n = 16 recordings. k, Decay time to 50% of variance explained at 20 ms timelag. The 'x' represents the average. Two-sided Wilcoxon signed-rank test, ***P < 0.001 (eye versus whisker, P = 3.05 × 10−5; eye versus nose, P = 3.05 × 10−5; whisker versus nose, P = 1.53 × 10−4). Credit: Nature Neuroscience (2023). DOI: 10.1038/s41593-023-01490-6

Mice are always in motion. Even if there's no external motivation for their actions—like a cat lurking a few feet away—mice are constantly sweeping their whiskers back and forth, sniffing around their environment and grooming themselves.

These spontaneous actions light up neurons across many different regions of the brain, providing a of what the animal is doing moment-by-moment across the brain. But how the brain uses these persistent, widespread signals remains a mystery.

Now, scientists at HHMI's Janelia Research Campus have developed a tool that could bring researchers one step closer to understanding these enigmatic brain-wide signals. The tool, known as Facemap, uses to relate information about a mouse's eye, whisker, nose, and to in the brain.

The findings are published in the journal Nature Neuroscience.

"The goal is: What are those behaviors that are being represented in those ? And, if a lot of that information is in the , then how can we track that better?" says Atika Syeda, a graduate student in the Stringer Lab and lead author of a new paper describing the research.

Creating Facemap

The idea to create a better tool for understanding brain-wide signals grew out of previous research from Janelia Group Leaders Carsen Stringer and Marius Pachitariu. They found that activity in many different areas across a mouse's brain—long thought to be —are signals driven by these spontaneous behaviors. Still unclear, however, was how the brain uses this information.

"The first step in really answering that question is understanding what are the movements that are driving this activity, and what exactly is represented in these brain areas," Stringer says.

To do this, researchers need to be able to track and quantify movements and correlate them with brain activity. But the tools enabling scientists to do such experiments weren't optimized for use in , so researchers haven't been able to get the information they need.

Janelia scientists have developed a tool that could bring researchers one step closer to understanding brain-wide signals driven by spontaneous behaviors. The tool, known as Facemap, uses deep neural networks to relate information about a mouse's eye, whisker, nose, and mouth movements to neural activity in the brain. Credit: Atika Syeda/HHMI Janelia Research Campus

"All of these different are driven by these movements, which is why we think it is really important to get a better handle on what these movements actually are because our previous techniques really couldn't tell us what they were," Stringer says.

To address this shortcoming, the team looked at 2,400 video frames and labeled distinct points on the mouse face corresponding to different facial movements associated with spontaneous behaviors. They homed in on 13 key points on the face that represent individual behaviors, like whisking, grooming, and licking.

The team had first developed a neural network-based model that could identify these key points in videos of mouse faces collected in the lab under various experimental setups.

They then developed another deep neural network-based model to correlate this key facial point data representing mouse movement to neural activity, allowing them to see how a mouse's spontaneous behaviors drive neural activity in a particular brain region.

Facemap is more accurate and faster than previous methods used to track orofacial movements and behaviors in mice. The tool is also specifically designed to track mouse faces and has been pretrained to track many different mouse movements. These factors make Facemap a particularly effective tool: The model can predict twice as much neural activity in mice compared to prior methods.

In earlier work, the team found that spontaneous behaviors activated neurons in the , the brain region that processes visual information from the eye. Using Facemap, they discovered that these neuronal activity clusters were more spread out across this region of the than previously thought.

Facemap is freely available and easy to use. Hundreds of researchers around the world have already downloaded the tool since it was released last year.

"This is something that if anyone wanted to get started, they could download Facemap, run their videos, and get their results on the same day," Syeda says. "It just makes research, in general, much easier."

More information: Atika Syeda et al, Facemap: a framework for modeling neural activity based on orofacial tracking, Nature Neuroscience (2023). DOI: 10.1038/s41593-023-01490-6

Journal information: Nature Neuroscience

Citation: Reading a mouse's mind from its face: New tool decodes neural activity using facial movements (2023, November 20) retrieved 6 December 2023 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Study shows that activity in the primate visual cortex is minimally linked to spontaneous movements


Feedback to editors