

An AI‐powered system decodes emotional valence from mouse facial expressions, identifying ear dynamics as key indicators. By integrating facial kinematics with synchronized neural recordings, this approach establishes a direct link between behavior and brain activity, advancing automated emotion recognition in animal models. Abstract Despite advances in linking mouse facial expressions to emotional states, the specific facial features and neural signatures remain elusive. An artificial intelligence (AI)‐based framework that decodes mouse facial expressions is presented, revealing stable valence and arousal dimensions analogous to those described in human emotion models. Facial expressions emerge as robust indicators of positive and negative emotional responses, validated through pharmacological manipulations, while responses to hallucinogens highlight the potential of valence‐specific prototype modeling for interpreting previously uncharacterized emotional states. Using automated multikeypoint tracking, patterned facial kinematics that are consistent within the same emotional valence are identified. Ear dynamics, in particular, emerge as critical features, offering distinct and sensitive markers of subtle emotional distinctions. Neurocorrelational analyses and optogenetic inhibition targeting the ventral tegmental area further demonstrate the intricate link between facial expressions and valence‐specific neural activity in dopaminergic and GABAergic neurons. These findings establish a precise, high‐temporal‐resolution platform for objectively decoding murine emotional states, advancing the understanding of emotional processing mechanisms and informing the development of mood‐regulating therapies. An AI-powered system decodes emotional valence from mouse facial expressions, identifying ear dynamics as key indicators. By integrating facial kinematics with synchronized neural recordings, this approach establishes a direct link between behavior and brain activity, advancing automated emotion recognition in animal models. Abstract Despite advances in linking mouse facial expressions to emotional states, the specific facial features and neural signatures remain elusive. An artificial intelligence (AI)-based framework that decodes mouse facial expressions is presented, revealing stable valence and arousal dimensions analogous to those described in human emotion models. Facial expressions emerge as robust indicators of positive and negative emotional responses, validated through pharmacological manipulations, while responses to hallucinogens highlight the potential of valence-specific prototype modeling for interpreting previously uncharacterized emotional states. Using automated multikeypoint tracking, patterned facial kinematics that are consistent within the same emotional valence are identified. Ear dynamics, in particular, emerge as critical features, offering distinct and sensitive markers of subtle emotional distinctions. Neurocorrelational analyses and optogenetic inhibition targeting the ventral tegmental area further demonstrate the intricate link between facial expressions and valence-specific neural activity in dopaminergic and GABAergic neurons. These findings establish a precise, high-temporal-resolution platform for objectively decoding murine emotional states, advancing the understanding of emotional processing mechanisms and informing the development of mood-regulating therapies. Advanced Science, Volume 12, Issue 42, November 13, 2025.
Medical Journal
|15th Jan, 2026
|Nature Medicine's Advance Online Publication (AOP) table of contents.
Medical Journal
|15th Jan, 2026
|Wiley
Medical Journal
|15th Jan, 2026
|Wiley
Medical Journal
|15th Jan, 2026
|Wiley
Medical Journal
|15th Jan, 2026
|Wiley
Medical Journal
|15th Jan, 2026
|Wiley
Medical Journal
|15th Jan, 2026
|Wiley