Cross-modal deep generative models reveal the cortical representation of dancing
Yu Takagi (),
Daichi Shimizu,
Mina Wakabayashi,
Ryu Ohata and
Hiroshi Imamizu
Additional contact information
Yu Takagi: The University of Tokyo, Department of Psychology, Graduate School of Humanities and Sociology
Daichi Shimizu: Kobe University, Graduate School of Human Development and Environment
Mina Wakabayashi: The University of Tokyo, Department of Psychology, Graduate School of Humanities and Sociology
Ryu Ohata: The University of Tokyo, Department of Psychology, Graduate School of Humanities and Sociology
Hiroshi Imamizu: The University of Tokyo, Department of Psychology, Graduate School of Humanities and Sociology
Nature Communications, 2025, vol. 16, issue 1, 1-12
Abstract:
Abstract Dance is an ancient, holistic art form practiced worldwide throughout human history. Although it offers a window into cognition, emotion, and cross‑modal processing, fine‑grained quantitative accounts of how its diverse information is represented in the brain have rarely been performed. Here, we relate features from a cross‑modal deep generative model of dance to functional magnetic resonance imaging responses while participants watched naturalistic dance clips. We demonstrate that cross-modal features explain dance‑evoked brain activity better than low‑level motion and audio features. Using encoding models as in silico simulators, we quantify how dances that elicit different emotions yield distinct neural patterns. While expert dancers’ brain activity is more broadly explained by dance features than that of novices, experts exhibit greater individual variability. Our approach links cross-modal representations from generative models to naturalistic neuroimaging, clarifying how motion, music, and expertise jointly shape aesthetic and emotional experience.
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-025-65039-w Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-65039-w
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-025-65039-w
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().