A neural circuit for context-dependent multimodal signaling in Drosophila
Elsa Steinfath,
Afshin Khalili,
Melanie Stenger,
Bjarne L. Schultze,
Sarath Ravindran Nair,
Kimia Alizadeh and
Jan Clemens ()
Additional contact information
Elsa Steinfath: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Afshin Khalili: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Melanie Stenger: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Bjarne L. Schultze: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Sarath Ravindran Nair: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Kimia Alizadeh: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Jan Clemens: a Joint Initiative of the University Medical Center Göttingen and the Max Planck Institute for Multidisciplinary Sciences
Nature Communications, 2025, vol. 16, issue 1, 1-15
Abstract:
Abstract Many animals produce multimodal displays that combine acoustic, visual, or vibratory signals, yet the neural mechanisms coordinating these behaviors remain unclear. Using Drosophila courtship as a model, we reveal how a single neural circuit integrates sensory cues and motivational state to orchestrate multimodal signaling. Male flies produce both air-borne song and substrate-borne vibrations during courtship, but in distinct, largely non-overlapping contexts. We demonstrate that the same brain neurons that drive song also control vibrations through separate pre-motor pathways, with cell-type specific dynamics. This shared circuit coordinates multimodal displays with locomotion, ensuring vibrations are produced only when they can effectively reach the female. The circuit employs shared motifs—recurrence and mutual inhibition—that enable dynamic control of multimodal signals by external cues and internal state. A computational model confirms that these motifs are sufficient to explain the observed behavioral dynamics. Our findings illustrate how simple neural circuit elements can be combined to select and coordinate complex multimodal behaviors.
Date: 2025
References: View references in EconPapers View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.nature.com/articles/s41467-025-64907-9 Abstract (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:nat:natcom:v:16:y:2025:i:1:d:10.1038_s41467-025-64907-9
Ordering information: This journal article can be ordered from
https://www.nature.com/ncomms/
DOI: 10.1038/s41467-025-64907-9
Access Statistics for this article
Nature Communications is currently edited by Nathalie Le Bot, Enda Bergin and Fiona Gillespie
More articles in Nature Communications from Nature
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().