EconPapers    
Economics at your fingertips  
 

An improved ViT model for music genre classification based on mel spectrogram

Pingping Wu, Weijie Gao, Yitao Chen, Fangfang Xu, Yanzhe Ji, Juan Tu and Han Lin

PLOS ONE, 2025, vol. 20, issue 3, 1-12

Abstract: Automating the task of music genre classification offers opportunities to enhance user experiences, streamline music management processes, and unlock insights into the rich and diverse world of music. In this paper, an improved ViT model is proposed to extract more comprehensive music genre features from Mel spectrograms by leveraging the strengths of both convolutional neural networks and Transformers. Also, the paper incorporates a channel attention mechanism by amplifying differences between channels within the Mel spectrograms of individual music genres, thereby facilitating more precise classification. Experimental results on the GTZAN dataset show that the proposed model achieves an accuracy of 86.8%, paving the way for more accurate and efficient music genre classification methods compared to earlier approaches.

Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0319027 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 19027&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0319027

DOI: 10.1371/journal.pone.0319027

Access Statistics for this article

More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().

 
Page updated 2025-05-10
Handle: RePEc:plo:pone00:0319027