Dynamic Convolution Neural Networks with Both Global and Local Attention for Image Classification
Chusan Zheng,
Yafeng Li (),
Jian Li,
Ning Li,
Pan Fan,
Jieqi Sun and
Penghui Liu
Additional contact information
Chusan Zheng: School of Mathematics and Information Sciences, Baoji University of Arts and Science, Baoji 721013, China
Yafeng Li: School of Computer, Baoji University of Arts and Science, Baoji 721016, China
Jian Li: School of Electrical and Control Engineering, School of Mathematics and Data Science, Shaanxi University of Science and Technology, Xi’an 710016, China
Ning Li: School of Computer, Baoji University of Arts and Science, Baoji 721016, China
Pan Fan: School of Computer, Baoji University of Arts and Science, Baoji 721016, China
Jieqi Sun: School of Electrical and Control Engineering, School of Mathematics and Data Science, Shaanxi University of Science and Technology, Xi’an 710016, China
Penghui Liu: School of Computer, Baoji University of Arts and Science, Baoji 721016, China
Mathematics, 2024, vol. 12, issue 12, 1-19
Abstract:
Convolution is a crucial component of convolution neural networks (CNNs). However, the standard static convolution has two primary defects: data independence and the weak ability to integrate global and local features. This paper proposes a novel and efficient dynamic convolution method with global and local attention to address these issues. A building block called the Global and Local Attention Unit (GLAU) is designed, in which a weighted fusion of global channel attention kernels and local spatial attention kernels generates the proposed dynamic convolution kernels. The GLAU is data-dependent and has better adaptability and the ability to integrate global and local features into each layer. We refer to such modified CNNs with GLAUs as “GLAUNets”. Extensive evaluation experiments for image classification compared to classical CNNs and the state-of-the-art dynamic convolution neural networks were conducted on the popular benchmark datasets. In terms of classification accuracy, the number of parameters, and computational complexity, the experimental results demonstrate the outstanding performance of GLAUNets.
Keywords: attention mechanism; convolution neural network; dynamic kernel; image classification (search for similar items in EconPapers)
JEL-codes: C (search for similar items in EconPapers)
Date: 2024
References: View complete reference list from CitEc
Citations:
Downloads: (external link)
https://www.mdpi.com/2227-7390/12/12/1856/pdf (application/pdf)
https://www.mdpi.com/2227-7390/12/12/1856/ (text/html)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:gam:jmathe:v:12:y:2024:i:12:p:1856-:d:1414829
Access Statistics for this article
Mathematics is currently edited by Ms. Emma He
More articles in Mathematics from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().