LKDA-Net: Hierarchical transformer with large Kernel depthwise convolution attention for 3D medical image segmentation
Ming Li,
Jingang Ma and
Jing Zhao
PLOS ONE, 2025, vol. 20, issue 8, 1-23
Abstract:
Since Transformers have demonstrated excellent performance in the segmentation of two-dimensional medical images, recent works have also introduced them into 3D medical segmentation tasks. For example, hierarchical transformers like Swin UNETR have reintroduced several prior knowledge of convolutional networks, further enhancing the model’s volumetric segmentation ability on three-dimensional medical datasets. The effectiveness of these hybrid architecture methods is largely attributed to the large number of parameters and the large receptive fields of non-local self-attention. We believe that large-kernel volumetric depthwise convolutions can obtain large receptive fields with fewer parameters. In this paper, we propose a lightweight three-dimensional convolutional network, LKDA-Net, for efficient and accurate three-dimensional volumetric segmentation. This network adopts a large-kernel depthwise convolution attention mechanism to simulate the self-attention mechanism of Transformers. Firstly, inspired by the Swin Transformer module, we investigate different-sized large-kernel convolution attention mechanisms to obtain larger global receptive fields, and replace the MLP in the Swin Transformer with the Inverted Bottleneck with Depthwise Convolutional Augmentation to reduce channel redundancy and enhance feature expression and segmentation performance. Secondly, we propose a skip connection fusion module to achieve smooth feature fusion, enabling the decoder to effectively utilize the features of the encoder. Finally, through experimental evaluations on three public datasets, namely Synapse, BTCV and ACDC, LKDA-Net outperforms existing models of various architectures in segmentation performance and has fewer parameters. Code: https://github.com/zouyunkai/LKDA-Net.
Date: 2025
References: Add references at CitEc
Citations:
Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0329806 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 29806&type=printable (application/pdf)
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0329806
DOI: 10.1371/journal.pone.0329806
Access Statistics for this article
More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().