EconPapers    
Economics at your fingertips  
 

Contrast limited adaptive histogram equalization (CLAHE) and colour difference histogram (CDH) feature merging capsule network (CCFMCapsNet) for complex image recognition

Steve Okyere-Gyamfi, Michael Asante, Kwame Ofosuhene Peasah, Yaw Marfo Missah and Vivian Akoto-Adjepong

PLOS ONE, 2025, vol. 20, issue 10, 1-27

Abstract: To enhance crop yield, detecting leaf diseases has become a crucial research focus. Deep learning and computer vision excel in digital image processing. Various techniques grounded in deep learning have been utilized for detecting plant leaf diseases; however, achieving high accuracy remains a challenge. Basic convolutional neural networks (CNNs) in deep learning struggle with issues such as the abnormal orientation of images, rotation, and others, resulting in subpar performance. CNNs also need extensive data covering a wide range of variations to deliver strong performance. CapsNet is an innovative deep-learning architecture designed to address the limitations of CNNs. It performs well without needing a vast amount of data in various variations. CapsNets have their limitations, such as the encoder network considering every element in the image and the crowding issue. Due to this, they perform well on simple image recognition tasks but struggle with more complex images. To address these challenges, we introduced a new CapsNet model known as CCFM-CapsNet. This model incorporates CLAHE to reduce image noise and CDH to extract crucial features. Also, max-pooling and dropout layers are incorporated in the original CapsNet model for identifying and classifying diseases in apples, bananas, grapes, corn, mangoes, pepper, potatoes, rice, tomato and also for classifying fashion-MNIST and CIFAR-10 datasets. The proposed CCFM-CapsNet demonstrates significantly high validation accuracies, achieving 99.53%, 95.24%, 99.75%, 97.40%, 99.13%, 100%, 99.77%, 100%, 98.54%, 93.48%, and 82.34% with corresponding parameters in millions(M) 4.68M, 4.68M, 4.68M, 4.68M, 4.79M, 4.63M, 4.66M, 4.68M, 4.84M, 2.39M, and 4.84M for the datasets aforementioned respectively, outperforming the traditional CapsNet and other advanced CapsNet models. Consequently, the CCFM-CapsNet model can be utilized effectively as a smart tool for identifying plant diseases and also in achieving Sustainable Development Goal 2 (Zero Hunger), which aims to end global hunger by the year 2030.

Date: 2025
References: View complete reference list from CitEc
Citations:

Downloads: (external link)
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0335393 (text/html)
https://journals.plos.org/plosone/article/file?id= ... 35393&type=printable (application/pdf)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:plo:pone00:0335393

DOI: 10.1371/journal.pone.0335393

Access Statistics for this article

More articles in PLOS ONE from Public Library of Science
Bibliographic data for series maintained by plosone ().

 
Page updated 2025-11-29
Handle: RePEc:plo:pone00:0335393