EconPapers    
Economics at your fingertips  
 

Sensitive crop leaf disease prediction based on computer vision techniques with handcrafted features

Manoj A. Patil () and Manohar Manur ()
Additional contact information
Manoj A. Patil: Christ(Deemed to be University) School of Engineering and Technology
Manohar Manur: Christ(Deemed to be University) School of Engineering and Technology

International Journal of System Assurance Engineering and Management, 2023, vol. 14, issue 6, No 16, 2235-2266

Abstract: Abstract Agricultural production is considered the primary source of the economy of many countries. Tomato and Potatoes are the most sensitive and consumable vegetables worldwide. However, during the growth of these crops, they suffer from many leaf diseases, which lead to loss of productivity and economy of the farmers. Many farmers detect and find plant diseases that are more time-consuming, expensive, and require expert decisions following the naked eye method. Therefore, early and accurate diagnosis of Tomato and Potato crops leaf diseases plays a vital role in sustainable agriculture. So, this research paper proposes an efficient leaf disease classification model based on computer vision techniques. The proposed Adaptive Deep Neural Network (ADNN) leaf disease classification method is a hybrid model which combines an optimized long short-term memory (OLSTM) and convolution neural network (CNN). The weight values supplied in the LSTM classifier are optimally selected using the Adaptive Raindrop Optimization algorithm. The handcrafted features are extracted from the segmented image and fused with the hybrid deep neural network to improve the classifier performance. The ADNN method consists of five steps: preprocessing, feature extraction, segmentation, handcrafted feature extraction, and classification. At first, the images are given to the preprocessing stage to remove the noise from leaf images. Then, the image-affected portion is segmented using an enhanced radial basis function neural network. After the segmentation process, the segmented image is given as an input to the adaptive deep neural network (ADNN) that classifies various types of diseases in the Potato and Tomato leaves. The efficiency of the ADNN model based on the OLSTM-CNN approach is determined concerning multiple metrics, namely Accuracy, Precision, Recall, F-measure, Specificity, and Sensitivity. The ADNN model achieved the best Accuracy of 98.02% for Tomatoes and 98% for Potatoes. The ADNN is compared with existing state-of-the-art CNN, LSTM, ResNet50, and MobileNet techniques. The performance analysis proved that the ADNN model improved efficiency in terms of all metrics and methods.

Keywords: ADNN; OLSTM-CNN; CNN; LSTM; ARDO (search for similar items in EconPapers)
Date: 2023
References: View references in EconPapers View complete reference list from CitEc
Citations:

Downloads: (external link)
http://link.springer.com/10.1007/s13198-023-02066-0 Abstract (text/html)
Access to the full text of the articles in this series is restricted.

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:spr:ijsaem:v:14:y:2023:i:6:d:10.1007_s13198-023-02066-0

Ordering information: This journal article can be ordered from
http://www.springer.com/engineering/journal/13198

DOI: 10.1007/s13198-023-02066-0

Access Statistics for this article

International Journal of System Assurance Engineering and Management is currently edited by P.K. Kapur, A.K. Verma and U. Kumar

More articles in International Journal of System Assurance Engineering and Management from Springer, The Society for Reliability, Engineering Quality and Operations Management (SREQOM),India, and Division of Operation and Maintenance, Lulea University of Technology, Sweden
Bibliographic data for series maintained by Sonal Shukla () and Springer Nature Abstracting and Indexing ().

 
Page updated 2025-04-20
Handle: RePEc:spr:ijsaem:v:14:y:2023:i:6:d:10.1007_s13198-023-02066-0