EconPapers    
Economics at your fingertips  
 

Transfer Learning-Based Search Model for Hot Pepper Diseases and Pests

Helin Yin, Yeong Hyeon Gu, Chang-Jin Park, Jong-Han Park and Seong Joon Yoo
Additional contact information
Helin Yin: Department of Computer Science and Engineering, Sejong University, Seoul 05006, Korea
Yeong Hyeon Gu: Department of Computer Science and Engineering, Sejong University, Seoul 05006, Korea
Chang-Jin Park: Department of Bioresources Engineering, Sejong University, Seoul 05006, Korea
Jong-Han Park: Horticultural and Herbal Crop Environment Division, National Institute of Horticultural and Herbal Science, Rural Development Administration, Wanju 55365, Korea
Seong Joon Yoo: Department of Computer Science and Engineering, Sejong University, Seoul 05006, Korea

Agriculture, 2020, vol. 10, issue 10, 1-16

Abstract: The use of conventional classification techniques to recognize diseases and pests can lead to an incorrect judgment on whether crops are diseased or not. Additionally, hot pepper diseases, such as “anthracnose” and “bacterial spot” can be erroneously judged, leading to incorrect disease recognition. To address these issues, multi-recognition methods, such as Google Cloud Vision, suggest multiple disease candidates and allow the user to make the final decision. Similarity-based image search techniques, along with multi-recognition, can also be used for this purpose. Content-based image retrieval techniques have been used in several conventional similarity-based image searches, using descriptors to extract features such as the image color and edge. In this study, we use eight pre-trained deep learning models (VGG16, VGG19, Resnet 50, etc.) to extract the deep features from images. We conducted experiments using 28,011 image data of 34 types of hot pepper diseases and pests. The search results for diseases and pests were similar to query images with deep features using the k-nearest neighbor method. In top-1 to top-5, when using the deep features based on the Resnet 50 model, we achieved recognition accuracies of approximately 88.38–93.88% for diseases and approximately 95.38–98.42% for pests. When using the deep features extracted from the VGG16 and VGG19 models, we recorded the second and third highest performances, respectively. In the top-10 results, when using the deep features extracted from the Resnet 50 model, we achieved accuracies of 85.6 and 93.62% for diseases and pests, respectively. As a result of performance comparison between the proposed method and the simple convolutional neural network (CNN) model, the proposed method recorded 8.62% higher accuracy in diseases and 14.86% higher in pests than the CNN classification model.

Keywords: deep feature; k-nearest neighbor; hot pepper disease; similarity-based image retrieval; transfer learning (search for similar items in EconPapers)
JEL-codes: Q1 Q10 Q11 Q12 Q13 Q14 Q15 Q16 Q17 Q18 (search for similar items in EconPapers)
Date: 2020
References: View complete reference list from CitEc
Citations: View citations in EconPapers (4)

Downloads: (external link)
https://www.mdpi.com/2077-0472/10/10/439/pdf (application/pdf)
https://www.mdpi.com/2077-0472/10/10/439/ (text/html)

Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.

Export reference: BibTeX RIS (EndNote, ProCite, RefMan) HTML/Text

Persistent link: https://EconPapers.repec.org/RePEc:gam:jagris:v:10:y:2020:i:10:p:439-:d:420779

Access Statistics for this article

Agriculture is currently edited by Ms. Leda Xuan

More articles in Agriculture from MDPI
Bibliographic data for series maintained by MDPI Indexing Manager ().

 
Page updated 2025-03-19
Handle: RePEc:gam:jagris:v:10:y:2020:i:10:p:439-:d:420779