Effective fashion labelling via image-based recommendation with deep learning
P. Valarmathi,
R. Dhanalakshmi and
Narendran Rajagopalan
International Journal of Business Innovation and Research, 2023, vol. 32, issue 1, 47-59
Abstract:
Tremendous increase in the volume of e-commerce data directs recommender systems to be an efficient approach to overcome this overload of information. Recently, deep learning has been applied in varied business fields such as image processing and natural language processing for higher performance. In particular, fashion organisations have started applying deep learning training methods to their online business. Classification of objects/images is the most significant backbone of these applications. Generally, existing techniques depend on traditional features/attributes to characterise an image, for example, the visual properties retrieved by convolutional neural systems. We have developed a two-tier deep learning system that recommends apparel images based on various apparel images passed as input. To accomplish this, a neural network classification system is employed as a visually aware, data-driven extractor of features. The extracted features and the ranking matrix are taken as the input for similarity-based recommendations employing a significant nearest neighbour algorithm.
Keywords: image recommendation; convolutional neural network; significant nearest neighbour; SNN; VGG16; AlexNet. (search for similar items in EconPapers)
Date: 2023
References: Add references at CitEc
Citations:
Downloads: (external link)
http://www.inderscience.com/link.php?id=134306 (text/html)
Access to full text is restricted to subscribers.
Related works:
This item may be available elsewhere in EconPapers: Search for items with the same title.
Export reference: BibTeX
RIS (EndNote, ProCite, RefMan)
HTML/Text
Persistent link: https://EconPapers.repec.org/RePEc:ids:ijbire:v:32:y:2023:i:1:p:47-59
Access Statistics for this article
More articles in International Journal of Business Innovation and Research from Inderscience Enterprises Ltd
Bibliographic data for series maintained by Sarah Parker ().